RobotFramework, Chromedriver and Docker
-
Comments:
- here.
One of my team implemented RobotFramework support for automated browser testing of our platform a while ago. At the time, we were using Codeship Basic, and I built a helper to run a robot test suite within a tox environment. It was all good, because chromedriver
and all it’s dependencies were already installed.
But time passes, and we needed to move to Codeship Pro. Which has some neater features, but required me to build docker images for everything. We already use docker for deployment, but I didn’t really want to build a bunch of distinct images just for testing that re-implemented the same stuff that we have in our deployment images. Even just appending new stuff to them means that things could turn out to be a pain in the arse to manage.
And getting chromedriver installed into a docker image is not neat.
I did find a docker image that just has an instance of chromedriver, and exposes that. But getting that to work with robot was still a bunch of work. After much experimentation, I was able to get the connections between everything to work.
First, we need to have the chromedriver container running:
$ docker run -p 4444:4444 CHROMEDRIVER_WHITELISTED_IPS='' robcherry/docker-chromedriver:latest
Then, there are a few moving parts that need to be in place to get things to work. Using my djangobot management command (which I had to extend a bit here), a single command can be used to spin up a Django runserver
command, apply migrations (if necessary), and then run the robot commands. The trick is you need to teach Robot to speak to the remote WebDriver instance, which then in turn speaks to the running django webserver.
First, the RobotFramework commands; my resource.robot
file which is referenced by all of my robot test suites contains:
*** Variables ***
${HOSTNAME} 127.0.0.1
${PORT} 8000
${SCHEME} http
${SERVER} ${SCHEME}://${HOSTNAME}:${PORT}
${BROWSER} headlesschrome
${TIMEOUT} 30
${REMOTE_URL}
*** Settings ***
Documentation A resource file with reusable keywords and variables.
Library SeleniumLibrary timeout=${TIMEOUT} implicit_wait=1
Library Collections
Library DebugLibrary
Library DateTime
Library String
Library djangobot.DjangoLibrary ${HOSTNAME} ${PORT}
*** Keywords ***
Create Remote Webdriver
${chrome_options} = Evaluate sys.modules['selenium.webdriver'].ChromeOptions() sys, selenium.webdriver
Call Method ${chrome_options} add_argument headless
Call Method ${chrome_options} add_argument disable-gpu
Call Method ${chrome_options} add_argument no-sandbox
${options}= Call Method ${chrome_options} to_capabilities
Create Webdriver Remote command_executor=${REMOTE_URL} desired_capabilities=${options}
Open Browser ${SERVER} ${BROWSER} remote_url=${REMOTE_URL} desired_capabilities=${options}
Start Session
Run Keyword If '${REMOTE_URL}' Create Remote Webdriver
Run Keyword If '${REMOTE_URL}' == '' Open Browser ${SERVER} ${BROWSER}
Set Window Size 2048 2048
Fetch Url login
Add Cookie robot true
Register Keyword To Run On Failure djangobot.DjangoLibrary.Dump Error Data
End Session
Close Browser
Logout
Fetch Url logout
Notice that the Start Session
keyword determines which type of browser to open - either a local or remote one.
Thus, each *.robot
file starts with:
*** Settings ***
Resource resource.robot
Suite Setup Start Session
Suite Teardown End Session
Test Setup Logout
Because the requests will no longer be coming from localhost
, you need to ensure that your runserver is listening on the interface the requests will be coming from. If you can’t detect this, and your machine is not exposed to an insecure network, then you can use 0.0.0.0
to get the django devserver to listen on all interfaces. You will also need to supply the hostname that you will be using for the requests (which won’t be localhost anymore), and ensure this is in your Django settings.ALLOWED_HOSTS
.
In my case, I needed to make my robot
command allow all this, but ultimately I can now do:
$ ./manage.py robot --runserver 0 \
--listen 0.0.0.0 \
--hostname mymachine.local \
--remote-url http://localhost:4444 \
--include tag
This runs against the database I already have prepared, but in my codeship-steps.yml
I needed to do a bit more, and hook it up to the other containers:
coverage run --branch --parallel \
/app/manage.py robot --migrate \
--server-url=http://web.8000 \
--remote-url=http://chromedriver:4444 \
--tests-dir=/app/robot_tests/ --output-dir=/coverage/robot_results/ \
--exclude skip --exclude expected-failure
Now, if only Codeship’s jet
tool actually cached multi-stage builds correctly.
Maybe I neeed to try this.