Reputation: 4037
For executing tests I usually run a separate container with:
docker-compose run --rm web /bin/bash
Where web is a container with django. From a shell I execute py.test from time to time.
In order to be able to reach selenium from a container with django and to allow the browser from selenium container to reach django's liveserver I decided to use "net" parameter which allows containers to share net. So I added it to the yml:
selenium:
image: selenium/standalone-firefox
net: "container:web"
Unfortunately this does not work. I do not see 4444 port in my django container.
It only works if instead of net:"container:web"
I specify an autogenerated container's name, like net:"container:project_web_run_1"
.
Also I tried instead of docker-compose run --rm ....
use docker-compose up --no-deps
changing command
parameter to py.test functional_tests
but that did not work either.
Is this the right of using selenium with containers?
Upvotes: 12
Views: 6783
Reputation: 451
Just set the DJANGO_LIVE_TEST_SERVER_ADDRESS in the main settings.py:
Example:
### settings.py
import os
import socket
os.environ['DJANGO_LIVE_TEST_SERVER_ADDRESS'] = socket.gethostbyname(socket.gethostname())
Upvotes: 0
Reputation: 291
I have just specified host='web'
for LiveServerTestCase
. Here is my working solution.
test.py
from django.test import LiveServerTestCase
from selenium import webdriver
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
class FunctionalTestCase(LiveServerTestCase):
host = 'web'
def setUp(self):
self.browser = webdriver.Remote(
command_executor="http://selenium:4444/wd/hub",
desired_capabilities=DesiredCapabilities.FIREFOX
)
def test_user_registration(self):
self.browser.get(self.live_server_url)
self.assertIn('Django', self.browser.title)
def tearDown(self):
self.browser.close()
docker-compose.yml
version: '3'
services:
db:
image: postgres
web:
build: .
ports:
- "8000:8000"
depends_on:
- db
- selenium
selenium:
image: selenium/standalone-firefox
Remember you have to have to install the selenium
in your docker image for this to work:
$ docker-compose exec web bash
> pip install selenium
...
> pip freeze > ../requirements.txt
> exit
$ ...
Upvotes: 1
Reputation: 381
For anyone running pytest, and possibly pytest-splinter (Selenium wrapper)
version: '3'
services:
db:
image: postgres
django:
build: .
ports:
- "8000:8000"
depends_on:
- db
- selenium
selenium:
image: selenium/standalone-firefox-debug:latest
ports:
- "4444:4444" # Selenium
- "5900:5900" # VNC
Define a conftest.py in your root directory to make these fixtures available to all your tests
import socket
import pytest
from pytest_django.live_server_helper import LiveServer
@pytest.fixture(scope='session')
def test_server() -> LiveServer:
addr = socket.gethostbyname(socket.gethostname())
server = LiveServer(addr)
yield server
server.stop()
@pytest.fixture(autouse=True, scope='function')
def _test_server_helper(request):
"""
Configures test_server fixture so you don't have to mark
tests with @pytest.mark.django_db
"""
if "test_server" not in request.fixturenames:
return
request.getfixturevalue("transactional_db")
# Settings below here are exclusive to splinter,
# I'm just overriding the default browser fixture settings
# If you just use selenium, no worries, just take note of the remote url and use
# it wherever you define your selenium browser
@pytest.fixture(scope='session')
def splinter_webdriver():
return 'remote'
@pytest.fixture(scope='session')
def splinter_remote_url():
return 'http://selenium:4444/wd/hub'
Don't forget to set ALLOWED_HOSTS in your config file:
if env('USE_DOCKER') == 'yes':
import socket
ALLOWED_HOSTS = [socket.gethostbyname(socket.gethostname())]
# or just
ALLOWED_HOSTS = ['*']
Then just test away!
from django.urls import reverse
def test_site_loads(browser, test_server):
browser.visit(test_server.url + reverse('admin:index'))
Upvotes: 6
Reputation: 964
In my case, the "web" container runs only one command, which is bash -c "sleep infinity"
.
Then, I start the whole stack with docker-compose up -d
.
Then, I use docker-compose exec web bash -c "cd /usr/src/app && tox"
, for example.
This way, my web
host is accessible from selenium
, always under the same name.
Using docker-compose run web ...
generates new (predictable, but still) host name every single time.
Upvotes: 0
Reputation: 3340
Here is how I do it. The basic problem is that docker-compose run will generate a different hostname (project_container_run_x) where x is hard to know for sure. I ended up just going off of ip address. I'm also ensuring DEBUG is False otherwise I get a bad request.
I'm using StaticLiveServerTestCase like this:
import os
import socket
os.environ['DJANGO_LIVE_TEST_SERVER_ADDRESS'] = '0.0.0.0:8000'
class IntegrationTests(StaticLiveServerTestCase):
live_server_url = 'http://{}:8000'.format(
socket.gethostbyname(socket.gethostname())
)
def setUp(self):
settings.DEBUG = True
self.browser = webdriver.Remote(
command_executor="http://selenium:4444/wd/hub",
desired_capabilities=DesiredCapabilities.CHROME
)
def tearDown(self):
self.browser.quit()
super().tearDown()
def test_home(self):
self.browser.get(self.live_server_url)
My docker-compose file has this for selenium and extends the web container (where django is running). Port 5900 is open for VNC. I like to keep this isolated in something like docker-compose.selenium.yml
version: '2'
services:
web:
environment:
SELENIUM_HOST: http://selenium:4444/wd/hub
TEST_SELENIUM: 'yes'
depends_on:
- selenium
selenium:
image: selenium/standalone-chrome-debug
ports:
- "5900:5900"
I can run tests like
docker-compose run --rm web ./manage.py test
So my web container is accessing selenium via the "selenium" host. Selenium then accesses the web container by ip address which is determined on the fly.
Another gotcha is that it's tempting to just use "web" as the hostname. If your docker-compose run command starts up a separate web container - this will appear to work. However it won't be using your test database, making for not a great test.
Upvotes: 8