Section author: Mike Fitzpatrick <>

3.5. Python Unit Testing


This document is currently just a PROPOSED testing standard. It is derived from the LSST document in the hopes of leveraging a common test framework.


This document was derived from the version 6.0 of the LSST/DM Python Testing document ( External documents referenced in the original LSST/DM document have been partially imported as needed for clarity, or else now reference similarly modified Data Lab documents.

This page provides technical guidance to developers writing unit tests for Data Lab’s Python code base. Tests should be written using the unittest framework, with default test discovery, and should support being run using the pytest test runner as well as from the command line.

3.5.1. Introduction to unittest

This document will not attempt to explain full details of how to use unittest but instead shows common scenarios encountered in the codebase.

A simple unittest example is shown below:

 1import unittest
 2import math
 5class DemoTestCase1(unittest.TestCase):
 6    """Demo test case 1."""
 8    def testDemo(self):
 9        self.assertGreater(10, 5)
10        with self.assertRaises(TypeError):
11            1 + "2"
14class DemoTestCase2(unittest.TestCase):
15    """Demo test case 2."""
17    def testDemo1(self):
18        self.assertNotEqual("string1", "string2")
20    def testDemo2(self):
21        self.assertAlmostEqual(3.14, math.pi, places=2)
24if __name__ == "__main__":
25    unittest.main()

The important things to note in this example are:

  • Test file names must begin with test_ to allow pytest to automatically detect them without requiring an explicit test list, which can be hard to maintain and can lead to missed tests.

  • If the test is being executed using python from the command line the unittest.main() call performs the test discovery and executes the tests, setting exit status to non-zero if any of the tests fail.

  • Test classes are executed in the order in which they appear in the test file. In this case the tests in DemoTestCase1 will be executed before those in DemoTestCase2.

  • Test classes must, ultimately, inherit from unittest.TestCase in order to be discovered. The tests themselves must be methods of the test class with names that begin with test. All other methods and classes will be ignored by the test system but can be used by tests.

  • Specific test asserts, such as assertGreater(), assertIsNone() or assertIn(), should be used wherever possible. It is always better to use a specific assert because the error message will contain more useful detail and the intent is more obvious to someone reading the code. Only use assertTrue() or assertFalse() if you are checking a boolean value, or a complex statement that is unsupported by other asserts.

  • When testing that an exception is raised always use assertRaises() as a context manager, as shown in line 10 of the above example.

  • If a test method completes, the test passes; if it throws an uncaught exception the test has failed.

3.5.2. Supporting Pytest

pytest provides a rich execution and reporting environment for tests and can be used to run multiple test files together.

The pytest scheme for discovering tests inside Python modules is much more flexible than that provided by unittest, but test files should not take advantage of that flexibility as it can lead to inconsistency in test reports that depend on the specific test runner, and it is required that an individual test file can be executed by running it directly with python. In particular, care must be taken not to have free functions that use a test prefix or non-TestCase test classes that are named with a Test prefix in the test files.

3.5.3. Testing Flask Applications

Data Lab services are written using the Flask microframework. See the discussion of Flask testing for more information on how to use pytest with these applications.

3.5.4. Common Issues

This section describes some common problems that are encountered when using pytest. Testing global state

pytest can run tests from more than one file in a single invocation and this can be used to verify that there is no state contaminating later tests. To run pytest use the pytest executable:

$ pytest

to run all files in the tests directory named test_*.py. To ensure that the order of test execution does not matter it is useful to sometimes run the tests in reverse order by listing the test files manually:

$ pytest `ls -r tests/test_*.py`


pytest plugins are usually all enabled by default. Test Skipping and Expected Failures

When writing tests it is important that tests are skipped using the proper unittest rather than returning from the test early. unittest supports skipping of individual tests and entire classes using decorators or skip exceptions. It is also possible to indicate that a particular test is expected to fail, being reported as an error if the test unexpectedly passes. Expected failures can be used to write test code that triggers a reported bug before the fix to the bug has been implemented and without causing the continuous integration system to die. One of the primary advantages of using a modern test runner such as pytest is that it is very easy to generate machine-readable pass/fail/skip/xfail statistics to see how the system is evolving over time, and it is also easy to enable code coverage. Jenkins now provides test result information.

3.5.5. Enabling additional Pytest options: flake8

As described in Code MAY be validated with flake8, Python modules can be configured using the setup.cfg file. This configuration is supported by pytest and can be used to enable additional testing or tuning on a per-package basis. pytest uses the [tool:pytest] block in the configuration file. To enable automatic flake8 testing as part of the normal test execution the following can be added to the setup.cfg file:

addopts = --flake8
flake8-ignore = E133 E211 E221 E223 E226 E228 N802 N803 N806 W504

The addopts parameter adds additional command-line options to the pytest command when it is run from the command-line A wrinkle with the configuration of the pytest-flake8 plugin is that it inherits the max-line-length and exclude settings from the [flake8] section of setup.cfg but you are required to explicitly list the codes to ignore when running within pytest by using the flake8-ignore parameter. One advantage of this approach is that you can ignore error codes from specific files such that the unit tests will pass, but running flake8 from the command line will remind you there is an outstanding issue. This feature should be used sparingly, but can be useful when you wish to enable code linting for the bulk of the project but have some issues preventing full compliance.

With this configuration each Python file tested by pytest will have flake8 run on it.

3.5.6. Using a shared base class

For some tests it is helpful to provide a base class and then share it amongst multiple test classes that are configured with different attributes. If this is required, be careful to not have helper functions prefixed with test. Do not have the base class named with a Test prefix and ensure it does not inherit from TestCase; if you do, pytest will attempt to find tests inside it and will issue a warning if none can be found. This can be dealt with by creating a test suite that only includes the classes to be tested, omitting the base class. This does not work in a pytest environment.

Consider the following test code:

import unittest

class BaseClass(object):
    def testParam(self):
        self.assertLess(self.PARAM, 5)

class ThisIsTest1(BaseClass, unittest.TestCase):
    PARAM = 3

if __name__ == "__main__":

which inherits from the helper class and unittest.TestCase and runs a single test without attempting to run any tests in BaseClass.

$ pytest -v python/examples/
======================================= test session starts ========================================
platform darwin -- Python 3.4.3, pytest-3.2.1, py-1.4.30, pluggy-0.3.1 -- /usr/local/bin/python3.4
cachedir: python/examples/.cache
rootdir: python/examples, inifile:
collected 1 items

python/examples/ PASSED

===================================== 1 passed in 0.02 seconds =====================================