################### Python Unit Testing ################### .. note:: This document is currently just a **PROPOSED** testing standard. It is derived from the LSST document in the hopes of leveraging a common test framework. .. note:: This document was derived from the version 6.0 of the LSST/DM Python Testing document (https://github.com/lsst-dm/dm_dev_guide/blob/master/python/testing.rst). External documents referenced in the original LSST/DM document have been partially imported as needed for clarity, or else now reference similarly modified Data Lab documents. This page provides technical guidance to developers writing unit tests for Data Lab's Python code base. Tests should be written using the :mod:`unittest` framework, with default test discovery, and should support being run using the `pytest`_ test runner as well as from the command line. .. _pytest: http://pytest.org Introduction to ``unittest`` ============================ This document will not attempt to explain full details of how to use :mod:`unittest` but instead shows common scenarios encountered in the codebase. A simple :mod:`unittest` example is shown below: .. literalinclude:: examples/test_basic_example.py :linenos: :language: python The important things to note in this example are: * Test file names must begin with ``test_`` to allow `pytest`_ to automatically detect them without requiring an explicit test list, which can be hard to maintain and can lead to missed tests. * If the test is being executed using :command:`python` from the command line the :py:func:`unittest.main` call performs the test discovery and executes the tests, setting exit status to non-zero if any of the tests fail. * Test classes are executed in the order in which they appear in the test file. In this case the tests in ``DemoTestCase1`` will be executed before those in ``DemoTestCase2``. * Test classes must, ultimately, inherit from :class:`unittest.TestCase` in order to be discovered. The tests themselves must be methods of the test class with names that begin with ``test``. All other methods and classes will be ignored by the test system but can be used by tests. * Specific test asserts, such as :meth:`~unittest.TestCase.assertGreater`, :meth:`~unittest.TestCase.assertIsNone` or :meth:`~unittest.TestCase.assertIn`, should be used wherever possible. It is always better to use a specific assert because the error message will contain more useful detail and the intent is more obvious to someone reading the code. Only use :meth:`~unittest.TestCase.assertTrue` or :meth:`~unittest.TestCase.assertFalse` if you are checking a boolean value, or a complex statement that is unsupported by other asserts. * When testing that an exception is raised always use :meth:`~unittest.TestCase.assertRaises` as a context manager, as shown in line 10 of the above example. * If a test method completes, the test passes; if it throws an uncaught exception the test has failed. Supporting Pytest ================= `pytest`_ provides a rich execution and reporting environment for tests and can be used to run multiple test files together. The `pytest`_ scheme for discovering tests inside Python modules is much more flexible than that provided by :mod:`unittest`, but test files should not take advantage of that flexibility as it can lead to inconsistency in test reports that depend on the specific test runner, and it is required that an individual test file can be executed by running it directly with :command:`python`. In particular, care must be taken not to have free functions that use a ``test`` prefix or non-\ :class:`~unittest.TestCase` test classes that are named with a ``Test`` prefix in the test files. Testing Flask Applications ========================== Data Lab services are written using the `Flask `_ microframework. See the discussion of `Flask testing `_ for more information on how to use ``pytest`` with these applications. Common Issues ============= This section describes some common problems that are encountered when using `pytest`_. Testing global state -------------------- `pytest`_ can run tests from more than one file in a single invocation and this can be used to verify that there is no state contaminating later tests. To run `pytest`_ use the :command:`pytest` executable: .. code-block:: shell $ pytest to run all files in the ``tests`` directory named ``test_*.py``. To ensure that the order of test execution does not matter it is useful to sometimes run the tests in reverse order by listing the test files manually: .. code-block:: shell $ pytest `ls -r tests/test_*.py` .. note:: `pytest`_ plugins are usually all enabled by default. Test Skipping and Expected Failures ----------------------------------- When writing tests it is important that tests are skipped using the proper :mod:`unittest` rather than returning from the test early. :mod:`unittest` supports skipping of individual tests and entire classes using decorators or skip exceptions. It is also possible to indicate that a particular test is expected to fail, being reported as an error if the test unexpectedly passes. Expected failures can be used to write test code that triggers a reported bug before the fix to the bug has been implemented and without causing the continuous integration system to die. One of the primary advantages of using a modern test runner such as `pytest`_ is that it is very easy to generate machine-readable pass/fail/skip/xfail statistics to see how the system is evolving over time, and it is also easy to enable code coverage. Jenkins now provides test result information. .. _testing-flake8: Enabling additional Pytest options: flake8 ========================================== As described in :ref:`style-guide-py-flake8`, Python modules can be configured using the :file:`setup.cfg` file. This configuration is supported by `pytest`_ and can be used to enable additional testing or tuning on a per-package basis. `pytest`_ uses the ``[tool:pytest]`` block in the configuration file. To enable automatic :command:`flake8` testing as part of the normal test execution the following can be added to the :file:`setup.cfg` file: .. code-block:: ini [tool:pytest] addopts = --flake8 flake8-ignore = E133 E211 E221 E223 E226 E228 N802 N803 N806 W504 The ``addopts`` parameter adds additional command-line options to the :command:`pytest` command when it is run from the command-line A wrinkle with the configuration of the ``pytest-flake8`` plugin is that it inherits the ``max-line-length`` and ``exclude`` settings from the ``[flake8]`` section of :file:`setup.cfg` but you are required to explicitly list the codes to ignore when running within `pytest`_ by using the ``flake8-ignore`` parameter. One advantage of this approach is that you can ignore error codes from specific files such that the unit tests will pass, but running :command:`flake8` from the command line will remind you there is an outstanding issue. This feature should be used sparingly, but can be useful when you wish to enable code linting for the bulk of the project but have some issues preventing full compliance. With this configuration each Python file tested by :command:`pytest` will have :command:`flake8` run on it. Using a shared base class ========================= For some tests it is helpful to provide a base class and then share it amongst multiple test classes that are configured with different attributes. If this is required, be careful to not have helper functions prefixed with ``test``. Do not have the base class named with a ``Test`` prefix and ensure it does not inherit from :class:`~unittest.TestCase`; if you do, `pytest`_ will attempt to find tests inside it and will issue a warning if none can be found. This can be dealt with by creating a test suite that only includes the classes to be tested, omitting the base class. This does not work in a `pytest`_ environment. Consider the following test code: .. literalinclude:: examples/test_baseclass.py :language: python which inherits from the helper class and :class:`unittest.TestCase` and runs a single test without attempting to run any tests in ``BaseClass``. .. code-block:: text $ pytest -v python/examples/test_baseclass.py ======================================= test session starts ======================================== platform darwin -- Python 3.4.3, pytest-3.2.1, py-1.4.30, pluggy-0.3.1 -- /usr/local/bin/python3.4 cachedir: python/examples/.cache rootdir: python/examples, inifile: collected 1 items python/examples/test_baseclass.py::ThisIsTest1::testParam PASSED ===================================== 1 passed in 0.02 seconds =====================================