Testing with Pytest

At AHL, automated testing is a key tool to help us produce high quality code. For our Python based tests we rely heavily on Pytest. In this article we look at why we chose Pytest and how we use it.

Why Pytest?

Here at AHL we are dedicated to automated software tests. In the same spirit of automation that we apply to systematic trading, automated software tests verify that any changes to our code are rigorously checked for bugs and regressions of behaviour.

To help us achieve this, we are heavy users of the Pytest testing framework. Its simple function-based syntax and powerful feature set are attractive to new and experienced developers alike. A modular plugin system with its re-usable fixtures have allowed us to grow a very large library of tests with minimal duplication in setup code.

Here’s some of our favourite features of Pytest:

Tests Are Just Functions

Instead of constructing boilerplate classes, tests can be expressed as simple functions. Given this short function under test, here’s a comparison between the default UnitTest implementation and pytest.


  ## Contents of analytics.py
  import numpy

  def fastsum(array_like):
      return numpy.sum(array_like, axis=0)
  

UnitTest


  import unittest
  import numpy
  import analytics

  class TestAnalytics(unittest.TestCase):
      def test_fastsum(self):
          self.assertEqual(analytics.fastsum(numpy.array([1, 2, 3]),  6)
  

PyTest


  import numpy
  import analytics

  def test_fastsum():
      assert analytics.fastsum(numpy.array([1, 2, 3])) == 6
  

Smart Assertions

The built-in assert statement has been overloaded to provide in-depth comparison information about what you’re asserting.

UnitTest


  $ pytest test_analytics.py
  ========================== test session starts ===========================
  platform linux2 -- Python 2.7.11, pytest-2.9.1, py-1.4.31, pluggy-0.3.1
  collected 1 items

  test_analytics.py F

  ================================ FAILURES ================================
  ________________________________ test_sum ________________________________

      def test_fastsum():
  >       assert analytics.fastsum([1, 2, 3]) == 7
  E       assert 6 == 7
  E        +  where 6 = analytics.fastsum([1, 2, 3])

  test_analytics.py:4: AssertionError
  =========================  1 failed in 0.01 seconds ======================
  

Data-driven test cases

Often a function under test can have many different types of inputs. It is laborious to write out each permutation of inputs and outputs as separate tests, so pytest has a great feature for creating data-driven parametrized test cases.


  import pytest
  import numpy as np
  import analytics

  @pytest.mark.parametrize('fn_in,expected_result',[
       ([1, 2, 3], 6),                  # Integers
       ([1.1, 2.2, 3.3], 6.6),          # Floats
       (np.array([1, 2, 3]), 6),        # Numpy Arrays
       # ... etc
       ([1.0, np.nan, 3.0], np.nan),    # Nan Handling
  ])
  def test_fastsum(fn_in, expected_result):
      result = analytics.fastsum(fn_in)
      if np.isnan(expected_result):
          assert np.isnan(result)
      else:
          assert result == expected_result
  

Powerful Re-usable Fixtures

Simple, parametrized test functions and smart assertions are only the start of why we like pytest. The real power comes from its dependency-injection-style test fixtures.

  • A test fixture is an object that is created by the test framework with some initial state and passed into any tests that have requested it by specifying the fixture name as a function argument.
  • Test fixtures have a scope, which determines their lifetime within the test run. The scope can be one of:
    • function : a single test function
    • class: a UnitTest style test class
    • module: a single Python test module
    • session: the entire session, from when pytest starts up till it finishes running all the tests
  • Fixtures have setup and teardown, that is run at the start and end of their lifetime.

Here’s a simple example that creates an in-memory SQLite database session and hands it to the test function:


  import pytest
  from sqlalchemy import create_engine
  from sqlalchemy.orm import sessionmaker

  @pytest.fixture
  def db_session():
      engine = create_engine('sqlite:///:memory:', echo=True)
      Session = sessionmaker(bind=engine)
      return Session()

  def test_select(db_session):
      """ the 'db_session' argument here is matched to the name of the
          fixture above
      """
      db_session.execute('select name from user')
      ... 
  

Fixtures Can Depend On Other Fixtures

In the same way that test functions depend on fixtures by specifying them by name as arguments, fixtures can depend on each other by specifying other fixture names in their own arguments. These dependencies will form a graph of objects that are created and torn down in the correct graph-order by test runner.

One can quickly see that this encourages libraries of re-usable test code that can be assembled to run complex integration tests with many moving parts. Here’s an example:


  import pytest

  import backend
  import frontend

  @pytest.yield_fixture(scope=session)
  def backend_server():
      # --- Setup ------
      server = backend.Server()
      server.start()
      yield server
      # --- Teardown ---
      server.stop()

  @pytest.fixture
  def frontend_client(backend_server):
      client = frontend.Client(host=server.host, port=server.port)
      client.connect()
      return client


  def test_client(frontend_client):
      frontend_client.login('admin', 'password')
      ...
  
  • In this example we have a session-scoped server fixture that is only created once, and a frontend client that is created once per test function.
  |   Session Scope    |            Function Scope            |
  |--------------------|--------------------------------------|
  |  backend_server <--|-- frontend_client <----- test_client |
  |                    |                                      |

  • The frontend client depends on the server fixture. This both ensures that the server is running at the time the client is needed, and also means the client can pull configuration details like host and port numbers out of the server class.
  • The test itself only asks for the client; pytest will ensure the server is started in time for the client to connect.

There is a gotcha here: fixture dependencies must obey scope precedence. You cannot have a session-scoped fixture depending on a function-scoped fixture, because the function-based fixture may have been destroyed within the lifetime of the session-based one.

Good Practice - Distribute Fixtures With The Code

As codebases grow, it makes sense for the teams that maintain a project to distribute the test fixtures for that project along with the normal code if its used by another team’s project. This way the fixtures remain ‘first class citizens’ and changes are propogated to the tests of other projects.

An example layout for a PnL service could be:

pnl_service/
    __init__.py
    server.py
    client.py
    fixtures.py   # this module contains a fixture called 'pnl_client'
  

Then in another project’s tests that use the PnL service, it can use the fixtures maintained by the PnL team.


  import attribution

  # This tells pytest that we want to use the fixtures from
  # the other project
  pytest_plugins = 'pnl_service.fixtures'

  def test_attribution(pnl_client):
      pnl_data = pnl_client.get_pnl(product='TEST_PRODUCT')
      ftl_data = attribution.get_market_attribution('FTL')
      assert ftl_data == [ .... ]
  

 

Open Sourced: Here’s Some We Created Earlier

There are lots of plugins for Pytest that provide fixtures and other functionality, including many built-in ones.
Many of the fixtures we’ve written over the years are generally useful and as such have been open-sourced. They are all available in the GitHub repository pytest-plugins. Here’s the list of plugins:

Plugin Description
pytest-server-fixtures Extensible server-running framework with a suite of well-known databases and webservices included: mongodb, redis, rethinkd, Jenkins, Apache httpd, Xvfb
pytest-shutil Unix shell and environment management tools
pytest-profiling Profiling plugin with tabular heat graph output and gprof support for C-Extensions
pytest-devpi-server DevPI server runnning fixture for testing package management code
pytest-pyramid-server Pyramid server fixture for running up servers in integration tests
pytest-webdriver Selenium webdriver fixture for testing web applications
pytest-virtualenv Create and teardown virtual environments, run tests and commands in the scope of the virtualenv
pytest-qt-app PyQT application fixture
pytest-listener TCP Listener/Reciever for testing remote systems
pytest-git Git repository fixture
pytest-svn SVN repository fixture
pytest-fixture-config Configuration tools for Py.test fixtures
pytest-verbose-parametrize Makes py.test’s parametrize output a little more verbose

How to get them

All of these plugins are available on PyPI and can be installed individually using pip:


    $ pip install pytest-server-fixtures
  

 

Further Reading

If you love testing your code as much we do, go ahead and install Pytest and get testing! It’s easy to get started and satisfying to know your software is in good hands when the tests all pass. Here’s some references for further reading:

 

I am interested in other Tech Articles.

To receive e-mail alerts whenever new Tech Articles or Events are posted on this site, please subscribe below.

Subscribe

 

Find out more about Technology at Man Group