Tooling for converting subunit streams into a SQL DB
Go to file
Matthew Treinish efae453856 Add sphinx docs
This commit actually adds support for building the docs with sphinx.
Before we had a tox job for building docs but there wasn't actually
anything to build.
2014-06-24 16:21:38 -04:00
doc/source Add sphinx docs 2014-06-24 16:21:38 -04:00
subunit2sql Allow oslo.db config options on the CLI 2014-06-24 16:08:30 -04:00
.gitignore Update project packaging 2014-06-24 15:50:36 -04:00
alembic.ini Initial commit 2014-06-12 23:59:00 -04:00
LICENSE Initial commit 2014-06-12 23:59:00 -04:00
openstack-common.conf Initial commit 2014-06-12 23:59:00 -04:00
README.rst Cleanup the readme 2014-06-24 16:00:13 -04:00
requirements.txt Initial commit 2014-06-12 23:59:00 -04:00
setup.cfg Update project packaging 2014-06-24 15:50:36 -04:00
setup.py Initial commit 2014-06-12 23:59:00 -04:00
test-requirements.txt Initial commit 2014-06-12 23:59:00 -04:00
TODO.rst Update project packaging 2014-06-24 15:50:36 -04:00
tox.ini Initial commit 2014-06-12 23:59:00 -04:00

subunit2SQL README

subunit2SQL like it's name implies is a tool used for converting subunit streams to data in a SQL database. The motivation is that for multiple distributed test runs that are generating subunit output it is useful to store the results in a unified repository. This is the motivation for the testrepository project which does a good job for centralizing the results from multiple test runs.

However, imagine something like the OpenStack CI system where the same basic test suite is normally run several hundreds of times a day. To provide useful introspection on the data from those runs and to build trends over time the test results need to be stored in a format that allows for easy querying. Using a SQL database makes a lot of sense for doing this.

subunit2SQL uses alembic migrations to setup a DB schema that can then be used by the subunit2sql binary to parse subunit streams and populate the DB. Additionally, it provides a DB API that can be used to query information from the results stored to build other tooling.