Testing
This is a collection of random tips about testing. Most of the test configurations you will use are already defined in the Taskfile.
Warning
If you find yourself doing any testing that doesn't have task
entry for it already, let's talk!
Unit Tests
Run unit tests with:
We run unittests with pytest.
The task test
helper automatically creates a local Postgres instance for testing and creates a testing datawarehouse
in the "dwh" database from testing_dwh.csv.zst file.
testing_sheet.csv is the corresponding spreadsheet that simulates a
typical table configuration for the participant type data above.
Various tests are also run as part of our GitHub action test workflow.
conftest.py defines fixtures used by many of the tests.
Google Integration Tests
Some of our pytests have a test marked as 'integration'. These are only usually run in GHA but you can run them locally
by setting GOOGLE_APPLICATION_CREDENTIALS
and running:
BigQuery Integration Tests
Method 1:
- Push the branch to GitHub (i.e. draft PR)
- Use the GitHub CLI to trigger the BigQuery integration tests on a specific branch:
tests
refers to the name of the workflow containing the integration tests. run-bq-integration
refers to the name
of the workflow_dispatch input that determines if the BigQuery integration tests are run. The branch must already be
pushed to GitHub.
Method 2: Trigger via GitHub UI
- Push the branch to GitHub (i.e. draft PR)
- Go to the Testing workflow
- Click "new workflow"
- Select the PR's branch name
- Tick the "bq integration" box.
You can also trigger the BigQuery integration tests to run in GHA by putting run:bqintegration
in your PR comment.
Showing additional test output
pytest -rA
to print out all stdout from your tests; -rx
for just those failing. (See
docs for more info.)
API Test Scripts
See apitest.strata.xurl for a complete example of how to write an API test script. We use a small custom file format called Xurl.
test_api.py
tests that use testdata/*.xurl
data can be automatically updated with the actual server results by
prefixing your pytest run with the environment variable: UPDATE_API_TESTS=1
.
Or just run task update-api-tests
.
Loading the testing DWH with a different schema?
Use the CLI command:
uv run xngin-cli create-testing-dwh \
--dsn=$DATABASE_URL \
--password=$PASSWORD \
--table-name=test_participant_type \
--schema-name=alt
How can I run the unittests against an arbitrary data warehouse?
Tests that exercise integration with a data warehouse do so via two mechanisms:
- xngin.testing.settings.json defines some static configuration for stable databases.
- The XNGIN_QUERIES_TEST_URI environment variable defines a SQLAlchemy URI that some of the apiserver.dwh tests are run against.
By default, task test
will run all the tests against Postgres. See the bq-integration GHA job for an example of how
to run some of the tests against BigQuery.
Testing environment for BigQuery as the Service Provider
You can create a test dataset on a project you've configured with bigquery using our xngin-cli tool:
export GSHEET_GOOGLE_APPLICATION_CREDENTIALS=~/.config/gspread/service_account.json
xngin-cli create-testing-dwh --dsn 'bigquery://xngin-development-dc/ds'
The tool is also used for interacting with the BQ API directly
(i.e. bigquery_dataset_set_default_expiration
).
These commands use Google's [Application Default Credentials] (https://cloud.google.com/docs/authentication/application-default-credentials) process.
Note
The GHA service account has permissions to access the xngin-development-dc.ds dataset.
Note
You do not need the gcloud CLI or related tooling installed.