nbsapi_verify is a standalone tool designed to verify that your API implementation conforms to the https://nbsapi.org OpenAPI specification, currently at version 2.0.0
Using pipx
pipx nbsapi_verify --help
Using uvx
uvx nbsapi_verify --help
If you would prefer the tool to be installed on your PATH you can run:
pipx install nbsapi_verify or uv tool install nbsapi_verify. You can then run nbsapi_verify without prefixes.
You can also install the package using your preferred Python package manager:
pip install nbsapi_verifyuv add nbsapi_verifypoetry add nbsapi_verifyAfter installation, you can run the tool using the installed script:
nbsapi_verify --helpnbsapi_verify requires a small amount of configuration:
- First, generate a verification config. This requires you to specify:
- the host the API is running on
- a valid username
- the password for that username
- the ID of that user
- a path for the verification config to be stored (optional: it defaults to the current working directory)
- the test type to be run:
all,auth,user: theauthtests will exercise the write API functions, while theusertests will exercise the read API functions (defaults toall).
In order to test your API while locally developing, that command might look like:
nbsapi_verify --generate \
--host http://localhost:8000 \
--test-type all \
--username testuser \
--password testpass \
--project 1 \
--solution 1\
--impact 1 \
--measure 1 \
--config-dir ~If the command completes sucessfully, you can run the verification tool:
nbsapi_verify --config-dir ~You can also generate JSON and HTML reports of the test results:
# Generate default JSON report (nbsapi_verify_report.json)
nbsapi_verify --config-dir ~ --json-output
# Generate default HTML report (nbsapi_verify_report.html)
nbsapi_verify --config-dir ~ --html-output
# Generate both reports
nbsapi_verify --config-dir ~ --json-output --html-outputWhen all tests pass, your API implementation is conformant to the NbsAPI specification!
This document describes the test data requirements for running API conformance tests against the NBS API.
The conformance tests validate that the API implementation matches the OpenAPI specification. However, since these tests may run against production databases where data creation/deletion is not appropriate, they rely on existing test data.
For all conformance tests to pass, the following data must exist in the database:
- Project ID:
1- Must exist for project-related endpoint tests- Required for: GET, PATCH, DELETE
/v2/api/projects/1 - Required for: GET
/v2/api/projects/1/export - Required for: GET
/v2/api/projects/1/export/deltares - Required for: POST/DELETE
/v2/api/projects/1/solutions/1
- Required for: GET, PATCH, DELETE
- Solution ID:
1- Must exist for solution-related tests- Required for: GET, PATCH
/v2/api/solutions/solutions/1 - Required for: GET
/v2/api/solutions/solutions/1/geojson - Required for: project-solution relationship tests
- Required for: GET, PATCH
- Impact ID:
1- Must exist for impact tests- Required for: GET
/v2/api/impacts/impacts/1 - Required for: GET
/v2/api/impacts/solutions/1/impacts
- Required for: GET
- Measure ID:
1- Must exist for measure type tests- Required for: GET, PATCH
/v2/measure_types/1 - Note: DELETE tests are intentionally excluded from conformance tests
- Required for: GET, PATCH
- Username:
testuserwith Password:testpass- Required for authenticated endpoints- Used for obtaining JWT tokens for protected endpoints
The test runner accepts these IDs as configuration options:
# Generate configuration with test data IDs
nbsapi-verify --generate \
--host http://localhost:8000 \
--username testuser \
--password testpass \
--project 1 \
--solution 1 \
--impact 1 \
--measure 1Important: The conformance tests use the project_id query parameter to create projects with predictable IDs for subsequent tests to reference. No special server configuration is required.
For development environments, you can create the required test data:
-- Create test user (adjust based on your auth system)
INSERT INTO users (username, password_hash) VALUES ('testuser', '<hashed_password>');
-- Create test measure type
INSERT INTO measure_types (id, name, description) VALUES ('1', 'Test Measure', 'Test measure type');
-- Create test solution
INSERT INTO naturebasedsolution (id, name, definition, measure_id) VALUES
(1, 'Test Solution', 'Test nature-based solution', '1');
-- Create test project
INSERT INTO projects (id, title, description) VALUES
('1', 'Test Project', 'Test project for conformance tests');
-- Create test impact
INSERT INTO impacts (id, magnitude, solution_id) VALUES
(1, 100.0, 1);For production environments:
- Identify existing data that can serve as test subjects
- Update the configuration to use those IDs
- Ensure the test user has appropriate permissions but limited scope
The following types of tests are intentionally excluded from conformance testing:
DELETE endpoints are skipped because:
- They permanently destroy data
- Running against production databases would be destructive
- They create dependencies between test execution order
Excluded DELETE endpoints:
DELETE /v2/measure_types/{measure_id}DELETE /v2/api/projects/{project_id}DELETE /v2/api/projects/{project_id}/solutions/{solution_id}
Tests that create new data may fail in production environments due to:
- Unique constraints (if test data already exists)
- Permission restrictions
- Database triggers or validation rules
If you see 404 errors like "Project not found", "Solution not found":
- Verify the required test data exists in the database
- Check that the IDs in your configuration match existing data
- Ensure the test user has read access to the data
If you see 409 errors for creation endpoints:
- This is normal behavior when test data already exists
- The tests expect either 200 (success) or 409 (conflict) responses
- No action needed unless you're getting different error codes
If you see 401/403 errors:
- Verify the test user credentials are correct
- Check that the user has appropriate permissions
- Ensure the JWT token is being generated correctly
- Use Dedicated Test Data: Create specific test records rather than using production data
- Document Test IDs: Keep track of which IDs are reserved for testing
- Isolate Test Data: Use specific naming conventions (e.g., "Test Project") to identify test records
- Regular Cleanup: In development environments, periodically clean up accumulated test data
- Monitor Production: When running against production, monitor for any unintended side effects
- "Project not found" errors: The most common issue is missing project ID 1
- Authentication failures: Usually incorrect test user credentials
- Measure type conflicts: Trying to create measure types with existing IDs
-
Check database for required test data:
SELECT id, title FROM projects WHERE id = '1'; SELECT id, name FROM naturebasedsolution WHERE id = 1; SELECT id FROM impacts WHERE id = 1; SELECT id, name FROM measure_types WHERE id = '1';
-
Verify test user can authenticate:
curl -X POST http://localhost:8000/auth/token \ -d "username=testuser&password=testpass" -
Test individual endpoints manually:
curl -H "Authorization: Bearer <token>" \ http://localhost:8000/v2/api/projects/1
When adding new conformance tests, document any additional data requirements here. Consider:
- What IDs/entities the test requires
- Whether the test is destructive
- If it should be excluded from production testing
- What configuration options might be needed
nbsapi_verify --help