This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
OcotilloAPI (also known as NMSampleLocations) is a FastAPI-based geospatial sample data management system for the New Mexico Bureau of Geology and Mineral Resources. It uses PostgreSQL with PostGIS for storing and querying spatial data related to sample locations, field observations, water chemistry, and more.
This project is migrating data from the legacy AMPAPI system (SQL Server, NM_Aquifer schema) to a new PostgreSQL + PostGIS stack. Transfer scripts in transfers/ handle data conversion from legacy tables.
# Install dependencies (requires uv package manager)
uv venv
source .venv/bin/activate # On Mac/Linux
uv sync --locked
# Setup pre-commit hooks
pre-commit install
# Configure environment
cp .env.example .env
# Edit .env with database credentials# Run migrations
alembic upgrade head
# Create a new migration
alembic revision --autogenerate -m "description"
# Rollback one migration
alembic downgrade -1# Local development (requires PostgreSQL + PostGIS installed)
uvicorn main:app --reload
# Docker (includes database)
docker compose up --build# Run all tests
uv run pytest
# Run specific test file
uv run pytest tests/test_sample.py
# Run specific test function
uv run pytest tests/test_sample.py::test_add_sample
# Run with coverage
uv run pytest --cov
# Set up test database (PostgreSQL with PostGIS required)
createdb -h localhost -U <user> ocotilloapi_test
psql -h localhost -U <user> -d ocotilloapi_test -c "CREATE EXTENSION IF NOT EXISTS postgis;"Test Database: Tests automatically use ocotilloapi_test database. The test framework sets POSTGRES_DB=ocotilloapi_test in tests/__init__.py before importing the database engine.
Environment Variables: Tests read from .env file but override the database name:
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_USER=<username>
POSTGRES_PASSWORD=<password>
# POSTGRES_DB in .env is ignored during tests - always uses ocotilloapi_test# Transfer data from legacy AMPAPI (NM_Aquifer) to new schema
python -m transfers.transferThe system follows a hierarchical structure for field data collection:
Location (geographic point)
└── Thing (monitoring point at location: well, spring, etc.)
└── FieldEvent (visit to a thing on a date)
└── FieldActivity (specific activity during event: water level, chemistry, etc.)
└── Sample (physical sample collected during activity)
└── Observation (measurement/result from sample: pH, groundwater level, etc.)
Key Relationships:
- Each level inherits context from parent (location → thing → event → activity → sample → observation)
Thinghas geometry (PostGIS Point, WGS84/SRID 4326) and attributes (depth, construction details)FieldEventlinks participants (contacts) to field visitsSamplecan have depth intervals (depth_top,depth_bottom) and QC typesObservationlinks toParameter(from lexicon) and stores value/units
├── alembic/ # Database migrations
├── api/ # Route handlers (one file per resource)
├── cli/ # Ocotillo CLI commands (oco)
├── core/ # Application configuration
│ ├── app.py # FastAPI app initialization
│ ├── dependencies.py # Dependency injection (auth, DB session)
│ └── permissions.py # Authentication/authorization logic
├── db/ # SQLAlchemy models (one file per table/resource)
│ ├── engine.py # Database connection configuration
│ └── ...
├── schemas/ # Pydantic schemas (validation, serialization)
├── services/ # Business logic and database interactions
├── tests/ # Pytest test suite
│ ├── conftest.py # Shared fixtures (test data setup)
│ └── __init__.py # Sets test database (ocotilloapi_test)
├── transfers/ # Data migration scripts from AMPAPI (SQL Server)
│ ├── transfer.py # Main transfer orchestrator
│ ├── well_transfer.py # Well/thing data migration
│ └── ...
└── main.py # Application entry point
The system uses Authentik for OAuth2 authentication with role-based access control:
Permission Levels (defined in core/dependencies.py):
- Viewer: Read-only access to all public entities
- Editor: Can modify existing records (includes Viewer permissions)
- Admin: Can create new records (includes Editor + Viewer permissions)
AMP-Specific Roles: AMPAdmin, AMPEditor, AMPViewer for legacy AMPAPI integration
The application supports two database modes (configured via DB_DRIVER in .env):
- Google Cloud SQL (
DB_DRIVER=cloudsql): Uses Cloud SQL Python Connector - Standard PostgreSQL (default): Direct pg8000/asyncpg connection
Connection String Format (standard mode):
postgresql+pg8000://{user}:{password}@{host}:{port}/{database}
Important: db/engine.py uses load_dotenv(override=False) so that environment variables set before import (e.g., by the test framework) are preserved.
- Coordinate System: WGS84 (SRID 4326) for all geometries
- Geometry Types: PostGIS
Pointfor thing locations - Legacy Migration: Transfer scripts convert from UTM (SRID 26913) to WGS84
- GeoAlchemy2: Used for SQLAlchemy ↔ PostGIS integration
All custom exceptions should use PydanticStyleException for consistent API error responses:
from services.exceptions_helper import PydanticStyleException
raise PydanticStyleException(
status_code=409,
detail=[{
"loc": ["body", "sample_name"],
"msg": "Sample with sample_name X already exists.",
"type": "value_error",
"input": {"sample_name": "X"}
}]
)Validation Strategy:
- 422 errors: Pydantic validation on incoming request data (automatic)
- 409 errors: Database constraint violations (manual checks in endpoints)
When modifying data models:
- Update DB Model: Revise model in
db/directory - Update Schemas: Revise Pydantic schemas in
schemas/- Add field validators using
@field_validatoror@model_validator - Input validation (422 errors) → Pydantic validators
- Database validation (409 errors) → Manual checks in endpoint
- Add field validators using
- Create Migration:
alembic revision --autogenerate -m "description" - Update Tests:
- Update fixtures in
tests/conftest.py - Update POST test payloads and assertions
- Update PATCH test payloads and assertions
- Update GET test assertions
- Add validation tests if needed
- Update fixtures in
- Update Transfer Scripts: Revise field mappings in
transfers/(if migrating legacy data)
Schema Conventions:
Createschemas:<type>for non-nullable,<type> | None = Nonefor nullableUpdateschemas: All fields optional withNonedefaultsResponseschemas:<type>for non-nullable,<type> | Nonefor nullable
- Test Database: Uses
ocotilloapi_test(set automatically bytests/__init__.py) - Test Client:
TestClientfrom FastAPI (tests/__init__.py) - Authentication Override: Tests bypass Authentik auth using
override_authentication()fixture - Fixtures: Session-scoped fixtures in
conftest.pycreate test data - Cleanup Helpers:
cleanup_post_test(model, id): Delete records created by POST testscleanup_patch_test(model, payload, original_data): Rollback PATCH test changes
GitHub Actions workflows (.github/workflows/):
- tests.yml: Runs pytest with PostGIS Docker service container
- format_code.yml: Code formatting checks
- release.yml: Sentry release tracking
Source: AMPAPI (SQL Server, NM_Aquifer schema)
Target: OcotilloAPI (PostgreSQL + PostGIS)
Transfer Scripts (transfers/):
well_transfer.py: Migrates well/thing data with coordinate transformationwaterlevels_transfer.py: Migrates groundwater level observationscontact_transfer.py: Migrates contact recordslink_ids_transfer.py: Migrates legacy ID mappings
- API Docs:
http://localhost:8000/docs(Swagger UI) or/redoc(ReDoc) - OGC API:
http://localhost:8000/ogcapifor OGC API - Features endpoints - CLI:
oco --helpfor Ocotillo CLI commands - Sentry: Error tracking and performance monitoring integrated