This is a simple backend server built with Express.js and TypeScript, providing CRUD operations for items stored in a SQLite/PostgreSQL database using TypeORM. The application includes input validation, security middleware, logging, authentication, rate limiting, and comprehensive testing.
This is a complete, production-ready Node.js backend API server that can be developed, tested, and deployed independently. It contains everything needed for a modern REST API service:
- ✅ Independent Dependencies: Own
package.jsonwith Express, TypeScript, TypeORM - ✅ Complete Source Code: API routes, middleware, database models, utilities
- ✅ Comprehensive Testing: 184 tests including unit, integration, E2E, and load testing
- ✅ Database Setup: SQLite/PostgreSQL with migrations and seeding
- ✅ CI/CD Pipeline: GitHub Actions with automated testing and deployment
- ✅ Container Ready: Docker multi-stage builds with service orchestration
- ✅ Cloud Ready: Kubernetes manifests for production deployment
- ✅ Documentation: Complete API documentation and setup guide
- ✅ API Versioning: RESTful API with v1 endpoints (
/api/v1/*) - ✅ User Authentication: JWT-based authentication with bcrypt password hashing, refresh tokens, and token blacklisting
- ✅ User Registration: Register new users with secure password storage
- ✅ CRUD Operations: Create, Read, Update, Delete items with proper validation
- ✅ Soft Delete: Items are marked as deleted instead of being removed from database
- ✅ Data Ownership: Users can only access/modify their own items
- ✅ Input Validation: Comprehensive validation using Zod schemas
- ✅ API Throttling: Redis-backed rate limiting with distributed throttling support
- ✅ ORM Integration: TypeORM for database operations with support for SQLite and PostgreSQL
- ✅ Database Migrations: Automatic schema synchronization
- ✅ Security: Helmet, CORS, global and user-specific rate limiting, and secure headers
- ✅ Logging: Structured logging with Winston and request tracing middleware
- ✅ Health Checks: Enhanced health endpoint with system metrics
- ✅ API Documentation: Interactive Swagger/OpenAPI documentation
- ✅ Docker Support: Containerized deployment with multi-stage build
- ✅ CI/CD: GitHub Actions pipeline for automated testing and building
- ✅ Linting: ESLint for code quality
- ✅ Testing: Jest & Supertest with comprehensive test coverage
- ✅ Environment Configuration: Flexible configuration management
- ✅ Compression: Gzip compression for responses
- ✅ Clustering: Multi-core support with Node.js cluster module
- ✅ Monitoring: Prometheus metrics for observability
- ✅ Load Balancing: Nginx configuration for horizontal scaling
- ✅ Kubernetes: Deployment manifests for cloud-native scaling
- ✅ Performance Optimization: Database indexes and query optimization
This API uses URI versioning to ensure backward compatibility and smooth transitions between versions.
All endpoints are prefixed with /api/v1/.
- URI Versioning:
/api/v1/resource- Version is part of the URL path - Backward Compatibility: New versions maintain compatibility with previous versions where possible
- Deprecation Notices: Deprecated endpoints will be marked in the changelog and documentation
- Sunset Policy: Deprecated versions will be supported for at least 6 months after deprecation
When introducing breaking changes, a new version (v2, v3, etc.) will be created with:
- Updated endpoint paths
- Modified request/response formats
- New features and improvements
To migrate from v1 to future versions:
- Update client code to use new endpoint paths
- Handle new response formats
- Test thoroughly in staging environment
- Update API documentation references
Authenticate user and get JWT token.
Request Body:
{
"username": "admin",
"password": "password"
}Response:
{
"token": "jwt-token-here",
"refreshToken": "refresh-token-here",
"user": {
"id": 1,
"username": "admin"
}
}Register a new user account.
Request Body:
{
"username": "newuser",
"password": "mypassword"
}Response:
{
"token": "jwt-token-here",
"refreshToken": "refresh-token-here",
"user": {
"id": 2,
"username": "newuser"
}
}Logout user and blacklist the current access token. Requires authentication.
Response: 200 OK
{
"message": "Logged out successfully"
}Refresh access token using refresh token.
Request Body:
{
"refreshToken": "refresh-token-here"
}Response:
{
"token": "new-jwt-token-here",
"refreshToken": "new-refresh-token-here"
}All item endpoints require authentication and enforce data ownership - users can only access/modify their own items.
Create a new item. Requires authentication.
Request Body:
{
"name": "Item Name",
"description": "Optional description"
}Response:
{
"id": 1,
"name": "Item Name",
"description": "Optional description",
"userId": 1,
"createdAt": "2025-12-02T02:00:00.000Z",
"updatedAt": "2025-12-02T02:00:00.000Z"
}List user's items with optional filters and pagination. Requires authentication.
Query Parameters:
name(optional): Filter by name (partial match)limit(optional): Number of items to return (default: 10, max: 100)offset(optional): Number of items to skip (default: 0)
Response:
[
{
"id": 1,
"name": "Item Name",
"description": "Optional description",
"userId": 1,
"createdAt": "2025-12-02T02:00:00.000Z",
"updatedAt": "2025-12-02T02:00:00.000Z"
}
]Get details of a specific item. Requires authentication and ownership.
Response:
{
"id": 1,
"name": "Item Name",
"description": "Optional description",
"userId": 1,
"createdAt": "2025-12-02T02:00:00.000Z",
"updatedAt": "2025-12-02T02:00:00.000Z"
}Update an existing item. Requires authentication and ownership.
Request Body:
{
"name": "Updated Name",
"description": "Updated description"
}Response:
{
"id": 1,
"name": "Updated Name",
"description": "Updated description",
"userId": 1,
"createdAt": "2025-12-02T02:00:00.000Z",
"updatedAt": "2025-12-02T02:00:00.000Z"
}Soft delete an item (marks as deleted, doesn't remove from database). Requires authentication and ownership.
Response: 204 No Content
This API implements soft delete functionality:
- Deleted items are marked with a
deleted_attimestamp instead of being removed from the database - All GET operations automatically exclude soft-deleted items
- Attempting to delete an already deleted item returns 404
- This preserves data integrity and allows for potential future restoration features
Health check endpoint with system info.
Response:
{
"status": "OK",
"timestamp": "2025-12-01T12:00:00.000Z",
"uptime": 123.45,
"itemsCount": 10,
"usersCount": 5,
"database": "SQLite",
"cache": "in-memory",
"version": "1.0.0"
}Interactive API documentation powered by Swagger UI.
Prometheus metrics endpoint for monitoring and observability.
Cache statistics endpoint showing hits, misses, and health information.
Response:
{
"stats": {
"hits": 150,
"misses": 25,
"sets": 50,
"deletes": 5
},
"health": {
"status": "healthy",
"l1Size": 10,
"redisConnected": true
}
}-
Install dependencies:
npm install
-
Create a
.envfile in the root directory:PORT=3000 NODE_ENV=development JWT_SECRET=your-super-secret-jwt-key-change-this-in-production LOG_LEVEL=info USE_REDIS=false REDIS_HOST=localhost REDIS_PORT=6379
Environment Variables:
PORT: Server port (default: 3000)NODE_ENV: Environment modeJWT_SECRET: Secret key for JWT tokens (change in production!)LOG_LEVEL: Logging level (error, warn, info, debug)DATABASE_URL: Database connection URL (default: SQLite)USE_REDIS: Enable Redis caching (true/false)REDIS_HOST: Redis server hostnameREDIS_PORT: Redis server portREDIS_PASSWORD: Redis server passwordRATE_LIMIT_WINDOW_MS: Rate limit window in milliseconds (default: 900000)RATE_LIMIT_MAX_REQUESTS: Max requests per window (default: 100)BCRYPT_ROUNDS: Bcrypt hashing rounds (default: 12)CORS_ORIGIN: CORS allowed origin (default: http://localhost:3000)SESSION_TIMEOUT: JWT session timeout in milliseconds (default: 3600000)REFRESH_TOKEN_EXPIRY: Refresh token expiry in milliseconds (default: 604800000)MAX_LOGIN_ATTEMPTS: Maximum login attempts before lockout (default: 5)LOCKOUT_DURATION: Lockout duration in milliseconds (default: 900000)CACHE_TTL: Cache TTL in seconds (default: 300)MAX_ITEMS_PER_PAGE: Maximum items per page (default: 100)USE_CLUSTER: Enable Node.js clustering (true/false)
-
Build the project:
npm run build
-
Start the server:
npm start
Or for development (with auto-reload):
npm run dev
The server will run on http://localhost:3000.
Build and run with Docker:
docker build -t problem5 .
docker run -p 3000:3000 problem5Run the automated test suite with Jest and Supertest:
npm testFor watch mode:
npm run test:watchThe tests cover all CRUD operations, error handling, validation, and edge cases.
For comprehensive end-to-end testing with full infrastructure (PostgreSQL + Redis), use Docker Compose:
# Run E2E tests with Docker
npm run test:e2e:dockerThis command will:
- Start PostgreSQL and Redis containers
- Build and start the application container
- Run the E2E test suite against the containerized application
- Automatically clean up containers after completion
Manual Docker E2E Testing:
# Start the test infrastructure
docker-compose -f docker-compose.test.yml up -d postgres redis app
# Wait for services to be healthy
docker-compose -f docker-compose.test.yml ps
# Run E2E tests
npm run test:e2e
# Clean up
docker-compose -f docker-compose.test.yml down -vThe Docker E2E tests validate the complete user journey including authentication, item CRUD operations, and error handling in a production-like environment.
Run ESLint to check code quality:
npm run lintTo auto-fix issues:
npm run lint:fixThis project uses GitHub Actions for comprehensive continuous integration and deployment. The CI/CD pipeline runs on every push and pull request to main/develop branches, performing:
Test Stage:
- Matrix testing across Node.js 18.x and 20.x
- PostgreSQL and Redis service containers for integration testing
- Security audit with npm audit
- ESLint code quality checks
- Unit tests, integration tests, and E2E tests
- Coverage reporting to Codecov with 70%+ thresholds
- Build artifact uploads
Docker Stage:
- Multi-stage Docker image building with caching
- Optimized production images
Load Testing Stage:
- K6 load testing against running application
- Performance validation before deployment
Deploy Stage:
- Automated deployment to staging on main branch pushes
- Ready for production deployment integration
npm test # Run all tests (unit + integration + E2E)
npm run test:unit # Run unit tests only
npm run test:integration # Run integration tests only
npm run test:e2e # Run E2E tests only
npm run test:coverage # Run tests with coverage report
npm run test:load:k6 # Run K6 load tests- Path-based triggers: Only runs when problem5 files change
- Service dependencies: PostgreSQL and Redis for realistic testing
- Artifact management: Build and test result artifacts
- Security scanning: Automated dependency vulnerability checks
- Performance validation: Load testing ensures production readiness
Enable clustering for multi-core utilization:
USE_CLUSTER=true npm startUse the provided nginx.conf for load balancing across multiple instances:
nginx -c /path/to/nginx.confDeploy to Kubernetes using the manifests in k8s/:
kubectl apply -f k8s/deployment.yamlMetrics are exposed at /metrics endpoint. Configure Prometheus to scrape this endpoint.
Scale the application with Docker Compose:
docker-compose up --scale app=3- Database indexes on frequently queried columns
- Redis caching for hot data
- Gzip compression for responses
- Connection pooling for database
- Health checks for load balancer compatibility
- Jest & Supertest: Testing framework
- ESLint: Code linting
- Docker: Containerization
- GitHub Actions: CI/CD pipeline
- dotenv: Environment configuration