A Node.js example demonstrating how to track events to both Harness Feature Management & Experimentation (FME) and AWS S3 using a server-side batching approach.
This example shows how to:
- Track events to Harness FME using the Split SDK's
track()method - Simultaneously batch and store the same events to AWS S3 as NDJSON files
- Use a Node.js Express server for secure credential management and event batching
- Avoid exposing AWS credentials on the client side
Why this approach? Tracking events to S3 provides a backup data warehouse for analytics, compliance, or custom data processing while maintaining real-time tracking to Harness FME.
┌─────────────────────────────────────────────────────────────┐
│ Browser │
│ ┌─────────────────┐ │
│ │ index.ejs │ │
│ │ (HTML + JS) │ │
│ └────────┬────────┘ │
│ │ │
│ ├──────────────────────┐ │
│ │ │ │
│ ▼ ▼ │
│ ┌─────────────────┐ ┌─────────────────┐ │
│ │ Harness FME │ │ track.js │ │
│ │ SDK (Split) │ │ wrapper │ │
│ └────────┬────────┘ └────────┬────────┘ │
│ │ │ │
└───────────┼──────────────────────┼──────────────────────────┘
│ │
▼ ▼
┌──────────────┐ ┌──────────────────────────┐
│ Harness FME │ │ Express Server │
│ Cloud │ │ (server.js) │
└──────────────┘ │ │
│ • Batches events │
│ • Flushes @ 100 events │
│ • NDJSON format │
└────────┬─────────────────┘
│
▼
┌──────────────────────────┐
│ AWS S3 Bucket │
│ events/*.ndjson │
└──────────────────────────┘
- Node.js v14 or higher
- AWS Account with S3 access
- Harness FME Account (sign up free)
npm install# Using AWS CLI
aws s3 mb s3://your-bucket-name --region us-east-1Or create via AWS Console.
Option 1: AWS CLI (Recommended for Local Development)
aws configure
# Enter your credentials when promptedThis stores credentials in ~/.aws/credentials. The server will automatically use them — no need to put credentials in .env!
Option 2: Explicit Credentials in .env
For deployment environments, you can set credentials directly in .env:
- Go to AWS IAM Console
- Create a new user with programmatic access
- Attach a policy with
s3:PutObjectpermission - Add credentials to
.env(see Step 3)
Create a .env file:
cp .env.example .envEdit .env with your configuration:
# Harness FME SDK Key (get from https://app.harness.io/)
HARNESS_FME_API_KEY=your_harness_fme_sdk_key_here
# AWS Configuration
AWS_REGION=us-east-1
S3_BUCKET_NAME=your-bucket-name
# Server Configuration (optional)
PORT=3000
BATCH_SIZE=100
# Only needed if NOT using AWS CLI credentials:
# AWS_ACCESS_KEY_ID=AKIA...
# AWS_SECRET_ACCESS_KEY=...Get your Harness FME SDK key:
- Log in to Harness
- Go to Feature Management and Experimentation → Environments
- Select your environment
- Copy the Client-side SDK Key
npm startFor development with auto-reload:
npm run devYou should see:
========================================
Server running on http://localhost:3000
S3 Bucket: your-bucket-name
Batch Size: 100
========================================
Visit http://localhost:3000 in your browser.
The server will render the HTML page with your Harness FME API key automatically injected from the .env file.
- Click the "Send Track Event" button
- Watch the event log in the UI
- Check the server console for batching progress
- After 100 events, the batch will automatically flush to S3
To flush events before reaching the batch size:
curl -X POST http://localhost:3000/api/flushcurl http://localhost:3000/api/statusEvents are stored as NDJSON (newline-delimited JSON) files:
S3 Path: s3://your-bucket-name/events/batch-YYYY-MM-DDTHH-MM-SS-mmmZ.ndjson
File Content:
{"trafficType":"user","name":"button_click","value":1,"properties":{"timestamp":"2025-10-24T10:30:15.123Z","counter":1,"userAgent":"Mozilla/5.0...","page":"/"},"timestamp":"2025-10-24T10:30:15.456Z"}
{"trafficType":"user","name":"button_click","value":2,"properties":{"timestamp":"2025-10-24T10:30:16.234Z","counter":2,"userAgent":"Mozilla/5.0...","page":"/"},"timestamp":"2025-10-24T10:30:16.567Z"}Python:
import json
with open('batch.ndjson', 'r') as f:
for line in f:
event = json.loads(line)
print(event)Command Line:
# Pretty print
cat batch.ndjson | jq '.'
# Filter events
cat batch.ndjson | jq 'select(.name == "button_click")'
# Count events
wc -l batch.ndjsonAWS Athena: NDJSON can be queried directly using AWS Athena for SQL-based analytics.
Track an event (used by frontend).
Request:
{
"trafficType": "user",
"name": "button_click",
"value": 1,
"properties": {
"timestamp": "2025-10-24T10:30:15.123Z",
"counter": 1
}
}Response:
{
"success": true,
"batchSize": 45,
"flushed": null
}When batch size is reached (100 events):
{
"success": true,
"batchSize": 0,
"flushed": {
"flushed": 100,
"filename": "events/batch-2025-10-24T10-30-15-123Z.ndjson",
"bucket": "your-bucket-name"
}
}Manually trigger flush to S3.
Response:
{
"success": true,
"flushed": 45,
"filename": "events/batch-2025-10-24T10-30-15-123Z.ndjson",
"bucket": "your-bucket-name"
}Get server status.
Response:
{
"status": "running",
"batchSize": 45,
"maxBatchSize": 100,
"s3Bucket": "your-bucket-name",
"region": "us-east-1"
}Health check and API information.
Response:
{
"service": "Harness FME + S3 Event Batching Server",
"version": "1.0.0",
"endpoints": {
"track": "POST /api/track",
"flush": "POST /api/flush",
"status": "GET /api/status"
}
}Serves the main HTML application page with Harness FME API key injected from .env.
Change the batch size in .env:
BATCH_SIZE=50 # Flush every 50 events instead of 100Change the AWS region in .env:
AWS_REGION=eu-west-1Change the port in .env:
PORT=8080track-multi/
├── .env.example # Environment variable template
├── .gitignore # Git ignore file
├── index.ejs # HTML template (rendered by server)
├── package.json # Node.js dependencies
├── README.md # This file
├── server.js # Express server with batching logic
└── track.js # Client-side tracking wrapper
The track.js module provides a simple wrapper:
- Initialization: Call
initTracker(client)once when the Harness FME SDK is ready - Tracking: Call
track(trafficType, eventName, value, properties)to track events - Dual Send: Events are sent to both:
- Harness FME via the SDK's native
track()method - Express server via
POST /api/track
- Harness FME via the SDK's native
import { initTracker, track } from './track.js';
// Initialize once
const client = factory.client();
initTracker(client);
// Track events
await track('user', 'button_click', 1, { page: '/home' });The Express server:
- Receives events via
POST /api/track - Adds them to an in-memory batch array
- When batch size reaches 100 (configurable):
- Converts events to NDJSON format
- Uploads to S3 with timestamp-based filename
- Clears the batch
- On graceful shutdown, flushes remaining events
The server renders index.ejs with the Harness FME API key injected from .env:
<!-- index.ejs -->
<script>
const factory = window.splitio({
core: {
authorizationKey: '<%= splitApiKey %>' // From .env
}
});
</script>This keeps secrets server-side and out of version control.
If you see CORS errors in the browser console:
- Ensure the server is running (
npm start) - Check the server URL in
track.jsmatches your server port
If you see AWS credential errors:
- Double-check your
.envfile - Verify IAM user has S3 write permissions
- Confirm the bucket name is correct and exists
- If using AWS CLI: run
aws s3 lsto verify credentials work
If the Harness FME SDK times out:
- This is normal if you don't have a valid API key
- Events will still be sent to S3
- For production, get a real API key from Harness
- Events only flush after 100 events (by default)
- Use
POST /api/flushto manually flush - Check server logs for errors
If you see Cannot use import statement outside a module:
- Ensure your HTML is served by the Express server at
http://localhost:3000 - Don't open
index.ejsdirectly as a file (it needs templating)
For Production Deployments:
- Don't commit
.env: Already in.gitignore - Use IAM Roles: If running on AWS (EC2, Lambda), use IAM roles instead of access keys
- Least Privilege: Grant only
s3:PutObjectpermission, notAmazonS3FullAccess - Environment-specific Keys: Use different AWS credentials for dev/staging/production
- HTTPS: Use HTTPS in production
- Rate Limiting: Add rate limiting to prevent abuse
- Authentication: Add authentication to the API endpoints
- Input Validation: Validate event data before storage
- Monitoring: Set up CloudWatch alarms for S3 write failures
This example is useful for:
- Compliance & Auditing: Keep immutable event logs in S3 for compliance
- Custom Analytics: Process events with AWS Athena, EMR, or Glue
- Data Warehousing: Load events into Redshift or Snowflake
- Backup: Maintain a backup of all events outside Harness FME
This is a community example. Feel free to:
- Open issues for bugs or questions
- Submit pull requests for improvements
- Adapt this example for your own use cases
For Harness FME support, visit the Harness Community or documentation.
For AWS support, refer to AWS Support.
Note: This example uses the Split SDK (now part of Harness FME). The SDK naming conventions reference "Split" for backward compatibility, but the service is now Harness Feature Management & Experimentation (FME).