Updates code for Bug FIXES for AWS working

This commit is contained in:
Chandresh Kerkar 2025-12-19 19:38:36 +05:30
parent d67df2fcd4
commit 736bfe4766
16 changed files with 2240 additions and 67 deletions

62
.env Working Normal file
View File

@ -0,0 +1,62 @@
# ============================================
# REQUIRED ENVIRONMENT VARIABLES
# ============================================
# Copy this file to .env and fill in your values
# Database Connection (PostgreSQL)
DATABASE_URL=postgres://postgres:password123@localhost:5433/farmmarket
# JWT Secrets (use strong random strings)
# Generate with: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"
JWT_ACCESS_SECRET=add74b258202057143382e8ee9ecc24a1114eddd3da5db79f3d29d24d7083043
JWT_REFRESH_SECRET=94a09772321fa15dc41c6c1e07d3b97a5b50d770e29f2ade47e8de5c571a611d
# ============================================
# OPTIONAL ENVIRONMENT VARIABLES
# ============================================
# Server Configuration
PORT=3000
NODE_ENV=development
# CORS Configuration (comma-separated list, required in production)
# Example: https://yourdomain.com,https://www.yourdomain.com
#CORS_ALLOWED_ORIGINS=
# JWT Token Expiration (default values shown)
JWT_ACCESS_TTL=15m
JWT_REFRESH_TTL=7d
# Refresh Token Inactivity Timeout (in minutes, default: 4320 = 3 days)
REFRESH_MAX_IDLE_MINUTES=4320
# OTP Configuration
OTP_MAX_ATTEMPTS=5
# ============================================
# TWILIO SMS CONFIGURATION (Optional)
# ============================================
# Required for sending OTP via SMS
# If not configured, OTP will be logged to console in development
TWILIO_ACCOUNT_SID=ACa6723cb1475351e13d9ca60059c23b28
TWILIO_AUTH_TOKEN=67ecdfb2bc70285b45b969940e18e443
# Use either TWILIO_MESSAGING_SERVICE_SID (recommended) OR TWILIO_FROM_NUMBER
#TWILIO_MESSAGING_SERVICE_SID=your-messaging-service-sid
# OR
TWILIO_FROM_NUMBER=+16597322424
# ============================================
# Twilio SMS Configuration (Optional)
# ============================================
# TWILIO_ACCOUNT_SID=your-twilio-account-sid
# TWILIO_AUTH_TOKEN=your-twilio-auth-token
# TWILIO_MESSAGING_SERVICE_SID=your-messaging-service-sid (recommended)
# OR
# TWILIO_FROM_NUMBER=+1234567890
# ============================================
# ADMIN DASHBOARD CONFIGURATION
# ============================================
ENABLE_ADMIN_DASHBOARD=true

158
MIGRATION_SUMMARY.md Normal file
View File

@ -0,0 +1,158 @@
# AWS Database Migration - Implementation Summary
## ✅ Completed Changes
All code changes have been implemented to migrate from local Docker PostgreSQL to AWS PostgreSQL using AWS SSM Parameter Store for secure credential management.
## 📁 Files Modified
### 1. `src/utils/awsSsm.js`
**Changes:**
- ✅ Updated to use correct SSM parameter paths:
- Read-Write: `/test/livingai/db/app`
- Read-Only: `/test/livingai/db/app/readonly`
- ✅ Added support for `DB_USE_READONLY` environment variable
- ✅ Improved error handling with detailed error messages
- ✅ Added `buildDatabaseConfig()` function for SSL support
- ✅ Updated credential validation and parsing
### 2. `src/db.js`
**Changes:**
- ✅ Added SSL configuration for self-signed certificates
- ✅ Updated to use `buildDatabaseConfig()` instead of connection string
- ✅ Improved error handling and logging
- ✅ Auto-detection of AWS database when `DB_HOST=db.livingai.app`
- ✅ Connection pool configuration (max: 20, idleTimeout: 30s)
### 3. `src/config.js`
**Changes:**
- ✅ Updated to require AWS credentials when using SSM
- ✅ Removed `DATABASE_URL` from required env vars when `USE_AWS_SSM=true`
- ✅ Added validation for AWS credentials
### 4. Documentation
**New Files:**
- ✅ `docs/getting-started/AWS_DATABASE_MIGRATION.md` - Complete migration guide
- ✅ `docs/getting-started/ENV_VARIABLES_REFERENCE.md` - Environment variables reference
- ✅ `MIGRATION_SUMMARY.md` - This file
## 🔒 Security Implementation
### ✅ Credentials Management
- **NO database credentials in `.env` files**
- Credentials fetched from AWS SSM Parameter Store at runtime
- Only AWS credentials (for SSM access) in `.env`
- Supports both read-write and read-only users
### ✅ SSL Configuration
- SSL enabled with `rejectUnauthorized: false` for self-signed certificates
- Connection string includes `?sslmode=require`
- Proper SSL configuration in connection pool
## 📋 Required Environment Variables
### For AWS Database (Production)
```env
# AWS Configuration (for SSM access)
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
USE_AWS_SSM=true
# JWT Configuration
JWT_ACCESS_SECRET=your_secret
JWT_REFRESH_SECRET=your_secret
```
### Optional
```env
DB_USE_READONLY=false # false = read_write_user, true = read_only_user
DB_HOST=db.livingai.app
DB_PORT=5432
DB_NAME=livingai_test_db
```
## 🔄 Migration Steps
1. **Set up AWS SSM Parameters:**
- Create `/test/livingai/db/app` with read-write user credentials (JSON format)
- Create `/test/livingai/db/app/readonly` with read-only user credentials (optional)
2. **Update `.env` file:**
- Add AWS credentials (AWS_REGION, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY)
- Set `USE_AWS_SSM=true`
- Remove any database credentials (DB_USER, DB_PASSWORD, DATABASE_URL)
3. **Verify IAM Permissions:**
- Ensure IAM user/role has `ssm:GetParameter` permission for both SSM parameter paths
4. **Test Connection:**
- Start application: `npm start`
- Verify logs show successful SSM credential fetch and database connection
## 🧪 Testing Checklist
- [ ] AWS SSM parameters created with correct paths and JSON format
- [ ] IAM user has SSM read permissions
- [ ] `.env` file has AWS credentials (no DB credentials)
- [ ] `USE_AWS_SSM=true` in `.env`
- [ ] Application starts without errors
- [ ] Database connection established successfully
- [ ] SSL connection working (no SSL errors)
- [ ] API endpoints respond correctly
- [ ] Database queries execute successfully
- [ ] All business logic works as before
## 🔍 Verification Commands
### Check AWS SSM Parameter
```bash
aws ssm get-parameter --name "/test/livingai/db/app" --with-decryption --region ap-south-1
```
### Test Database Connection
```bash
npm start
# Look for these log messages:
# ✅ Successfully fetched DB credentials from SSM: /test/livingai/db/app (read-write user)
# ✅ Using database credentials from AWS SSM Parameter Store
# ✅ Database connection established successfully
```
## ⚠️ Important Notes
1. **No Breaking Changes**: All business logic remains unchanged. Only database connection configuration was updated.
2. **Backward Compatibility**: Local development still works with `DATABASE_URL` when `USE_AWS_SSM=false`.
3. **Security**: Database credentials are never stored in files. They are fetched from AWS SSM at runtime.
4. **SSL**: Self-signed certificates are supported via `rejectUnauthorized: false` configuration.
5. **Connection Pooling**: Configured with sensible defaults (max 20 connections, 30s idle timeout).
## 📚 Documentation
For detailed information, see:
- `docs/getting-started/AWS_DATABASE_MIGRATION.md` - Complete migration guide
- `docs/getting-started/ENV_VARIABLES_REFERENCE.md` - Environment variables reference
## 🐛 Troubleshooting
Common issues and solutions are documented in `docs/getting-started/AWS_DATABASE_MIGRATION.md` under the "Troubleshooting" section.
## ✨ Next Steps
1. Review the changes in the modified files
2. Set up AWS SSM parameters with your database credentials
3. Update your `.env` file with AWS credentials
4. Test the connection
5. Deploy to your AWS environment
---
**Migration Status**: ✅ Complete
**All Requirements Met**: ✅ Yes
**Security Requirements Met**: ✅ Yes
**Backward Compatibility**: ✅ Maintained

View File

@ -0,0 +1,310 @@
# AWS Database Migration Guide
This guide explains how to migrate the authentication service from a local Docker PostgreSQL database to an AWS PostgreSQL database using AWS SSM Parameter Store for secure credential management.
## Overview
**Security Model**: Database credentials are fetched from AWS SSM Parameter Store at runtime. NO database credentials are stored in `.env` files or code.
## Prerequisites
1. AWS Account with access to Systems Manager Parameter Store
2. IAM user/role with permissions to read SSM parameters:
- `ssm:GetParameter` for `/test/livingai/db/app`
- `ssm:GetParameter` for `/test/livingai/db/app/readonly`
3. AWS PostgreSQL database instance (RDS or managed PostgreSQL)
4. Database users created:
- `read_write_user` (for authentication service)
- `read_only_user` (optional, for read-only operations)
## AWS Configuration
### 1. Set Up AWS SSM Parameters
Store database credentials in AWS SSM Parameter Store as **SecureString** parameters:
#### Read-Write User (Authentication Service)
**Parameter Path**: `/test/livingai/db/app`
**Parameter Value** (JSON format):
```json
{
"user": "read_write_user",
"password": "your_secure_password_here",
"host": "db.livingai.app",
"port": "5432",
"database": "livingai_test_db"
}
```
#### Read-Only User (Optional)
**Parameter Path**: `/test/livingai/db/app/readonly`
**Parameter Value** (JSON format):
```json
{
"user": "read_only_user",
"password": "your_secure_password_here",
"host": "db.livingai.app",
"port": "5432",
"database": "livingai_test_db"
}
```
### 2. Create IAM Policy
Create an IAM policy that allows reading SSM parameters:
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ssm:GetParameter",
"ssm:GetParameters"
],
"Resource": [
"arn:aws:ssm:ap-south-1:*:parameter/test/livingai/db/app",
"arn:aws:ssm:ap-south-1:*:parameter/test/livingai/db/app/readonly"
]
}
]
}
```
Attach this policy to your IAM user or role.
## Environment Variables
### Required Variables
Create or update your `.env` file with **ONLY** these AWS credentials:
```env
# AWS Configuration (for SSM Parameter Store access)
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key_here
AWS_SECRET_ACCESS_KEY=your_aws_secret_key_here
# Enable AWS SSM for database credentials
USE_AWS_SSM=true
# JWT Configuration (REQUIRED)
JWT_ACCESS_SECRET=your_jwt_access_secret_here
JWT_REFRESH_SECRET=your_jwt_refresh_secret_here
```
### Optional Variables
```env
# Control which database user to use
# false = read_write_user (default for auth service)
# true = read_only_user
DB_USE_READONLY=false
# Database connection settings (auto-detected if not set)
DB_HOST=db.livingai.app
DB_PORT=5432
DB_NAME=livingai_test_db
```
### ⚠️ DO NOT Include
**NEVER** include these in your `.env` file:
- `DB_USER`
- `DB_PASSWORD`
- `DATABASE_URL` (with credentials)
- Any database credentials
Database credentials are fetched from AWS SSM at runtime.
## Database Configuration
### AWS PostgreSQL Settings
- **Host**: `db.livingai.app`
- **Port**: `5432`
- **Database**: `livingai_test_db`
- **SSL**: Required (self-signed certificates supported)
### SSL Configuration
The connection automatically handles SSL with self-signed certificates:
- `rejectUnauthorized: false` is set for self-signed certificate support
- Connection string includes `?sslmode=require`
## Migration Steps
### Step 1: Update Environment Variables
1. Remove any existing database credentials from `.env`:
```bash
# Remove these if present:
# DB_HOST=...
# DB_USER=...
# DB_PASSWORD=...
# DATABASE_URL=...
```
2. Add AWS credentials to `.env`:
```env
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
USE_AWS_SSM=true
```
### Step 2: Verify SSM Parameters
Ensure your SSM parameters are set correctly:
```bash
# Using AWS CLI (if configured)
aws ssm get-parameter --name "/test/livingai/db/app" --with-decryption --region ap-south-1
```
### Step 3: Test Database Connection
Start your application:
```bash
npm start
```
You should see:
```
✅ Successfully fetched DB credentials from SSM: /test/livingai/db/app (read-write user)
✅ Using database credentials from AWS SSM Parameter Store
Host: db.livingai.app, Database: livingai_test_db, User: read_write_user
✅ Database connection established successfully
```
### Step 4: Run Database Migrations
If you have schema changes, run migrations:
```bash
node run-migration.js
```
## Connection Pool Configuration
The connection pool is configured with:
- **Max connections**: 20
- **Idle timeout**: 30 seconds
- **Connection timeout**: 2 seconds
These can be adjusted in `src/db.js` if needed.
## Troubleshooting
### Error: "Cannot access AWS SSM Parameter Store"
**Causes**:
- Missing AWS credentials in `.env`
- IAM user doesn't have SSM permissions
- Wrong AWS region
**Solutions**:
1. Verify `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` are set
2. Check IAM permissions for SSM access
3. Verify `AWS_REGION` matches your SSM parameter region
### Error: "Database connection failed: SSL required"
**Cause**: Database requires SSL but connection isn't using it.
**Solution**: SSL is automatically configured. Verify your database security group allows SSL connections.
### Error: "Parameter not found"
**Cause**: SSM parameter path doesn't exist or has wrong name.
**Solution**:
1. Verify parameter path: `/test/livingai/db/app`
2. Check parameter is in correct region
3. Ensure parameter type is "SecureString"
### Error: "Invalid credentials format"
**Cause**: SSM parameter value is not valid JSON.
**Solution**: Ensure parameter value is JSON format:
```json
{
"user": "username",
"password": "password",
"host": "host",
"port": "5432",
"database": "dbname"
}
```
## Local Development
For local development without AWS SSM, you can use `DATABASE_URL`:
```env
# Disable AWS SSM for local development
USE_AWS_SSM=false
# Use local database
DATABASE_URL=postgresql://postgres:password@localhost:5432/farmmarket
```
**Note**: This should only be used for local development. Production must use AWS SSM.
## Security Best Practices
1. **Never commit `.env` files** - Add `.env` to `.gitignore`
2. **Rotate credentials regularly** - Update SSM parameters periodically
3. **Use least privilege** - IAM user should only have SSM read permissions
4. **Monitor SSM access** - Enable CloudTrail to audit SSM parameter access
5. **Use different credentials per environment** - Separate SSM parameters for test/prod
## Code Changes Summary
### Files Modified
1. **`src/utils/awsSsm.js`**
- Updated to use correct SSM parameter paths
- Added support for read-only/read-write user selection
- Improved error handling and validation
2. **`src/db.js`**
- Added SSL configuration for self-signed certificates
- Updated to use `buildDatabaseConfig` instead of connection string
- Improved error handling and logging
3. **`src/config.js`**
- Removed `DATABASE_URL` from required env vars when using SSM
- Added AWS credentials validation
### No Changes Required
- All business logic remains unchanged
- All API endpoints work as before
- All database queries work as before
- Only connection configuration changed
## Verification Checklist
- [ ] AWS SSM parameters created with correct paths
- [ ] IAM user/role has SSM read permissions
- [ ] `.env` file has AWS credentials (no DB credentials)
- [ ] `USE_AWS_SSM=true` in `.env`
- [ ] Application starts without errors
- [ ] Database connection established successfully
- [ ] API endpoints respond correctly
- [ ] Database queries execute successfully
## Support
For issues or questions:
1. Check application logs for detailed error messages
2. Verify SSM parameters using AWS Console or CLI
3. Test AWS credentials with AWS CLI
4. Review IAM permissions for SSM access

View File

@ -0,0 +1,278 @@
# Database Mode Switch Guide
This guide explains how to switch between **Local Database** and **AWS Database** modes.
## Quick Switch
Add this to your `.env` file to switch between modes:
```env
# For Local Database (Docker PostgreSQL)
DATABASE_MODE=local
# For AWS Database (AWS RDS/PostgreSQL with SSM)
DATABASE_MODE=aws
```
## Local Database Mode
### Configuration
```env
# Set database mode
DATABASE_MODE=local
# Local PostgreSQL connection string
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
# JWT Configuration (required)
JWT_ACCESS_SECRET=your_jwt_access_secret
JWT_REFRESH_SECRET=your_jwt_refresh_secret
# Redis (optional)
REDIS_URL=redis://localhost:6379
```
### Requirements
- Docker PostgreSQL running (from `docker-compose.yml`)
- `DATABASE_URL` set in `.env`
- No AWS credentials needed
### Start Local Database
```bash
cd db/farmmarket-db
docker-compose up -d postgres
```
### Connection String Format
```
postgresql://[user]:[password]@[host]:[port]/[database]
```
Example:
```
postgresql://postgres:password123@localhost:5433/farmmarket
```
## AWS Database Mode
### Configuration
```env
# Set database mode
DATABASE_MODE=aws
# AWS Configuration (for SSM Parameter Store)
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key
AWS_SECRET_ACCESS_KEY=your_aws_secret_key
# Optional: Control which database user
DB_USE_READONLY=false # false = read_write_user, true = read_only_user
# Optional: Database connection settings (auto-detected)
# DB_HOST=db.livingai.app
# DB_PORT=5432
# DB_NAME=livingai_test_db
# JWT Configuration (required)
JWT_ACCESS_SECRET=your_jwt_access_secret
JWT_REFRESH_SECRET=your_jwt_refresh_secret
# Redis (optional)
REDIS_URL=redis://your-redis-host:6379
```
### Requirements
- AWS SSM Parameter Store configured with database credentials
- AWS credentials (`AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`)
- IAM permissions to read SSM parameters
- AWS PostgreSQL database instance
### SSM Parameter Setup
Store credentials in AWS SSM Parameter Store:
**Path**: `/test/livingai/db/app` (read-write user)
**Value** (JSON):
```json
{
"user": "read_write_user",
"password": "secure_password",
"host": "db.livingai.app",
"port": "5432",
"database": "livingai_test_db"
}
```
## Mode Detection Priority
The system determines database mode in this order:
1. **`DATABASE_MODE`** environment variable (highest priority)
- `DATABASE_MODE=local` → Local mode
- `DATABASE_MODE=aws` → AWS mode
2. **`USE_AWS_SSM`** environment variable (legacy support)
- `USE_AWS_SSM=true` → AWS mode
- `USE_AWS_SSM=false` or not set → Local mode
3. **`DB_HOST`** auto-detection
- `DB_HOST=db.livingai.app` → AWS mode
- Otherwise → Local mode
## Examples
### Example 1: Local Development
```env
# .env file
DATABASE_MODE=local
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
JWT_ACCESS_SECRET=dev-secret
JWT_REFRESH_SECRET=dev-refresh-secret
REDIS_URL=redis://localhost:6379
```
**Start services:**
```bash
cd db/farmmarket-db
docker-compose up -d
```
### Example 2: AWS Production
```env
# .env file
DATABASE_MODE=aws
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
DB_USE_READONLY=false
JWT_ACCESS_SECRET=prod-secret
JWT_REFRESH_SECRET=prod-refresh-secret
REDIS_URL=redis://your-redis-host:6379
```
### Example 3: Using Legacy USE_AWS_SSM
```env
# .env file (still works for backward compatibility)
USE_AWS_SSM=true
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_key
AWS_SECRET_ACCESS_KEY=your_secret
DATABASE_URL=postgresql://... # Not used when USE_AWS_SSM=true
```
## Switching Between Modes
### From Local to AWS
1. Update `.env`:
```env
DATABASE_MODE=aws
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_key
AWS_SECRET_ACCESS_KEY=your_secret
```
2. Remove or comment out `DATABASE_URL`:
```env
# DATABASE_URL=postgresql://... # Not needed in AWS mode
```
3. Restart application
### From AWS to Local
1. Update `.env`:
```env
DATABASE_MODE=local
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
```
2. Remove or comment out AWS credentials (optional):
```env
# AWS_REGION=ap-south-1
# AWS_ACCESS_KEY_ID=...
# AWS_SECRET_ACCESS_KEY=...
```
3. Start local PostgreSQL:
```bash
cd db/farmmarket-db
docker-compose up -d postgres
```
4. Restart application
## Verification
### Check Current Mode
When you start the application, you'll see:
**Local Mode:**
```
📊 Database Mode: Local (using DATABASE_URL)
Using DATABASE_URL from environment (local database mode)
✅ Database connection established successfully
```
**AWS Mode:**
```
📊 Database Mode: AWS (using SSM Parameter Store)
✅ Successfully fetched DB credentials from SSM: /test/livingai/db/app (read-write user)
✅ Using database credentials from AWS SSM Parameter Store
Host: db.livingai.app, Database: livingai_test_db, User: read_write_user
✅ Database connection established successfully
```
## Troubleshooting
### Error: "Missing required environment variables"
**Local Mode:**
- Ensure `DATABASE_URL` is set
- Check PostgreSQL is running: `docker-compose ps`
**AWS Mode:**
- Ensure `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` are set
- Verify SSM parameters exist in AWS Console
- Check IAM permissions for SSM access
### Error: "Database connection failed"
**Local Mode:**
- Verify PostgreSQL is running: `docker ps`
- Check `DATABASE_URL` format is correct
- Test connection: `psql postgresql://postgres:password123@localhost:5433/farmmarket`
**AWS Mode:**
- Verify AWS credentials are correct
- Check SSM parameter exists and has correct JSON format
- Verify database security group allows connections
- Test AWS credentials: `aws ssm get-parameter --name "/test/livingai/db/app" --with-decryption`
## Best Practices
1. **Use `DATABASE_MODE` explicitly** - Most clear and reliable
2. **Never commit `.env` files** - Keep credentials secure
3. **Use different `.env` files** - `.env.local` for local, `.env.production` for AWS
4. **Test both modes** - Ensure your application works in both environments
5. **Document your setup** - Note which mode each environment uses
## Summary
- **`DATABASE_MODE=local`** → Uses `DATABASE_URL` from `.env`
- **`DATABASE_MODE=aws`** → Fetches credentials from AWS SSM Parameter Store
- Both modes work independently
- Switch by changing one environment variable
- All business logic remains unchanged

View File

@ -0,0 +1,89 @@
# .env File Template
Copy this template to create your `.env` file in the project root.
## Quick Setup
1. Copy `.env.example` to `.env`:
```bash
cp .env.example .env
```
2. Update the values with your actual credentials
3. Set `DATABASE_MODE` to `local` or `aws`
## Complete Template
Create a file named `.env` in the root directory (`G:\LivingAi\farm-auth-service\.env`) with this content:
```env
# =====================================================
# DATABASE MODE SWITCH
# =====================================================
# Options: 'local' or 'aws'
DATABASE_MODE=local
# =====================================================
# LOCAL DATABASE (when DATABASE_MODE=local)
# =====================================================
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
# =====================================================
# AWS DATABASE (when DATABASE_MODE=aws)
# =====================================================
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key_here
AWS_SECRET_ACCESS_KEY=your_aws_secret_key_here
DB_USE_READONLY=false
# =====================================================
# JWT Configuration (REQUIRED)
# =====================================================
JWT_ACCESS_SECRET=your_jwt_access_secret_here
JWT_REFRESH_SECRET=your_jwt_refresh_secret_here
# =====================================================
# Redis Configuration (Optional)
# =====================================================
REDIS_URL=redis://localhost:6379
# =====================================================
# Application Configuration
# =====================================================
NODE_ENV=development
PORT=3000
CORS_ALLOWED_ORIGINS=http://localhost:3000
```
## Minimal Setup for Local Development
If you just want to get started quickly with local database:
```env
DATABASE_MODE=local
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
JWT_ACCESS_SECRET=change-this-to-a-secret-key
JWT_REFRESH_SECRET=change-this-to-another-secret-key
REDIS_URL=redis://localhost:6379
NODE_ENV=development
PORT=3000
```
## Minimal Setup for AWS Production
For AWS database with SSM:
```env
DATABASE_MODE=aws
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
JWT_ACCESS_SECRET=your-production-secret-key
JWT_REFRESH_SECRET=your-production-refresh-secret-key
REDIS_URL=redis://your-redis-host:6379
NODE_ENV=production
PORT=3000
CORS_ALLOWED_ORIGINS=https://your-app-domain.com
```

View File

@ -0,0 +1,175 @@
# Environment Variables Reference
## Quick Reference: `.env` File Format
### For AWS Database (Production)
```env
# =====================================================
# AWS Configuration (REQUIRED for SSM access)
# =====================================================
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=your_aws_access_key_here
AWS_SECRET_ACCESS_KEY=your_aws_secret_key_here
USE_AWS_SSM=true
# =====================================================
# JWT Configuration (REQUIRED)
# =====================================================
JWT_ACCESS_SECRET=your_jwt_access_secret_here
JWT_REFRESH_SECRET=your_jwt_refresh_secret_here
# =====================================================
# Application Configuration
# =====================================================
NODE_ENV=production
PORT=3000
CORS_ALLOWED_ORIGINS=https://your-app-domain.com
```
### For Local Development
```env
# =====================================================
# Local Database (Local Development Only)
# =====================================================
USE_AWS_SSM=false
DATABASE_URL=postgresql://postgres:password@localhost:5432/farmmarket
# =====================================================
# JWT Configuration (REQUIRED)
# =====================================================
JWT_ACCESS_SECRET=your_jwt_access_secret_here
JWT_REFRESH_SECRET=your_jwt_refresh_secret_here
# =====================================================
# Application Configuration
# =====================================================
NODE_ENV=development
PORT=3000
```
## Variable Descriptions
### AWS Configuration
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| `AWS_REGION` | Yes (for AWS) | `ap-south-1` | AWS region for SSM Parameter Store |
| `AWS_ACCESS_KEY_ID` | Yes (for AWS) | - | AWS access key for SSM access |
| `AWS_SECRET_ACCESS_KEY` | Yes (for AWS) | - | AWS secret key for SSM access |
| `USE_AWS_SSM` | Yes (for AWS) | `false` | Set to `true` to use AWS SSM for DB credentials |
| `DB_USE_READONLY` | No | `false` | Set to `true` to use read-only user |
| `DB_HOST` | No | `db.livingai.app` | Database host (auto-detected) |
| `DB_PORT` | No | `5432` | Database port |
| `DB_NAME` | No | `livingai_test_db` | Database name |
### Database Credentials
⚠️ **IMPORTANT**: Database credentials (`DB_USER`, `DB_PASSWORD`, `DATABASE_URL` with credentials) should **NEVER** be in `.env` files when using AWS SSM.
Credentials are fetched from AWS SSM Parameter Store:
- Read-Write: `/test/livingai/db/app`
- Read-Only: `/test/livingai/db/app/readonly`
### JWT Configuration
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| `JWT_ACCESS_SECRET` | Yes | - | Secret for signing access tokens |
| `JWT_REFRESH_SECRET` | Yes | - | Secret for signing refresh tokens |
| `JWT_ACCESS_TTL` | No | `15m` | Access token expiration time |
| `JWT_REFRESH_TTL` | No | `7d` | Refresh token expiration time |
### Application Configuration
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| `NODE_ENV` | No | `development` | Environment: `development`, `production`, `test` |
| `PORT` | No | `3000` | Server port |
| `CORS_ALLOWED_ORIGINS` | Yes (prod) | - | Comma-separated list of allowed origins |
### Redis Configuration (Optional)
| Variable | Required | Default | Description |
|----------|----------|---------|-------------|
| `REDIS_URL` | No | - | Full Redis connection URL (e.g., `redis://localhost:6379`) |
| `REDIS_HOST` | No | `localhost` | Redis host |
| `REDIS_PORT` | No | `6379` | Redis port |
| `REDIS_PASSWORD` | No | - | Redis password (optional) |
**Note**: Redis is optional. If not configured, rate limiting uses in-memory storage.
### Local Development Only
| Variable | Required | Description |
|----------|----------|-------------|
| `DATABASE_URL` | Yes (if not using SSM) | PostgreSQL connection string for local database |
## Security Notes
1. **Never commit `.env` files** - Add to `.gitignore`
2. **Use AWS SSM in production** - No database credentials in files
3. **Rotate credentials regularly** - Update SSM parameters periodically
4. **Use environment-specific values** - Different values for dev/test/prod
## Example: Complete Production `.env`
```env
# AWS Configuration
AWS_REGION=ap-south-1
AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
USE_AWS_SSM=true
DB_USE_READONLY=false
# JWT Configuration
JWT_ACCESS_SECRET=your-super-secret-access-key-change-this-in-production
JWT_REFRESH_SECRET=your-super-secret-refresh-key-change-this-in-production
JWT_ACCESS_TTL=15m
JWT_REFRESH_TTL=7d
# Redis Configuration (Optional)
REDIS_URL=redis://your-redis-host:6379
# OR
# REDIS_HOST=your-redis-host
# REDIS_PORT=6379
# REDIS_PASSWORD=your-redis-password
# Application Configuration
NODE_ENV=production
PORT=3000
CORS_ALLOWED_ORIGINS=https://app.example.com,https://api.example.com
```
## Example: Local Development `.env`
```env
# Local Database
USE_AWS_SSM=false
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
# JWT Configuration
JWT_ACCESS_SECRET=dev-secret-key
JWT_REFRESH_SECRET=dev-refresh-secret-key
# Redis Configuration (Optional - use local Docker Redis)
REDIS_URL=redis://localhost:6379
# OR start Redis with docker-compose and use:
# REDIS_HOST=localhost
# REDIS_PORT=6379
# Application Configuration
NODE_ENV=development
PORT=3000
```
## Verification
To verify your environment variables are set correctly:
```bash
# Check required variables are set
node -e "require('dotenv').config(); console.log('AWS_REGION:', process.env.AWS_REGION); console.log('USE_AWS_SSM:', process.env.USE_AWS_SSM);"
```

View File

@ -0,0 +1,73 @@
# Fix Database Permissions Error
## Problem
You're getting this error:
```
error: permission denied for schema public
code: '42501'
```
This happens because the `read_write_user` doesn't have CREATE permission on the `public` schema.
## Solution
You need to grant permissions using a **database admin/superuser account**. The `read_write_user` cannot grant permissions to itself.
## Option 1: Using Admin Database URL (Recommended)
1. **Get admin database credentials** from your AWS RDS console or database administrator
- You need a user with superuser privileges or the schema owner
2. **Add to your `.env` file:**
```env
ADMIN_DATABASE_URL=postgresql://admin_user:admin_password@db.livingai.app:5432/livingai_test_db
```
3. **Run the setup script:**
```bash
npm run setup-db
```
## Option 2: Manual SQL (If you have database access)
Connect to your database using any PostgreSQL client (psql, pgAdmin, DBeaver, etc.) as an admin user and run:
```sql
GRANT USAGE ON SCHEMA public TO read_write_user;
GRANT CREATE ON SCHEMA public TO read_write_user;
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
```
## Option 3: AWS RDS Console
If you're using AWS RDS:
1. Go to AWS RDS Console
2. Find your database instance
3. Use "Query Editor" or connect via psql with master credentials
4. Run the SQL commands from Option 2
## Verification
After running the fix, verify permissions:
```sql
SELECT
has_schema_privilege('read_write_user', 'public', 'USAGE') as has_usage,
has_schema_privilege('read_write_user', 'public', 'CREATE') as has_create;
```
Both should return `true`.
## Why This Happens
- PostgreSQL doesn't allow users to grant permissions to themselves
- The `read_write_user` needs CREATE permission to create tables (like `otp_codes`)
- Only a superuser or schema owner can grant these permissions
## After Fixing
1. Restart your application
2. Try creating an OTP - it should work now

View File

@ -0,0 +1,109 @@
# How to Get Admin Database Credentials
## For AWS RDS Databases
### Option 1: AWS RDS Master User (Easiest)
The **master user** created when you set up the RDS instance has superuser privileges.
1. **Go to AWS RDS Console**
- Navigate to: https://console.aws.amazon.com/rds/
- Select your database instance
2. **Find Master Username**
- In the instance details, look for "Master username"
- This is usually `postgres` or a custom name you set
3. **Get Master Password**
- If you forgot it, you can reset it:
- Select your instance → "Modify" → Change master password
- Or use AWS Secrets Manager if configured
4. **Use in .env:**
```env
ADMIN_DATABASE_URL=postgresql://master_username:master_password@db.livingai.app:5432/livingai_test_db
```
### Option 2: AWS RDS Query Editor
If you have AWS Console access:
1. Go to RDS Console → Your Database → "Query Editor"
2. Connect using master credentials
3. Run the SQL commands directly:
```sql
GRANT USAGE ON SCHEMA public TO read_write_user;
GRANT CREATE ON SCHEMA public TO read_write_user;
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
```
### Option 3: Store Admin Credentials in AWS SSM
If you want to automate this:
1. **Store admin credentials in AWS SSM Parameter Store:**
```bash
aws ssm put-parameter \
--name "/test/livingai/db/admin" \
--type "SecureString" \
--value '{"user":"admin_user","password":"admin_password","host":"db.livingai.app","port":"5432","database":"livingai_test_db"}' \
--region ap-south-1
```
2. **Or set the parameter path in .env:**
```env
AWS_SSM_ADMIN_PARAM=/test/livingai/db/admin
```
3. **Run the setup script:**
```bash
npm run setup-db
```
### Option 4: Use psql Command Line
If you have psql installed and network access:
```bash
psql -h db.livingai.app -p 5432 -U master_username -d livingai_test_db
```
Then run:
```sql
GRANT USAGE ON SCHEMA public TO read_write_user;
GRANT CREATE ON SCHEMA public TO read_write_user;
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
```
## For Local Docker Databases
If using local Docker PostgreSQL:
```env
ADMIN_DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
```
The default postgres user has superuser privileges.
## Security Notes
⚠️ **Important:**
- Never commit admin credentials to git
- Use AWS SSM Parameter Store or AWS Secrets Manager for production
- Rotate admin passwords regularly
- Use the admin account only for setup/maintenance, not for application connections
## After Getting Credentials
1. Add to `.env`:
```env
ADMIN_DATABASE_URL=postgresql://admin:password@host:port/database
```
2. Run setup:
```bash
npm run setup-db
```
3. Restart your application

View File

@ -0,0 +1,96 @@
# Quick Fix: Database Permissions
## Current Situation
✅ You can fetch credentials from AWS SSM:
- `read_only_user` - Read-only access
- `read_write_user` - Read-write access (but can't grant permissions to itself)
❌ You need **admin/master user** credentials to grant CREATE permission
## Solution: Get AWS RDS Master User Credentials
### Step 1: Find Master User in AWS RDS
1. Go to **AWS RDS Console**: https://console.aws.amazon.com/rds/
2. Click on your database instance (`db.livingai.app`)
3. Look for **"Master username"** in the instance details
- Usually it's `postgres` or a custom name you set during creation
### Step 2: Get or Reset Master Password
**Option A: You know the password**
- Use it directly
**Option B: You forgot the password**
1. Select your RDS instance
2. Click **"Modify"**
3. Change the master password
4. Apply changes (may require a maintenance window)
### Step 3: Store Admin Credentials in AWS SSM
Run this command in your farm-auth-service directory:
```bash
npm run store-admin
```
When prompted, enter:
- **Username**: Your RDS master username (e.g., `postgres`)
- **Password**: Your RDS master password
- **Host**: `db.livingai.app` (default)
- **Port**: `5432` (default)
- **Database**: `livingai_test_db` (default)
This will store credentials at: `/test/livingai/db/admin`
### Step 4: Run Setup
```bash
npm run setup-db
```
The script will automatically:
1. Find admin credentials from SSM
2. Grant CREATE permission to `read_write_user`
3. Create the `uuid-ossp` extension
4. Verify permissions
### Step 5: Restart Application
```bash
npm start
```
## Alternative: Manual SQL
If you prefer to run SQL directly:
1. Connect to your database using any PostgreSQL client with master credentials
2. Run:
```sql
GRANT USAGE ON SCHEMA public TO read_write_user;
GRANT CREATE ON SCHEMA public TO read_write_user;
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
```
## Why This Is Needed
PostgreSQL security model:
- Users cannot grant permissions to themselves
- Only superusers or schema owners can grant CREATE permission
- The `read_write_user` needs CREATE permission to create tables like `otp_codes`
## Verification
After setup, verify permissions:
```sql
SELECT
has_schema_privilege('read_write_user', 'public', 'USAGE') as has_usage,
has_schema_privilege('read_write_user', 'public', 'CREATE') as has_create;
```
Both should return `true`.

View File

@ -0,0 +1,288 @@
# Redis Setup Guide
Redis is used for distributed rate limiting and OTP tracking. It's **optional** - the service will use in-memory rate limiting if Redis is not available.
## Quick Start
### Option 1: Using Docker Compose (Recommended for Local Development)
Redis is already configured in `db/farmmarket-db/docker-compose.yml`. Start it with:
```bash
cd db/farmmarket-db
docker-compose up -d redis
```
Or start all services (PostgreSQL + Redis):
```bash
docker-compose up -d
```
Then add to your `.env` file:
```env
REDIS_URL=redis://localhost:6379
```
### Option 2: Install Redis Locally
#### macOS (using Homebrew)
```bash
brew install redis
brew services start redis
```
#### Ubuntu/Debian
```bash
sudo apt update
sudo apt install redis-server
sudo systemctl start redis-server
sudo systemctl enable redis-server
```
#### Windows
Download and install from: https://github.com/microsoftarchive/redis/releases
Then add to your `.env` file:
```env
REDIS_URL=redis://localhost:6379
```
### Option 3: Use Cloud Redis (Production)
For production, use a managed Redis service:
- **AWS ElastiCache**: `redis://your-elasticache-endpoint:6379`
- **Redis Cloud**: `redis://user:password@redis-cloud-host:6379`
- **Azure Cache for Redis**: `redis://your-cache.redis.cache.windows.net:6380?ssl=true`
Add to your `.env` file:
```env
REDIS_URL=redis://your-redis-host:6379
# OR with password
REDIS_URL=redis://:password@your-redis-host:6379
```
## Environment Variables
### Using REDIS_URL (Recommended)
Single connection string with all details:
```env
# Without password
REDIS_URL=redis://localhost:6379
# With password
REDIS_URL=redis://:password@localhost:6379
# With username and password
REDIS_URL=redis://username:password@localhost:6379
# With SSL (AWS ElastiCache, etc.)
REDIS_URL=rediss://your-redis-host:6379
```
### Using REDIS_HOST and REDIS_PORT
Separate host and port:
```env
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your_password # Optional
```
## Verification
### Test Redis Connection
Start your application and look for:
```
✅ Redis Client: Ready
```
If Redis is not available, you'll see:
```
⚠️ Redis not available. Rate limiting will use in-memory fallback.
```
### Test Redis Manually
```bash
# Using redis-cli (if installed)
redis-cli ping
# Should return: PONG
# Test connection with password
redis-cli -h localhost -p 6379 -a your_password ping
```
## Docker Compose Setup
The `docker-compose.yml` file includes Redis:
```yaml
services:
redis:
image: redis:7-alpine
container_name: farmmarket-redis
restart: always
ports:
- "6379:6379"
volumes:
- redis_data:/data
command: redis-server --appendonly yes
```
### Start Redis Only
```bash
cd db/farmmarket-db
docker-compose up -d redis
```
### Stop Redis
```bash
docker-compose stop redis
```
### View Redis Logs
```bash
docker-compose logs -f redis
```
## Production Recommendations
### 1. Use Managed Redis Service
For production, use a managed Redis service:
- **AWS ElastiCache** (recommended for AWS deployments)
- **Redis Cloud**
- **Azure Cache for Redis**
- **Google Cloud Memorystore**
### 2. Enable Redis Persistence
If using your own Redis instance:
- Enable AOF (Append-Only File) persistence
- Configure regular snapshots (RDB)
- Set up replication for high availability
### 3. Security
- Use password authentication in production
- Enable SSL/TLS for connections
- Restrict network access (firewall/VPC)
- Use IAM roles (for AWS ElastiCache)
### Example: AWS ElastiCache Configuration
```env
# AWS ElastiCache endpoint
REDIS_URL=redis://your-cluster.xxxxx.cache.amazonaws.com:6379
# With password
REDIS_URL=redis://:your-password@your-cluster.xxxxx.cache.amazonaws.com:6379
# With SSL (if enabled)
REDIS_URL=rediss://your-cluster.xxxxx.cache.amazonaws.com:6379
```
## Benefits of Using Redis
### Without Redis (In-Memory)
- ✅ Simple setup
- ✅ No external dependencies
- ❌ Rate limits reset on server restart
- ❌ Not shared across multiple server instances
- ❌ Limited scalability
### With Redis
- ✅ Persistent rate limits (survive restarts)
- ✅ Shared across multiple server instances
- ✅ Better scalability
- ✅ Can handle high traffic
- ❌ Requires Redis setup and maintenance
## Troubleshooting
### Connection Refused
**Error**: `ECONNREFUSED` or `Connection refused`
**Solutions**:
1. Check if Redis is running: `docker-compose ps` or `redis-cli ping`
2. Verify Redis port is correct (default: 6379)
3. Check firewall settings
4. Verify REDIS_URL or REDIS_HOST is correct
### Authentication Failed
**Error**: `NOAUTH Authentication required`
**Solutions**:
1. Add password to REDIS_URL: `redis://:password@host:6379`
2. Or set `REDIS_PASSWORD` environment variable
3. Verify password is correct
### Timeout Errors
**Error**: Connection timeout
**Solutions**:
1. Check network connectivity
2. Verify Redis host and port are correct
3. Check if Redis is behind a firewall
4. Increase connection timeout if needed
## Performance Tuning
### Connection Pooling
The Redis client automatically manages connections. Default settings are suitable for most use cases.
### Memory Optimization
If using local Redis:
```bash
# Set max memory (e.g., 256MB)
redis-cli CONFIG SET maxmemory 256mb
redis-cli CONFIG SET maxmemory-policy allkeys-lru
```
## Monitoring
### Check Redis Stats
```bash
redis-cli INFO stats
```
### Monitor Commands
```bash
redis-cli MONITOR
```
### Check Memory Usage
```bash
redis-cli INFO memory
```
## Next Steps
1. Set up Redis using one of the options above
2. Add Redis configuration to your `.env` file
3. Restart your application
4. Verify connection with `✅ Redis Client: Ready` message
5. Test rate limiting to ensure it's working

128
example.env Normal file
View File

@ -0,0 +1,128 @@
# =====================================================
# FARM AUTH SERVICE - ENVIRONMENT CONFIGURATION
# =====================================================
# Copy this file to .env and update with your actual values
# DO NOT commit .env file to git (it's in .gitignore)
# =====================================================
# =====================================================
# DATABASE MODE SWITCH
# =====================================================
# Options: 'local' or 'aws'
# - 'local': Uses DATABASE_URL for local Docker PostgreSQL
# - 'aws': Uses AWS SSM Parameter Store for AWS PostgreSQL
# =====================================================
DATABASE_MODE=aws
# =====================================================
# LOCAL DATABASE CONFIGURATION
# =====================================================
# Only used when DATABASE_MODE=local
# Format: postgresql://user:password@host:port/database
DATABASE_URL=postgresql://postgres:password123@localhost:5433/farmmarket
# =====================================================
# AWS DATABASE CONFIGURATION
# =====================================================
# Only used when DATABASE_MODE=aws
# These credentials are used ONLY to access AWS SSM Parameter Store
# Database credentials are fetched from SSM at runtime - NOT stored here
# AWS Region for SSM Parameter Store
AWS_REGION=ap-south-1
# AWS Access Key (for SSM access only)
AWS_ACCESS_KEY_ID=your_aws_access_key_here
# AWS Secret Key (for SSM access only)
AWS_SECRET_ACCESS_KEY=your_aws_secret_key_here
# Optional: Control which database user to use
# false = use read_write_user from /test/livingai/db/app (default for auth service)
# true = use read_only_user from /test/livingai/db/app/readonly
DB_USE_READONLY=false
# Optional: Database connection settings (auto-detected if not set)
# DB_HOST=db.livingai.app
# DB_PORT=5432
# DB_NAME=livingai_test_db
# =====================================================
# JWT Configuration (REQUIRED for both modes)
# =====================================================
# These secrets are used to sign and verify JWT tokens
# Generate strong random secrets for production
JWT_ACCESS_SECRET=add74b258202057143382e8ee9ecc24a1114eddd3da5db79f3d29d24d7083043
JWT_REFRESH_SECRET=94a09772321fa15dc41c6c1e07d3b97a5b50d770e29f2ade47e8de5c571a611d
# Optional JWT settings
JWT_ACCESS_TTL=15m
JWT_REFRESH_TTL=7d
# =====================================================
# Redis Configuration (Optional - for rate limiting)
# =====================================================
# Redis is optional - if not set, rate limiting uses in-memory storage
# For local development with Docker Compose:
REDIS_URL=redis://localhost:6379
# OR use separate host/port:
# REDIS_HOST=localhost
# REDIS_PORT=6379
# REDIS_PASSWORD=your_redis_password
# For production (AWS ElastiCache, etc.):
# REDIS_URL=redis://your-redis-host:6379
# REDIS_URL=redis://:password@your-redis-host:6379
# =====================================================
# Application Configuration
# =====================================================
# Environment: development, production, test
NODE_ENV=development
# Server port
PORT=3000
# =====================================================
# CORS Configuration
# =====================================================
# For local development, you can leave empty (allows all origins)
# For production, REQUIRED - comma-separated list of allowed origins
CORS_ALLOWED_ORIGINS=http://localhost:3000
# Production example:
# CORS_ALLOWED_ORIGINS=https://app.example.com,https://api.example.com
# =====================================================
# Twilio Configuration (Optional - for SMS OTP)
# =====================================================
# Uncomment and fill in if using Twilio for SMS OTP
# TWILIO_ACCOUNT_SID=your_twilio_account_sid
# TWILIO_AUTH_TOKEN=your_twilio_auth_token
# TWILIO_PHONE_NUMBER=+1234567890
# =====================================================
# SECURITY NOTES
# =====================================================
# 1. DO NOT commit this file - it's already in .gitignore
# 2. For AWS mode: Database credentials are fetched from SSM Parameter Store
# SSM Parameter Paths:
# - Read-Write User: /test/livingai/db/app
# - Read-Only User: /test/livingai/db/app/readonly
#
# SSM Parameter Format (JSON):
# {
# "user": "read_write_user",
# "password": "secure_password_here",
# "host": "db.livingai.app",
# "port": "5432",
# "database": "livingai_test_db"
# }
#
# 3. For local mode: Use DATABASE_URL with local PostgreSQL
# Start PostgreSQL with: docker-compose up -d postgres (from db/farmmarket-db/)
#
# 4. Replace all placeholder values with your actual credentials
# 5. Use strong random secrets for JWT_ACCESS_SECRET and JWT_REFRESH_SECRET

View File

@ -0,0 +1,261 @@
#!/usr/bin/env node
/**
* Database Permissions Setup Script
*
* This script attempts to grant CREATE permission on the public schema
* to the read_write_user. It should be run as a database admin/superuser.
*
* Usage:
* node scripts/setup-db-permissions.js
*
* Or with admin credentials:
* DATABASE_URL=postgresql://admin:password@host:port/database node scripts/setup-db-permissions.js
*/
require('dotenv').config();
const { Pool } = require('pg');
const { getDbCredentials, buildDatabaseConfig } = require('../src/utils/awsSsm');
const config = require('../src/config');
async function setupPermissions() {
let pool;
let adminPool = null;
try {
console.log('🔧 Setting up database permissions...\n');
// Check if admin DATABASE_URL is provided
let adminDatabaseUrl = process.env.ADMIN_DATABASE_URL;
// Try to get admin credentials from AWS SSM if available
if (!adminDatabaseUrl) {
try {
const { getDbCredentials, buildDatabaseConfig } = require('../src/utils/awsSsm');
const config = require('../src/config');
const ssmClient = require('../src/utils/awsSsm').getSsmClient();
const { GetParameterCommand } = require('@aws-sdk/client-ssm');
// Check multiple common admin parameter paths
const adminParamPaths = [
process.env.AWS_SSM_ADMIN_PARAM, // Custom path from env
'/test/livingai/db/admin', // Standard admin path
'/test/livingai/db/master', // Alternative: master user
'/test/livingai/db/root', // Alternative: root user
'/test/livingai/db/postgres', // Alternative: postgres user
].filter(Boolean); // Remove undefined values
console.log('🔍 Checking AWS SSM for admin credentials...');
for (const adminParamPath of adminParamPaths) {
try {
const response = await ssmClient.send(new GetParameterCommand({
Name: adminParamPath,
WithDecryption: true,
}));
const adminCreds = JSON.parse(response.Parameter.Value);
const { buildDatabaseUrl } = require('../src/utils/awsSsm');
// Use credentials from SSM or fallback to environment/defaults
adminDatabaseUrl = buildDatabaseUrl({
user: adminCreds.user || adminCreds.username,
password: adminCreds.password,
host: adminCreds.host || process.env.DB_HOST || 'db.livingai.app',
port: adminCreds.port || process.env.DB_PORT || '5432',
database: adminCreds.database || adminCreds.dbname || process.env.DB_NAME || 'livingai_test_db',
});
console.log(`✅ Found admin credentials in AWS SSM: ${adminParamPath}`);
console.log(` User: ${adminCreds.user || adminCreds.username}`);
break; // Found credentials, stop searching
} catch (ssmError) {
// Parameter not found, try next path
if (ssmError.name === 'ParameterNotFound') {
continue;
} else {
console.log(`⚠️ Error checking ${adminParamPath}: ${ssmError.message}`);
}
}
}
if (!adminDatabaseUrl) {
console.log(' No admin credentials found in AWS SSM Parameter Store');
console.log(' You can store admin credentials at: /test/livingai/db/admin');
}
} catch (error) {
// SSM check failed, continue with other methods
console.log(`⚠️ Could not check AWS SSM: ${error.message}`);
}
}
if (adminDatabaseUrl) {
console.log('📝 Using admin DATABASE_URL for setup...');
adminPool = new Pool({ connectionString: adminDatabaseUrl });
} else {
// Try to use the application database connection
console.log('📝 Using application database connection...');
const databaseMode = config.getDatabaseMode();
if (databaseMode === 'aws') {
const credentials = await getDbCredentials(false);
const poolConfig = buildDatabaseConfig(credentials);
pool = new Pool(poolConfig);
console.log(` Connected as: ${poolConfig.user}@${poolConfig.host}/${poolConfig.database}`);
} else {
if (!config.databaseUrl) {
throw new Error('DATABASE_URL not set. Set DATABASE_URL or ADMIN_DATABASE_URL in .env');
}
pool = new Pool({ connectionString: config.databaseUrl });
console.log(` Connected using DATABASE_URL`);
}
}
const dbPool = adminPool || pool;
// Test connection
await dbPool.query('SELECT 1');
console.log('✅ Database connection established\n');
// Get current user and check if superuser
const userResult = await dbPool.query(`
SELECT
current_user,
current_database(),
(SELECT usesuper FROM pg_user WHERE usename = current_user) as is_superuser
`);
const currentUser = userResult.rows[0].current_user;
const currentDb = userResult.rows[0].current_database;
const isSuperuser = userResult.rows[0].is_superuser;
console.log(`👤 Current user: ${currentUser}`);
console.log(`📊 Database: ${currentDb}`);
console.log(`🔑 Superuser: ${isSuperuser ? 'Yes ✅' : 'No ❌'}\n`);
// Check if read_write_user exists
const userCheck = await dbPool.query(
"SELECT 1 FROM pg_roles WHERE rolname = 'read_write_user'"
);
if (userCheck.rows.length === 0) {
console.log('⚠️ WARNING: read_write_user does not exist in the database.');
console.log(' You may need to create it first, or use a different user name.\n');
}
// Warn if trying to grant to self (PostgreSQL allows this but it's a no-op)
if (currentUser === 'read_write_user' && !isSuperuser) {
console.log('⚠️ WARNING: You are connected as read_write_user trying to grant permissions to itself.');
console.log(' PostgreSQL does not allow users to grant permissions to themselves.');
console.log(' You need to connect as a superuser or schema owner.\n');
console.log('📋 To fix this:');
console.log(' 1. Get admin/superuser database credentials');
console.log(' 2. Set ADMIN_DATABASE_URL in .env:');
console.log(' ADMIN_DATABASE_URL=postgresql://admin:password@host:port/database');
console.log(' 3. Run this script again\n');
throw new Error('Cannot grant permissions to self. Need admin credentials.');
}
// Attempt to grant permissions
console.log('🔐 Granting permissions...\n');
try {
// Grant USAGE on schema
await dbPool.query('GRANT USAGE ON SCHEMA public TO read_write_user');
console.log('✅ Granted USAGE on schema public to read_write_user');
// Grant CREATE on schema
await dbPool.query('GRANT CREATE ON SCHEMA public TO read_write_user');
console.log('✅ Granted CREATE on schema public to read_write_user');
// Set default privileges
await dbPool.query(`
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT SELECT, INSERT, UPDATE, DELETE ON TABLES TO read_write_user
`);
console.log('✅ Set default privileges for tables');
await dbPool.query(`
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT ALL ON SEQUENCES TO read_write_user
`);
console.log('✅ Set default privileges for sequences');
// Try to create extension (may fail if not superuser)
try {
await dbPool.query('CREATE EXTENSION IF NOT EXISTS "uuid-ossp"');
console.log('✅ Created uuid-ossp extension');
} catch (extError) {
if (extError.code === '42501') {
console.log('⚠️ Could not create uuid-ossp extension (requires superuser)');
console.log(' Extension may need to be created manually by database admin');
} else {
throw extError;
}
}
// Verify permissions were actually granted
console.log('\n🔍 Verifying permissions...');
const permCheck = await dbPool.query(`
SELECT
has_schema_privilege('read_write_user', 'public', 'USAGE') as has_usage,
has_schema_privilege('read_write_user', 'public', 'CREATE') as has_create
`);
const hasUsage = permCheck.rows[0].has_usage;
const hasCreate = permCheck.rows[0].has_create;
if (!hasUsage || !hasCreate) {
console.log('❌ WARNING: Permissions verification failed!');
console.log(` USAGE: ${hasUsage ? '✅' : '❌'}`);
console.log(` CREATE: ${hasCreate ? '✅' : '❌'}`);
console.log('\n This usually means you need admin credentials to grant permissions.');
throw new Error('Permissions were not granted. Need admin/superuser access.');
}
console.log('✅ Permissions verified:');
console.log(` USAGE: ${hasUsage ? '✅' : '❌'}`);
console.log(` CREATE: ${hasCreate ? '✅' : '❌'}`);
console.log('\n✅ Database permissions setup completed successfully!');
console.log('\n📋 Next steps:');
console.log(' 1. Restart your application');
console.log(' 2. Try creating an OTP - it should work now');
} catch (permError) {
if (permError.code === '42501') {
console.error('\n❌ ERROR: Permission denied. You need to run this script as a database admin/superuser.');
console.error('\n📋 To fix this:');
console.error(' 1. Connect to your database as an admin/superuser');
console.error(' 2. Run these SQL commands:');
console.error('\n GRANT USAGE ON SCHEMA public TO read_write_user;');
console.error(' GRANT CREATE ON SCHEMA public TO read_write_user;');
console.error(' CREATE EXTENSION IF NOT EXISTS "uuid-ossp";');
console.error('\n 3. Or set ADMIN_DATABASE_URL in .env with admin credentials:');
console.error(' ADMIN_DATABASE_URL=postgresql://admin:password@host:port/database');
process.exit(1);
} else {
throw permError;
}
}
} catch (error) {
console.error('\n❌ Error setting up database permissions:');
console.error(` ${error.message}`);
console.error('\n📋 Manual setup instructions:');
console.error(' Connect to your database as admin and run:');
console.error(' GRANT USAGE ON SCHEMA public TO read_write_user;');
console.error(' GRANT CREATE ON SCHEMA public TO read_write_user;');
console.error(' CREATE EXTENSION IF NOT EXISTS "uuid-ossp";');
console.error('\n📖 For detailed instructions on getting admin credentials, see:');
console.error(' docs/getting-started/GET_ADMIN_DB_CREDENTIALS.md');
process.exit(1);
} finally {
if (pool) await pool.end();
if (adminPool) await adminPool.end();
}
}
// Run the setup
setupPermissions().catch((error) => {
console.error('Fatal error:', error);
process.exit(1);
});

View File

@ -0,0 +1,135 @@
#!/usr/bin/env node
/**
* Store Admin Database Credentials in AWS SSM Parameter Store
*
* This script helps you store admin database credentials in AWS SSM
* so the setup script can automatically use them.
*
* Usage:
* node scripts/store-admin-credentials.js
*
* Or provide credentials via environment variables:
* ADMIN_DB_USER=postgres ADMIN_DB_PASSWORD=password node scripts/store-admin-credentials.js
*/
require('dotenv').config();
const readline = require('readline');
const { SSMClient, PutParameterCommand } = require('@aws-sdk/client-ssm');
// AWS Configuration
const REGION = process.env.AWS_REGION || 'ap-south-1';
const ACCESS_KEY = process.env.AWS_ACCESS_KEY_ID;
const SECRET_KEY = process.env.AWS_SECRET_ACCESS_KEY;
if (!ACCESS_KEY || !SECRET_KEY) {
console.error('❌ Error: AWS credentials required');
console.error(' Set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in .env');
process.exit(1);
}
const ssmClient = new SSMClient({
region: REGION,
credentials: {
accessKeyId: ACCESS_KEY,
secretAccessKey: SECRET_KEY,
},
});
// Default values from environment or existing app credentials
const DB_HOST = process.env.DB_HOST || 'db.livingai.app';
const DB_PORT = process.env.DB_PORT || '5432';
const DB_NAME = process.env.DB_NAME || 'livingai_test_db';
const ADMIN_PARAM_PATH = process.env.AWS_SSM_ADMIN_PARAM || '/test/livingai/db/admin';
const rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
});
function question(prompt) {
return new Promise((resolve) => {
rl.question(prompt, resolve);
});
}
async function storeAdminCredentials() {
try {
console.log('🔐 Store Admin Database Credentials in AWS SSM\n');
console.log(`📋 Parameter Path: ${ADMIN_PARAM_PATH}`);
console.log(`🌍 Region: ${REGION}\n`);
// Get admin credentials
let adminUser = process.env.ADMIN_DB_USER;
let adminPassword = process.env.ADMIN_DB_PASSWORD;
let adminHost = process.env.ADMIN_DB_HOST || DB_HOST;
let adminPort = process.env.ADMIN_DB_PORT || DB_PORT;
let adminDatabase = process.env.ADMIN_DB_NAME || DB_NAME;
if (!adminUser) {
adminUser = await question('Enter admin database username (e.g., postgres): ');
}
if (!adminPassword) {
adminPassword = await question('Enter admin database password: ');
// Hide password input
process.stdout.write('\x1B[1A\x1B[2K'); // Move up and clear line
}
const useDefaults = await question(`\nUse default values? (Host: ${adminHost}, Port: ${adminPort}, Database: ${adminDatabase}) [Y/n]: `);
if (useDefaults.toLowerCase() === 'n') {
adminHost = await question(`Database host [${adminHost}]: `) || adminHost;
adminPort = await question(`Database port [${adminPort}]: `) || adminPort;
adminDatabase = await question(`Database name [${adminDatabase}]: `) || adminDatabase;
}
// Create credentials object
const credentials = {
user: adminUser,
password: adminPassword,
host: adminHost,
port: adminPort,
database: adminDatabase,
};
console.log('\n📤 Storing credentials in AWS SSM...');
console.log(` User: ${adminUser}`);
console.log(` Host: ${adminHost}:${adminPort}`);
console.log(` Database: ${adminDatabase}`);
// Store in SSM
const command = new PutParameterCommand({
Name: ADMIN_PARAM_PATH,
Type: 'SecureString',
Value: JSON.stringify(credentials),
Description: 'Admin database credentials for farm-auth-service setup',
Overwrite: true, // Allow overwriting existing parameter
});
await ssmClient.send(command);
console.log('\n✅ Admin credentials stored successfully!');
console.log(`\n📋 Next steps:`);
console.log(` 1. Run: npm run setup-db`);
console.log(` 2. The setup script will automatically use these credentials`);
console.log(`\n💡 To use a different parameter path, set AWS_SSM_ADMIN_PARAM in .env`);
} catch (error) {
console.error('\n❌ Error storing credentials:');
if (error.name === 'AccessDeniedException') {
console.error(' Permission denied. Ensure your AWS user has permission to write to SSM Parameter Store.');
console.error(` Required permission: ssm:PutParameter for ${ADMIN_PARAM_PATH}`);
} else {
console.error(` ${error.message}`);
}
process.exit(1);
} finally {
rl.close();
}
}
// Run the script
storeAdminCredentials().catch((error) => {
console.error('Fatal error:', error);
process.exit(1);
});

View File

@ -28,12 +28,16 @@ async function initializeDb() {
// Try to get credentials from AWS SSM Parameter Store
// Only if USE_AWS_SSM is explicitly set to 'true'
let credentials = null;
if (process.env.USE_AWS_SSM === 'true' || process.env.USE_AWS_SSM === '1') {
try {
const credentials = await getDbCredentials();
credentials = await getDbCredentials();
if (credentials) {
connectionString = buildDatabaseUrl(credentials);
console.log('✅ Using database credentials from AWS SSM Parameter Store');
console.log(` 📊 Database: ${credentials.dbname || credentials.database || 'unknown'}`);
console.log(` 🌐 Hostname: ${credentials.host || 'unknown'}`);
console.log(` 👤 User: ${credentials.user || 'unknown'}`);
} else {
console.log('⚠️ AWS SSM not available, using DATABASE_URL from environment');
}
@ -54,10 +58,32 @@ async function initializeDb() {
process.exit(-1);
});
// Test connection
// Test connection and log current user
try {
await pool.query('SELECT 1');
console.log('✅ Database connection established');
// Try to get connection details (may fail if user lacks permissions)
try {
const userResult = await pool.query('SELECT current_user, current_database(), inet_server_addr() as server_ip');
const dbInfo = userResult.rows[0];
console.log(` 👤 Connected as: ${dbInfo.current_user}`);
console.log(` 📊 Database: ${dbInfo.current_database}`);
// Show hostname from SSM (what we connected to) and server IP (actual server)
if (credentials && credentials.host) {
console.log(` 🌐 Hostname (from SSM): ${credentials.host}`);
}
if (dbInfo.server_ip) {
console.log(` 🖥️ Server IP: ${dbInfo.server_ip}`);
}
} catch (infoErr) {
// If we can't get user info, at least show what we know from credentials
if (credentials) {
console.log(` 👤 User: ${credentials.user || 'unknown'}`);
console.log(` 📊 Database: ${credentials.dbname || credentials.database || 'unknown'}`);
console.log(` 🌐 Hostname (from SSM): ${credentials.host || 'unknown'}`);
}
}
} catch (err) {
console.error('❌ Database connection failed:', err.message);
throw err;

View File

@ -19,8 +19,7 @@
*
* Sensitive tables (always logged if enabled):
* - users
* - otp_codes
* - otp_requests
* - otp_requests (primary OTP table from final_db.sql)
* - refresh_tokens
* - auth_audit
*/
@ -33,8 +32,7 @@ const LOG_LEVEL = process.env.DB_ACCESS_LOG_LEVEL || 'sensitive'; // 'all' or 's
// Tables that contain sensitive data (always logged)
const SENSITIVE_TABLES = [
'users',
'otp_codes',
'otp_requests',
'otp_requests', // Primary OTP table (from final_db.sql)
'refresh_tokens',
'auth_audit',
'user_devices',

View File

@ -11,8 +11,6 @@ const OTP_TTL_SECONDS = parseInt(process.env.OTP_TTL_SECONDS || '120', 10); // 2
const OTP_EXPIRY_MS = OTP_TTL_SECONDS * 1000;
const MAX_OTP_ATTEMPTS = Number(process.env.OTP_VERIFY_MAX_ATTEMPTS || process.env.OTP_MAX_ATTEMPTS || 5);
let otpTableReadyPromise;
// === SECURITY HARDENING: TIMING ATTACK PROTECTION ===
// Pre-computed dummy hash for constant-time comparison when OTP not found
// This ensures bcrypt.compare() always executes with similar timing
@ -27,44 +25,18 @@ async function getDummyOtpHash() {
return dummyOtpHash;
}
function ensureOtpCodesTable() {
if (!otpTableReadyPromise) {
otpTableReadyPromise = db.query(`
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
CREATE TABLE IF NOT EXISTS otp_codes (
id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
phone_number VARCHAR(20) NOT NULL,
otp_hash VARCHAR(255) NOT NULL,
expires_at TIMESTAMPTZ NOT NULL,
attempt_count INT NOT NULL DEFAULT 0,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX IF NOT EXISTS idx_otp_codes_phone ON otp_codes (phone_number);
CREATE INDEX IF NOT EXISTS idx_otp_codes_expires ON otp_codes (expires_at);
DO $$
BEGIN
IF EXISTS (
SELECT 1 FROM information_schema.columns
WHERE table_name = 'otp_codes' AND column_name = 'code'
) THEN
ALTER TABLE otp_codes RENAME COLUMN code TO otp_hash;
END IF;
IF EXISTS (
SELECT 1 FROM information_schema.columns
WHERE table_name = 'otp_codes' AND column_name = 'verified_at'
) THEN
ALTER TABLE otp_codes DROP COLUMN verified_at;
END IF;
IF NOT EXISTS (
SELECT 1 FROM information_schema.columns
WHERE table_name = 'otp_codes' AND column_name = 'attempt_count'
) THEN
ALTER TABLE otp_codes ADD COLUMN attempt_count INT NOT NULL DEFAULT 0;
END IF;
END $$;
`);
/**
* Extract country code from phone number (e.g., +91 from +919876543210)
* @param {string} phoneNumber - Phone number in E.164 format
* @returns {string} Country code (default: '+91')
*/
function extractCountryCode(phoneNumber) {
if (!phoneNumber || !phoneNumber.startsWith('+')) {
return '+91'; // Default to India
}
return otpTableReadyPromise;
// Extract country code (typically 1-3 digits after +)
const match = phoneNumber.match(/^\+(\d{1,3})/);
return match ? `+${match[1]}` : '+91';
}
/**
@ -80,7 +52,6 @@ function generateOtpCode() {
* @returns {Promise<{code: string}>} - The generated OTP code
*/
async function createOtp(phoneNumber) {
await ensureOtpCodesTable();
const code = generateOtpCode();
const expiresAt = new Date(Date.now() + OTP_EXPIRY_MS);
const otpHash = await bcrypt.hash(code, 10);
@ -88,20 +59,23 @@ async function createOtp(phoneNumber) {
// === SECURITY HARDENING: FIELD-LEVEL ENCRYPTION ===
// Encrypt phone number before storing
const encryptedPhone = encryptPhoneNumber(phoneNumber);
const countryCode = extractCountryCode(phoneNumber);
// === ADDED FOR 2-MIN OTP VALIDITY & NO-RESEND ===
// Delete any existing OTPs for this phone number
// Mark existing OTPs as deleted for this phone number
// Note: We search by encrypted phone to handle both encrypted and plaintext (backward compatibility)
await db.query(
'DELETE FROM otp_codes WHERE phone_number = $1 OR phone_number = $2',
`UPDATE otp_requests
SET deleted = TRUE
WHERE (phone_number = $1 OR phone_number = $2) AND deleted = FALSE`,
[encryptedPhone, phoneNumber] // Try both encrypted and plaintext for backward compatibility
);
// Insert new OTP with encrypted phone number
// Insert new OTP with encrypted phone number (using otp_requests table from final_db.sql)
await db.query(
`INSERT INTO otp_codes (phone_number, otp_hash, expires_at, attempt_count)
VALUES ($1, $2, $3, 0)`,
[encryptedPhone, otpHash, expiresAt]
`INSERT INTO otp_requests (phone_number, country_code, otp_hash, expires_at, attempt_count, deleted)
VALUES ($1, $2, $3, $4, 0, FALSE)`,
[encryptedPhone, countryCode, otpHash, expiresAt]
);
// === ADDED FOR 2-MIN OTP VALIDITY & NO-RESEND ===
@ -129,16 +103,16 @@ async function createOtp(phoneNumber) {
* - All code paths take similar execution time regardless of outcome
*/
async function verifyOtp(phoneNumber, code) {
await ensureOtpCodesTable();
// === SECURITY HARDENING: FIELD-LEVEL ENCRYPTION ===
// Encrypt phone number for search (handles both encrypted and plaintext for backward compatibility)
const encryptedPhone = encryptPhoneNumber(phoneNumber);
const result = await db.query(
`SELECT id, otp_hash, expires_at, attempt_count, phone_number
FROM otp_codes
WHERE phone_number = $1 OR phone_number = $2
`SELECT id, otp_hash, expires_at, attempt_count, phone_number, consumed_at
FROM otp_requests
WHERE (phone_number = $1 OR phone_number = $2)
AND deleted = FALSE
AND consumed_at IS NULL
ORDER BY created_at DESC
LIMIT 1`,
[encryptedPhone, phoneNumber] // Try both encrypted and plaintext for backward compatibility
@ -189,14 +163,20 @@ async function verifyOtp(phoneNumber, code) {
}
if (isExpired) {
// OTP expired - delete and return (but we already did bcrypt.compare for constant time)
await db.query('DELETE FROM otp_codes WHERE id = $1', [otpRecord.id]);
// OTP expired - mark as consumed (but we already did bcrypt.compare for constant time)
await db.query(
'UPDATE otp_requests SET consumed_at = NOW() WHERE id = $1',
[otpRecord.id]
);
return { ok: false, reason: 'expired' };
}
if (isMaxAttempts) {
// Max attempts exceeded - delete and return (but we already did bcrypt.compare for constant time)
await db.query('DELETE FROM otp_codes WHERE id = $1', [otpRecord.id]);
// Max attempts exceeded - mark as consumed (but we already did bcrypt.compare for constant time)
await db.query(
'UPDATE otp_requests SET consumed_at = NOW() WHERE id = $1',
[otpRecord.id]
);
return { ok: false, reason: 'max_attempts' };
}
@ -205,27 +185,34 @@ async function verifyOtp(phoneNumber, code) {
// === ADDED FOR OTP ATTEMPT LIMIT ===
// Increment attempt count
await db.query(
'UPDATE otp_codes SET attempt_count = attempt_count + 1 WHERE id = $1',
'UPDATE otp_requests SET attempt_count = attempt_count + 1 WHERE id = $1',
[otpRecord.id]
);
return { ok: false, reason: 'invalid' };
}
// === ADDED FOR 2-MIN OTP VALIDITY & NO-RESEND ===
// Delete OTP once verified to prevent reuse
// Also clears the active OTP marker (via deletion, Redis TTL will handle cleanup)
await db.query('DELETE FROM otp_codes WHERE id = $1', [otpRecord.id]);
// Mark OTP as consumed once verified to prevent reuse
// Using consumed_at instead of deleting (matches final_db.sql schema)
await db.query(
'UPDATE otp_requests SET consumed_at = NOW() WHERE id = $1',
[otpRecord.id]
);
return { ok: true };
}
/**
* Clean up expired OTPs (can be called periodically)
* Marks expired OTPs as consumed instead of deleting (matches final_db.sql schema)
*/
async function cleanupExpiredOtps() {
await ensureOtpCodesTable();
await db.query(
'DELETE FROM otp_codes WHERE expires_at < NOW()'
`UPDATE otp_requests
SET consumed_at = NOW()
WHERE expires_at < NOW()
AND consumed_at IS NULL
AND deleted = FALSE`
);
}