mirror of
https://github.com/pacnpal/thrilltrack-explorer.git
synced 2025-12-20 11:51:14 -05:00
Create shared rateLimitConfig.ts with tiers (strict, moderate, lenient, generous, per-user variants) and update edge functions to import centralized rate limiters. Replace inline rate limiter usage with new config, preserving backward compatibility. Add documentation guide for rate limiting usage.
8.6 KiB
8.6 KiB
Rate Limiting Guide for Edge Functions
This guide helps you choose the appropriate rate limit tier for each edge function and explains how to implement rate limiting consistently across the application.
Quick Reference
Rate Limit Tiers
| Tier | Requests/Min | Use Case |
|---|---|---|
| STRICT | 5 | Expensive operations (uploads, exports, batch processing) |
| MODERATE | 10 | Moderation actions, content submission, security operations |
| STANDARD | 20 | Typical read/write operations, account management |
| LENIENT | 30 | Lightweight reads, public data, validation |
| GENEROUS | 60 | High-frequency operations (webhooks, polling, health checks) |
Per-User Tiers (Rate limits by user ID instead of IP)
| Tier | Requests/Min | Use Case |
|---|---|---|
| PER_USER_STRICT | 5 | User-specific expensive operations |
| PER_USER_MODERATE | 10 | User-specific moderation actions |
| PER_USER_STANDARD | 20 | User-specific standard operations |
| PER_USER_LENIENT | 40 | User-specific frequent operations |
How to Implement Rate Limiting
Basic Implementation
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts';
import { corsHeaders } from '../_shared/cors.ts';
import { rateLimiters, withRateLimit } from '../_shared/rateLimiter.ts';
// Your handler function
const handler = async (req: Request): Promise<Response> => {
// Your edge function logic here
return new Response(JSON.stringify({ success: true }), {
headers: { ...corsHeaders, 'Content-Type': 'application/json' }
});
};
// Apply rate limiting with appropriate tier
serve(withRateLimit(handler, rateLimiters.moderate, corsHeaders));
Per-User Rate Limiting
// Rate limit by user ID instead of IP address
serve(withRateLimit(handler, rateLimiters.perUserModerate, corsHeaders));
Custom Rate Limiting
import { createRateLimiter } from '../_shared/rateLimiter.ts';
// Create a custom rate limiter
const customLimiter = createRateLimiter({
windowMs: 60000,
maxRequests: 15,
keyGenerator: (req) => {
// Custom key logic
return req.headers.get('x-custom-key') || 'default';
}
});
serve(withRateLimit(handler, customLimiter, corsHeaders));
Recommended Rate Limits by Function Category
🔴 STRICT (5 req/min)
Currently Implemented:
- ✅
upload-image- CloudFlare image upload
Recommended:
export-user-data- Data export operations- Any function that makes expensive external API calls
- Batch data processing operations
- Functions that manipulate large datasets
🟠 MODERATE (10 req/min)
Currently Implemented:
- ✅
process-selective-approval- Moderation approvals - ✅
process-selective-rejection- Moderation rejections
Recommended:
admin-delete-user- Admin user deletionmanage-moderator-topic- Admin moderation managementmerge-contact-tickets- Admin ticket managementmfa-unenroll- Security operationsresend-deletion-code- Prevent code spamsend-escalation-notification- Admin escalationssend-password-added-email- Security emails- User submission functions (parks, rides, edits)
🟡 STANDARD (20 req/min)
Recommended:
cancel-account-deletion- Account managementcancel-email-change- Account managementconfirm-account-deletion- Account managementrequest-account-deletion- Account managementcreate-novu-subscriber- User registrationsend-contact-message- Contact form submissions- Email validation functions
- Authentication-related functions
🟢 LENIENT (30 req/min)
Recommended:
detect-location- Lightweight location lookupcheck-transaction-status- Status pollingvalidate-email-backend- Email validationsitemap- Public sitemap generation- Read-only public endpoints
🔵 GENEROUS (60 req/min)
Recommended:
novu-webhook- External webhook receiverscheduled-maintenance- Health checks- Internal service-to-service communication
- Real-time status endpoints
⚫ NO RATE LIMITING NEEDED
These functions are typically called internally or on a schedule:
cleanup-old-versions- Scheduled cleanupprocess-expired-bans- Scheduled taskprocess-scheduled-deletions- Scheduled taskrun-cleanup-jobs- Scheduled taskmigrate-novu-users- One-time migration- Internal notification functions (notify-*)
seed-test-data- Development only
Best Practices
1. Choose the Right Tier
- Start restrictive: Begin with a lower tier and increase if needed
- Consider cost: Match the rate limit to the operation's resource cost
- Think about abuse: Higher abuse risk = stricter limits
- Monitor usage: Use edge function logs to track rate limit hits
2. Use Per-User Limits for Authenticated Endpoints
// ✅ Good: Rate limit authenticated operations per user
serve(withRateLimit(handler, rateLimiters.perUserModerate, corsHeaders));
// ❌ Less effective: Rate limit authenticated operations per IP
// (Multiple users behind same IP can hit each other's limits)
serve(withRateLimit(handler, rateLimiters.moderate, corsHeaders));
3. Handle Rate Limit Errors Gracefully
Rate limit responses automatically include:
429 Too Many Requestsstatus codeRetry-Afterheader (seconds to wait)X-RateLimit-Limitheader (max requests allowed)X-RateLimit-Remainingheader (requests remaining)
4. Document Your Choice
Always add a comment explaining why you chose a specific tier:
// Apply moderate rate limiting (10 req/min) for moderation actions
// to prevent abuse while allowing legitimate moderator workflows
serve(withRateLimit(handler, rateLimiters.moderate, corsHeaders));
5. Test Rate Limits
# Test rate limiting locally
for i in {1..15}; do
curl -X POST https://your-project.supabase.co/functions/v1/your-function \
-H "Authorization: Bearer YOUR_ANON_KEY" \
-H "Content-Type: application/json" \
-d '{"test": true}'
echo " - Request $i"
sleep 1
done
Migration Checklist
When adding rate limiting to an existing function:
- ✅ Determine the appropriate tier based on operation cost
- ✅ Import
rateLimitersandwithRateLimitfrom_shared/rateLimiter.ts - ✅ Import
corsHeadersfrom_shared/cors.ts - ✅ Wrap your handler with
withRateLimit(handler, rateLimiters.TIER, corsHeaders) - ✅ Add a comment explaining the tier choice
- ✅ Test the rate limit works correctly
- ✅ Monitor edge function logs for rate limit hits
- ✅ Adjust tier if needed based on real usage
Troubleshooting
Rate Limits Too Strict
Symptoms: Legitimate users hitting rate limits frequently
Solutions:
- Increase to next tier up (strict → moderate → standard → lenient)
- Consider per-user rate limiting instead of per-IP
- Check if the operation can be optimized to reduce frequency
Rate Limits Too Lenient
Symptoms: Abuse patterns, high costs, slow performance
Solutions:
- Decrease to next tier down
- Add additional validation before expensive operations
- Consider implementing captcha for public endpoints
Per-User Rate Limiting Not Working
Check:
- Is the Authorization header being sent?
- Is the JWT valid and parsable?
- Are logs showing IP-based limits instead of user-based?
Examples from Production
Example 1: Upload Function (STRICT)
// upload-image function needs strict limiting because:
// - Makes external CloudFlare API calls ($$)
// - Processes large file uploads
// - High abuse potential
serve(withRateLimit(async (req) => {
// Upload logic here
}, rateLimiters.strict, getCorsHeaders(allowedOrigin)));
Example 2: Moderation Function (MODERATE)
// process-selective-approval needs moderate limiting because:
// - Modifies database records
// - Triggers notifications
// - Used by moderators (need reasonable throughput)
serve(withRateLimit(handler, rateLimiters.moderate, corsHeaders));
Example 3: Validation Function (LENIENT)
// validate-email-backend can be lenient because:
// - Lightweight operation (just validation)
// - No database writes
// - Users may need to retry multiple times
serve(withRateLimit(async (req) => {
// Validation logic here
}, rateLimiters.lenient, corsHeaders));
Future Enhancements
Potential improvements to consider:
- Dynamic Rate Limits: Adjust limits based on user role/tier
- Distributed Rate Limiting: Use Redis for multi-region support
- Rate Limit Analytics: Track and visualize rate limit metrics
- Custom Error Messages: Provide context-specific retry guidance
- Whitelist Support: Bypass limits for trusted IPs/users