ScaleRocket/Web

Rate Limiting

Client-side rate limiting, Supabase built-in limits, and Edge Function rate limiting patterns.

Overview

Rate limiting protects your application from abuse, brute-force attacks, and excessive API usage. ScaleRocket implements rate limiting at multiple layers: client-side throttling, Supabase built-in limits, and custom Edge Function rate limiting.

Client-Side Rate Limiting

Login Throttling

The login form prevents rapid-fire submissions with a client-side cooldown:

const [isSubmitting, setIsSubmitting] = useState(false);

const handleLogin = async (e: FormEvent) => {
  e.preventDefault();
  if (isSubmitting) return;

  setIsSubmitting(true);
  try {
    await supabase.auth.signInWithPassword({ email, password });
  } catch (error) {
    toast({ title: "Login failed", variant: "destructive" });
  } finally {
    setTimeout(() => setIsSubmitting(false), 2000); // 2s cooldown
  }
};

This prevents duplicate requests but does not protect against determined attackers. Server-side limiting is essential.

Supabase Built-In Rate Limits

Supabase applies automatic rate limits to auth endpoints:

EndpointDefault Limit
Sign up10 requests per hour per IP
Sign in30 requests per hour per IP
Password reset5 requests per hour per IP
OAuth10 requests per minute per IP
API (PostgREST)Configurable per project

These limits are enforced server-side by Supabase. You can adjust some of them in your Supabase project settings under Auth > Rate Limits.

Edge Function Rate Limiting

For custom Edge Functions, implement rate limiting with an in-memory store or a Redis-backed solution.

Simple In-Memory Pattern

// supabase/functions/_shared/rate-limit.ts
const requests = new Map<string, { count: number; resetAt: number }>();

export function isRateLimited(key: string, maxRequests = 10, windowMs = 60000): boolean {
  const now = Date.now();
  const entry = requests.get(key);

  if (!entry || now > entry.resetAt) {
    requests.set(key, { count: 1, resetAt: now + windowMs });
    return false;
  }

  entry.count++;
  return entry.count > maxRequests;
}

Usage in an Edge Function

import { isRateLimited } from "../_shared/rate-limit.ts";

Deno.serve(async (req) => {
  const clientIP = req.headers.get("x-forwarded-for") || "unknown";

  if (isRateLimited(clientIP, 20, 60000)) {
    return new Response(JSON.stringify({ error: "Too many requests" }), {
      status: 429,
      headers: { "Content-Type": "application/json", "Retry-After": "60" },
    });
  }

  // Handle the request
});

Limitations

In-memory rate limiting resets when the function cold-starts. For persistent rate limiting, use:

  • Supabase table: Store request counts in a database table with a cleanup cron.
  • Upstash Redis: Use @upstash/ratelimit for distributed, serverless-friendly rate limiting.

Production Recommendations

  1. Always enable Supabase auth rate limits -- these are on by default; do not disable them.
  2. Add rate limiting to all public Edge Functions -- especially webhooks and payment endpoints.
  3. Use progressive delays -- increase cooldown time after repeated failures (exponential backoff).
  4. Return 429 status codes -- include a Retry-After header so clients know when to retry.
  5. Monitor rate limit hits -- log when users hit limits to detect abuse patterns.
  6. Use Cloudflare or Vercel WAF -- for IP-level rate limiting before requests reach your app.

Tips

  • Client-side rate limiting is a UX improvement, not a security measure. Always enforce limits server-side.
  • Avoid rate limiting by user ID alone; an attacker can cycle through accounts. Use IP-based limiting as the first layer.
  • For paid API endpoints, consider per-API-key rate limiting tied to the user's plan.

Done reading? Mark this page as complete.

On this page