Mockly

glossary

Supabase Storage Bucket Privacy

Storage bucket privacy means restricting downloads and listings so only authorized backend code can fetch objects. This page explains it in plain English, then goes deeper into how it works in Supabase/Postgres, what commonly goes wrong, and how to fix it without relying on fragile client-side rules.

What “Supabase Storage Bucket Privacy” means (plain English)

Private buckets or signed URLs ensure only backend services or authenticated server code can grant access to files.

How Supabase Storage Bucket Privacy works in Supabase/Postgres (technical)

Set bucket policies to private, avoid PUBLIC/anon read permissions, and generate short-lived signed URLs server-side to prevent direct enumeration.

Attack paths & failure modes for Supabase Storage Bucket Privacy

  • Public bucket leaks invoices: A SaaS app stored PDF invoices in a public bucket with names like invoice_1234.pdf.
  • Guessable filenames enable enumeration: User uploads were stored as {userId}/{timestamp}.png while the bucket still allowed weak reads.
  • Public bucket leaks invoices: Public bucket plus predictable filenames allowed brute-force downloads of sensitive billing data.
  • Guessable filenames enable enumeration: Attackers guessed object keys and retrieved many files because listing or weak reads were permitted.
  • Buckets are public, so object URLs bypass your application authorization.
  • Buckets are listable, enabling enumeration of filenames and paths.
  • Filenames embed identifiers (emails, invoice numbers) that leak information even without contents.
  • Access is “secured” in the UI but files remain directly fetchable via Storage endpoints.
  • Signed URLs are used but with long TTLs or shared broadly (turning them into public links).

Why Supabase Storage Bucket Privacy matters for Supabase security

Public or listable buckets let attackers enumerate filenames and download sensitive documents, so privacy is breached even when filenames seem harmless.

Common Supabase Storage Bucket Privacy mistakes that lead to leaks

  • Leaving a bucket public for convenience while assuming filenames are secret.
  • Using predictable filenames that can be brute-forced.
  • Skipping server-side signed URLs and giving clients direct bucket access.
  • Public bucket leaks invoices: Public bucket plus predictable filenames allowed brute-force downloads of sensitive billing data.
  • Guessable filenames enable enumeration: Attackers guessed object keys and retrieved many files because listing or weak reads were permitted.

Where to look for Supabase Storage Bucket Privacy in Supabase

  • Bucket privacy settings (public vs private) and whether listing is allowed.
  • Storage policies for storage.objects (read/list/write rules).
  • Object naming conventions that make enumeration or guessing easier.
  • Any client-side code that downloads private files without a backend token exchange.

How to detect Supabase Storage Bucket Privacy issues (signals + checks)

Use this as a quick checklist to validate your current state:

  • Try the same queries your frontend can run (anon/authenticated). If sensitive rows come back, you have exposure.
  • Verify RLS is enabled and (for sensitive tables) forced.
  • List policies and look for conditions that don’t bind rows to a user or tenant.
  • Audit grants to anon / authenticated on sensitive tables and functions.
  • Public bucket leaks invoices: Predictable object keys plus public buckets equals enumeration.
  • Public bucket leaks invoices: Private buckets should be the default for sensitive data.
  • Public bucket leaks invoices: Signed URLs should be short-lived and issued from backend code.
  • Re-test after every migration that touches security-critical tables or functions.

How to fix Supabase Storage Bucket Privacy (backend-only + zero-policy posture)

Mockly’s safest default is backend-only access: the browser should not query tables, call RPC, or access Storage directly.

  1. Decide which operations must remain client-side (often: none for sensitive resources).
  2. Create server endpoints (API routes or server actions) for required reads/writes.
  3. Apply hardening SQL: enable+force RLS where relevant, remove broad policies, and revoke grants from client roles.
  4. Generate signed URLs for private Storage downloads on the server only.
  5. Re-run a scan and confirm the issue disappears.
  6. Add a regression check to your release process so drift doesn’t reintroduce exposure. Fixes that worked in linked incidents:
  • Public bucket leaks invoices: Make the bucket private, rename objects to random UUIDs, and serve downloads via backend-generated signed URLs.
  • Guessable filenames enable enumeration: Switch to random UUID object keys, keep the bucket private, and issue signed URLs after ownership checks.

Verification checklist for Supabase Storage Bucket Privacy

  1. Attempt to list objects or fetch a private file using anon/authenticated credentials.
  2. Make buckets private and serve downloads through a backend endpoint that returns signed URLs.
  3. Keep signed URLs short-lived and generated server-side only.
  4. Verify direct Storage access fails while the app works via your backend download flow.
  5. Re-check caching/CDN behavior so old public URLs don’t remain accessible.
  6. Public bucket leaks invoices: Predictable object keys plus public buckets equals enumeration.
  7. Public bucket leaks invoices: Private buckets should be the default for sensitive data.
  8. Public bucket leaks invoices: Signed URLs should be short-lived and issued from backend code.

SQL sanity checks for Supabase Storage Bucket Privacy (optional, but high signal)

If you prefer evidence over intuition, run a small set of SQL checks after each fix.

The goal is not to memorize catalog tables — it’s to make sure the access boundary you intended is the one Postgres actually enforces:

  • Confirm RLS is enabled (and forced where appropriate) for tables tied to this term.
  • List policies and read them as plain language: who can do what, under what condition?
  • Audit grants for anon/authenticated and PUBLIC on the tables, views, and functions involved.
  • If Storage is involved: review bucket privacy and policies for listing/reads.
  • If RPC is involved: review EXECUTE grants for functions and whether privileged functions are server-only.

Pair these checks with a direct API access test using client credentials. When both agree, you can ship the fix with confidence.

Over time, keep a small “query pack” for the checks you trust and run it after every migration. That’s how you prevent quiet regressions.

Prevent Supabase Storage Bucket Privacy drift (so it doesn’t come back)

  • Adopt a default: private buckets only; require an explicit exception for public assets.
  • Add a release checklist item: confirm bucket privacy and listing settings in prod.
  • Monitor for unexpected spikes in Storage reads from anonymous contexts.
  • Keep one reusable verification test for “Public bucket leaks invoices” and rerun it after every migration that touches this surface.
  • Keep one reusable verification test for “Guessable filenames enable enumeration” and rerun it after every migration that touches this surface.

Rollout plan for Supabase Storage Bucket Privacy fixes (without breaking production)

Many hardening changes fail because teams revoke direct access first and only later discover missing backend paths.

Use this sequence to reduce both risk and outage pressure:

  1. Implement and verify the backend endpoint or server action before permission changes.
  2. Switch clients to that backend path behind a feature flag when possible.
  3. Then revoke direct client access (broad grants, permissive policies, public bucket reads, or broad EXECUTE).
  4. Run direct-access denial tests and confirm authorized backend flows still succeed.
  5. Re-scan after deployment and again after the next migration.

This turns security fixes into repeatable rollout mechanics instead of one-off emergency changes.

Incident breakdowns for Supabase Storage Bucket Privacy (real scenarios)

Public bucket leaks invoices

Scenario: A SaaS app stored PDF invoices in a public bucket with names like invoice_1234.pdf.

What failed: Public bucket plus predictable filenames allowed brute-force downloads of sensitive billing data.

What fixed it: Make the bucket private, rename objects to random UUIDs, and serve downloads via backend-generated signed URLs.

Why the fix worked: Private buckets block direct access, and signed URLs expire quickly, so guessing alone no longer works.

Key takeaways:

  • Predictable object keys plus public buckets equals enumeration.
  • Private buckets should be the default for sensitive data.
  • Signed URLs should be short-lived and issued from backend code.
  • Rotate legacy objects when renaming strategies change.

Read full example: Public bucket leaks invoices

Guessable filenames enable enumeration

Scenario: User uploads were stored as {userId}/{timestamp}.png while the bucket still allowed weak reads.

What failed: Attackers guessed object keys and retrieved many files because listing or weak reads were permitted.

What fixed it: Switch to random UUID object keys, keep the bucket private, and issue signed URLs after ownership checks.

Why the fix worked: Random keys break enumeration and private buckets demand credentials before any request succeeds.

Key takeaways:

  • Predictable keys invite brute force.
  • Combine private buckets with non-guessable names.
  • Signed URLs should expire quickly and be rate limited.
  • Monitor for unusual download patterns.

Read full example: Guessable filenames enable enumeration

Real-world examples of Supabase Storage Bucket Privacy (and why they work)

Related terms

  • Signed URLs → /glossary/signed-urls
  • Service Role Key → /glossary/service-role-key

FAQ

Is Supabase Storage Bucket Privacy enough to secure my Supabase app?

It’s necessary, but not sufficient. You also need correct grants, secure Storage/RPC settings, and a backend-only access model for sensitive operations.

What’s the quickest way to reduce risk with Supabase Storage Bucket Privacy?

Remove direct client access to sensitive resources, enable/force RLS where appropriate, and verify via a repeatable checklist that anon/authenticated cannot query what they shouldn’t.

How do I verify the fix is real (not just a UI change)?

Attempt direct API queries using the same client credentials your app ships. If the database denies access (401/403) and your backend endpoints still work, your fix is effective.

Next step

Want a quick exposure report for your own project? Run a scan in Mockly to find public tables, storage buckets, and RPC functions — then apply fixes with verification steps.

Explore related pages

parent

Glossary

/glossary

sibling

Service Role Key

/glossary/service-role-key

sibling

Signed URLs

/glossary/signed-urls

cross

Make a bucket private + serve files with signed URLs

/templates/storage-safety/make-bucket-private-signed-urls

cross

Public bucket leaks invoices

/examples/supabase-storage-bucket-privacy/public-bucket-leaks-invoices

cross

Pricing

/pricing