Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion bin/installcheck
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ else
fi

# Execute the test fixtures
psql -v ON_ERROR_STOP= -f test/fixtures.sql -f lints/0001*.sql -f lints/0002*.sql -f lints/0003*.sql -f lints/0004*.sql -f lints/0005*.sql -f lints/0006*.sql -f lints/0007*.sql -f lints/0008*.sql -f lints/0009*.sql -f lints/0010*.sql -f lints/0011*.sql -f lints/0013*.sql -f lints/0014*.sql -f lints/0015*.sql -f lints/0016*.sql -f lints/0017*.sql -f lints/0018*.sql -f lints/0019*.sql -f lints/0020*.sql -f lints/0021*.sql -f lints/0022*.sql -d contrib_regression
psql -v ON_ERROR_STOP= -f test/fixtures.sql -f lints/0001*.sql -f lints/0002*.sql -f lints/0003*.sql -f lints/0004*.sql -f lints/0005*.sql -f lints/0006*.sql -f lints/0007*.sql -f lints/0008*.sql -f lints/0009*.sql -f lints/0010*.sql -f lints/0011*.sql -f lints/0013*.sql -f lints/0014*.sql -f lints/0015*.sql -f lints/0016*.sql -f lints/0017*.sql -f lints/0018*.sql -f lints/0019*.sql -f lints/0020*.sql -f lints/0021*.sql -f lints/0022*.sql -f lints/0023*.sql -f lints/0024*.sql -d contrib_regression

# Run tests
${REGRESS} --use-existing --dbname=contrib_regression --inputdir=${TESTDIR} ${TESTS}
Expand Down
6 changes: 5 additions & 1 deletion dockerfiles/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,5 +13,9 @@ services:
interval: 5s
timeout: 5s
retries: 10
volumes:
- ../results:/home/splinter/results_out
command:
- ./bin/installcheck
- bash
- -c
- "./bin/installcheck; cp /home/splinter/regression.diffs /home/splinter/regression.out /home/splinter/results/* /home/splinter/results_out/ 2>/dev/null || true"
110 changes: 110 additions & 0 deletions docs/0023_sensitive_columns_exposed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@

Level: ERROR

### Rationale

Tables exposed via the Supabase Data APIs that contain columns with potentially sensitive data (such as passwords, SSNs, credit card numbers, API keys, or other PII) pose a significant security risk when Row Level Security (RLS) is not enabled. Without RLS, anyone with access to the project's URL and an anonymous or authenticated role can read all data in these tables, potentially exposing sensitive user information.

This lint identifies tables that:
1. Are accessible via the Data API (in exposed schemas like `public`)
2. Have RLS disabled
3. Contain columns with names matching common sensitive data patterns

### Sensitive Column Patterns Detected

The following categories of sensitive data are detected:

**Authentication & Credentials:**
- `password`, `passwd`, `pwd`, `secret`, `api_key`, `token`, `jwt`, `access_token`, `refresh_token`, `session_token`, `auth_code`, `otp`, `2fa_secret`

**Personal Identifiers:**
- `ssn`, `social_security`, `driver_license`, `passport_number`, `national_id`, `tax_id`

**Financial Information:**
- `credit_card`, `card_number`, `cvv`, `bank_account`, `account_number`, `routing_number`, `iban`, `swift_code`

**Health & Medical:**
- `health_record`, `medical_record`, `patient_id`, `insurance_number`, `diagnosis`

**Device & Digital Identifiers:**
- `mac_address`, `imei`, `device_uuid`, `ssh_key`, `pgp_key`, `certificate`

**Biometric Data:**
- `fingerprint`, `biometric`, `facial_recognition`

### How to Resolve

**Option 1: Enable Row Level Security (Recommended)**

Enable RLS on the table and create appropriate policies:

```sql
-- Enable RLS
alter table <schema>.<table> enable row level security;

-- Create a policy that restricts access
create policy "Users can only view their own data"
on <schema>.<table>
for select
using (auth.uid() = user_id);
```

**Option 2: Remove sensitive columns from the table**

If the data doesn't need to be stored, remove the sensitive columns:

```sql
alter table <schema>.<table> drop column <sensitive_column>;
```

**Option 3: Move sensitive data to a separate, protected table**

Store sensitive data in a separate table with proper RLS:

```sql
-- Create a protected table for sensitive data
create table <schema>.<table>_secure (
id uuid primary key references <schema>.<table>(id),
<sensitive_column> text
);

-- Enable RLS on the secure table
alter table <schema>.<table>_secure enable row level security;

-- Remove from the exposed table
alter table <schema>.<table> drop column <sensitive_column>;
```

**Option 4: Remove the schema from API exposure**

If the table should not be accessible via APIs at all, remove the schema from the [Exposed schemas in API settings](https://supabase.com/dashboard/project/_/settings/api).

### Example

Given the schema:

```sql
create table public.users(
id uuid primary key,
email text not null,
password_hash text not null,
ssn text,
created_at timestamptz default now()
);

grant select on public.users to anon, authenticated;
```

This table is flagged because it contains sensitive columns (`password_hash`, `ssn`) and is accessible via the API without RLS protection. Any user with the project URL can query this table and retrieve all user passwords and social security numbers.

To fix, enable RLS and create appropriate policies:

```sql
alter table public.users enable row level security;

-- Allow users to only read their own data
create policy "Users can view own profile"
on public.users
for select
using (auth.uid() = id);
```
128 changes: 128 additions & 0 deletions docs/0024_permissive_rls_policy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@

Level: WARN

### Rationale

Row Level Security (RLS) policies that use always-true expressions like `USING (true)` or `WITH CHECK (true)` effectively bypass the security that RLS is meant to provide. While RLS appears to be enabled on the table, these permissive policies allow unrestricted access to all rows for the specified roles.

This is a common misconfiguration that occurs when:
- Developers create placeholder policies during development and forget to update them
- Policies are incorrectly configured with the assumption that other policies will restrict access
- Copy-paste errors from documentation examples

### Patterns Detected

The lint identifies policies with these always-true patterns:

**USING Clause (controls which rows can be read):**
- `USING (true)` - explicitly allows reading all rows
- `USING (1=1)` - tautology that always evaluates to true
- `USING ('a'='a')` - string comparison tautology
- Missing USING clause on permissive SELECT policies

**WITH CHECK Clause (controls which rows can be written):**
- `WITH CHECK (true)` - allows writing any row
- `WITH CHECK (1=1)` - tautology that always evaluates to true
- Missing WITH CHECK clause on permissive INSERT/UPDATE policies

### Security Impact

When a permissive policy with `USING (true)` exists:
- **For SELECT**: Any user with the specified role can read ALL rows in the table
- **For INSERT**: Any user can insert ANY data into the table
- **For UPDATE**: Any user can modify ANY row in the table
- **For DELETE**: Any user can delete ANY row from the table

This is particularly dangerous when the policy applies to `anon` or `authenticated` roles, as it exposes data to all API users.

### How to Resolve

**Option 1: Add proper row-level conditions**

Replace the permissive policy with one that properly restricts access:

```sql
-- Instead of: USING (true)
-- Use a proper condition:
drop policy "allow_all" on public.posts;

create policy "users_own_posts"
on public.posts
for select
using (auth.uid() = user_id);
```

**Option 2: Use restrictive policies in combination**

If you need a base permissive policy, combine it with restrictive policies:

```sql
-- Base permissive policy
create policy "authenticated_access"
on public.posts
for select
to authenticated
using (true);

-- Restrictive policy to limit access
create policy "only_published"
on public.posts
as restrictive
for select
to authenticated
using (status = 'published' or auth.uid() = user_id);
```

**Option 3: Remove the policy if RLS is not needed**

If you don't need row-level restrictions, consider whether RLS should be disabled:

```sql
drop policy "allow_all" on public.posts;
alter table public.posts disable row level security;
```

Note: Only disable RLS if you're certain the table should be fully accessible.

### Example

Given this problematic configuration:

```sql
create table public.user_data(
id uuid primary key,
user_id uuid references auth.users(id),
sensitive_info text
);

alter table public.user_data enable row level security;

-- This policy defeats the purpose of RLS!
create policy "allow_all_select"
on public.user_data
for select
to authenticated
using (true);
```

The `allow_all_select` policy allows ANY authenticated user to read ALL rows, including other users' sensitive information.

Fix by adding a proper condition:

```sql
drop policy "allow_all_select" on public.user_data;

create policy "users_own_data"
on public.user_data
for select
to authenticated
using (auth.uid() = user_id);
```

### False Positives

In some cases, `USING (true)` may be intentional:
- Public read-only tables (e.g., blog posts, product catalogs)
- Tables where access is controlled by other means (e.g., API layer)

If the policy is intentional, you can document why in a comment or consider suppressing this lint for specific tables.
108 changes: 108 additions & 0 deletions lints/0023_sensitive_columns_exposed.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
create view lint."0023_sensitive_columns_exposed" as

-- Detects tables exposed via API that contain columns with sensitive names
-- Inspired by patterns from security scanners that detect PII/credential exposure
with sensitive_patterns as (
select unnest(array[
-- Authentication & Credentials
'password', 'passwd', 'pwd', 'passphrase',
'secret', 'secret_key', 'private_key', 'api_key', 'apikey',
'auth_key', 'token', 'jwt', 'access_token', 'refresh_token',
'oauth_token', 'session_token', 'bearer_token', 'auth_code',
'session_id', 'session_key', 'session_secret',
'recovery_code', 'backup_code', 'verification_code',
'otp', 'two_factor', '2fa_secret', '2fa_code',
-- Personal Identifiers
'ssn', 'social_security', 'social_security_number',
'driver_license', 'drivers_license', 'license_number',
'passport_number', 'passport_id', 'national_id', 'tax_id',
-- Financial Information
'credit_card', 'card_number', 'cvv', 'cvc', 'cvn',
'bank_account', 'account_number', 'routing_number',
'iban', 'swift_code', 'bic',
-- Health & Medical
'health_record', 'medical_record', 'patient_id',
'insurance_number', 'health_insurance', 'medical_insurance',
'treatment',
-- Device Identifiers
'mac_address', 'macaddr', 'imei', 'device_uuid',
-- Digital Keys & Certificates
'pgp_key', 'gpg_key', 'ssh_key', 'certificate',
'license_key', 'activation_key',
-- Biometric Data
'facial_recognition'
]) as pattern
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pass, diagnosis, fingerprint, biometric, and facial_recognition don't seem high enough signal-to-noise compared with the other. lets remove those

),
exposed_tables as (
select
n.nspname as schema_name,
c.relname as table_name,
c.oid as table_oid
from
pg_catalog.pg_class c
join pg_catalog.pg_namespace n
on c.relnamespace = n.oid
where
c.relkind = 'r' -- regular tables
and (
pg_catalog.has_table_privilege('anon', c.oid, 'SELECT')
or pg_catalog.has_table_privilege('authenticated', c.oid, 'SELECT')
)
and n.nspname = any(array(select trim(unnest(string_to_array(current_setting('pgrst.db_schemas', 't'), ',')))))
and n.nspname not in (
'_timescaledb_cache', '_timescaledb_catalog', '_timescaledb_config', '_timescaledb_internal', 'auth', 'cron', 'extensions', 'graphql', 'graphql_public', 'information_schema', 'net', 'pgmq', 'pgroonga', 'pgsodium', 'pgsodium_masks', 'pgtle', 'pgbouncer', 'pg_catalog', 'pgtle', 'realtime', 'repack', 'storage', 'supabase_functions', 'supabase_migrations', 'tiger', 'topology', 'vault'
)
-- Only flag tables without RLS enabled
and not c.relrowsecurity
),
sensitive_columns as (
select
et.schema_name,
et.table_name,
a.attname as column_name,
sp.pattern as matched_pattern
from
exposed_tables et
join pg_catalog.pg_attribute a
on a.attrelid = et.table_oid
and a.attnum > 0
and not a.attisdropped
cross join sensitive_patterns sp
where
-- Match column name against sensitive patterns (case insensitive), allowing '-'/'_' variants
replace(lower(a.attname), '-', '_') = sp.pattern
)
select
'sensitive_columns_exposed' as name,
'Sensitive Columns Exposed' as title,
'ERROR' as level,
'EXTERNAL' as facing,
array['SECURITY'] as categories,
'Detects tables exposed via API that contain columns with potentially sensitive data (PII, credentials, financial info) without RLS protection.' as description,
format(
'Table `%s.%s` is exposed via API without RLS and contains potentially sensitive column(s): %s. This may lead to data exposure.',
schema_name,
table_name,
string_agg(distinct column_name, ', ' order by column_name)
) as detail,
'https://supabase.com/docs/guides/database/database-linter?lint=0023_sensitive_columns_exposed' as remediation,
jsonb_build_object(
'schema', schema_name,
'name', table_name,
'type', 'table',
'sensitive_columns', array_agg(distinct column_name order by column_name),
'matched_patterns', array_agg(distinct matched_pattern order by matched_pattern)
) as metadata,
format(
'sensitive_columns_exposed_%s_%s',
schema_name,
table_name
) as cache_key
from
sensitive_columns
group by
schema_name,
table_name
order by
schema_name,
table_name;
Loading