Skip to content

Conversation

@AdriiiPRodri
Copy link
Contributor

@AdriiiPRodri AdriiiPRodri commented Dec 1, 2025

Context

This PR implements the backend support for the "Finding Severity Over Time" chart component in the UI. The chart will display the evolution of failed findings by severity level over a configurable time range.

The main challenge was performance: the existing ScanSummary table has a lot of rows in production, making it unsuitable for direct querying on every page load.

Description

New aggregation table DailyFindingsSeverity

  • Stores pre-aggregated daily snapshots with one row per provider per day
  • Contains FAIL counts by severity (critical, high, medium, low, informational) plus muted count
  • Denormalizes provider_type to avoid JOINs when filtering by cloud provider type
  • Includes three indexes optimized for different query patterns (no filter, provider filter, type filter)
  • Uses Row Level Security (RLS) for tenant isolation

New Celery task

  • update_daily_findings_severity_task runs after each scan completes
  • Aggregates data from ScanSummary (not Finding table) for efficiency
  • Uses update_or_create so multiple scans on the same day keep only the latest data
  • Runs in parallel with generate_outputs_task to minimize scan completion time

New API endpoint

  • GET /api/v1/overviews/findings_severity_over_time
  • Required parameter: filter[date_from]
  • Optional filters: date_to, provider_id, provider_id__in, provider_type, provider_type__in
  • Maximum date range: 365 days
  • Implements fill-forward logic for days without scans
  • Excludes soft-deleted providers automatically via ActiveProviderManager
  • Respects RBAC provider restrictions

Steps to review

  1. Migration: Check 0060_dailyfindingsseverity.py creates the table with proper indexes and RLS constraint

  2. Model: Review DailyFindingsSeverity in models.py:

    • Uses ActiveProviderManager to exclude soft-deleted providers
    • Has denormalized provider_type field
    • Three indexes cover all query patterns
  3. Task chain: In tasks/tasks.py, verify update_daily_findings_severity_task runs in parallel with generate_outputs_task (inside a group())

  4. Aggregation function: In tasks/jobs/scan.py, check aggregate_daily_findings_severity:

    • Reads from read replica
    • Aggregates from ScanSummary not Finding
    • Uses update_or_create for idempotency
  5. Endpoint: In v1/views.py, review findings_severity_over_time:

    • Validates date_from is required
    • Limits to 365 days max
    • Uses denormalized provider_type for filtering (no JOIN)
    • Fill-forward logic for missing days

Checklist

UI

  • All issue/task requirements work as expected on the UI
  • Screenshots/Video of the functionality flow (if applicable) - Mobile (X < 640px)
  • Screenshots/Video of the functionality flow (if applicable) - Table (640px > X < 1024px)
  • Screenshots/Video of the functionality flow (if applicable) - Desktop (X > 1024px)
  • Ensure new entries are added to CHANGELOG.md, if applicable.

API

  • Verify if API specs need to be regenerated.
  • Check if version updates are required (e.g., specs, Poetry, etc.).
  • Ensure new entries are added to CHANGELOG.md, if applicable.

License

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@github-actions github-actions bot added component/api review-django-migrations This PR contains changes in Django migrations labels Dec 1, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Dec 1, 2025

✅ All necessary CHANGELOG.md files have been updated.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 1, 2025

Conflict Markers Resolved

All conflict markers have been successfully resolved in this pull request.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 1, 2025

🔒 Container Security Scan

Image: prowler-api:1c7beff
Last scan: 2025-12-05 09:31:03 UTC

📊 Vulnerability Summary

Severity Count
🔴 Critical 4
Total 4

3 package(s) affected

⚠️ Action Required

Critical severity vulnerabilities detected. These should be addressed before merging:

  • Review the detailed scan results
  • Update affected packages to patched versions
  • Consider using a different base image if updates are unavailable

📋 Resources:

@AdriiiPRodri AdriiiPRodri force-pushed the PROWLER-25-finding-severity-over-time-component-api branch from bfda570 to 22c6a6a Compare December 2, 2025 14:31
@AdriiiPRodri AdriiiPRodri marked this pull request as ready for review December 2, 2025 14:47
@AdriiiPRodri AdriiiPRodri requested a review from a team as a code owner December 2, 2025 14:47
@codecov
Copy link

codecov bot commented Dec 2, 2025

Codecov Report

❌ Patch coverage is 77.04280% with 59 lines in your changes missing coverage. Please review.
✅ Project coverage is 92.38%. Comparing base (dbdce98) to head (28019e1).
⚠️ Report is 25 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #9363      +/-   ##
==========================================
+ Coverage   84.46%   92.38%   +7.92%     
==========================================
  Files        1498      156    -1342     
  Lines       46805    22279   -24526     
==========================================
- Hits        39532    20582   -18950     
+ Misses       7273     1697    -5576     
Flag Coverage Δ
api 92.38% <77.04%> (?)
prowler-py3.10-aws ?
prowler-py3.10-azure ?
prowler-py3.10-config ?
prowler-py3.10-gcp ?
prowler-py3.10-github ?
prowler-py3.10-iac ?
prowler-py3.10-kubernetes ?
prowler-py3.10-lib ?
prowler-py3.10-m365 ?
prowler-py3.10-mongodbatlas ?
prowler-py3.10-nhn ?
prowler-py3.10-oraclecloud ?
prowler-py3.11-aws ?
prowler-py3.11-azure ?
prowler-py3.11-config ?
prowler-py3.11-gcp ?
prowler-py3.11-github ?
prowler-py3.11-iac ?
prowler-py3.11-kubernetes ?
prowler-py3.11-lib ?
prowler-py3.11-m365 ?
prowler-py3.11-mongodbatlas ?
prowler-py3.11-nhn ?
prowler-py3.11-oraclecloud ?
prowler-py3.12-aws ?
prowler-py3.12-azure ?
prowler-py3.12-config ?
prowler-py3.12-gcp ?
prowler-py3.12-github ?
prowler-py3.12-iac ?
prowler-py3.12-kubernetes ?
prowler-py3.12-lib ?
prowler-py3.12-m365 ?
prowler-py3.12-mongodbatlas ?
prowler-py3.12-nhn ?
prowler-py3.12-oraclecloud ?
prowler-py3.9-aws ?
prowler-py3.9-azure ?
prowler-py3.9-config ?
prowler-py3.9-gcp ?
prowler-py3.9-github ?
prowler-py3.9-iac ?
prowler-py3.9-kubernetes ?
prowler-py3.9-lib ?
prowler-py3.9-m365 ?
prowler-py3.9-mongodbatlas ?
prowler-py3.9-nhn ?
prowler-py3.9-oraclecloud ?

Flags with carried forward coverage won't be shown. Click here to find out more.

Components Coverage Δ
prowler ∅ <ø> (∅)
api 92.38% <77.04%> (∅)
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@AdriiiPRodri
Copy link
Contributor Author

AdriiiPRodri commented Dec 2, 2025

Summary

This PR implements the backend support for the "Finding Severity Over Time" chart component in the UI. The chart displays the evolution of failed findings by severity level over a configurable time range.

Key Implementation Details

Component Description
New Table DailyFindingsSeverity - Pre-aggregated daily snapshots (one row per provider per day)
New Task update_daily_findings_severity_task - Runs in parallel with generate_outputs_task
New Endpoint GET /api/v1/overviews/findings_severity_over_time
Backfill Migration 0062_backfill_daily_severity - Populates 90 days of historical data

API Response Example

Request:

GET /api/v1/overviews/findings_severity_over_time?filter[date_from]=2025-12-01

Response:

{
  "data": [
    {
      "type": "findings-severity-over-time",
      "id": "2025-12-01",
      "attributes": {
        "date": "2025-12-01",
        "critical": 0,
        "high": 0,
        "medium": 0,
        "low": 0,
        "informational": 0,
        "muted": 0
      }
    },
    {
      "type": "findings-severity-over-time",
      "id": "2025-12-02",
      "attributes": {
        "date": "2025-12-02",
        "critical": 0,
        "high": 0,
        "medium": 2,
        "low": 25,
        "informational": 0,
        "muted": 0
      }
    }
  ],
  "meta": {
    "version": "v1"
  }
}

Performance Evidence

Test Data Population

Created test data with 100 providers and 365 days of history (36500 rows):

$ poetry run python manage.py populate_daily_severity --tenant 12646005-9067-4d2a-a098-8bb378604362 --providers 100 --days 365

============================================================
DailyFindingsSeverity Performance Test Data Population
============================================================
  Tenant:        12646005-9067-4d2a-a098-8bb378604362
  Providers:     100
  Days:          365
  Total rows:    36500
  Batch size:    1000
  Alias prefix:  perf-test
============================================================

Step 1/3: Creating providers...
Providers: 100%|████████████████████████████████████████████████████| 100/100 [00:00<00:00, 621.47it/s]
  Created 100 providers

Step 2/3: Generating severity data...
Generating: 100%|█████████████████████████████████████████████| 36500/36500 [00:00<00:00, 82264.50it/s]
  Generated 36500 rows

Step 3/3: Inserting into database...
Inserting: 100%|███████████████████████████████████████████████████████| 37/37 [00:03<00:00, 11.12it/s]
  Inserted 36500 rows

============================================================
Population complete!
============================================================

Query Performance & Index Usage

All three query patterns were tested with proper index usage:

1. No Filters (Most Common)

SELECT "daily_findings_severity"."date",
       SUM("daily_findings_severity"."critical") AS "critical",
       SUM("daily_findings_severity"."high") AS "high",
       SUM("daily_findings_severity"."medium") AS "medium",
       SUM("daily_findings_severity"."low") AS "low",
       SUM("daily_findings_severity"."informational") AS "informational",
       SUM("daily_findings_severity"."muted") AS "muted"
FROM "daily_findings_severity"
INNER JOIN "providers" ON ("daily_findings_severity"."provider_id" = "providers"."id")
WHERE (NOT "providers"."is_deleted"
       AND "daily_findings_severity"."date" >= '2025-11-01'
       AND "daily_findings_severity"."date" <= '2025-12-01'
       AND "daily_findings_severity"."tenant_id" = '12646005-9067-4d2a-a098-8bb378604362')
GROUP BY "daily_findings_severity"."date"
ORDER BY "daily_findings_severity"."date" ASC
Metric Value
Execution Time 5.091ms
Index Used dfs_tenant_date_idx

Query Plan:

GroupAggregate  (cost=14.01..14.04 rows=1 width=52)
  Group Key: daily_findings_severity.date
  ->  Sort  (cost=14.01..14.02 rows=1 width=28)
        Sort Key: daily_findings_severity.date
        ->  Hash Join  (cost=8.34..14.00 rows=1 width=28)
              ->  Hash  (cost=8.32..8.32 rows=1 width=44)
                    ->  Index Scan using dfs_tenant_date_idx on daily_findings_severity
                          Index Cond: ((tenant_id = '...'::uuid) AND (date >= '2025-11-01') AND (date <= '2025-12-01'))

2. Provider ID Filter

-- filter[provider_id]=b440ccec-8865-4c3f-b079-13c9939ebbfb
Metric Value
Execution Time 5.058ms
Index Used dfs_tenant_provider_date_idx

Query Plan:

GroupAggregate  (cost=0.41..14.21 rows=1 width=52)
  ->  Nested Loop  (cost=0.41..14.19 rows=1 width=28)
        ->  Index Scan using dfs_tenant_provider_date_idx on daily_findings_severity
              Index Cond: ((tenant_id = '...'::uuid) AND (provider_id = '...'::uuid) AND (date >= '2025-11-01') AND (date <= '2025-12-01'))

3. Provider Type Filter

-- filter[provider_type__in]=aws,azure
Metric Value
Execution Time 7.486ms
Index Used dfs_tenant_date_idx (see note below)

Note: The dfs_tenant_type_date_idx index exists but PostgreSQL's planner chooses dfs_tenant_date_idx due to RLS dynamic session variables. Without RLS, the correct index is used (0.619ms). The 7.5ms performance is still acceptable for production.


Backfill Migration Evidence

Tested the backfill migration with 1000 providers and 60 days of scan history:

$ poetry run python manage.py populate_scan_history --tenant 12646005-9067-4d2a-a098-8bb378604362 --providers 1000 --days 60
Creating 1000 providers with 60 days of scan history...
Created 1000 providers, 60000 scans with summaries

$ poetry run python manage.py migrate api 0060 --fake --database admin && poetry run python manage.py migrate api 0062 --database admin
Operations to perform:
  Target specific migration: 0060_dailyfindingsseverity, from api
Running migrations:
  Rendering model states... DONE
  Unapplying api.0061_backfill_daily_severity... FAKED
Operations to perform:
  Target specific migration: 0062_backfill_daily_severity, from api
Running migrations:
  Applying api.0062_backfill_daily_severity... OK

Worker Task Evidence

The scan-daily-severity task executes successfully after each scan completes:

worker-dev-1   | [2025-12-02 10:49:48,326: INFO/MainProcess] Task scan-daily-severity[b48a9c95-27b9-4dc9-8478-9bd087a3541e] received
worker-dev-1   | [2025-12-02 10:49:48,814: INFO/ForkPoolWorker-10] scan-daily-severity[b48a9c95-27b9-4dc9-8478-9bd087a3541e]: Updated daily findings severity for provider 9aa465d0-b958-4a42-9f85-83b67fc6dd4a on 2025-12-02

@AdriiiPRodri
Copy link
Contributor Author

AdriiiPRodri commented Dec 2, 2025

Backfill example

from datetime import timedelta
from django.utils import timezone
from api.models import Scan, StateChoices, DailyFindingsSeverity
from tasks.tasks import backfill_daily_findings_severity_task

# Get scan IDs that already have backfill data (to skip them)
already_backfilled = set(
    DailyFindingsSeverity.objects.using("admin")
    .values_list("scan_id", flat=True)
)
print(f"Found {len(already_backfilled)} scans already backfilled")

# Get completed scans from the last 90 days, excluding already backfilled
cutoff = timezone.now() - timedelta(days=90)
scans = Scan.objects.using("admin").filter(
    state=StateChoices.COMPLETED,
    completed_at__gte=cutoff,
).exclude(
    id__in=already_backfilled
).values_list("tenant_id", "id")

# Verify how many scans will be backfilled
print(f"Found {scans.count()} scans to backfill")

# Queue backfill tasks in batches
for tenant_id, scan_id in scans:
    backfill_daily_findings_severity_task.delay(
        tenant_id=str(tenant_id),
        scan_id=str(scan_id),
    )
...
worker-dev-1   | [2025-12-02 15:57:30,217: INFO/MainProcess] Task backfill-daily-findings-severity[11830f59-cae3-4c9c-9c78-206d78080801] received
worker-dev-1   | [2025-12-02 15:57:30,218: INFO/MainProcess] Task backfill-daily-findings-severity[f439f775-f953-4757-b331-98e278e5e2ed] received
worker-dev-1   | [2025-12-02 15:57:30,545: INFO/MainProcess] Task backfill-daily-findings-severity[e603373a-b694-4d1f-ad3f-ee61cdd97914] received
worker-dev-1   | [2025-12-02 15:57:30,564: INFO/MainProcess] Task backfill-daily-findings-severity[f4a84613-ccd7-4625-bfbb-48eb0518218b] received
...

Query

SELECT COUNT(*), SUM(critical+high+medium+low) FROM daily_findings_severity
 count |  sum   
-------+--------
  9000 | 992897
(1 row)

Copy link
Member

@vicferpoy vicferpoy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changelog has conflicts.

@AdriiiPRodri
Copy link
Contributor Author

AdriiiPRodri commented Dec 3, 2025

Queries review directly to the db

prowler@prowler-api:~/backend$ poetry run python manage.py benchmark_severity_over_time --populate --tenant 12646005-9067-4d2a-a098-8bb378604362

Populating 50 providers × 365 days × ~2 scans/day...
This may take a few minutes...
  Creating 50 providers...
  Processing day 0/365...
  Processing day 30/365...
  Processing day 60/365...
  Processing day 90/365...
  Processing day 120/365...
  Processing day 150/365...
  Processing day 180/365...
  Processing day 210/365...
  Processing day 240/365...
  Processing day 270/365...
  Processing day 300/365...
  Processing day 330/365...
  Processing day 360/365...

Created:
  - 50 providers
  - 45539 scans
  - 5236179 scan_summaries
  - Time: 645.09s

prowler@prowler-api:~/backend$ poetry run python manage.py benchmark_severity_over_time --tenant 12646005-9067-4d2a-a098-8bb378604362 --explain

================================================================================
EXPLAIN ANALYZE - DailySeveritySummary queries
================================================================================

1. Query with date range filter (30 days):
  Incremental Sort  (cost=1.92..80.09 rows=147 width=44) (actual time=0.115..0.340 rows=298 loops=1)
    Sort Key: provider_id, date DESC
    Presorted Key: provider_id
    Full-sort Groups: 5  Sort Method: quicksort  Average Memory: 29kB  Peak Memory: 29kB
    Buffers: shared hit=51
    ->  Index Scan using daily_severity_summaries_provider_id_1e41c4dc on daily_severity_summaries  (cost=0.28..76.10 rows=147 width=44) (actual time=0.034..0.268 rows=298 loops=1)
          Index Cond: (provider_id = ANY ('{ca7945b1-dc5e-4aa5-9c20-649f62301608,b94e8fe1-894b-40d0-a4ea-0e65ab68fc30,9eaf0ae9-750c-4d52-9159-9adf3be18511,4d577e3e-f239-4085-b3c2-0d4d073d2c22,55209da2-5918-4cf8-a913-1b655a16e0ff,1d8220e6-b576-4f6e-972a-b99b3c189211,722adf5d-612e-4da2-9d64-109c354e8429,3f088d3c-9806-4a8b-a5cb-d5c022d50ff7,1ace2823-53b7-4496-85e6-5b210d742485,0f2767d9-2f04-4f7f-850a-001960166ead}'::uuid[]))
          Filter: ((date >= '2025-11-03'::date) AND (date <= '2025-12-03'::date) AND (tenant_id = '12646005-9067-4d2a-a098-8bb378604362'::uuid) AND CASE WHEN (current_setting('api.tenant_id'::text, true) IS NULL) THEN false ELSE (tenant_id = (current_setting('api.tenant_id'::text))::uuid) END)
          Rows Removed by Filter: 563
          Buffers: shared hit=45
  Planning:
    Buffers: shared hit=186
  Planning Time: 0.871 ms
  Execution Time: 0.371 ms

2. Query all data up to date (fill-forward pattern):
  Incremental Sort  (cost=1.89..87.04 rows=431 width=44) (actual time=0.056..0.497 rows=861 loops=1)
    Sort Key: provider_id, date DESC
    Presorted Key: provider_id
    Full-sort Groups: 10  Sort Method: quicksort  Average Memory: 29kB  Peak Memory: 29kB
    Pre-sorted Groups: 10  Sort Method: quicksort  Average Memory: 30kB  Peak Memory: 30kB
    Buffers: shared hit=42
    ->  Index Scan using daily_severity_summaries_provider_id_1e41c4dc on daily_severity_summaries  (cost=0.28..73.95 rows=431 width=44) (actual time=0.006..0.350 rows=861 loops=1)
          Index Cond: (provider_id = ANY ('{ca7945b1-dc5e-4aa5-9c20-649f62301608,b94e8fe1-894b-40d0-a4ea-0e65ab68fc30,9eaf0ae9-750c-4d52-9159-9adf3be18511,4d577e3e-f239-4085-b3c2-0d4d073d2c22,55209da2-5918-4cf8-a913-1b655a16e0ff,1d8220e6-b576-4f6e-972a-b99b3c189211,722adf5d-612e-4da2-9d64-109c354e8429,3f088d3c-9806-4a8b-a5cb-d5c022d50ff7,1ace2823-53b7-4496-85e6-5b210d742485,0f2767d9-2f04-4f7f-850a-001960166ead}'::uuid[]))
          Filter: ((date <= '2025-12-03'::date) AND (tenant_id = '12646005-9067-4d2a-a098-8bb378604362'::uuid) AND CASE WHEN (current_setting('api.tenant_id'::text, true) IS NULL) THEN false ELSE (tenant_id = (current_setting('api.tenant_id'::text))::uuid) END)
          Buffers: shared hit=42
  Planning Time: 0.037 ms
  Execution Time: 0.526 ms

3. Aggregation by date (sum across providers):
  Sort  (cost=81.67..81.85 rows=73 width=52) (actual time=0.188..0.189 rows=31 loops=1)
    Sort Key: date
    Sort Method: quicksort  Memory: 27kB
    Buffers: shared hit=42
    ->  HashAggregate  (cost=78.68..79.41 rows=73 width=52) (actual time=0.175..0.177 rows=31 loops=1)
          Group Key: date
          Batches: 1  Memory Usage: 24kB
          Buffers: shared hit=42
          ->  Index Scan using daily_severity_summaries_provider_id_1e41c4dc on daily_severity_summaries  (cost=0.28..76.10 rows=147 width=28) (actual time=0.013..0.147 rows=298 loops=1)
                Index Cond: (provider_id = ANY ('{ca7945b1-dc5e-4aa5-9c20-649f62301608,b94e8fe1-894b-40d0-a4ea-0e65ab68fc30,9eaf0ae9-750c-4d52-9159-9adf3be18511,4d577e3e-f239-4085-b3c2-0d4d073d2c22,55209da2-5918-4cf8-a913-1b655a16e0ff,1d8220e6-b576-4f6e-972a-b99b3c189211,722adf5d-612e-4da2-9d64-109c354e8429,3f088d3c-9806-4a8b-a5cb-d5c022d50ff7,1ace2823-53b7-4496-85e6-5b210d742485,0f2767d9-2f04-4f7f-850a-001960166ead}'::uuid[]))
                Filter: ((date >= '2025-11-03'::date) AND (date <= '2025-12-03'::date) AND (tenant_id = '12646005-9067-4d2a-a098-8bb378604362'::uuid) AND CASE WHEN (current_setting('api.tenant_id'::text, true) IS NULL) THEN false ELSE (tenant_id = (current_setting('api.tenant_id'::text))::uuid) END)
                Rows Removed by Filter: 563
                Buffers: shared hit=42
  Planning:
    Buffers: shared hit=7 read=1
  Planning Time: 0.187 ms
  Execution Time: 0.237 ms

4. Count rows per provider:
  HashAggregate  (cost=56.95..57.45 rows=50 width=24) (actual time=0.422..0.423 rows=10 loops=1)
    Group Key: provider_id
    Batches: 1  Memory Usage: 24kB
    Buffers: shared hit=21
    ->  Index Only Scan using dss_tenant_provider_idx on daily_severity_summaries  (cost=0.28..54.80 rows=431 width=16) (actual time=0.101..0.370 rows=861 loops=1)
          Index Cond: ((tenant_id = '12646005-9067-4d2a-a098-8bb378604362'::uuid) AND (provider_id = ANY ('{ca7945b1-dc5e-4aa5-9c20-649f62301608,b94e8fe1-894b-40d0-a4ea-0e65ab68fc30,9eaf0ae9-750c-4d52-9159-9adf3be18511,4d577e3e-f239-4085-b3c2-0d4d073d2c22,55209da2-5918-4cf8-a913-1b655a16e0ff,1d8220e6-b576-4f6e-972a-b99b3c189211,722adf5d-612e-4da2-9d64-109c354e8429,3f088d3c-9806-4a8b-a5cb-d5c022d50ff7,1ace2823-53b7-4496-85e6-5b210d742485,0f2767d9-2f04-4f7f-850a-001960166ead}'::uuid[])))
          Filter: CASE WHEN (current_setting('api.tenant_id'::text, true) IS NULL) THEN false ELSE (tenant_id = (current_setting('api.tenant_id'::text))::uuid) END
          Heap Fetches: 0
          Buffers: shared hit=21
  Planning:
    Buffers: shared hit=3
  Planning Time: 0.045 ms
  Execution Time: 0.434 ms

prowler@prowler-api:~/backend$ poetry run python manage.py benchmark_severity_over_time --tenant 12646005-9067-4d2a-a098-8bb378604362 --benchmark

================================================================================
BENCHMARK: DailySeveritySummary Performance Analysis
================================================================================
Configuration:
  - Providers: 50
  - Total DailySeveritySummary rows: 4300
  - Avg rows per provider: 86.0
--- Test 1: Fetch all data up to date (endpoint pattern) ---
Range        |    Rows |      Query |      Fetch |      Total
------------------------------------------------------------
7 days       |    4300 |     4.04ms |     3.91ms |     7.96ms
30 days      |    4300 |     4.24ms |     3.77ms |     8.01ms
90 days      |    4300 |     3.92ms |     3.62ms |     7.54ms
180 days     |    4300 |     3.94ms |     3.81ms |     7.75ms
365 days     |    4300 |     3.93ms |     3.62ms |     7.55ms

--- Test 2: Aggregation by date (sum across providers) ---
Range        |    Days |      Query |      Fetch |      Total
------------------------------------------------------------
7 days       |       8 |     0.78ms |     0.09ms |     0.86ms
30 days      |      31 |     1.07ms |     0.01ms |     1.08ms
90 days      |      91 |     2.14ms |     0.02ms |     2.16ms
180 days     |     181 |     3.08ms |     0.06ms |     3.14ms
365 days     |     366 |     4.52ms |     0.15ms |     4.67ms

--- Test 3: Query with varying provider counts (30 days) ---
Providers    |    Rows |      Query |      Fetch |      Total
------------------------------------------------------------
1            |      29 |     0.29ms |     0.03ms |     0.32ms
5            |     149 |     0.40ms |     0.12ms |     0.52ms
10           |     298 |     0.53ms |     0.25ms |     0.78ms
25           |     741 |     0.95ms |     0.61ms |     1.57ms
50           |    1471 |     1.66ms |     1.21ms |     2.87ms

@vicferpoy vicferpoy self-requested a review December 4, 2025 09:42
@AdriiiPRodri AdriiiPRodri force-pushed the PROWLER-25-finding-severity-over-time-component-api branch 2 times, most recently from 21bca44 to f8ddcdb Compare December 4, 2025 15:13
@AdriiiPRodri AdriiiPRodri force-pushed the PROWLER-25-finding-severity-over-time-component-api branch from f8ddcdb to 93360c4 Compare December 4, 2025 15:17
josemazo
josemazo previously approved these changes Dec 4, 2025
Copy link
Contributor

@josemazo josemazo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀

@AdriiiPRodri AdriiiPRodri merged commit 2170e5f into master Dec 5, 2025
37 of 44 checks passed
@AdriiiPRodri AdriiiPRodri deleted the PROWLER-25-finding-severity-over-time-component-api branch December 5, 2025 10:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

component/api review-django-migrations This PR contains changes in Django migrations

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants