Skip to content

Add DML workflow SPA and local demo bootstrap#18

Merged
jruszo merged 9 commits intomasterfrom
feature/dml-workflow-spa
Mar 31, 2026
Merged

Add DML workflow SPA and local demo bootstrap#18
jruszo merged 9 commits intomasterfrom
feature/dml-workflow-spa

Conversation

@jruszo
Copy link
Copy Markdown
Owner

@jruszo jruszo commented Mar 31, 2026

Summary

  • add Vue workflow management and dedicated DML submission flows with matching sql_api endpoints
  • bring local development closer to turnkey testing with demo MySQL/PostgreSQL services and idempotent local seed commands
  • fix follow-up workflow regressions around self-service listing, scheduling lookup, and submission edge cases

Testing

  • npm run build in frontend/
  • docker exec datamingle-app python manage.py test sql_api.tests.TestWorkflow
  • docker exec datamingle-app python manage.py test sql.test_local_demo
  • docker exec datamingle-app python manage.py smoke_local_demo

Summary by CodeRabbit

  • New Features

    • Full workflow console: create/review/execute/schedule/rollback DML requests plus approval-preview and submission-metadata endpoints
    • New DML request creation UI and workflow detail panel; router entries added
    • Local demo seeding and smoke-check commands for quick local demo setup
  • Documentation

    • Local demo guide added; updated instructions to run management commands inside the app container
  • Tests

    • New idempotent demo seeding and workflow endpoint tests
  • Chores

    • Docker compose: added demo MySQL/Postgres services and enabled pgsql engine

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 31, 2026

Warning

Rate limit exceeded

@jruszo has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 2 minutes and 13 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 2 minutes and 13 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: bbe72c03-ab63-4a50-b47b-5ec63729836e

📥 Commits

Reviewing files that changed from the base of the PR and between a106d06 and e15a5ae.

📒 Files selected for processing (1)
  • src/docker-compose/mysql-demo/init/001-demo-schema.sql
📝 Walkthrough

Walkthrough

This PR adds workflow functionality and demo tooling: new frontend Vue views and routes, TypeScript API types/functions, backend workflow REST endpoints/serializers/tests, demo seeding scripts and management commands, local MySQL/Postgres demo services and init scripts, Docker/compose updates, and documentation changes.

Changes

Cohort / File(s) Summary
Frontend Workflow API
frontend/src/lib/api.ts
Added ~20 TypeScript types for workflows and ~12 API functions calling /v1/workflow/... endpoints (list, check, create, detail, content, rollback, review, execute, schedule, execution-window, metadata, approval preview).
Frontend Workflow Views
frontend/src/views/DmlWorkflowCreateView.vue, frontend/src/views/WorkflowsView.vue
Added DML request creation view with SQL editing/checking, approval preview, and submission; replaced minimal workflow view with full console (filters, pagination, detail drawer, polling, review/execute/schedule/rollback actions).
Frontend Routing
frontend/src/router/index.ts
Added routes: /workflows/dml/new (workflow-dml-new) and /workflows/:workflowId (workflow-detail).
Backend Workflow API Layer
sql_api/api_workflow.py
Reworked workflow list to custom GenericAPIView filtering; added endpoints/classes: WorkflowSubmissionMetadata, WorkflowApprovalPreview, WorkflowDetail, WorkflowContentDetail, WorkflowRollbackDetail, WorkflowScheduleCreate, WorkflowExecutionWindowUpdate; added result parsing, permission logic, and scheduling handling.
Backend Serializers
sql_api/serializers.py
Added WorkflowSummarySerializer, WorkflowScheduleSerializer, and WorkflowWindowSerializer.
Backend API Tests
sql_api/tests.py
Added/updated workflow tests covering list scopes, submission metadata, approval preview, detail/content/rollback, cancel flows, scheduling, and execution-window behaviors; also timing/TOTP adjustments and teardown cleanup.
Backend URL Routing
sql_api/urls.py
Wired new /v1/workflow/... routes for submission-metadata, approval-preview, and workflow detail/content/rollback/schedule/execution-window endpoints.
Demo Data Seeding Infrastructure
sql/local_demo.py
New idempotent seed_local_demo() to create demo auth groups, resource groups, users (passwords), instances, instance tags, and workflow audit settings; exports helper lists and demo passwords.
Django Management Commands
sql/management/commands/seed_local_demo.py, sql/management/commands/smoke_local_demo.py, sql/management/__init__.py, sql/management/commands/__init__.py
Added seed_local_demo command calling seeding function and smoke_local_demo for verification (entity presence, API auth, approval-preview, engine connection, DB resources).
Docker Demo Database Services
src/docker-compose/docker-compose.local-arm.yml, src/docker-compose/.env.local-arm
Added mysql_demo and postgres_demo services with init scripts and healthchecks; renamed archery image/container to datamingle-app; added RUN_LOCAL_DEMO_SEED=1; enabled pgsql in .env.
Demo Database Initialization Scripts
src/docker-compose/mysql-demo/init/001-demo-schema.sql, src/docker-compose/postgres-demo/init/001-demo-schema.sql
Added MySQL and PostgreSQL init SQL to create/seeds demo databases/tables and upsert rows for demo instances.
Container Startup & Configuration
src/docker/startup.sh
Added conditional execution of python3 manage.py seed_local_demo when RUN_LOCAL_DEMO_SEED=1.
Documentation Updates
AGENTS.md, README.md, src/docker-compose/LOCAL_DEMO.md
Updated AGENTS.md to reference datamingle-app container; added "Local Demo Users" section to README with demo credentials and reference to LOCAL_DEMO.md; added LOCAL_DEMO.md documenting seeded entities, demo DBs, seed/smoke commands, and reset instructions.

Sequence Diagrams

sequenceDiagram
    actor User
    participant Frontend as Frontend UI<br/>(DML Workflow<br/>Create View)
    participant API as Backend API<br/>(Workflow Endpoints)
    participant DB as Database<br/>(SqlWorkflow,<br/>Audit)

    User->>Frontend: Open /workflows/dml/new
    Frontend->>API: GET /v1/workflow/submission-metadata/
    API->>DB: Query resource_groups, instance_grants
    DB-->>API: Metadata
    API-->>Frontend: Metadata (groups, instances, perms)
    
    User->>Frontend: Select group
    Frontend->>API: GET /v1/workflow/approval-preview/?group_id=X
    API->>DB: Query approval chain
    DB-->>API: Approval nodes
    API-->>Frontend: Approval flow preview

    User->>Frontend: Enter SQL
    Frontend->>API: POST /v1/workflow/check/
    API->>API: Parse & validate SQL
    API-->>Frontend: Check result

    User->>Frontend: Submit workflow
    Frontend->>API: POST /v1/workflow/
    API->>DB: Create SqlWorkflow, logs, audits
    DB-->>API: workflow_id
    API-->>Frontend: Created workflow info
Loading
sequenceDiagram
    actor Reviewer
    participant Frontend as Frontend UI<br/>(Workflows Console)
    participant API as Backend API<br/>(Workflow Endpoints)
    participant DB as Database<br/>(SqlWorkflow,<br/>Audit, Timing)
    participant Queue as Task Queue

    Reviewer->>Frontend: Open /workflows
    Frontend->>API: GET /v1/workflow/?scope=pending_review
    API->>DB: Query pending workflows
    DB-->>API: List
    API-->>Frontend: Summary list

    Reviewer->>Frontend: View workflow detail
    Frontend->>API: GET /v1/workflow/{id}/
    API->>DB: Query workflow, approval, logs
    DB-->>API: Detail
    API-->>Frontend: Detail + action flags

    Reviewer->>Frontend: Schedule execution
    Frontend->>API: POST /v1/workflow/{id}/schedule/
    API->>DB: Update status, add schedule
    API->>Queue: add_sql_schedule(run_date)
    Queue-->>API: Job created
    API-->>Frontend: Success

    Frontend->>Frontend: Poll every 5s for status
    Frontend->>API: GET /v1/workflow/?scope=mine
    API->>DB: Query updated workflows
    DB-->>API: Updated state (run_date/status)
    API-->>Frontend: Updated list
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~70 minutes

Poem

🐰 I nibbled code and planted seeds,
Docker pots and SQL deeds.
Forms aligned, approvals sing,
Demo users sprout in spring—
Hopping, checking, workflows bloom 🌱

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 3.57% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'Add DML workflow SPA and local demo bootstrap' accurately summarizes the main changes: adding a Vue-based workflow SPA for DML submissions and local development improvements including demo database services and seeding.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feature/dml-workflow-spa

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Comment thread sql_api/api_workflow.py Dismissed
Comment thread sql_api/api_workflow.py Dismissed
Comment thread sql_api/api_workflow.py Dismissed
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
sql_api/urls.py (1)

76-80: ⚠️ Potential issue | 🟡 Minor

Duplicate URL route definitions for instance tags.

Lines 76-77 and 79-80 define identical routes for v1/instance/tag/ and v1/instance/tag/<int:pk>/. While Django uses the first match, these duplicates should be removed to avoid confusion and potential maintenance issues.

🔧 Proposed fix
     path("v1/instance/tag/", api_instance.InstanceTagList.as_view()),
     path("v1/instance/tag/<int:pk>/", api_instance.InstanceTagDetail.as_view()),
     path("v1/instance/metadata/", api_instance.InstanceMetadata.as_view()),
-    path("v1/instance/tag/", api_instance.InstanceTagList.as_view()),
-    path("v1/instance/tag/<int:pk>/", api_instance.InstanceTagDetail.as_view()),
     path(
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@sql_api/urls.py` around lines 76 - 80, The URL patterns contain duplicate
route entries for the instance tag endpoints: remove the repeated path lines
that register api_instance.InstanceTagList.as_view() and
api_instance.InstanceTagDetail.as_view() so each route (v1/instance/tag/ and
v1/instance/tag/<int:pk>/) is only declared once; locate the duplicated uses of
api_instance.InstanceTagList and api_instance.InstanceTagDetail in the
urlpatterns and delete the redundant entries, leaving a single registration for
each view.
🧹 Nitpick comments (5)
README.md (1)

48-64: Add an explicit “local-only / never production” warning near demo credentials.

This section is useful, but a short caution line will reduce accidental reuse of seeded credentials outside local demo environments.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 48 - 64, Add a clear, prominent one-line warning
under the "Local Demo Users" heading (near the seeded credentials list) that
reads something like: "WARNING: These demo credentials are for local development
and testing only — never use in production." Ensure the line is visually
distinct (e.g., bold or prefixed with WARNING) and placed immediately above or
below the seeded app users list so readers cannot miss it; update the README.md
"Local Demo Users" section accordingly.
frontend/src/views/WorkflowsView.vue (2)

1049-1054: Shared reviewForm.audit_remark between review and terminate sections.

Both the "Review action" section (lines 1049-1054) and "Terminate workflow" section (lines 1137-1142) bind to the same reviewForm.audit_remark model. This means text entered in one textarea will appear in the other, which could confuse users or cause unintended submissions.

Consider using separate form fields for review remarks and termination reasons.

Also applies to: 1137-1142

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/src/views/WorkflowsView.vue` around lines 1049 - 1054, The two
textareas both bind to reviewForm.audit_remark which causes shared state between
"Review action" and "Terminate workflow"; create distinct fields on the form
object (e.g., reviewForm.review_remark for the review textarea and
reviewForm.terminate_reason for the terminate textarea), update the v-model
bindings on the corresponding <textarea> elements, and update any submission
handlers or validators that read audit_remark (e.g., review submission and
terminate submission functions) to use the new field names while keeping
existing props like :disabled="reviewSubmitting" and :class="textareaClass".

580-594: Polling watcher should also trigger on selection changes.

The watcher on shouldPollSelectedWorkflow starts/stops the poll timer when the workflow status changes, but the timer keeps the old selectedWorkflow.value.id in closure. If the user selects a different workflow while polling is active, the timer continues polling the previous workflow until status changes.

Consider using selectedWorkflowId as a dependency to reset the timer when the selected workflow changes:

♻️ Proposed fix to watch both status and selection
 watch(
-  shouldPollSelectedWorkflow,
-  (shouldPoll) => {
+  [shouldPollSelectedWorkflow, selectedWorkflowId],
+  ([shouldPoll]) => {
     clearPollTimer()
     if (!shouldPoll || !selectedWorkflow.value) {
       return
     }
     pollTimer = window.setInterval(() => {
       if (selectedWorkflow.value) {
         void loadSelectedWorkflow(selectedWorkflow.value.id, true)
         void loadWorkflows()
       }
     }, 5000)
   },
 )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/src/views/WorkflowsView.vue` around lines 580 - 594, The poll timer
is only watching shouldPollSelectedWorkflow so its interval callback can close
over an old selectedWorkflow.value.id; update the watcher to depend on both
polling state and selection so the timer resets when selection changes (watch an
array or a computed tuple including shouldPollSelectedWorkflow and
selectedWorkflowId), call clearPollTimer() then recreate pollTimer when
shouldPollSelectedWorkflow is true and selectedWorkflowId is present, and ensure
the interval callback uses selectedWorkflowId (or reads selectedWorkflow.value)
so loadSelectedWorkflow and loadWorkflows always run for the current selection;
keep existing helpers clearPollTimer, pollTimer, loadSelectedWorkflow, and
loadWorkflows.
sql/management/commands/smoke_local_demo.py (1)

116-119: Minor: Error check default value is inconsistent with ResultSet API.

Per sql/engines/models.py, ResultSet.error is None on success, not an empty string. While getattr(connection_result, "error", "") works due to Python truthiness, using None as the default would be more accurate:

♻️ Minor consistency fix
-            if getattr(connection_result, "error", ""):
+            if getattr(connection_result, "error", None):
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@sql/management/commands/smoke_local_demo.py` around lines 116 - 119, The
error check uses getattr(connection_result, "error", "") but ResultSet.error is
None on success; change the default to None and check truthiness accordingly by
using getattr(connection_result, "error", None) (keeping the existing raise of
CommandError with instance.instance_name and connection_result.error) so the
code aligns with the ResultSet.error API and is more semantically correct when
no error is present.
frontend/src/views/DmlWorkflowCreateView.vue (1)

326-347: Redundant database reload when group changes but instance is retained.

When form.groupId changes but the current instance remains valid (line 344-346), loadDatabases is called. However, the database list shouldn't change just because the resource group changed—the instance is the same.

This is a minor inefficiency but causes an unnecessary API call.

♻️ Skip database reload when instance is unchanged
 watch(
   () => form.groupId,
   (groupIdValue) => {
     const retainedInstanceId = Number(form.instanceId)
     const keepsCurrentInstance = filteredInstances.value.some((instance) => instance.id === retainedInstanceId)

     approvalPreview.value = null
     approvalError.value = ''
     form.instanceId = keepsCurrentInstance ? form.instanceId : ''
-    form.dbName = ''
-    availableDatabases.value = []
+    if (!keepsCurrentInstance) {
+      form.dbName = ''
+      availableDatabases.value = []
+    }
     databasesError.value = ''
     invalidateCheck()

     const groupId = Number(groupIdValue)
     if (groupId) {
       void loadApprovalPreview(groupId)
     }
-    if (groupId && keepsCurrentInstance && retainedInstanceId) {
-      void loadDatabases(retainedInstanceId)
-    }
   },
 )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@frontend/src/views/DmlWorkflowCreateView.vue` around lines 326 - 347, The
watcher on form.groupId currently calls loadDatabases(retainedInstanceId) when
the instance is still valid (keepsCurrentInstance), causing an unnecessary API
call; remove the conditional call to loadDatabases from this watch (the block
using retainedInstanceId / keepsCurrentInstance / loadDatabases) and instead
ensure database loading is triggered from the instance-change logic (e.g., the
watch or handler that reacts to form.instanceId), so databases are only loaded
when the instance selection actually changes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@sql_api/api_workflow.py`:
- Around line 451-464: The loop that builds group_names and review_info from
audit_auth_groups can crash on malformed or deleted IDs; wrap the parsing and
Group.objects.get(id=int(auth_group_id)) lookup in validation that catches
ValueError and Group.DoesNotExist (and any non-integer/empty parts), and instead
of letting it 500, raise or return a clear configuration error indicating
stale/malformed approval groups for the workflow (mentioning audit_auth_groups),
so callers see a user-facing config error; keep the rest of the logic that
builds group_names and review_info unchanged.
- Around line 778-793: The scheduler registration add_sql_schedule(...) is being
called inside the transaction.atomic() block along with workflow.status update
and Audit.add_log, which can leave an external django-q job registered if the DB
transaction later rolls back; move the call to add_sql_schedule(schedule_name,
run_date, workflow_id) out of the transaction.atomic() so that only the DB
mutations (workflow.status change and Audit.add_log) occur inside the atomic
block and the external side effect is performed after the transaction commits
successfully; keep the same parameters and ensure the call happens only after
workflow.save(update_fields=["status"]) and Audit.add_log(...) complete.

In `@sql_api/serializers.py`:
- Around line 29-35: Remove the unused imports from serializers.py: drop Const
from the common.utils.const import line and remove can_execute, can_timingtask,
can_cancel, can_rollback and task_info (the latter from sql.utils.tasks) so
serializers.py only imports symbols it uses (e.g., WorkflowType, WorkflowStatus,
SysConfig, traceback, logging, OffLineDownLoad). Then update api_workflow.py so
it no longer re-exports those functions from .serializers — import can_execute,
can_timingtask, can_cancel, can_rollback directly from sql.utils.sql_review
instead of from serializers to eliminate the unnecessary re-export pattern.

In `@sql/local_demo.py`:
- Around line 242-260: seed_local_demo currently performs destructive actions
(creating a fixed-password superuser and overwriting auth groups) without
guarding environment; add an explicit environment safety check at the start of
seed_local_demo that refuses to run unless the process is a local/dev
environment (e.g., settings.DEBUG is True or a dedicated env var like DJANGO_ENV
== "local"/"dev"); if the check fails, raise an exception (RuntimeError or
PermissionError) or log and abort before calling _seed_auth_groups, _seed_users,
or _seed_instances to ensure superuser/group seeding never runs against
shared/prod DBs. Ensure the check references seed_local_demo and prevents calls
to helpers _seed_users and _seed_auth_groups when not in a safe environment.

In `@src/docker-compose/mysql-demo/init/001-demo-schema.sql`:
- Around line 1-8: The SQL script depends on an external env var to create
demo_orders but only explicitly creates demo_billing; add an explicit CREATE
DATABASE IF NOT EXISTS for demo_orders at the top of the script (before the
existing CREATE DATABASE IF NOT EXISTS demo_billing) so both databases are
created within this script and the subsequent GRANT ALL PRIVILEGES and USE
demo_orders statements target an explicitly-created database.

---

Outside diff comments:
In `@sql_api/urls.py`:
- Around line 76-80: The URL patterns contain duplicate route entries for the
instance tag endpoints: remove the repeated path lines that register
api_instance.InstanceTagList.as_view() and
api_instance.InstanceTagDetail.as_view() so each route (v1/instance/tag/ and
v1/instance/tag/<int:pk>/) is only declared once; locate the duplicated uses of
api_instance.InstanceTagList and api_instance.InstanceTagDetail in the
urlpatterns and delete the redundant entries, leaving a single registration for
each view.

---

Nitpick comments:
In `@frontend/src/views/DmlWorkflowCreateView.vue`:
- Around line 326-347: The watcher on form.groupId currently calls
loadDatabases(retainedInstanceId) when the instance is still valid
(keepsCurrentInstance), causing an unnecessary API call; remove the conditional
call to loadDatabases from this watch (the block using retainedInstanceId /
keepsCurrentInstance / loadDatabases) and instead ensure database loading is
triggered from the instance-change logic (e.g., the watch or handler that reacts
to form.instanceId), so databases are only loaded when the instance selection
actually changes.

In `@frontend/src/views/WorkflowsView.vue`:
- Around line 1049-1054: The two textareas both bind to reviewForm.audit_remark
which causes shared state between "Review action" and "Terminate workflow";
create distinct fields on the form object (e.g., reviewForm.review_remark for
the review textarea and reviewForm.terminate_reason for the terminate textarea),
update the v-model bindings on the corresponding <textarea> elements, and update
any submission handlers or validators that read audit_remark (e.g., review
submission and terminate submission functions) to use the new field names while
keeping existing props like :disabled="reviewSubmitting" and
:class="textareaClass".
- Around line 580-594: The poll timer is only watching
shouldPollSelectedWorkflow so its interval callback can close over an old
selectedWorkflow.value.id; update the watcher to depend on both polling state
and selection so the timer resets when selection changes (watch an array or a
computed tuple including shouldPollSelectedWorkflow and selectedWorkflowId),
call clearPollTimer() then recreate pollTimer when shouldPollSelectedWorkflow is
true and selectedWorkflowId is present, and ensure the interval callback uses
selectedWorkflowId (or reads selectedWorkflow.value) so loadSelectedWorkflow and
loadWorkflows always run for the current selection; keep existing helpers
clearPollTimer, pollTimer, loadSelectedWorkflow, and loadWorkflows.

In `@README.md`:
- Around line 48-64: Add a clear, prominent one-line warning under the "Local
Demo Users" heading (near the seeded credentials list) that reads something
like: "WARNING: These demo credentials are for local development and testing
only — never use in production." Ensure the line is visually distinct (e.g.,
bold or prefixed with WARNING) and placed immediately above or below the seeded
app users list so readers cannot miss it; update the README.md "Local Demo
Users" section accordingly.

In `@sql/management/commands/smoke_local_demo.py`:
- Around line 116-119: The error check uses getattr(connection_result, "error",
"") but ResultSet.error is None on success; change the default to None and check
truthiness accordingly by using getattr(connection_result, "error", None)
(keeping the existing raise of CommandError with instance.instance_name and
connection_result.error) so the code aligns with the ResultSet.error API and is
more semantically correct when no error is present.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 47736e8a-4cb6-49d5-b0ff-6f8e11dc68e6

📥 Commits

Reviewing files that changed from the base of the PR and between e94ad72 and 1b10099.

📒 Files selected for processing (22)
  • AGENTS.md
  • README.md
  • frontend/src/lib/api.ts
  • frontend/src/router/index.ts
  • frontend/src/views/DmlWorkflowCreateView.vue
  • frontend/src/views/WorkflowsView.vue
  • sql/local_demo.py
  • sql/management/__init__.py
  • sql/management/commands/__init__.py
  • sql/management/commands/seed_local_demo.py
  • sql/management/commands/smoke_local_demo.py
  • sql/test_local_demo.py
  • sql_api/api_workflow.py
  • sql_api/serializers.py
  • sql_api/tests.py
  • sql_api/urls.py
  • src/docker-compose/.env.local-arm
  • src/docker-compose/LOCAL_DEMO.md
  • src/docker-compose/docker-compose.local-arm.yml
  • src/docker-compose/mysql-demo/init/001-demo-schema.sql
  • src/docker-compose/postgres-demo/init/001-demo-schema.sql
  • src/docker/startup.sh

Comment thread sql_api/api_workflow.py
Comment thread sql_api/api_workflow.py
Comment on lines +778 to +793
with transaction.atomic():
workflow.status = "workflow_timingtask"
workflow.save(update_fields=["status"])
add_sql_schedule(schedule_name, run_date, workflow_id)
audit_id = Audit.detail_by_workflow_id(
workflow_id=workflow_id,
workflow_type=WorkflowType.SQL_REVIEW,
).audit_id
Audit.add_log(
audit_id=audit_id,
operation_type=4,
operation_type_desc="Scheduled Execution",
operation_info="Scheduled execution time: {}".format(run_date),
operator=request.user.username,
operator_display=request.user.display,
)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Move scheduler registration out of the atomic block.

add_sql_schedule() is an external side effect. If anything later in this transaction rolls back, the django-q job can survive while the workflow status/log do not, leaving scheduler state and DB state out of sync.

Suggested direction
         with transaction.atomic():
             workflow.status = "workflow_timingtask"
             workflow.save(update_fields=["status"])
-            add_sql_schedule(schedule_name, run_date, workflow_id)
             audit_id = Audit.detail_by_workflow_id(
                 workflow_id=workflow_id,
                 workflow_type=WorkflowType.SQL_REVIEW,
             ).audit_id
             Audit.add_log(
                 audit_id=audit_id,
                 operation_type=4,
                 operation_type_desc="Scheduled Execution",
                 operation_info="Scheduled execution time: {}".format(run_date),
                 operator=request.user.username,
                 operator_display=request.user.display,
             )
+            transaction.on_commit(
+                lambda: add_sql_schedule(schedule_name, run_date, workflow_id)
+            )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@sql_api/api_workflow.py` around lines 778 - 793, The scheduler registration
add_sql_schedule(...) is being called inside the transaction.atomic() block
along with workflow.status update and Audit.add_log, which can leave an external
django-q job registered if the DB transaction later rolls back; move the call to
add_sql_schedule(schedule_name, run_date, workflow_id) out of the
transaction.atomic() so that only the DB mutations (workflow.status change and
Audit.add_log) occur inside the atomic block and the external side effect is
performed after the transaction commits successfully; keep the same parameters
and ensure the call happens only after workflow.save(update_fields=["status"])
and Audit.add_log(...) complete.

Comment thread sql_api/serializers.py
Comment on lines +29 to +35
from common.utils.const import WorkflowType, WorkflowStatus, Const
from common.config import SysConfig
import traceback
import logging
from sql.offlinedownload import OffLineDownLoad
from sql.utils.sql_review import can_execute, can_timingtask, can_cancel, can_rollback
from sql.utils.tasks import task_info
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if these imports are actually used in serializers.py
rg -n "can_execute|can_timingtask|can_cancel|can_rollback|task_info|Const" sql_api/serializers.py | grep -v "^.*:.*import"

Repository: jruszo/Datamingle

Length of output: 43


🏁 Script executed:

# First, let's see the exact imports at lines 29-35
head -40 sql_api/serializers.py | tail -15

Repository: jruszo/Datamingle

Length of output: 552


🏁 Script executed:

# Check if api_workflow.py imports these from serializers.py
rg -n "from.*serializers import|import.*serializers" sql_api/api_workflow.py

Repository: jruszo/Datamingle

Length of output: 170


🏁 Script executed:

# Check if these utilities are used elsewhere in api_workflow.py
rg -n "can_execute|can_timingtask|can_cancel|can_rollback|task_info" sql_api/api_workflow.py | head -20

Repository: jruszo/Datamingle

Length of output: 1002


🏁 Script executed:

rg -n "Const\." sql_api/serializers.py

Repository: jruszo/Datamingle

Length of output: 43


Remove unused imports from serializers.py.

The imports Const, can_execute, can_timingtask, can_cancel, can_rollback, and task_info are present at lines 29 and 34-36 but are not used within this file. Additionally, api_workflow.py imports can_execute, can_timingtask, can_cancel, and can_rollback from .serializers (lines 39-42), creating an unnecessary re-export pattern. These should instead be imported directly from sql.utils.sql_review in api_workflow.py. Remove the unused imports from serializers.py.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@sql_api/serializers.py` around lines 29 - 35, Remove the unused imports from
serializers.py: drop Const from the common.utils.const import line and remove
can_execute, can_timingtask, can_cancel, can_rollback and task_info (the latter
from sql.utils.tasks) so serializers.py only imports symbols it uses (e.g.,
WorkflowType, WorkflowStatus, SysConfig, traceback, logging, OffLineDownLoad).
Then update api_workflow.py so it no longer re-exports those functions from
.serializers — import can_execute, can_timingtask, can_cancel, can_rollback
directly from sql.utils.sql_review instead of from serializers to eliminate the
unnecessary re-export pattern.

Comment thread sql/local_demo.py
Comment on lines +242 to +260
def seed_local_demo(write_line=None):
def log(message):
if write_line:
write_line(message)

with transaction.atomic():
auth_groups = _seed_auth_groups(log)
resource_groups = _seed_resource_groups(log)
tags = _seed_instance_tags(log)
users = _seed_users(auth_groups, resource_groups, log)
instances = _seed_instances(resource_groups, tags, log)
_seed_workflow_settings(auth_groups, resource_groups, log)

return {
"auth_groups": list(auth_groups.keys()),
"resource_groups": [group.group_name for group in resource_groups.values()],
"users": list(users.keys()),
"instances": [instance.instance_name for instance in instances.values()],
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Refuse to run this seeder outside local/dev.

seed_local_demo() currently reaches helpers that create a superuser with a fixed password and overwrite generic auth groups like DBA/RD in place. If someone points this at a shared database, it can create a known-credential admin and clobber existing role permissions.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@sql/local_demo.py` around lines 242 - 260, seed_local_demo currently performs
destructive actions (creating a fixed-password superuser and overwriting auth
groups) without guarding environment; add an explicit environment safety check
at the start of seed_local_demo that refuses to run unless the process is a
local/dev environment (e.g., settings.DEBUG is True or a dedicated env var like
DJANGO_ENV == "local"/"dev"); if the check fails, raise an exception
(RuntimeError or PermissionError) or log and abort before calling
_seed_auth_groups, _seed_users, or _seed_instances to ensure superuser/group
seeding never runs against shared/prod DBs. Ensure the check references
seed_local_demo and prevents calls to helpers _seed_users and _seed_auth_groups
when not in a safe environment.

Comment thread src/docker-compose/mysql-demo/init/001-demo-schema.sql
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants