feat: Contact Merge + Duplicate Detection#157
Conversation
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…s table Three migrations: merge tracking on contacts, pg_trgm + name trigram index for fuzzy matching, and contact_duplicate_dismissals table. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…point Merge engine handles all FK references: simple re-points, trip_travelers sub-merge, relationship canonicalization, tag merge, portal/Stripe policies. Includes duplicate detection with pg_trgm similarity() and dismissals. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
POST /contacts/merge, GET /contacts/duplicates, POST /contacts/duplicates/dismiss. Registered before ContactsController for route priority. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Primary selector, field-by-field radio overrides, inheritance summary. Defaults to primary's values, auto-selects non-null for empty fields. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
When navigating to a contact with mergedIntoContactId set, immediately redirect to the surviving contact via router.replace and show a holding screen while the redirect resolves. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…apper Adds mergedIntoContactId, mergedAt, mergedBy to ContactResponseDto and the contacts service mapper so the frontend redirect and merge UI work. Also fixes type narrowing in merge-editor-dialog. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Caution Review failedThe pull request is closed. ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (7)
📝 WalkthroughWalkthroughThis PR implements a comprehensive contact merge and duplicate detection feature, including UI components for merging contacts, backend API endpoints with atomic database operations, duplicate detection algorithms (email/phone/DOB matching), database schema extensions, and supporting React Query hooks. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant MergeDialog as Merge Editor Dialog
participant API as Backend API
participant DB as Database
User->>MergeDialog: Select primary contact, configure field overrides
MergeDialog->>API: POST /contacts/merge (ContactMergeRequest)
API->>DB: BEGIN TRANSACTION
DB->>DB: Validate contacts exist, not previously merged
DB->>DB: Update primary contact with field overrides
DB->>DB: Re-point related records (trips, tags, groups, etc.)
DB->>DB: Handle trip_travelers, relationships, shares, portal users
DB->>DB: Soft-delete secondary contact (merged_into_contact_id, merged_at, merged_by)
DB->>DB: COMMIT TRANSACTION
DB-->>API: Merge complete with repoint counts
API-->>MergeDialog: ContactMergeResult { success, mergedContactId, repointed }
MergeDialog->>MergeDialog: Show success toast
MergeDialog-->>User: Close dialog, refresh contact list
sequenceDiagram
participant User
participant DuplicatesPage as Duplicates Page
participant API as Backend API
participant DB as Database
User->>DuplicatesPage: Click "Scan for Duplicates"
DuplicatesPage->>API: GET /contacts/duplicates
API->>DB: Query Tier 1: exact email matches (high confidence)
DB-->>API: Email duplicate pairs
API->>DB: Query Tier 2: phone + name similarity >= 0.4 (high confidence)
DB-->>API: Phone/name duplicate pairs
API->>DB: Query Tier 3: DOB + name similarity >= 0.4 (medium confidence)
DB-->>API: DOB/name duplicate pairs
API->>DB: Exclude previously dismissed duplicates
DB-->>API: Filtered DuplicateDetectionResult
API-->>DuplicatesPage: { groups: DuplicateGroup[], totalGroups }
DuplicatesPage->>DuplicatesPage: Render DuplicateGroupCards grid
User->>DuplicatesPage: Click "Merge" or "Not Duplicates"
DuplicatesPage->>API: POST /contacts/duplicates/dismiss (or trigger merge)
API->>DB: Insert contact_duplicate_dismissals record
DB-->>API: Dismissal recorded
API-->>DuplicatesPage: Success response
DuplicatesPage->>DuplicatesPage: Refetch duplicates, show toast
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
- Guard in handleDragEnd prevents Lead contacts from being dragged to non-prospecting columns; shows a descriptive toast error instead of making a rejected API call - handleActiveChange now destructures isActive out of filters when "All" is selected, so the key is completely absent from the filter object rather than being set to undefined - Fix activeValue default from 'active' to 'all' so the select correctly reflects the unfiltered state Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Missing from the simple repoints list per spec. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 12
🧹 Nitpick comments (5)
apps/admin/src/app/contacts/_components/bulk-actions-toolbar.tsx (2)
81-114: Missing Sentry error capture for runtime errors.Per coding guidelines, the admin application must capture runtime errors with Sentry. The
catchblocks inhandleStatusChange(and similarly inhandleDelete) silently count errors without reporting them to Sentry.Consider capturing unexpected errors:
🛡️ Proposed fix
+import * as Sentry from '@sentry/nextjs' // In the catch block: } catch (e) { + Sentry.captureException(e, { tags: { operation: 'bulk_status_change' } }) errorCount++ }As per coding guidelines: "Admin application must capture runtime errors with Sentry."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/admin/src/app/contacts/_components/bulk-actions-toolbar.tsx` around lines 81 - 114, The catch blocks in handleStatusChange (and the similar catch in handleDelete) only increment error counters and do not report exceptions to Sentry; update those catch blocks to accept the error object (e) and call Sentry.captureException(e, { contexts: { action: 'bulkStatusChange', id } }) or similar contextual metadata so runtime errors are recorded; ensure you import Sentry where needed and include enough context (id, newStatus or delete action) to make debugging easier while preserving the existing success/error counting and toast behavior.
189-194: CSV escape function doesn't handle carriage returns.The
escapeCsvhelper checks for,,", and\n, but doesn't handle\r(carriage return). Some CSV parsers treat\ras a line terminator.♻️ Suggested fix
const escapeCsv = (val: string) => { - if (val.includes(',') || val.includes('"') || val.includes('\n')) { + if (val.includes(',') || val.includes('"') || val.includes('\n') || val.includes('\r')) { return `"${val.replace(/"/g, '""')}"` } return val }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/admin/src/app/contacts/_components/bulk-actions-toolbar.tsx` around lines 189 - 194, The escapeCsv function fails to treat carriage returns as line terminators; update the escapeCsv helper so it also checks for '\r' (e.g., include val.includes('\r') in the condition) and still returns the quoted value with internal double-quotes escaped (keep the existing val.replace(/"/g, '""') behavior) so values containing CR are properly quoted for CSV parsers.packages/shared-types/src/api/contacts.types.ts (1)
660-664: Consider usingDuplicateMatchTypeformatchTypeinDuplicateDismissRequest.
DuplicateDismissRequest.matchTypeis typed asstring, butDuplicateMatchTypeis already defined as'email' | 'phone_name' | 'dob_name'. Using the union type would provide compile-time validation.♻️ Proposed fix
export interface DuplicateDismissRequest { contactId1: string contactId2: string - matchType: string + matchType: DuplicateMatchType }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/shared-types/src/api/contacts.types.ts` around lines 660 - 664, DuplicateDismissRequest currently types matchType as string which loses compile-time safety; change the matchType property on the DuplicateDismissRequest interface to use the existing union type DuplicateMatchType (i.e., replace the string type with DuplicateMatchType in the DuplicateDismissRequest declaration) so only 'email' | 'phone_name' | 'dob_name' values are allowed.apps/api/src/contacts/contact-merge.service.ts (1)
113-120: Consider adding Sentry error capture for unexpected failures.Per coding guidelines, API runtime errors must be captured by Sentry. While the service throws typed exceptions (BadRequestException, etc.), unexpected errors during the complex merge transaction should be captured for monitoring.
Consider wrapping the transaction in a try-catch that captures to Sentry before re-throwing.
As per coding guidelines: "All runtime errors in the API must be captured by Sentry."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 113 - 120, The ContactMergeService currently runs a complex merge transaction without capturing unexpected runtime errors for Sentry; wrap the transaction execution inside a try-catch in the method that performs the merge (the merge/transaction block inside ContactMergeService) and in the catch call Sentry.captureException(err) (or the project Sentry wrapper) before re-throwing the error so typed exceptions still propagate; ensure you import the Sentry capture function and keep the existing logger (this change should target the transaction/merge method within ContactMergeService).apps/admin/src/app/contacts/_components/merge-editor-dialog.tsx (1)
163-169: Capture merge failures to Sentry.Per coding guidelines, runtime errors in the admin app must be captured by Sentry. The merge failure is shown to the user via toast, but the underlying error is not reported for monitoring.
🛡️ Proposed fix
+import * as Sentry from '@sentry/nextjs' } catch (error) { + Sentry.captureException(error, { + tags: { operation: 'contact_merge' }, + extra: { primaryId: primaryId, secondaryId: secondaryId }, + }) toast({ title: 'Merge failed',As per coding guidelines: "Admin application must capture runtime errors with Sentry."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/admin/src/app/contacts/_components/merge-editor-dialog.tsx` around lines 163 - 169, The catch block in merge-editor-dialog.tsx currently shows a toast but does not report the error to Sentry; import Sentry (e.g. from '@sentry/react' or your app's Sentry wrapper) and call Sentry.captureException(error, { extra: { action: 'mergeContacts', /* include relevant context such as contact IDs or merge payload from the scope where this catch lives */ } }) inside the catch where the toast is emitted (keep the toast as-is) and ensure the caught error is forwarded to Sentry (cast to any if needed) so runtime merge failures are recorded for monitoring.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/admin/src/app/contacts/_components/contact-print-report.ts`:
- Line 13: Escape each tag before injecting into the report HTML: replace the
raw join expression that renders tags (the template expression using (c.tags ||
[]).join(', ')) with an escaped version that maps over c.tags and applies the
existing esc() helper (e.g., (c.tags || []).map(esc).join(', ')); ensure you use
esc() (or an equivalent HTML-encoding helper) so every tag value is HTML-escaped
before document.write() renders the report.
- Line 12: The DOB rendering uses new Date(c.dateOfBirth) which causes timezone
shifts for date-only strings; replace that parsing with parseFlexibleDate from
`@tailfire/shared-types/api` so date-only values are treated correctly (use
parseFlexibleDate(c.dateOfBirth) before calling toLocaleDateString('en-CA')
wherever dateOfBirth is rendered in contact-print-report.ts). Ensure you import
parseFlexibleDate and null-check its result the same way the current code guards
c.dateOfBirth.
In `@apps/admin/src/app/contacts/_components/contacts-filter-panel.tsx`:
- Around line 93-101: The "All" branch currently deletes isActive from filters
which can trigger server default filtering; change the "All" case in
contacts-filter-panel.tsx to call onFiltersChange({ ...filters, isActive: null,
page: 1 }) instead of removing the key, and update the hook that builds API
params (use-contacts.ts — the function that constructs query/params) to
recognize isActive === null as the explicit "no filter" signal and omit sending
the isActive query parameter to the server, leaving true/false values intact.
In `@apps/admin/src/app/contacts/duplicates/_components/duplicate-group-card.tsx`:
- Around line 64-67: The current DOB rendering uses new
Date(contact.dateOfBirth).toLocaleDateString() which misparses YYYY-MM-DD as
UTC; replace it with the safe parser or component: either call
parseISODate(contact.dateOfBirth) and then format via toLocaleDateString (or
pass the parsed Date to your output), or render the DateDisplay component with
value={contact.dateOfBirth} and format="date" (instead of new Date(...)); update
the JSX in the DuplicateGroupCard where contact.dateOfBirth is used to use
parseISODate or DateDisplay to ensure correct local-day rendering.
In `@apps/admin/src/app/contacts/duplicates/page.tsx`:
- Around line 15-18: The duplicates page currently ignores query errors and
treats failed scans like "no results" and only toasts dismiss failures; update
the UI logic around useDuplicateDetection (data, isLoading, isFetching, refetch)
to explicitly handle and render an error state when the duplicates query has an
error (show retry button that calls refetch), and when using useDismissDuplicate
(dismissMutation) capture failures in Sentry via Sentry.captureException(error)
tagging the event with process.env.NEXT_PUBLIC_SENTRY_ENVIRONMENT and also
render or surface a persistent error UI (not just a toast) for dismiss failures
so the admin can retry; ensure both query error handling and
dismissMutation.onError call include Sentry capture with the environment tag and
surface a retryable error component instead of falling back to the pre-scan
empty state.
In `@apps/admin/src/hooks/use-contact-merge.ts`:
- Around line 22-29: useDuplicateDetection() is disabled (enabled: false) so
invalidateQueries() in the dismiss mutation doesn't update the cached
DuplicateDetectionResult and dismissed pairs remain visible; fix by updating the
query cache in the dismiss mutation's onSuccess: use the queryClient (or
equivalent) to call setQueryData for the queryKey [...contactKeys.all,
'duplicates'] and remove or mark the dismissed pair from the
DuplicateDetectionResult payload (or alternatively call and await refetch() from
useDuplicateDetection() before closing dismiss UI); apply the same change for
the other duplicate-detection hook instance mentioned (lines 31-39).
In `@apps/api/src/contacts/contact-merge.controller.ts`:
- Around line 20-23: The controller is importing ContactMergeRequest and
DuplicateDismissRequest as type-only interfaces so NestJS can't perform runtime
validation for destructive endpoints; replace these type-only uses with concrete
DTO classes (e.g., ContactMergeDto, DuplicateDismissDto) that declare validation
decorators (class-validator) and use class-transformer, or apply explicit
ValidationPipe at the controller methods (merge/dismiss) to validate the
incoming `@Body`(); update the controller method signatures to accept the new DTO
classes (and add `@UsePipes`(new ValidationPipe({ whitelist: true, transform: true
})) or module-level ValidationPipe) so runtime validation runs for the merge and
dismiss endpoints.
In `@apps/api/src/contacts/contact-merge.service.ts`:
- Around line 980-1044: The returned object from rowToListItem is missing the
new merge-tracking fields from ContactResponseDto (mergedIntoContactId,
mergedAt, mergedBy); update the rowToListItem helper to include these properties
(e.g., set mergedIntoContactId: null, mergedAt: null, mergedBy: null or map from
existing variables if available) so the object conforms to
ContactListItemDto/ContactResponseDto and TypeScript errors are resolved.
- Around line 925-944: The current SELECT-then-INSERT in dismissDuplicate is
racy; replace it with a single INSERT INTO contact_duplicate_dismissals
(agency_id, contact_id1, contact_id2, match_type, dismissed_by) VALUES (...) ON
CONFLICT (agency_id, contact_id1, contact_id2, match_type) DO NOTHING so the
operation is atomic and idempotent, and amend the migration that created the
UNIQUE constraint on contact_duplicate_dismissals to include agency_id in the
unique key (use UNIQUE(agency_id, contact_id1, contact_id2, match_type) or
equivalent) so conflicts are scoped per agency.
In `@docs/superpowers/plans/2026-04-03-contact-merge.md`:
- Around line 52-65: The UNIQUE constraint on the contact_duplicate_dismissals
table is missing agency scoping; update the UNIQUE clause in the
contact_duplicate_dismissals migration (and any spec) to include agency_id so
the constraint reads uniqueness over agency_id, contact_id1, contact_id2,
match_type rather than only contact_id1, contact_id2, match_type.
In `@docs/superpowers/specs/2026-04-03-contact-merge-design.md`:
- Around line 227-237: The UNIQUE constraint on contact_duplicate_dismissals
currently uses UNIQUE (contact_id1, contact_id2, match_type) which omits
agency_id; update the table spec (contact_duplicate_dismissals) to include
agency_id in the unique index so it reads UNIQUE (agency_id, contact_id1,
contact_id2, match_type) to match the corrected migration and ensure uniqueness
is scoped per agency for rows identified by contact_id1/contact_id2 and
match_type.
In
`@packages/database/src/migrations/20260403150610_create_duplicate_dismissals.sql`:
- Around line 1-10: The UNIQUE constraint on contact_duplicate_dismissals
currently enforces (contact_id1, contact_id2, match_type) but dismissDuplicate()
looks up by agency_id, match_type and the canonical pair via LEAST/GREATEST;
update the schema to support that lookup by either (A) enforcing canonical
ordering on insert/update (ensure contact_id1 = LEAST(...), contact_id2 =
GREATEST(...)) and keep a UNIQUE (agency_id, contact_id1, contact_id2,
match_type), or (B) add a unique expression index/constraint using
LEAST(contact_id1, contact_id2), GREATEST(contact_id1, contact_id2) scoped by
agency_id and match_type so dismissDuplicate() queries use the index instead of
scanning; pick one approach and apply it to contact_duplicate_dismissals to
match the canonical-pair lookup used in dismissDuplicate().
---
Nitpick comments:
In `@apps/admin/src/app/contacts/_components/bulk-actions-toolbar.tsx`:
- Around line 81-114: The catch blocks in handleStatusChange (and the similar
catch in handleDelete) only increment error counters and do not report
exceptions to Sentry; update those catch blocks to accept the error object (e)
and call Sentry.captureException(e, { contexts: { action: 'bulkStatusChange', id
} }) or similar contextual metadata so runtime errors are recorded; ensure you
import Sentry where needed and include enough context (id, newStatus or delete
action) to make debugging easier while preserving the existing success/error
counting and toast behavior.
- Around line 189-194: The escapeCsv function fails to treat carriage returns as
line terminators; update the escapeCsv helper so it also checks for '\r' (e.g.,
include val.includes('\r') in the condition) and still returns the quoted value
with internal double-quotes escaped (keep the existing val.replace(/"/g, '""')
behavior) so values containing CR are properly quoted for CSV parsers.
In `@apps/admin/src/app/contacts/_components/merge-editor-dialog.tsx`:
- Around line 163-169: The catch block in merge-editor-dialog.tsx currently
shows a toast but does not report the error to Sentry; import Sentry (e.g. from
'@sentry/react' or your app's Sentry wrapper) and call
Sentry.captureException(error, { extra: { action: 'mergeContacts', /* include
relevant context such as contact IDs or merge payload from the scope where this
catch lives */ } }) inside the catch where the toast is emitted (keep the toast
as-is) and ensure the caught error is forwarded to Sentry (cast to any if
needed) so runtime merge failures are recorded for monitoring.
In `@apps/api/src/contacts/contact-merge.service.ts`:
- Around line 113-120: The ContactMergeService currently runs a complex merge
transaction without capturing unexpected runtime errors for Sentry; wrap the
transaction execution inside a try-catch in the method that performs the merge
(the merge/transaction block inside ContactMergeService) and in the catch call
Sentry.captureException(err) (or the project Sentry wrapper) before re-throwing
the error so typed exceptions still propagate; ensure you import the Sentry
capture function and keep the existing logger (this change should target the
transaction/merge method within ContactMergeService).
In `@packages/shared-types/src/api/contacts.types.ts`:
- Around line 660-664: DuplicateDismissRequest currently types matchType as
string which loses compile-time safety; change the matchType property on the
DuplicateDismissRequest interface to use the existing union type
DuplicateMatchType (i.e., replace the string type with DuplicateMatchType in the
DuplicateDismissRequest declaration) so only 'email' | 'phone_name' | 'dob_name'
values are allowed.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2f76632e-04ea-4d3e-bce4-426b93f0c0ff
📒 Files selected for processing (25)
apps/admin/src/app/contacts/[id]/page.tsxapps/admin/src/app/contacts/_components/bulk-actions-toolbar.tsxapps/admin/src/app/contacts/_components/contact-print-report.tsapps/admin/src/app/contacts/_components/contacts-filter-panel.tsxapps/admin/src/app/contacts/_components/contacts-kanban.tsxapps/admin/src/app/contacts/_components/merge-editor-dialog.tsxapps/admin/src/app/contacts/duplicates/_components/duplicate-group-card.tsxapps/admin/src/app/contacts/duplicates/page.tsxapps/admin/src/app/contacts/page.tsxapps/admin/src/hooks/use-contact-merge.tsapps/api/src/contacts/contact-merge.controller.tsapps/api/src/contacts/contact-merge.service.tsapps/api/src/contacts/contacts.module.tsapps/api/src/contacts/contacts.service.tsdocs/superpowers/plans/2026-04-03-contact-merge.mddocs/superpowers/specs/2026-04-03-contact-merge-design.mdpackages/database/src/migrations/20260403150608_add_contact_merge_columns.sqlpackages/database/src/migrations/20260403150609_add_pg_trgm_extension.sqlpackages/database/src/migrations/20260403150610_create_duplicate_dismissals.sqlpackages/database/src/migrations/meta/_journal.jsonpackages/database/src/schema/contact-duplicate-dismissals.schema.tspackages/database/src/schema/contacts.schema.tspackages/database/src/schema/index.tspackages/database/tsconfig.tsbuildinfopackages/shared-types/src/api/contacts.types.ts
| <td>${esc(c.contactType === 'lead' ? 'Lead' : 'Client')}</td> | ||
| <td>${esc(statusLabel(c.contactStatus))}</td> | ||
| <td>${esc(c.city)}${c.province ? `, ${esc(c.province)}` : ''}</td> | ||
| <td>${c.dateOfBirth ? new Date(c.dateOfBirth).toLocaleDateString('en-CA') : ''}</td> |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n apps/admin/src/app/contacts/_components/contact-print-report.ts | head -30Repository: Systemsaholic/tailfire
Length of output: 1877
🏁 Script executed:
# Search for dateOfBirth usage patterns across the codebase
rg "dateOfBirth" --type ts --type tsx -B 2 -A 2Repository: Systemsaholic/tailfire
Length of output: 93
🏁 Script executed:
# Check if there are any date utility functions or timezone handling
fd -e ts -e tsx | xargs grep -l "toLocaleDateString\|parseISO\|parse.*date" | head -20Repository: Systemsaholic/tailfire
Length of output: 1262
🏁 Script executed:
# Find the ContactListItemDto type definition
rg "ContactListItemDto" --type ts -A 10 | head -40Repository: Systemsaholic/tailfire
Length of output: 2908
🏁 Script executed:
# Check date utilities available
cat packages/shared-types/src/utils/date-utils/parsers.tsRepository: Systemsaholic/tailfire
Length of output: 3318
🏁 Script executed:
# Check validators
cat packages/shared-types/src/utils/date-utils/validators.tsRepository: Systemsaholic/tailfire
Length of output: 5470
🏁 Script executed:
# Search for how dateOfBirth is handled elsewhere in the codebase
rg "dateOfBirth" --type ts -B 1 -A 1 | head -50Repository: Systemsaholic/tailfire
Length of output: 2434
Use date utility function to avoid timezone shift in DOB rendering.
When dateOfBirth is stored as a date-only string like 1990-01-01, new Date(c.dateOfBirth) parses it at UTC midnight. Calling toLocaleDateString('en-CA') then applies local timezone conversion, shifting the date backward for users in negative UTC zones (e.g., PST renders as 1989-12-31).
Use parseFlexibleDate() from @tailfire/shared-types/api instead, which safely parses date-only strings without UTC assumptions:
Example fix
<td>${c.dateOfBirth ? parseFlexibleDate(c.dateOfBirth)?.toLocaleDateString('en-CA') : ''}</td>
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/_components/contact-print-report.ts` at line 12,
The DOB rendering uses new Date(c.dateOfBirth) which causes timezone shifts for
date-only strings; replace that parsing with parseFlexibleDate from
`@tailfire/shared-types/api` so date-only values are treated correctly (use
parseFlexibleDate(c.dateOfBirth) before calling toLocaleDateString('en-CA')
wherever dateOfBirth is rendered in contact-print-report.ts). Ensure you import
parseFlexibleDate and null-check its result the same way the current code guards
c.dateOfBirth.
| <td>${esc(statusLabel(c.contactStatus))}</td> | ||
| <td>${esc(c.city)}${c.province ? `, ${esc(c.province)}` : ''}</td> | ||
| <td>${c.dateOfBirth ? new Date(c.dateOfBirth).toLocaleDateString('en-CA') : ''}</td> | ||
| <td>${(c.tags || []).join(', ')}</td> |
There was a problem hiding this comment.
Escape tags before writing the report HTML.
tags is the only user-provided field here that bypasses esc(). Because this content is injected with document.write(), a crafted tag can execute script in the admin origin when the report window opens.
🔒 Suggested fix
- <td>${(c.tags || []).join(', ')}</td>
+ <td>${esc((c.tags ?? []).join(', '))}</td>📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| <td>${(c.tags || []).join(', ')}</td> | |
| <td>${esc((c.tags ?? []).join(', '))}</td> |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/_components/contact-print-report.ts` at line 13,
Escape each tag before injecting into the report HTML: replace the raw join
expression that renders tags (the template expression using (c.tags ||
[]).join(', ')) with an escaped version that maps over c.tags and applies the
existing esc() helper (e.g., (c.tags || []).map(esc).join(', ')); ensure you use
esc() (or an equivalent HTML-encoding helper) so every tag value is HTML-escaped
before document.write() renders the report.
| if (value === 'active') { | ||
| onFiltersChange({ ...filters, isActive: true, page: 1 }) | ||
| } else if (value === 'inactive') { | ||
| onFiltersChange({ ...filters, isActive: false, page: 1 }) | ||
| } else { | ||
| // "All" — remove isActive entirely so the API returns all contacts | ||
| const { isActive: _removed, ...rest } = filters | ||
| onFiltersChange({ ...rest, page: 1 }) | ||
| } |
There was a problem hiding this comment.
❓ Verification inconclusive
🏁 Script executed:
#!/bin/bash
# Verify how the server handles missing isActive parameter
ast-grep --pattern $'if (filters.isActive !== undefined) {
$$$
} else {
$$$
}'Repository: Systemsaholic/tailfire
Length of output: 1799
🌐 Web query:
Systemsaholic/tailfire contacts service isActive default filter
💡 Result:
No definitive information found on "Systemsaholic/tailfire contacts service isActive default filter". The GitHub user Systemsaholic exists and has recently contributed to a repository named tailfire (as of Feb-Mar 2026), but the repository itself is not publicly indexed or accessible in search results, likely private. No official documentation, code snippets, or mentions of a "contacts service", "isActive" field, or default filters were discovered across GitHub, general web, company pages, or related technical sources. Tailfire appears to be a private project, possibly a Tailwind/Firestore or similar web app given the user's other TypeScript/Next.js repos (e.g., garageview, call-helm). Without public sources, the default filter behavior cannot be confirmed.
Citations:
The repository is private and cannot be accessed for independent verification. The review comment's claims about server-side filtering defaults (contacts.service.ts:165-169), hook parameter logic (use-contacts.ts:41), and the resulting functional bug cannot be confirmed without direct access to the codebase.
The analysis presented is detailed and logically structured, but relies on context from specific code snippets that I cannot independently verify. A developer with repository access should confirm:
- The actual behavior of the server when
isActiveis omitted - The hook's parameter-building logic for undefined values
- Whether users selecting "All" actually see only active contacts (the claimed bug)
If these claims are verified as accurate, the review should be escalated to .
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/_components/contacts-filter-panel.tsx` around
lines 93 - 101, The "All" branch currently deletes isActive from filters which
can trigger server default filtering; change the "All" case in
contacts-filter-panel.tsx to call onFiltersChange({ ...filters, isActive: null,
page: 1 }) instead of removing the key, and update the hook that builds API
params (use-contacts.ts — the function that constructs query/params) to
recognize isActive === null as the explicit "no filter" signal and omit sending
the isActive query parameter to the server, leaving true/false values intact.
| {contact.dateOfBirth && ( | ||
| <p className="text-xs text-muted-foreground"> | ||
| DOB: {new Date(contact.dateOfBirth).toLocaleDateString()} | ||
| </p> |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, locate the file and read the specific lines
cat -n "apps/admin/src/app/contacts/duplicates/_components/duplicate-group-card.tsx" | sed -n '60,75p'Repository: Systemsaholic/tailfire
Length of output: 683
🏁 Script executed:
# Also get context around the file to understand the overall structure
wc -l "apps/admin/src/app/contacts/duplicates/_components/duplicate-group-card.tsx"Repository: Systemsaholic/tailfire
Length of output: 144
🏁 Script executed:
# Search for any date utility functions or type definitions for dateOfBirth
rg "dateOfBirth|toLocaleDateString" "apps/admin/src/" -A 2 -B 2Repository: Systemsaholic/tailfire
Length of output: 38055
🏁 Script executed:
# Check if there are any date utilities or helpers in the codebase
fd -e ts -e tsx "date|time|util" "apps/admin/src/" | head -20Repository: Systemsaholic/tailfire
Length of output: 971
🏁 Script executed:
# Check the parseISODate function and DateDisplay component
cat -n "apps/admin/src/components/ui/date-display.tsx" | head -50Repository: Systemsaholic/tailfire
Length of output: 1788
🏁 Script executed:
# Check for parseISODate definition
rg "parseISODate|export function parseISODate" "apps/admin/src/" -A 3Repository: Systemsaholic/tailfire
Length of output: 27080
🏁 Script executed:
# Check the flight-journey-display.tsx to see the pattern used there
cat -n "apps/admin/src/components/ui/flight-journey-display.tsx" | head -40Repository: Systemsaholic/tailfire
Length of output: 1580
Use the parseISODate utility or DateDisplay component to handle date-only strings correctly.
new Date(contact.dateOfBirth).toLocaleDateString() treats YYYY-MM-DD as UTC, causing users west of UTC to see the previous day. The codebase already has parseISODate() in date-utils.ts for safe parsing, or use the DateDisplay component with format="date" as shown in its JSDoc example.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/duplicates/_components/duplicate-group-card.tsx`
around lines 64 - 67, The current DOB rendering uses new
Date(contact.dateOfBirth).toLocaleDateString() which misparses YYYY-MM-DD as
UTC; replace it with the safe parser or component: either call
parseISODate(contact.dateOfBirth) and then format via toLocaleDateString (or
pass the parsed Date to your output), or render the DateDisplay component with
value={contact.dateOfBirth} and format="date" (instead of new Date(...)); update
the JSX in the DuplicateGroupCard where contact.dateOfBirth is used to use
parseISODate or DateDisplay to ensure correct local-day rendering.
| const { data, isLoading, isFetching, refetch } = useDuplicateDetection() | ||
| const dismissMutation = useDismissDuplicate() | ||
| const { toast } = useToast() | ||
| const [mergeIds, setMergeIds] = useState<[string, string] | null>(null) |
There was a problem hiding this comment.
Don't treat failed scans as “no scan results yet.”
This page never reads the duplicates query error, so a failed scan falls back to the pre-scan empty state instead of a retryable error UI. Dismiss failures are also only toasted. Please surface those failures explicitly and capture them in Sentry with NEXT_PUBLIC_SENTRY_ENVIRONMENT.
As per coding guidelines, "Admin application must capture runtime errors with Sentry. Errors must be tagged with NEXT_PUBLIC_SENTRY_ENVIRONMENT as development, preview, or production."
Also applies to: 22-33, 67-83
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/duplicates/page.tsx` around lines 15 - 18, The
duplicates page currently ignores query errors and treats failed scans like "no
results" and only toasts dismiss failures; update the UI logic around
useDuplicateDetection (data, isLoading, isFetching, refetch) to explicitly
handle and render an error state when the duplicates query has an error (show
retry button that calls refetch), and when using useDismissDuplicate
(dismissMutation) capture failures in Sentry via Sentry.captureException(error)
tagging the event with process.env.NEXT_PUBLIC_SENTRY_ENVIRONMENT and also
render or surface a persistent error UI (not just a toast) for dismiss failures
so the admin can retry; ensure both query error handling and
dismissMutation.onError call include Sentry capture with the environment tag and
surface a retryable error component instead of falling back to the pre-scan
empty state.
| // Check if already dismissed | ||
| const [existing] = await this.db.client.execute(sql` | ||
| SELECT id FROM contact_duplicate_dismissals | ||
| WHERE agency_id = ${auth.agencyId} | ||
| AND LEAST(contact_id1, contact_id2) = ${canonId1} | ||
| AND GREATEST(contact_id1, contact_id2) = ${canonId2} | ||
| AND match_type = ${matchType} | ||
| LIMIT 1 | ||
| `) | ||
|
|
||
| if (existing) { | ||
| return // Already dismissed, idempotent | ||
| } | ||
|
|
||
| await this.db.client.execute(sql` | ||
| INSERT INTO contact_duplicate_dismissals | ||
| (agency_id, contact_id1, contact_id2, match_type, dismissed_by) | ||
| VALUES | ||
| (${auth.agencyId}, ${canonId1}, ${canonId2}, ${matchType}, ${auth.userId}) | ||
| `) |
There was a problem hiding this comment.
TOCTOU race condition in dismissDuplicate relies on broken unique constraint.
The SELECT-before-INSERT pattern (lines 926-937) is vulnerable to a race condition: two concurrent calls could both see no existing row and both insert. This is typically mitigated by a unique constraint, but as noted in Context snippet 1, the constraint is missing agency_id.
With the current constraint UNIQUE(contact_id1, contact_id2, match_type), the INSERT might fail for the wrong reason (collision with a different agency's dismissal) or succeed when it shouldn't.
Recommended fix: Use INSERT ... ON CONFLICT DO NOTHING to make the operation atomic:
🔒 Proposed fix
- // Check if already dismissed
- const [existing] = await this.db.client.execute(sql`
- SELECT id FROM contact_duplicate_dismissals
- WHERE agency_id = ${auth.agencyId}
- AND LEAST(contact_id1, contact_id2) = ${canonId1}
- AND GREATEST(contact_id1, contact_id2) = ${canonId2}
- AND match_type = ${matchType}
- LIMIT 1
- `)
-
- if (existing) {
- return // Already dismissed, idempotent
- }
-
- await this.db.client.execute(sql`
- INSERT INTO contact_duplicate_dismissals
- (agency_id, contact_id1, contact_id2, match_type, dismissed_by)
- VALUES
- (${auth.agencyId}, ${canonId1}, ${canonId2}, ${matchType}, ${auth.userId})
- `)
+ // Upsert with ON CONFLICT for atomic idempotency
+ await this.db.client.execute(sql`
+ INSERT INTO contact_duplicate_dismissals
+ (agency_id, contact_id1, contact_id2, match_type, dismissed_by)
+ VALUES
+ (${auth.agencyId}, ${canonId1}, ${canonId2}, ${matchType}, ${auth.userId})
+ ON CONFLICT (agency_id, contact_id1, contact_id2, match_type) DO NOTHING
+ `)This also requires fixing the migration's unique constraint to include agency_id.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/contacts/contact-merge.service.ts` around lines 925 - 944, The
current SELECT-then-INSERT in dismissDuplicate is racy; replace it with a single
INSERT INTO contact_duplicate_dismissals (agency_id, contact_id1, contact_id2,
match_type, dismissed_by) VALUES (...) ON CONFLICT (agency_id, contact_id1,
contact_id2, match_type) DO NOTHING so the operation is atomic and idempotent,
and amend the migration that created the UNIQUE constraint on
contact_duplicate_dismissals to include agency_id in the unique key (use
UNIQUE(agency_id, contact_id1, contact_id2, match_type) or equivalent) so
conflicts are scoped per agency.
| - [ ] **Step 3: Create dismissals table migration** | ||
|
|
||
| ```sql | ||
| CREATE TABLE IF NOT EXISTS contact_duplicate_dismissals ( | ||
| id UUID PRIMARY KEY DEFAULT gen_random_uuid(), | ||
| agency_id UUID NOT NULL, | ||
| contact_id1 UUID NOT NULL REFERENCES contacts(id) ON DELETE CASCADE, | ||
| contact_id2 UUID NOT NULL REFERENCES contacts(id) ON DELETE CASCADE, | ||
| match_type VARCHAR(50) NOT NULL, | ||
| dismissed_by UUID NOT NULL, | ||
| dismissed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(), | ||
| UNIQUE (contact_id1, contact_id2, match_type) | ||
| ); | ||
| ``` |
There was a problem hiding this comment.
Plan shows unique constraint without agency_id.
Same issue as the spec and migration: the dismissals table unique constraint should include agency_id:
UNIQUE (agency_id, contact_id1, contact_id2, match_type)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/superpowers/plans/2026-04-03-contact-merge.md` around lines 52 - 65, The
UNIQUE constraint on the contact_duplicate_dismissals table is missing agency
scoping; update the UNIQUE clause in the contact_duplicate_dismissals migration
(and any spec) to include agency_id so the constraint reads uniqueness over
agency_id, contact_id1, contact_id2, match_type rather than only contact_id1,
contact_id2, match_type.
| CREATE TABLE contact_duplicate_dismissals ( | ||
| id UUID PRIMARY KEY DEFAULT gen_random_uuid(), | ||
| agency_id UUID NOT NULL, | ||
| contact_id1 UUID NOT NULL REFERENCES contacts(id), | ||
| contact_id2 UUID NOT NULL REFERENCES contacts(id), | ||
| match_type VARCHAR(50) NOT NULL, | ||
| dismissed_by UUID NOT NULL, | ||
| dismissed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(), | ||
| UNIQUE (contact_id1, contact_id2, match_type) | ||
| ); | ||
| ``` |
There was a problem hiding this comment.
Spec shows unique constraint without agency_id.
The dismissals table definition shows UNIQUE (contact_id1, contact_id2, match_type) without agency_id. This is the same issue as the migration. While contacts are agency-scoped via FK, the unique constraint should explicitly include agency_id to prevent edge cases where contact IDs from different agencies could collide (especially if contact deletion cascades differently).
Update the spec to match the corrected migration:
UNIQUE (agency_id, contact_id1, contact_id2, match_type)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/superpowers/specs/2026-04-03-contact-merge-design.md` around lines 227 -
237, The UNIQUE constraint on contact_duplicate_dismissals currently uses UNIQUE
(contact_id1, contact_id2, match_type) which omits agency_id; update the table
spec (contact_duplicate_dismissals) to include agency_id in the unique index so
it reads UNIQUE (agency_id, contact_id1, contact_id2, match_type) to match the
corrected migration and ensure uniqueness is scoped per agency for rows
identified by contact_id1/contact_id2 and match_type.
| CREATE TABLE IF NOT EXISTS contact_duplicate_dismissals ( | ||
| id UUID PRIMARY KEY DEFAULT gen_random_uuid(), | ||
| agency_id UUID NOT NULL, | ||
| contact_id1 UUID NOT NULL REFERENCES contacts(id) ON DELETE CASCADE, | ||
| contact_id2 UUID NOT NULL REFERENCES contacts(id) ON DELETE CASCADE, | ||
| match_type VARCHAR(50) NOT NULL, | ||
| dismissed_by UUID NOT NULL, | ||
| dismissed_at TIMESTAMPTZ NOT NULL DEFAULT NOW(), | ||
| UNIQUE (contact_id1, contact_id2, match_type) | ||
| ); |
There was a problem hiding this comment.
Match the index to the canonical-pair lookup.
dismissDuplicate() checks rows by agency_id, match_type, and the canonicalized pair via LEAST/GREATEST (apps/api/src/contacts/contact-merge.service.ts:911-945). The current UNIQUE (contact_id1, contact_id2, match_type) does not support that predicate, so idempotency checks and dismissal filtering will devolve into scans as this table grows. Please either enforce stored ordering and query raw columns, or add an expression index/unique constraint on the canonical pair, ideally scoped by agency_id.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@packages/database/src/migrations/20260403150610_create_duplicate_dismissals.sql`
around lines 1 - 10, The UNIQUE constraint on contact_duplicate_dismissals
currently enforces (contact_id1, contact_id2, match_type) but dismissDuplicate()
looks up by agency_id, match_type and the canonical pair via LEAST/GREATEST;
update the schema to support that lookup by either (A) enforcing canonical
ordering on insert/update (ensure contact_id1 = LEAST(...), contact_id2 =
GREATEST(...)) and keep a UNIQUE (agency_id, contact_id1, contact_id2,
match_type), or (B) add a unique expression index/constraint using
LEAST(contact_id1, contact_id2), GREATEST(contact_id1, contact_id2) scoped by
agency_id and match_type so dismissDuplicate() queries use the index instead of
scanning; pick one approach and apply it to contact_duplicate_dismissals to
match the canonical-pair lookup used in dismissDuplicate().
There was a problem hiding this comment.
🧹 Nitpick comments (4)
apps/api/src/contacts/contact-merge.service.ts (4)
214-220: Access check outside transaction relies on FOR UPDATE locks.The
canUseContactcalls query outside the transaction context, but the contacts were locked withFOR UPDATEinside the transaction. While the locks prevent concurrent modifications to the contact rows themselves, the access check queriescontact_shareswithout locking.In a rare scenario: another transaction could revoke a share between this access check and the merge completion, though the practical impact is minimal since the merge has already been validated.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 214 - 220, Access checks call contactAccess.canUseContact(primaryId/secondaryId, auth) outside the transaction that holds FOR UPDATE locks, which allows a race where contact_shares could be changed between the check and merge; move the access verification into the same transaction (or modify contactAccess.canUseContact to accept and use the transaction/EntityManager) so the queries run under the transaction and apply appropriate locking (e.g., SELECT ... FOR UPDATE on contact_shares) for primaryId and secondaryId using the same auth context.
232-239: Dead code:valis computed but never used.Line 234 computes
const val = secondary[col]but this value is never used within the loop iteration. The value is recomputed later at line 249 when building the actual update parts. This dead code should be removed.🧹 Proposed fix
if (source === 'secondary') { - // Take the value from the secondary contact - const val = secondary[col] - // Build a SET clause — we handle this via parameterized update below setClauses.push(field) } // If source === 'primary', no action needed — primary keeps its value🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 232 - 239, Remove the unused local variable by deleting the dead assignment "const val = secondary[col]" inside the loop where "source === 'secondary'"; keep pushing the "field" into "setClauses" as-is so the later update-building code (which recomputes the value) remains responsible for using "secondary[col]". Ensure the block that checks "if (source === 'secondary')" only performs setClauses.push(field) and no longer declares "val".
728-902: Duplicate detection logic is well-designed.The three-tier approach with proper exclusion of higher-confidence matches from lower tiers prevents redundant results. The
similarity() >= 0.4threshold is reasonable for catching typos while avoiding false positives.For large agencies, consider adding composite indexes to optimize these queries:
(agency_id, is_active, email)for email tier(agency_id, is_active, phone)for phone tier(agency_id, is_active, date_of_birth)for DOB tierThe pg_trgm GIN index mentioned in the migration should help with the similarity calculations.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 728 - 902, Queries in detectDuplicates (specifically the emailMatches, phoneNameMatches and dobNameMatches SQL blocks) will scan large contact sets for big agencies; add a DB migration that creates composite indexes to speed those WHERE clauses: create indexes on (agency_id, is_active, LOWER(email)) or (agency_id, is_active, email) for the email tier, (agency_id, is_active, phone) for the phone tier, and (agency_id, is_active, date_of_birth) for the DOB tier; ensure the migration also keeps or adds the pg_trgm GIN index used for similarity on the concatenated name searches so phoneNameMatches and dobNameMatches can use trigram acceleration.
247-255: Pass JSONB objects directly instead of stringifying.When
valis an object (e.g.,travelPreferences), the current code callsJSON.stringify(val), which sends a text parameter to PostgreSQL. PostgreSQL then implicitly casts text to JSONB, which is fragile and unreliable.Pass the value directly and let Drizzle handle JSONB serialization:
♻️ Proposed fix
const updateParts = setClauses.map((field) => { const col = FIELD_TO_COLUMN[field] const val = secondary[col] if (val === null || val === undefined) { return sql.raw(`${col} = NULL`) } - // Use parameterized value for safety - return sql`${sql.raw(col)} = ${typeof val === 'object' ? JSON.stringify(val) : val}` + // Pass value directly - Drizzle handles JSONB serialization + return sql`${sql.raw(col)} = ${val}` })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 247 - 255, The updateParts mapping currently stringifies object values (in the updateParts function that iterates setClauses using FIELD_TO_COLUMN and secondary) which sends text to Postgres; instead remove the JSON.stringify branch and pass val directly into the parameterized SQL expression so Drizzle can serialize JSON/JSONB (i.e., change the return from sql`${sql.raw(col)} = ${typeof val === 'object' ? JSON.stringify(val) : val}` to simply return sql`${sql.raw(col)} = ${val}` while keeping the null/undefined check and existing use of sql.raw(col)).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@apps/api/src/contacts/contact-merge.service.ts`:
- Around line 214-220: Access checks call
contactAccess.canUseContact(primaryId/secondaryId, auth) outside the transaction
that holds FOR UPDATE locks, which allows a race where contact_shares could be
changed between the check and merge; move the access verification into the same
transaction (or modify contactAccess.canUseContact to accept and use the
transaction/EntityManager) so the queries run under the transaction and apply
appropriate locking (e.g., SELECT ... FOR UPDATE on contact_shares) for
primaryId and secondaryId using the same auth context.
- Around line 232-239: Remove the unused local variable by deleting the dead
assignment "const val = secondary[col]" inside the loop where "source ===
'secondary'"; keep pushing the "field" into "setClauses" as-is so the later
update-building code (which recomputes the value) remains responsible for using
"secondary[col]". Ensure the block that checks "if (source === 'secondary')"
only performs setClauses.push(field) and no longer declares "val".
- Around line 728-902: Queries in detectDuplicates (specifically the
emailMatches, phoneNameMatches and dobNameMatches SQL blocks) will scan large
contact sets for big agencies; add a DB migration that creates composite indexes
to speed those WHERE clauses: create indexes on (agency_id, is_active,
LOWER(email)) or (agency_id, is_active, email) for the email tier, (agency_id,
is_active, phone) for the phone tier, and (agency_id, is_active, date_of_birth)
for the DOB tier; ensure the migration also keeps or adds the pg_trgm GIN index
used for similarity on the concatenated name searches so phoneNameMatches and
dobNameMatches can use trigram acceleration.
- Around line 247-255: The updateParts mapping currently stringifies object
values (in the updateParts function that iterates setClauses using
FIELD_TO_COLUMN and secondary) which sends text to Postgres; instead remove the
JSON.stringify branch and pass val directly into the parameterized SQL
expression so Drizzle can serialize JSON/JSONB (i.e., change the return from
sql`${sql.raw(col)} = ${typeof val === 'object' ? JSON.stringify(val) : val}` to
simply return sql`${sql.raw(col)} = ${val}` while keeping the null/undefined
check and existing use of sql.raw(col)).
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 7d605d95-6d61-473b-8616-fbd9081e571f
📒 Files selected for processing (1)
apps/api/src/contacts/contact-merge.service.ts
1. trip_travelers sub-merge: add repoints for trip_traveler_insurance, traveler_group_members, and cruise_booking_sessions (data loss fix) 2. Portal merge: sync contacts.portalUserId when transferring portal account 3. Import wizard: default possible_match rows to 'skip' instead of 'create' to prevent duplicate creation 4. "All" active filter: remove backend default (isActive undefined = show all), page.tsx defaults to isActive=true on load 5. Sort keys: add contactType, contactStatus, dateOfBirth to DTO allowlist, make Next Trip column non-sortable (computed field) 6. API typecheck: fix unused vars, type narrowing, missing merge fields in DTO Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (3)
packages/shared-types/src/api/contacts.types.ts (1)
660-664: Consider typingmatchTypeasDuplicateMatchTypefor consistency.The
matchTypefield is typed asstringbut should match theDuplicateMatchTypeunion defined on line 647 to ensure type safety and prevent invalid values from being passed to the API.♻️ Proposed fix
export interface DuplicateDismissRequest { contactId1: string contactId2: string - matchType: string + matchType: DuplicateMatchType }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/shared-types/src/api/contacts.types.ts` around lines 660 - 664, The DuplicateDismissRequest interface declares matchType as a plain string which allows invalid values; change its type to the existing DuplicateMatchType union to enforce valid values. Update the DuplicateDismissRequest declaration (the interface named DuplicateDismissRequest) so matchType: DuplicateMatchType instead of matchType: string and ensure any call sites constructing DuplicateDismissRequest use values from DuplicateMatchType.apps/api/src/contacts/contact-merge.service.ts (2)
771-803: Consider adding a functional index for case-insensitive email matching.The query uses
LOWER(c1.email) = LOWER(c2.email)for case-insensitive matching. For better performance on larger datasets, consider adding a functional index:CREATE INDEX idx_contacts_lower_email ON contacts (agency_id, LOWER(email)) WHERE is_active = true AND merged_into_contact_id IS NULL;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 771 - 803, The emailMatches query in contact-merge.service.ts uses LOWER(c1.email)=LOWER(c2.email) which is slow on large tables; add a functional partial index (e.g., name it idx_contacts_lower_email) that indexes agency_id and LOWER(email) and includes the same WHERE filters used in the query (is_active = true and merged_into_contact_id IS NULL) so the DB can use the index for case-insensitive lookups; after creating the index, confirm the query plan uses it and remove any redundant scans around the db.client.execute call that selects into emailMatches.
243-262: Potential data type handling issue withString(val)conversion.The value conversion on line 252 uses
String(val)for non-object values. This may not correctly handle certain data types likeDateobjects ornull-ish edge cases. While the parameterized query protects against SQL injection, the string coercion could produce unexpected values.Consider using the raw value directly and letting the database driver handle type conversion:
♻️ Proposed fix
const updateParts = setClauses.map((field) => { const col = FIELD_TO_COLUMN[field] ?? field const val = secondary[col as keyof typeof secondary] if (val === null || val === undefined) { return sql.raw(`${col} = NULL`) } - // Use parameterized value for safety - return sql`${sql.raw(col)} = ${typeof val === 'object' ? JSON.stringify(val) : String(val)}` + // Use parameterized value for safety - let driver handle type conversion + if (typeof val === 'object') { + return sql`${sql.raw(col)} = ${JSON.stringify(val)}` + } + return sql`${sql.raw(col)} = ${val}` })🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/contacts/contact-merge.service.ts` around lines 243 - 262, The code in the updateParts mapping coerces non-object values with String(val), which can mis-handle types like Date; change the value handling in contact-merge.service.ts inside the setClauses -> updateParts mapping so you pass the raw value to the parameterized SQL instead of String(val). Specifically, keep the current check that returns sql.raw(`${col} = NULL`) for null/undefined, but replace the ternary (typeof val === 'object' ? JSON.stringify(val) : String(val)) with logic that JSON.stringifys plain objects (excluding Date) and otherwise passes val directly (so Dates and numbers remain their native types) before embedding in the sql interpolation used in tx.execute for UPDATE contacts (refer to FIELD_TO_COLUMN, secondary, updateParts, and the tx.execute call).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/admin/src/app/contacts/_components/contact-import-wizard.tsx`:
- Around line 185-190: The confirm-time fallback currently treats missing
entries from the actions Map as 'create', which can reintroduce duplicates for
rows with disposition 'possible_match'; update the confirm/commit logic that
reads actions.get(rowIndex) (the code that currently defaults to 'create') to
default to 'skip' instead, ensuring the actions Map and the possible_match
handling in ContactImportWizard (where result.results and actions are used)
consistently treat unknown/missing actions as 'skip'.
---
Nitpick comments:
In `@apps/api/src/contacts/contact-merge.service.ts`:
- Around line 771-803: The emailMatches query in contact-merge.service.ts uses
LOWER(c1.email)=LOWER(c2.email) which is slow on large tables; add a functional
partial index (e.g., name it idx_contacts_lower_email) that indexes agency_id
and LOWER(email) and includes the same WHERE filters used in the query
(is_active = true and merged_into_contact_id IS NULL) so the DB can use the
index for case-insensitive lookups; after creating the index, confirm the query
plan uses it and remove any redundant scans around the db.client.execute call
that selects into emailMatches.
- Around line 243-262: The code in the updateParts mapping coerces non-object
values with String(val), which can mis-handle types like Date; change the value
handling in contact-merge.service.ts inside the setClauses -> updateParts
mapping so you pass the raw value to the parameterized SQL instead of
String(val). Specifically, keep the current check that returns sql.raw(`${col} =
NULL`) for null/undefined, but replace the ternary (typeof val === 'object' ?
JSON.stringify(val) : String(val)) with logic that JSON.stringifys plain objects
(excluding Date) and otherwise passes val directly (so Dates and numbers remain
their native types) before embedding in the sql interpolation used in tx.execute
for UPDATE contacts (refer to FIELD_TO_COLUMN, secondary, updateParts, and the
tx.execute call).
In `@packages/shared-types/src/api/contacts.types.ts`:
- Around line 660-664: The DuplicateDismissRequest interface declares matchType
as a plain string which allows invalid values; change its type to the existing
DuplicateMatchType union to enforce valid values. Update the
DuplicateDismissRequest declaration (the interface named
DuplicateDismissRequest) so matchType: DuplicateMatchType instead of matchType:
string and ensure any call sites constructing DuplicateDismissRequest use values
from DuplicateMatchType.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 5e800c34-ac48-418f-8c2b-36609800932a
📒 Files selected for processing (10)
apps/admin/src/app/contacts/_components/contact-import-wizard.tsxapps/admin/src/app/contacts/_components/contacts-filter-panel.tsxapps/admin/src/app/contacts/_components/contacts-table.tsxapps/admin/src/app/contacts/page.tsxapps/admin/src/hooks/use-contacts.tsapps/api/src/contacts/contact-import.service.tsapps/api/src/contacts/contact-merge.service.tsapps/api/src/contacts/contacts.service.tsapps/api/src/contacts/dto/contact-filter.dto.tspackages/shared-types/src/api/contacts.types.ts
✅ Files skipped from review due to trivial changes (2)
- apps/admin/src/hooks/use-contacts.ts
- apps/api/src/contacts/contact-import.service.ts
🚧 Files skipped from review as they are similar to previous changes (2)
- apps/api/src/contacts/contacts.service.ts
- apps/admin/src/app/contacts/_components/contacts-filter-panel.tsx
| // Default possible_match rows to 'skip' — agent must explicitly opt in | ||
| const actions = new Map<number, 'create' | 'merge' | 'skip'>() | ||
| for (const r of result.results) { | ||
| if (r.disposition === 'possible_match') { | ||
| actions.set(r.rowIndex, 'create') | ||
| actions.set(r.rowIndex, 'skip') | ||
| } |
There was a problem hiding this comment.
Make possible_match fallback consistently safe.
You default to 'skip' in preview init, but Line 219 falls back to 'create' when an action entry is missing. That can reintroduce accidental duplicate creation in edge cases. Prefer 'skip' as the confirm-time fallback too.
Suggested fix
- if (result.disposition === 'possible_match') {
- action = rowActions.get(rowIndex) ?? 'create'
+ if (result.disposition === 'possible_match') {
+ action = rowActions.get(rowIndex) ?? 'skip'
} else if (result.disposition === 'update') {Also applies to: 218-220
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/admin/src/app/contacts/_components/contact-import-wizard.tsx` around
lines 185 - 190, The confirm-time fallback currently treats missing entries from
the actions Map as 'create', which can reintroduce duplicates for rows with
disposition 'possible_match'; update the confirm/commit logic that reads
actions.get(rowIndex) (the code that currently defaults to 'create') to default
to 'skip' instead, ensuring the actions Map and the possible_match handling in
ContactImportWizard (where result.results and actions are used) consistently
treat unknown/missing actions as 'skip'.
… type The .references(() => contacts.id) on mergedIntoContactId causes TS7022 circular type inference. The FK constraint is already defined in the SQL migration — Drizzle doesn't need the JS-level reference. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…s from UI 1. Resolve contacts.module.ts conflict (both Merge + Lifecycle registered) 2. Remove contactType/contactStatus from merge editor (lifecycle-managed) 3. Tighten merge authorization: require owner or admin, not basic share access Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Summary
/contacts/duplicates— on-demand scan with merge/dismiss actions per groupMigrations
20260403150608_add_contact_merge_columns.sql— merged_into_contact_id, merged_at, merged_by on contacts20260403150609_add_pg_trgm_extension.sql— pg_trgm + trigram index for fuzzy name matching20260403150610_create_duplicate_dismissals.sql— contact_duplicate_dismissals tableTest plan
🤖 Generated with Claude Code
Summary by CodeRabbit
Release Notes
New Features
Improvements