Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -700,6 +700,7 @@ export const ConversationAccordion = ({
deep: {
chunks: {
_limit: 25,
_sort: ["-timestamp", "-created_at"],
},
},
// Override filter to add tag filtering while preserving project scope
Expand Down
Original file line number Diff line number Diff line change
@@ -1,47 +1,66 @@
import { readItems } from "@directus/sdk";
import { aggregate } from "@directus/sdk";
import { t } from "@lingui/core/macro";
import { ActionIcon, Group, Stack, Text } from "@mantine/core";
import { IconRefresh, IconUsersGroup } from "@tabler/icons-react";
import { useQuery } from "@tanstack/react-query";
import { useQuery, useQueryClient } from "@tanstack/react-query";
import { useState } from "react";
import { directus } from "@/lib/directus";
import { SummaryCard } from "../common/SummaryCard";

const TIME_INTERVAL_SECONDS = 40;
const TIME_INTERVAL_SECONDS = 30;

export const OngoingConversationsSummaryCard = ({
projectId,
}: {
projectId: string;
}) => {
// FIXME: could potentially use the "Aggregate" API to just get the count
const queryClient = useQueryClient();
// Track previous state to detect changes
const [hasOngoingConversations, setHasOngoingConversations] = useState(false);
// const hasOngoingConversationsRef = useRef<boolean>(false);

const conversationChunksQuery = useQuery({
queryFn: async () => {
const chunks = await directus.request(
readItems("conversation_chunk", {
fields: ["conversation_id"],
filter: {
conversation_id: {
project_id: projectId,
},
source: {
// @ts-expect-error source is not typed
_nin: ["DASHBOARD_UPLOAD", "CLONE"],
},
timestamp: {
// @ts-expect-error gt is not typed
_gt: new Date(
Date.now() - TIME_INTERVAL_SECONDS * 1000,
).toISOString(),
const result = await directus.request(
aggregate("conversation_chunk", {
aggregate: {
countDistinct: ["conversation_id"],
},
query: {
filter: {
conversation_id: {
project_id: projectId,
},
Comment on lines +30 to +33
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Make the nested project_id filter explicit under conversation_id

Minor clarity/readability nit: you’re currently relying on implicit equality sugar for the nested project filter:

conversation_id: {
  project_id: projectId,
},

I’d make this explicit so nobody has to mentally expand the Directus filter rules and so TypeScript tooling doesn’t get confused by the nesting:

-            conversation_id: {
-              project_id: projectId,
-            },
+            conversation_id: {
+              project_id: {
+                _eq: projectId,
+              },
+            },

Functionally the same intent, but clearer to future readers and less magical.

🤖 Prompt for AI Agents
In echo/frontend/src/components/conversation/OngoingConversationsSummaryCard.tsx
around lines 30 to 33, the Directus nested filter for conversation_id uses
implicit equality shorthand for project_id which is unclear; update the filter
to explicitly use the equality operator (e.g., set project_id: { _eq: projectId
} or the Directus explicit comparator your codebase uses) so the intent is
explicit, TypeScript tooling can infer types, and readers don’t need to expand
the implicit sugar.

source: {
// @ts-expect-error source is not typed
_nin: ["DASHBOARD_UPLOAD", "CLONE"],
},
timestamp: {
// @ts-expect-error gt is not typed
_gt: new Date(
Date.now() - TIME_INTERVAL_SECONDS * 1000,
).toISOString(),
},
},
},
}),
);

const uniqueConversations = new Set(
chunks.map((chunk) => chunk.conversation_id),
const currentCount = Number(
// @ts-expect-error aggregate response type is not properly typed
(result[0]?.countDistinct?.conversation_id as string) ?? "0",
);

return uniqueConversations.size;
if (currentCount > 0 || hasOngoingConversations) {
queryClient.invalidateQueries({
queryKey: ["projects", projectId, "conversations"],
});
setHasOngoingConversations(false);
}

setHasOngoingConversations(currentCount > 0);

return currentCount;
},
queryKey: ["conversation_chunks", projectId],
refetchInterval: 30000,
Expand Down
6 changes: 4 additions & 2 deletions echo/frontend/src/components/conversation/hooks/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -658,7 +658,7 @@ export const useConversationsByProjectId = (
query?: Partial<Query<CustomDirectusTypes, Conversation>>,
filterBySource?: string[],
) => {
const TIME_INTERVAL_SECONDS = 40;
const TIME_INTERVAL_SECONDS = 30;

Comment on lines +661 to 662
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Wire refetchInterval to TIME_INTERVAL_SECONDS to avoid future drift

Right now you’ve got TIME_INTERVAL_SECONDS = 30 in both hooks but still hard-code refetchInterval: 30000. That’s fine today, but it’s an easy place for things to fall out of sync the next time someone tweaks the window.

I’d tie the polling to the same constant:

-	export const useConversationsByProjectId = (
+	export const useConversationsByProjectId = (
@@
-	const TIME_INTERVAL_SECONDS = 30;
+	const TIME_INTERVAL_SECONDS = 30;
@@
-		refetchInterval: 30000,
+		refetchInterval: TIME_INTERVAL_SECONDS * 1000,
@@
-export const useInfiniteConversationsByProjectId = (
+export const useInfiniteConversationsByProjectId = (
@@
-	const TIME_INTERVAL_SECONDS = 30;
+	const TIME_INTERVAL_SECONDS = 30;
@@
-		refetchInterval: 30000,
+		refetchInterval: TIME_INTERVAL_SECONDS * 1000,

Keeps the “live window” and polling cadence locked together.

Also applies to: 732-733, 872-873, 955-956

🤖 Prompt for AI Agents
In echo/frontend/src/components/conversation/hooks/index.ts around lines 661-662
(also apply same change at 732-733, 872-873, 955-956): the refetchInterval is
hard-coded as 30000 while TIME_INTERVAL_SECONDS is defined as 30, which can
drift when one is changed; update the refetchInterval to derive from
TIME_INTERVAL_SECONDS (multiply by 1000) so polling cadence is tied to the
single constant, and ensure any other hard-coded 30000 occurrences in the listed
line ranges are replaced similarly.

return useQuery({
queryFn: async () => {
Expand All @@ -667,6 +667,7 @@ export const useConversationsByProjectId = (
deep: {
chunks: {
_limit: loadChunks ? 1000 : 1,
_sort: ["-timestamp", "-created_at"],
},
},
fields: [
Expand Down Expand Up @@ -868,7 +869,7 @@ export const useInfiniteConversationsByProjectId = (
},
) => {
const { initialLimit = 15 } = options ?? {};
const TIME_INTERVAL_SECONDS = 40;
const TIME_INTERVAL_SECONDS = 30;

return useInfiniteQuery({
getNextPageParam: (lastPage: { nextOffset?: number }) =>
Expand All @@ -880,6 +881,7 @@ export const useInfiniteConversationsByProjectId = (
deep: {
chunks: {
_limit: loadChunks ? 1000 : 1,
_sort: ["-timestamp", "-created_at"],
},
},
fields: [
Expand Down
Loading