Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion src/server/adaptors/googlesheets.ts
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ export class GoogleSheetsAdaptor implements DataSourceAdaptor {

async getRecordCount(): Promise<number | null> {
try {
const url = `https://sheets.googleapis.com/v4/spreadsheets/${this.spreadsheetId}/values/${encodeURIComponent(this.sheetName)}!A:A`;
const url = `https://sheets.googleapis.com/v4/spreadsheets/${this.spreadsheetId}/values/${encodeURIComponent(this.sheetName)}!A:Z`;
Copy link

Copilot AI Feb 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing from A:A to A:Z improves accuracy by counting rows with data in any of the first 26 columns, rather than only rows with data in column A. However, this still won't count rows that only have data beyond column Z. For better consistency with fetchAll() (which fetches all columns), consider using a larger range like A:ZZ or document this 26-column limitation.

Suggested change
const url = `https://sheets.googleapis.com/v4/spreadsheets/${this.spreadsheetId}/values/${encodeURIComponent(this.sheetName)}!A:Z`;
const url = `https://sheets.googleapis.com/v4/spreadsheets/${this.spreadsheetId}/values/${encodeURIComponent(this.sheetName)}!A:ZZ`;

Copilot uses AI. Check for mistakes.
const response = await this.makeGoogleSheetsRequest(url);

if (!response.ok) {
Expand Down
11 changes: 5 additions & 6 deletions src/server/jobs/importDataRecords.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@ import { db } from "@/server/services/database";
import logger from "@/server/services/logger";
import { batchAsync } from "../utils";
import { importBatch } from "./importDataSource";
import type { ColumnDef } from "@/server/models/DataSource";

const importDataRecords = async (args: object | null): Promise<boolean> => {
if (!args || !("dataSourceId" in args)) {
Expand Down Expand Up @@ -49,20 +48,16 @@ const importDataRecords = async (args: object | null): Promise<boolean> => {
}

const batches = batchAsync(dataRecords, DATA_RECORDS_JOB_BATCH_SIZE);
const columnDefsAccumulator = [...dataSource.columnDefs];

for await (const batch of batches) {
try {
const columnDefsAccumulator = [] as ColumnDef[];
const records = await adaptor.fetchByExternalId(
batch.map((r) => r.externalId),
);

await importBatch(records, dataSource, columnDefsAccumulator);

await updateDataSource(dataSource.id, {
columnDefs: columnDefsAccumulator,
});

await db
.updateTable("dataRecord")
.set({ needsImport: false })
Expand All @@ -86,6 +81,10 @@ const importDataRecords = async (args: object | null): Promise<boolean> => {
}
}

await updateDataSource(dataSource.id, {
columnDefs: columnDefsAccumulator,
});

// Update the recordCount for the data source
const totalRecordCount = await db
.selectFrom("dataRecord")
Expand Down