feat(appkit): infer numeric SQL type for sql.number(), add typed variants#323
Open
jamesbroadhead wants to merge 1 commit intomainfrom
Open
feat(appkit): infer numeric SQL type for sql.number(), add typed variants#323jamesbroadhead wants to merge 1 commit intomainfrom
jamesbroadhead wants to merge 1 commit intomainfrom
Conversation
…ants
Currently sql.number() unconditionally produces __sql_type: "NUMERIC",
which Databricks SQL binds as DECIMAL(10,0). That breaks LIMIT and
OFFSET — Spark requires an integer-typed expression and rejects the
parameter with INVALID_LIMIT_LIKE_EXPRESSION.DATA_TYPE — and silently
truncates non-integer JS numbers (sql.number(3.14) sent value="3.14"
into a DECIMAL(10,0) slot).
Two changes:
1. sql.number() now infers the wire type from the value:
- integer JS number -> BIGINT
- non-integer JS number -> DOUBLE
- numeric string -> NUMERIC (preserves caller's precision intent;
a string is an explicit choice to skip JS-number coercion)
Picks the most-precise type the value can losslessly represent.
Spark coerces numeric types implicitly so existing queries against
BIGINT/DECIMAL/DOUBLE columns keep working, plus LIMIT now accepts
sql.number(10) directly.
2. Typed variants for callers who need to override the inference:
- sql.int(value) -> INT
- sql.bigint(value) -> BIGINT (also accepts JS bigint for values
beyond Number.MAX_SAFE_INTEGER)
- sql.float(value) -> FLOAT
- sql.double(value) -> DOUBLE
- sql.decimal(value) -> NUMERIC (decimal precision preserved)
sql.int() and sql.bigint() reject non-integer inputs at the
helper boundary so the failure surfaces early, before the wire.
SQLNumberMarker.__sql_type widens from "NUMERIC" to a union of the
numeric SQL types. Non-breaking for callers: any code that previously
held an SQLNumberMarker still type-checks (the union is wider in
return position). The two unit tests that pinned `type: "NUMERIC"`
for sql.number(integer) are updated to expect BIGINT.
Discovered while building a parameterized analytics query against
LIMIT — the bare `LIMIT :limit` case is the most visible failure but
the underlying issue affects any query where the column type matters.
Signed-off-by: James Broadhead <jamesbroadhead@gmail.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Why
sql.number()unconditionally tags the parameter as__sql_type: "NUMERIC", which Databricks SQL binds asDECIMAL(10,0). Two real failures fall out of that:1.
LIMIT/OFFSETreject the binding. Spark requires an integer-typed expression for these clauses:So
useAnalyticsQuery("...", { limit: sql.number(10) })against a query containingLIMIT :limitfails the moment the page loads. Reproducible directly:SELECT 1 AS x LIMIT :n parameters=[{ name: "n", value: "10", type: "NUMERIC" }] -> FAILED (binds as DECIMAL(10,0)) SELECT 1 AS x LIMIT :n parameters=[{ name: "n", value: "10", type: "BIGINT" }] -> SUCCEEDED2. Non-integer JS numbers truncate silently.
sql.number(3.14)sendsvalue: "3.14"withtype: "NUMERIC", which the server interprets as DECIMAL(10,0) — zero fractional digits, so the value either rounds or rejects depending on context. The user lost precision they didn't ask to lose.Change
Infer the wire type from the JS value
sql.number()now picks the most-precise numeric SQL type the value can losslessly represent:sql.number(10)(integer)BIGINTsql.number(3.14)(non-integer)DOUBLEsql.number("123.4500")(string)NUMERICStrings stay
NUMERICbecause passing a string is a deliberate choice to skip JS-number coercion (currency, exact decimals, values beyondNumber.MAX_SAFE_INTEGER). Callers who reach for the string form usually want decimal precision preserved end-to-end.BIGINTcovers every JS integer up toNumber.MAX_SAFE_INTEGERand Spark coerces it implicitly toINT/DECIMAL/DOUBLEas needed, so existing queries against narrower or wider numeric columns keep working — plusLIMIT :limitnow Just Works.Typed variants for explicit control
When the inference is wrong for the caller's column or context:
sql.int()andsql.bigint()reject non-integer inputs at the helper boundary (sql.int(3.14)throws synchronously) so the failure surfaces at the callsite, not at the wire.Type widening
SQLNumberMarker.__sql_typewidens from"NUMERIC"to"INT" | "BIGINT" | "FLOAT" | "DOUBLE" | "NUMERIC". This is non-breaking — any value previously typed as the narrow"NUMERIC"literal still satisfies the wider union in return position. The two unit tests that pinnedtype: "NUMERIC"forsql.number(integer)are updated to expectBIGINT.Alternatives considered
INT. Rejected — would silently truncatesql.number(3.14)to3and break filters onBIGINT/DECIMAL/DOUBLEcolumns where the literal is outside INT range.BIGINT. Rejected for the same precision reason on non-integer values; would also surprise callers passing a JS number that's actually meant to be aDOUBLE.CAST(:limit AS INT)workaround in every query. Real cost: every parameterizedLIMIT/OFFSETin every consumer needs the cast. The library is the right place to fix it once.sql.number()alone. Rejected — the failure mode is silent (rounding) or cryptic (LIMIT error), and callers won't reach forsql.int()until they've already been bitten. Better to make the common-case path correct.Test plan
pnpm -r typecheck— clean across all packagesnpx vitest run— 1761/1761 pass; new tests cover BIGINT inference for integers, DOUBLE for non-integers, NUMERIC for strings, plus all five typed variantsnpx biome checkon touched files — cleanpnpm build— appkit + appkit-ui rebuild greendd43ee29fedd958dviadatabricks api post /api/2.0/sql/statementsThis pull request and its description were written by Isaac.