Skip to content

Fix JSON field support in merge_insert operations#5112

Draft
wjones127 wants to merge 1 commit intolance-format:mainfrom
wjones127:fix/issue-4831-json-merge-insert
Draft

Fix JSON field support in merge_insert operations#5112
wjones127 wants to merge 1 commit intolance-format:mainfrom
wjones127:fix/issue-4831-json-merge-insert

Conversation

@wjones127
Copy link
Copy Markdown
Contributor

Previously, merge_insert operations with JSON columns would fail with the error:

Query Execution error: LanceError(Schema): Attempt to project field by different types: LargeBinary and Utf8

This happened because field metadata was not preserved when creating the output schema in prepare_stream_schema, causing the JSON conversion to fail.

Changes

Core Fix

  • Updated prepare_stream_schema to use dataset schema metadata instead of input schema metadata for JSON fields
  • Added schema enrichment logic in check_compatible_schema to add arrow.json metadata to plain string inputs when the dataset has JSON fields

New Utility Function

  • Added lance_json_to_arrow_json in lance-arrow/src/json.rs - the inverse of arrow_json_to_lance_json
  • This provides a reusable utility for converting Lance JSON fields (LargeBinary + lance.json metadata) to Arrow JSON fields (Utf8 + arrow.json metadata)

Code Cleanup

  • Simplified wrap_json_stream_for_reading to use the new utility
  • Reduced code duplication across the codebase

Tests

  • Added Rust unit test for merge_insert with JSON columns
  • Added Python test for merge_insert with JSON columns

Result

Users can now:

  • Create a dataset with JSON schema (using pa.json() or arrow.json extension metadata)
  • Perform merge_insert with plain JSON strings (without any metadata)
  • Have the data automatically converted based on the dataset schema

The dataset/table schema is now the source of truth for JSON conversion, not the input data schema.

Fixes #4831

🤖 Generated with Claude Code

Previously, merge_insert operations with JSON columns would fail with
"Attempt to project field by different types: LargeBinary and Utf8"
because field metadata was not preserved when creating the output schema.

This fix uses the dataset schema as the source of truth for JSON metadata,
allowing users to provide plain JSON strings (without pa.json() metadata)
that are automatically converted based on the dataset schema.

Changes:
- Add lance_json_to_arrow_json utility in lance-arrow for field conversion
- Update prepare_stream_schema to use dataset schema metadata for JSON fields
- Add schema enrichment logic to handle plain string inputs for JSON fields
- Simplify wrap_json_stream_for_reading using the new utility
- Add tests for merge_insert with JSON columns

Fixes lance-format#4831

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
@github-actions
Copy link
Copy Markdown
Contributor

ACTION NEEDED
Lance follows the Conventional Commits specification for release automation.

The PR title and description are used as the merge commit message. Please update your PR title and description to match the specification.

For details on the error please inspect the "PR Title Check" action.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Can't use JSON fields with merge_insert

1 participant