Skip to content

Conversation

@roji
Copy link
Member

@roji roji commented Sep 9, 2025

Implements #36688 for SQL Server

Description
Adds server-side support for serialization arbitrary relational expressions for JSON_VALUE(JSON_OBJECT(...)) on SQL Server.

Customer impact
Before this PR, attempts to use ExecuteUpdate to update JSON properties from non-constant/parameter sources (i.e. from some database column) fail (except for int/string/bool). ExecuteUpdate over JSON is a new feature being introduced in EF 10 as part of the general support for complex JSON mapping, this adds support for an important missing scenario.

How found
Testing

Regression
No

Testing
Added

Risk
Very low, affects new codepaths only.

@roji roji requested review from AndriySvyryd and artl93 September 9, 2025 19:10
@roji roji requested a review from a team as a code owner September 9, 2025 19:10
@roji roji added the ask-mode label Sep 9, 2025
@roji roji changed the title Support arbitrary values in ExecuteUpdate JSON for SQL Server [rc2] Support arbitrary values in ExecuteUpdate JSON for SQL Server Sep 9, 2025
@artl93
Copy link
Member

artl93 commented Sep 9, 2025

@roji - what was our coverage like in terms of testing? I assume we hit all scenarios at least once in integration?

@roji
Copy link
Member Author

roji commented Sep 9, 2025

@artl93 yeah, for rc.1 these scenarios were known to be lacking coverage and possibly (even probably) needing fixing; the space of possibilities here in terms of what can be expressed in LINQ is quite huge, and query features very frequently have this sort of long tail of possibilities which require additional fixing etc.

Copy link
Member

@artl93 artl93 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RC2. Getting more coverage will be good. (I though you are already on top of that.)

@roji roji force-pushed the JsonObject branch 2 times, most recently from 41ca7cd to cc3173b Compare September 10, 2025 00:20
@roji
Copy link
Member Author

roji commented Sep 10, 2025

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

throw new ArgumentException("The number of property names must match the number of property values.");
}

if (typeMapping.StoreType is not "nvarchar(max)")
Copy link
Member

@AndriySvyryd AndriySvyryd Sep 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So json is not supported?
Also, use SqlServerStructuralJsonTypeMapping.NvarcharMaxDefault.StoreType instead and do case-insensitive comparison

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apparently not - JSON_OBJECT() is documented as returning an nvarchar(max) (it was introduced in 2022, before json was introduced. It's always possible to wrap it in CAST(JSON_OBJECT(...) AS json) if needed.

I'm not quite sure about our structural JSON type mapping design (in general). Type-wise, the T-SQL function really does just return an nvarchar(max) - having a "structural" type mapping blurs the line between the bare store type (nvarchar(max)) and modeled facts about the contents is contains (which is the query pipeline should be the on the shaper side). Specifically here, note that JSON_OBJECT() is being used to construct a JSON that isn't a structural type in our model (just a temporary JSON string that gets passed into JSON_VALUE()), so I'm not sure it's appropriate. In any case we can always change this as our needs evolve for JSON_OBJECT().

OK for the case-insensitivity.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But semantically it represents a structural store type, which seems to align with the type mapping intent

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure it's the job of the expression/type mapping here to express exactly what this represents or will be used as... For example, the user may be using a (future) EF.Functions.JsonObject() to build up a string which they'll be comparing (or assigning) to some string column or fetching back as a string (and not as a .NET type) - in those scenarios it really is just a string (which happens to contain a JSON object representation, but nobody cares).

In any case, I included this check (in the expression itself) to ensure that the return store type is correct, nothing more - users of the class are free to pass in a SqlServerStructuralJsonTypeMapping if they so wish.

Comment on lines +465 to +467
public static DbContextOptionsBuilder SetCompatibilityLevelFromEnvironment(DbContextOptionsBuilder builder)
=> builder.UseSqlServerCompatibilityLevel(SqlServerMajorVersion * 10);

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure this is a good idea as it would make tests more dependent on the environment and harder to debug when they fail on CI. I think it's better to have separate tests/theory data for different versions to make it clear what compatibility level was used/expected

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an important conversation.

I strongly believe that what we need is something very basic and which should be self-evident: we need to test our multiple supported SQL Server versions in CI, rather than just test one. Right now we have no idea whether something works on the other versions, unless a developer remembers to manually execute the tests locally on multiple versions. This is not the way things should work. We should simply have multiple test runs in CI, one against each supported SQL Server version, and we'd know immediately what test fails against which version ( here's a build for the PG provider doing exactly that).

This is something we've brought up several times over the years, but we haven't made actual progress on it. It should be standard and trivial to spin up your database in a docker container on Linux, at which point it should also be really trivial to test all the versions (we could even choose to use testcontainer.net to do this, though we don't have to). But somehow every time I've brought this up the conversation got complex.

/cc @SamMonoRT @cincuranet maybe now that we're entering the off-season this is a good thing to invest in.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Regardless of the docker conversation above, I think it's a good thing to not have to have different test suites for different database versions, but to have a single test which just works against any backend, and is responsible for varying its assertions if it needs to:

  • We get a single place where we can immediately see how the query behaves, across all version variations (rather than multiple test suites we need to navigate)
  • We should organize our test suites by area/responsibility/feature, and not by database version; in other words, we shouldn't have to concentrate all tests which vary on SQL in specific test suites (so that we can then duplicate those test suites for different versions). We should just organize tests in whatever makes sense, and if some test(s) somewhere need to vary, they can just do that without other tests in the same suite being affected.

@roji roji merged commit f17a31d into dotnet:release/10.0 Sep 11, 2025
7 checks passed
@roji roji deleted the JsonObject branch September 11, 2025 11:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants