Hey @WebReflection !
Can we have a chat about a (universal) testing framework?
As mentioned in the PR, I would love to have a json file.
[
{
"file": "file://tests/foo.json",
"normal_sha256": "....",
"flattened_sha256": "...."
},
{
"file": "https:://somewhere/extenal/foo.json,
"normal_sha256": "....",
"flattened_sha256": "...."
}
]
We normalize the json, and do a sha256.
cat foo.json | jq --sort-keys -r . | sha256sum | awk '{ print $1 }')
After the flatten, we also normalize and sha256 and validate the sum.
And then we unflatten and validate the original sum.
We run this for every implementation. We need to get the same sha256.
It is also ok, if we use the original implementation and we generate a flattened json using cyclic graphs.
Hey @WebReflection !
Can we have a chat about a (universal) testing framework?
As mentioned in the PR, I would love to have a
jsonfile.[ { "file": "file://tests/foo.json", "normal_sha256": "....", "flattened_sha256": "...." }, { "file": "https:://somewhere/extenal/foo.json, "normal_sha256": "....", "flattened_sha256": "...." } ]We normalize the json, and do a sha256.
After the flatten, we also normalize and sha256 and validate the sum.
And then we unflatten and validate the original sum.
We run this for every implementation. We need to get the same sha256.
It is also ok, if we use the original implementation and we generate a flattened json using cyclic graphs.