## Where communities thrive

• Join over 1.5M+ people
• Join over 100K+ communities
• Free without limits
##### Activity
• Jan 26 17:27
Hawxy synchronize #2473
• Jan 26 17:27
Hawxy synchronize #2473
• Jan 26 17:15
Hawxy edited #2471
• Jan 26 17:15
Hawxy edited #2471
• Jan 26 17:15
Hawxy edited #2471
• Jan 26 17:15
Hawxy edited #2471
• Jan 26 17:11
Hawxy opened #2473
• Jan 26 17:11
Hawxy opened #2473
• Jan 26 16:18
Hawxy edited #2471
• Jan 26 16:18
Hawxy edited #2471
• Jan 26 16:00
Hawxy edited #2471
• Jan 26 16:00
Hawxy edited #2471
• Jan 26 15:24
Hawxy edited #2472
• Jan 26 15:24
Hawxy edited #2472
• Jan 26 15:19
Hawxy synchronize #2472
• Jan 26 15:19
Hawxy synchronize #2472
• Jan 25 17:57
appetere commented #2421
• Jan 25 17:57
appetere commented #2421
• Jan 25 13:24
Hawxy edited #2471
• Jan 25 13:24
Hawxy edited #2471
Oskar Dudycz
@oskardudycz
if you could provide some more detailed repro then we could have a look
Chris Jansson
@chrisjansson

Alright, I'll see if I can narrow it down a bit.

Just to sanity check
Simply running select mt_transform_patch_doc('{ "Number": 123, "Arr": [ { "Id": 1 }, { "Id": 2 } ]}', '{"type":"set","value":100000,"path":"Number"}')
Yields {"Arr": [{"Id": 2}], "Number": 100000} in azure and {"Arr": [{"Id": 1}, {"Id": 2}], "Number": 100000} locally

mt_transform_patch_doc is the same for both instances
Oskar Dudycz
@oskardudycz
@chrisjansson are you sure that the structure of the object didn't changed?
in docker you might have most recent as it's dev env
while on azure you might have some older
and the patch won't match
(btw. patch won't match <= nice title for a Hip Hop song)
Chris Jansson
@chrisjansson
The structure is the same. Also running the above select statement the result of the mt_transform_patch_doc procedure differs between the envs
Oskar Dudycz
@oskardudycz
hm, that's odd
is it possible that you could get the json from both envs and the exact document structure + document store config?
Chris Jansson
@chrisjansson
Sure, but it seems to me that there is something honky with the build of (what I assume) plv8 that the azure instance is running?
I'll see if I can debug mt_transform_patch_doc on both sides see where they differ in behavior and go from there. It doesn't seem like it's anything to do with marten in particular though
Chris Jansson
@chrisjansson

A final update,
CREATE FUNCTION plv8_test(inp jsonb) RETURNS jsonb AS $return inp.Arr.length;$ LANGUAGE plv8 IMMUTABLE STRICT;

select plv8_test(CAST('{ "Number": 123, "Arr": [ { "Id": 1 }, { "Id": 2 } ]}' as jsonb));

Returns 1 on the azure instance and 2 on my local instance...

Marco Dissel
@mdissel
Jeremy D. Miller
@jeremydmiller
That looks pretty weak. I wonder what the point is
Oskar Dudycz
@oskardudycz
@chrisjansson I'll try to look on that tomorrow, but the more details I get the easier would be to say something concrete, at this moment I have no idea :|
@mdissel I haven't seen it, but I can tell that in most of my projects I did what I could to not have any attributes in my POCO - for me it's anti-pattern
Oskar Dudycz
@oskardudycz
although I see some benefits for Dapper or EF users
Tony Karalis
@tonykaralis
Evening/Morning all. This might be a bone-head question which i think I have the answer to but wanted to confirm. I have class A, which I have registered in my store options. Class A has a property of type Class B amongst other things. I make a change (add a property to class B). I run the schema validation and it doesn't throw any exceptions. My understanding is that only a change in Class A would warrant a schema change and not in Class B. Am I correct?
Mark Warpool
@CodingGorilla
What kind of schema change are you asking about? None of it would change the database structure to change.
@tonykaralis
Jeremy D. Miller
@jeremydmiller
What he said. There’s no change to your schema. Everything goes into the data field;) That’s the killer advantage of something like Marten or Raven or Mongo over an ORM + relational DB
Tony Karalis
@tonykaralis
@CodingGorilla @jeremydmiller You have put this into perspective for me. This is extremely powerful. At the same time it has me worried though. I was under the wrong impression that If i ran AssertDatabaseMatchesConfiguration after adding a property or more critically removing one then I'd get an exception thrown. This was my only way of protecting the database from a destructive change.
To provide a little more context, we have built a single core library which we use across different web apps(different solutions, different clients). Most classes persisted by Marten exist in the respective Web app's assembly, but there are a few which exist in our core library. Deleting a property off of a class in the core library would then result in the change not being noticed in the Web app (and before you know it or dont know it, some data is gone).
Oskar Dudycz
@oskardudycz
This will only assert if you changed document configuration to eg. have index
@tonykaralis you can always write your own verification
Tony Karalis
@tonykaralis
Yeah makes sense now.
Oskar Dudycz
@oskardudycz
although it might be quite time consuming, because each document might have different schema
Tony Karalis
@tonykaralis
Yeah thats one way of doing it
Oskar Dudycz
@oskardudycz
I was thinking about the possibility of storing the json schema
and allow validation for that https://github.com/RicoSuter/NJsonSchema
although right now it's only a concept in my head... ;)
Tony Karalis
@tonykaralis
Very interesting :thumbsup:
Oskar Dudycz
@oskardudycz
As for now you could try to implement your own wrapper
that eg. you'd store json schema somewhere
and then verify if the documents are valid for eg. most recent schema
So write some kind of "contract tests"
To have simple implementation dedicated for your need shouldn't be super hard
Especially if it's run for the tests and to eg. validate if your core changes didn't break anything
then it should be imho fine solution
Tony Karalis
@tonykaralis
I am using Martens subtyping as well though. I have about 15 classes going into a single table.
Absolutely, at the moment any change in order to get to production would require approval and all tests to pass. So at least a junior wont wreck it accidentally. So putting your solution into a test would add that extra check. ;)
Oskar Dudycz
@oskardudycz
Understood
so I think that maintaining the json schema
and then validating it should wokr
I had similar tests in my previous project, I stored snapshoted sample jsons
and then made sure that they're valid in the new schema, and all is working fine
I used that to make sure that I didn't provide a breaking change in my contracts