WHAT I HAVE:
Visual Basic 2019, WinForms, Entity Framework (I'm not sure which version; I think 6)
For reasons too long and complex to explain here, I'm occasionally forced by my VB.NET app and entity data model to load in a reference to an entity instance (to different variables) twice while in the data context. Since EF doesn't allow multiple references to 1 entity to be attached simultaneously, I have to detach the 1st variable, using .Entry/.State to change it from EntityState.Unchanged to EntityState.Detached, prior to performing the 2nd query. Normally this works just fine. However, for 1 specific entity type in my app, and under very specific scenarios (that are ostensibly the same as others), the statement to detach instead sets the 1st reference to EntityState.Added instead, triggering either one of 2 exceptions: Either I'm trying to multi-attach the entity to the context, or I'm trying to add it to the database when it's already present.
A surefire way to guarantee detachment is to close the context and re-open it prior to the 2nd query, but that, I suspect, has 2 problems: 1) If the database is big, then performance penalties, and 2) not being able to "transactionalize" changes made after the 1st query and then the 2nd, so that they both succeed or fail together.
Long story short, I need to know if it's a bug in my version of EF (I think it's EF6, but don't hold me to that!), or if there's something wrong with the way the class for that particular entity is set up? (Full disclosure: The classes for the types where detachment works and the 1 where it doesn't always are not fundamentally different, and they all use Partial Class logic to augment them with additional, non-mapped, members.)
Something like this is what I do:
Using db = *context*
Dim x = *query to get entity instance*
db.Entry(x).State = EntityState.Detached 'usually, but not invariably, works
' some other code
Dim y = *query to get collection including previous instance*
' some other code
Please keep any recommendations in VB.NET and as simple as possible.