This is used with the Property(() => p).HasDatabaseGeneratedOption() call. Is is perhaps to turn off default DB value generation?
EF uses DatabaseGeneratedOption to figure out what to do with the value of a key column for new entities. If the DatabaseGeneratedOption is Identity EF knows that the value the property is set to can be ignored and that the one that comes from the database should be used. If the DatabaseGeneratedOption is None EF will insert the value of the property to the database as the value of the key column.
In Code First - when Code First conventions find an int property that can be the key property for the given entity they by default will configure this column as identity column (meaning the database will generate the value of the key column/property). DatabaseGeneratedOption.None allows you to overwrite this if you want to set key values on your own.
Its effect is to configure EF to not fetch a new identity value after inserting into the database.
Related
Using Entity Framework Core 2.0
Stuck with company's production database which has primary keys defined for each table but no foreign keys defined for any relationships.
Dependent records in the database have id fields which are intended to relate to the primary key fields of the parent record like you would normally find with a foreign key relationship/constraint. But these fields were all created as INT NOT NULL and are using a SQL default of '0'.
As a result dependent records have been inserted over time without requiring that a related parent record be specified.
Initially I defined my models in EF with integers and used a fluent configuration to specify "IsRequired". This was done so I could run migrations to create a test database for comparison against the production database to verify that my code first was correctly coded.
This then lead to the problem while using "Include" in my Linq queries which performs an inner join that results in dropping the records that contain the 0's in the id fields of the dependent record.
The only way that I have found to make this work is to model all of the id fields in the dependent entity as nullable integers and remove the "IsRequired" from the fluent configuration.
When using the "Include" it performs a left outer join keeping all of the dependent entities. This also means that any reference properties on the included entities are set to null instead of an empty string. This part can probably be fixed fairly easily.
The downside is if I wanted to use migrations to create a database now, all id fields in the dependent records would be created as NULL.
Is there anyone who has run up against this type of situation? Does anyone have any suggestions to try other than the approach I am using?
I haven't dealt with this scenario before but I wonder if you can solve it by defining the FK property as Nullable and then in the migrations, after the migration is created, edit it to add a HasDefaultValue property to ensure that it's 0? (doc for that migration method: https://learn.microsoft.com/en-us/ef/core/modeling/relational/default-values)
I am working on a legacy web app which implements model first(edmx file) approach using Entity Framework.
I need to implement optimistic concurrency, so I have added this field as following:
and inside the database has been created as binary(8) type.
But when I try to update the entity is getting updated but the VersionRow values is not updated(no new value generated).
P.S
When I added the column I have binded default value as 0x0000000000000000 because it does not allow null values.
Yea I solved it this way:
1) I changed the type of RowVersion column from Binary(10) into timestamp inside SqlServer.
2) In the property details inside the .edmx file I have put the property StoreGeneratedPattern of the property RowVersion as Computed.
Computed it means that a new value is generated on insert and update.
Now it became as following:
I'm working on a custom entity framework provider and I need to add support for default column values for this provider. When the user uses the entity framework wizard and selects a table that includes columns with default values, those default values are not being populated into the entity designer.
I'm a little lost on where exactly this population should take place. I believe the appropriate place would be in the GetEdmType method override of DbXmlEnabledProviderManifest but I just don't see how to set the default value, if this is the correct place.
Anybody has experience writing EF providers that support default values for table columns? How do you implement this?
I am a bit late to the party but DbXmlEnabledProviderManifest is not the right place for adding default values. The provider manifest describes capabilities of the database engine itself and is specific (and general) to this database engine and not to a given database and/or table. The default value in the provider manifest tells EF what value to use for the given column property if one is not provided by the user (e.g. if the user user does not specify scale or precision for a decimal column the value from provider manifest will be used for scale and/or precision used for this column).
If you want just to insert a default value for a property the easiest way is to set the property that corresponds to the column on your entity to this value in the constructor. This way the user can always set it to a different value but if s/he does not the default value will be sent to the database. For some corner case scenarios where some of the columns in the database do not have corresponding properties on entities you can use DefaultValue attribute on the Property element in SSDL which will be inserted to the database when you add a row. This is especially useful if those properties are not nullable since without telling EF what value should be inserted EF would try inserting null which would obviously fail for non-nullable columns.
Im using the Entity Framework and I have a rowversion (timestamp) field on a table to use for concurrency. However, when updating the entity object, it keeps trying to set the rowversion column to null, and I get an error:
The 'VerCol' property on 'LmpDemoRequest' could not be set to a 'null' value. You must set this property to a non-null value of type 'Byte[]'.
I have the VerCol column within the entity definition, but I am unable to remove the "Setter" function.
How do I get the entity framework to stop attempting to set this column?
You can pass any arbitrary, valid values for the RowVersion fields (DateTime.Now for example). They will be overwritten with the server-generated values.
For future releases of EF, there should be support for "shadow properties", which exist in the model but not in your classes. That feature would be useful in situations such as this.
I had a case where a view included a RowVersion column from a table that was left joined in the view... so that column could be null sometimes.
But EF4 'knows' that a RowVersion column cannot be null, so even in a simple LINQ query, it was throwing an InvalidOperationException:
The 'PersonRowVersion' property on 'vVoteInfo' could not be set to a 'DBNull' value. You must set this property to a non-null value of type 'Byte[]'
I finally had to change the view to use this for the RowVersion column, so that EF would be happy:
coalesce(p._RowVersion, cast(0 as binary(6))) [PersonRowVersion]
When I create a new diagram and "Update model from database", the StoreGeneratedPattern attribute gets added to some primary key properties, but not all of them. What criteria does the designer use to decide whether to add this or not?
I found that it was because my database was not consistent. It appears to add the attribute to any primary key that is an identity column.