Microsoft.EntityFrameworkCore.DbUpdateException: 'An error occurred while saving the entity changes. See the inner exception for details.' - entity-framework

After upgrading to .net6.0 version while saving 10 digit decimal value - for example - 1234567890.12345, this error occurs:
OverflowException: Conversion overflows
Please suggest how to fix this error.
This error is occurring on the line while saving:
dbcontext.SaveChanges()
In the database, the Value column is defined as DECIMAL(35,17).
Let me know to solve this conversion overflows issue.

By default in EF Core (and previous versions), if you have not specified the precision of a Decimal typed column in the EF model, then it will default to a precision of 18 digits and 2 decimal places, (18,2).
Most likely in EF6 you had a convention to define all decimals as DECIMAL(35,17)
In .Net 6 there is support for conventions again, we now add this to the DbContext:
protected override void ConfigureConventions(ModelConfigurationBuilder configurationBuilder)
{
// Decimal (37,12) convention
configurationBuilder.Properties<decimal>()
.HavePrecision(37, 12);
}
based on solution from Loop/reflect through all properties in all EF Models to set Column Type
EF Core didn't support the same model for configuration conventions as EF6 OOTB, it sucked for a time, but we moved on, here is an old discussion on the topic: Where are Entity Framework Core conventions? also have a read here for
One of the benefits to this is that the model configuration is the standard place for this type of logic, so by keeping the config in-line, or atleast within your DbContext project your conventions are now more discoverable. In EF6 many of us implemented custom conventions and packaged them in distributable libraries for re-use. This effectively made them black-boxes which would often result in local variants of the same conventions so that the current project developers had access to the logic when they needed to inspect it or change it.

Related

Entity Framework Core can't map official types to PostgreSQL columns

I downloaded a sample database for Postgres (v14) (dvdrental) so I could follow some SQL tutorials. I wanted to create an ASP.NET Core (5) Web API for that database, so using scaffolding, I created the entities based on the database tables and columns, then after some minor changes, I wanted to create a new migration.
That step failed though, as I'm getting two errors (so far) regarding the data types.
The property 'Film.Fulltext' is of type 'NpgsqlTsVector' which is not supported by the current database provider. Either change the property CLR type or ignore the property using the '[NotMapped]' attribute or by using 'EntityTypeBuilder.Ignore' in 'OnModelCreating'
I tried to use [NotMapped], but got the same error. I also tried to specify
[Column(Typename = "tsvector")]
which is the official type mapping according to https://www.npgsql.org/doc/types/basic.html, but for some reason, EF core seems to ignore it completely and gives the same error.
The property 'Film.SpecialFeatures' could not be mapped because it is of type 'string[]', which is not a supported primitive type or a valid entity type. Either explicitly map this property, or ignore it using the '[NotMapped]' attribute or by using 'EntityTypeBuilder.Ignore' in 'OnModelCreating'
Which is, again, weird, as the Postgres websites says
public string[] Tags { get; set; }
The provider will create text[] columns for the above property..
So basically EF Core throws errors about something that is the officially recommended way of doing it, so I have no idea why these errors even occur or how I could solve them.
Any help is appreciated.
Thanks

how can we control EF Database First generated class and property names

I have an Oracle database that has all its tables and columns in all Upper-Case.
For example table STUDENT has FIRSTNAME,LASTNAME and DATEOFBIRTH
when i generate classes using EF Database First approach, i get all classes and names in Upper-Case as well.
answer here
How to force pascal case with Oracle's Entity Framework support?
did not helped as it only generates names with only First letter in upper case instead of FirstName or LastName.
I thought of doing it manually. Is there a way i can write something in OnModelCreating so that every time i generate edmx i get the names right?
If i change name after generation its going to override next time i update it from database.
ReSharper may be able help with this, though I have not used Re# in a while or the naming convention feature. (see: https://www.jetbrains.com/help/resharper/Inspect_the_Whole_Solution_for_Naming_Style_Compliance.html) Tell it you want UpperCamelCase and run it across the generated code. There may be other plug-ins available with this capability.
EF isn't going to do it as I doubt they'd be trying to determine word boundaries for naming. howwillitknowhowmanywordsareinhere? :)

How can I map a nullable tinyint column to a bool where null becomes false in EF Core?

I maintain a suite of applications for a SqlServer database that has no simple creation process and various instances in production have slight differences (with regards nullability of columns and varchar sizes). I am moving the data layer of these applications from EF 6 to EF Core 2.1 to increase platform support and to finally have a straightforward way to create new databases with a consistent layout.
I might take this opportunity to clean up my POCOs somewhat. One pattern I wish to do away with is that original SqlServer database often uses tinyint null instead of bit columns with a default constraint on them. These are mapped to byte? rather than bool in my C# code which I think needlessly complicates their usage. Going forward, I'd like new databases to use bit fields instead, and in many cases it is appropriate for them to be not null and defaulted to 0. I have this working, but I figured that with all the flexibility of EF Core, I should be able to subclass my DbContext and provide different mappings in order to enable the same code to run against the original "legacy" databases there these are still nullable tinyints instead.
Attempt 1
I was hoping to use a ValueConverter<bool, byte?> in my LegacyDbContext subclass to accomplish this (passing it into PropertyBuilder<bool>.HasConversion) until I learnt that these cannot be used to convert nulls under the limitations section of the EF Core docs, so I get a System.InvalidOperationException stating:
An exception occurred while reading a database value for property '<tableName>.<columnName>'. The expected type was 'System.Boolean' but the actual value was null.
Attempt 2
Registering a custom Microsoft.EntityFrameworkCore.Metadata.Internal.IEntityMaterializerSource implementation to take care of this conversion for me but I couldn't get that working either, I think it is invoked too late for me to do the type conversion...
Surely it is possible to replace nulls with 0 prior to the type conversion from byte to bool, or some other mechanism to accomplish my new POCOs using bools to be mapped back to the nullable tinyints of older databases?

FSharp Record Types With Entity Framework Code-First

I am doing a proof of concept in a line of business application where I want to swap out the current C# Code-First Entity Framework implementation with a F# one. I am following this article which seems to work pretty well, but I was hoping to use FSharp record types instead of the classes that the article uses. When I try and add a data annotation to a record type like this:
type Family = {[<Key>]Id:int; LastName:string; IsRegistered:bool}
I get the following error:
Error 1 The type 'Key' is not defined
Is there a way to use data annotations with record types? Apparently, EF Code-First needs annotations...
Record types support attributes just fine (and with the syntax you have).
Check if your reference to System.ComponentModel.DataAnnotations is in order, that's where KeyAttribute is defined.
Edit: EF wants to work with properties, that's why using a record doesn't mesh well with EF. You can still make it work in F# 3.0+ by marking the record with CLIMutable attribute (this generates property setters and a parameterless constructor which are taken for granted by C#-centric frameworks and libraries).
The article you're looking at was written with F# 2.0 in mind - CLIMutable wasn't around yet and there was no way of using records for that.

Entity Framework 4.1 for large number of tables (715)

I'm developing a data access layer for a database with over 700 tables. I created the model including all the tables, which generated a huge model. I then changed the model to use DBContext from 4.1 which seemed to improve how it compiled and worked. The designer didnt seem to work at all.
I then created a test app which just added two records to the table, but the processor went 100% in the db.SaveChanges method. Being a black box it was difficult to accertain what went wrong.
So my questions are
Is the entity framework the best approach to a large database
If so, should the model be broken down into logical areas. I did note that you cant have the same sql table in multiple models
I have read that the code only approach is best in these large cases. What is that.
Any guidance would be truly appreciated
Thanks
Large database is always something special. Any technology has some pros and cons when working with a large database.
The problem you have encountered is the most probably related to building the model. When you start the application and use EF related stuff for the first time EF must build the model description and compile it - this is the most time consuming operation you can find in EF. Complexity of this operation grows with number of entities in the model. Once the model is compiled it is reused for the whole lifetime of the application (if you restart the application or unload application domain the model must be compiled again). You can avoid this by precompiling the model. It is done at design time where you use some tool to generate code from the model and you include that code into your project (it must be done again after each change in the model). For EDMX based models you can use EdmGen.exe to generate views and for code first based models you can use EF Power Tools CTP1.
EDMX (the designer) was improved in VS 2010 SP1 to be able to work with large models but I still think the large in this case is around 100 entities / tables. In the same time you rarely need 715 tables in the same model. I believe that these 715 tables indeed model several domains so you can divide them into multiple models.
The same is true when you are using DbContext and code first. If you model a class do you think that it is correct design when the class exposes 715 properties? I don't think so but that is exactly what your derived DbContext looks like - it has a public property for each exposed entity set (in the simplest mapping it means one property per table).
Same entity can be used in multiple models but you should try to avoid it as much as possible because it can introduce some complexities when loading entity in one context type and using it in other context type.
Code only = code first = Entity framework when you define mapping in the code without using EDMX.
take a look this post.
http://blogs.msdn.com/b/adonet/archive/2008/11/24/working-with-large-models-in-entity-framework-part-1.aspx