Entity Framework With .Net Core is inserting duplicated records - entity-framework-core

i am using .net core 5 webApp (on IIS) with EF . when user is adding new record, from time to time duplicated records will inserted to the table every few seconds (like session is stuck... it want stop untill i restart the application pool). not happening every time, but indeed every few records.
my db is sql.
my code below:
[HttpPost]
[ValidateAntiForgeryToken]
public async Task<IActionResult> Create([Bind("ProductId,Name,Category,Color,UnitPrice,AvailableQuantity")] Products products)
{
if (ModelState.IsValid)
{
_context.Add(products);
await _context.SaveChangesAsync();
return RedirectToAction(nameof(Index));
}
return View(products);
}

Related

Azure Remote Tables And Entity Framework: Join two tables

I read through some answeres here, but they dont seem to apply my usecase.
I have a a few tables running on Azure Mobile Apps Backend, I can query them just fine with entity framework.
I know that I could do two queries and just join my results on client side, but Im sure there is a better way.
My issue is however, that I have a table inside my middleware, and then a dataservice with a remote model class that I use to query data from my service. This is how I would query a User by Email:
public async Task<IEnumerable<UserItem>?> GetItemByEmail(string email)
{
await InitializeAsync();
try
{
return await _table.GetAsyncItems().Where(x => x.Email == email).ToListAsync();
}
catch (Exception e)
{
return null;
}
}
The issue however is, that this only works on the one table (UserItemTable) which is referencend in _table.
Now there is no reference to a second table with which I wanna join.
Maybe this isnt the right place to do this inside the RemoteTable Controllers?
Any help would be appreciated.

Is it possible to improve EF6 WarmUp time?

I have an application in which I verify the following behavior: the first requests after a long period of inactivity take a long time, and timeout sometimes.
Is it possible to control how the entity framework manages dispose of the objects? Is it possible mark some Entities to never be disposed?
...in order to avoid/improve the warmup time?
Regards,
The reasons that similar queries will have an improved response time are manifold.
Most Database Management Systems cache parts of the fetched data, so that similar queries in the near future will be faster. If you do query Teachers with their Students, then the Teachers table will be joined with the Students table. This join result is quite often cached for a while. The next query for Teachers with their Students will reuse this join result and thus become faster
DbContext caches queried object. If you select a Single teacher, or Find one, it is kept in local memory. This is to be able to detect which items are changed when you call SaveChanges. If you Find the same Teacher again, this query will be faster. I'm not sure if the same happens if you query 1000 Teachers.
When you create a DbContext object, the initializer is checked to see if the model has been changed or not.
So it might seem wise not to Dispose() a created DbContext, yet you see that most people keep the DbContext alive for a fairly short time:
using (var dbContext = new MyDbContext(...))
{
var fetchedTeacher = dbContext.Teachers
.Where(teacher => teacher.Id = ...)
.Select(teacher => new
{
Id = teacher.Id,
Name = teacher.Name,
Students = teacher.Students.ToList(),
})
.FirstOrDefault();
return fetchedTeacher;
}
// DbContext is Disposed()
At first glance it would seem that it would be better to keep the DbContext alive. If someone asks for the same Teacher, the DbContext wouldn't have to ask the database for it, it could return the local Teacher..
However, keeping a DbContext alive might cause that you get the wrong data. If someone else changes the Teacher between your first and second query for this Teacher, you would get the old Teacher data.
Hence it is wise to keep the life time of a DbContext as short as possible.
Is there nothing I can do to improve the speed of the first query?
Yes you can!
One of the first things you could do is to set the initialize of your database such that it doesn't check the existence and model of the database. Of course you can only do this when you are fairly sure that your database exists and hasn't changed.
// constructor; disables initializer
public SchoolDBContext() : base(...)
{
//Disable initializer
Database.SetInitializer<SchoolDBContext>(null);
}
Another thing could be, if you already have fetched your object to update the database, and you are sure that no one else changed the object, you can Attach it, instead of fetching it again, as is shown in this question
Normal usage:
// update the name of the teacher with teacherId
void ChangeTeacherName(int teacherId, string name)
{
using (var dbContext = new SchoolContext(...))
{
// fetch the teacher, change the name and save
Teacher fetchedTeacher = dbContext.Teachers.Find(teacherId);
fetchedTeader.Name = name;
dbContext.SaveChanges();
}
}
Using Attach to update an earlier fetched Teacher:
void ChangeTeacherName (Teacher teacher, string name)
{
using (var dbContext = new SchoolContext(...))
{
dbContext.Teachers.Attach(teacher);
dbContext.Entry(teacher).Property(t => t.Name).IsModified = true;
dbContext.SaveChanges();
}
}
Using this method doesn't require to fetch the Teacher again. During SaveChanges the value of IsModified of all properties of all Attached items is checked. If needed they will be updated.

UnitofWork [aspnetboilerplate] Transaction Management

I am currently working on implementing aspnetboilerplate's transaction management
Below is the method I am using to insert a order and products associated with the order
public class OrderController
{
IOrderAppService _orderAppService;
public OrderController(IOrderAppService orderAppService)
{
_orderAppService = orderAppService;
}
public void TestOrder()
{
_orderAppService.TestTransaction();
}
}
public class OrderAppService : IOrderAppService
{
//repositories are injected here
public void TestTransaction()
{
//Created 'order' and 'products' here
//Committing the created objects
CommitOrderTransaction();
}
private void CommitOrderTransaction()
{
using (var unitOfWork = _unitOfWorkManager.Begin())
{
//Inserts the Order record
CommitInsertOrderHeader(); // Order Header is saved in database by using SaveChanges() method
//Inserts the Product records associated with OrderId
CommitInsertOrderDetails();
unitOfWork.Complete();
}
}
}
As the aspnetboilerplate documentation tells that,
"if current unit of work is transactional, all changes in the transaction are rolled back if an exception occurs, even saved changes."
In my case when an exception occurs on inserting the OrderDetails, I would like the header record to be rolled back as well but I still have the Order header record in database.
you don't need to handle transaction manually. ABP handles it for you! All application service methods are automatically set as UnitOfWork. It's an atomic operation. So if any exception occurs in the middle of transactions all the db operations are being rolled back.
further information check out https://aspnetboilerplate.com/Pages/Documents/Unit-Of-Work
If you are calling SaveChanges() twice and you aren't using a TransactionScope across both, then you won't be able to rollback the first call. I don't know what UnitOfWork is doing here, but if the DbContext you are working with isn't being used in that UoW, then nothing is going to happen. DbContext is technically its own Unit of Work already. You should be adding Orders and Order Details to the same DbContext and calling SaveChanges() just once. Then you'd be able to roll back both in that scenario.

How to use EF core owned entites

I am using a SQLite database EFCore 2.0 preview in UWP Project.
The address table is split into to different entities
Delivery address,
Invoice Address
using
modelBuilder.Entity<Project>().OwnsOne(p => p.DeliveryAddress);
which works great for setting up the database, with migrations, creates the different table in the database. With test data that I have put in manually works great at reading data from these tables. But how do I save changes to the DeliveryAddress table. Nothing is getting persisted to the database, when I save the using:
public void UpdateDeliveryAddress(Project modifiedProject)
{
using (var db = new SteelFrameCalculatorDataContext())
{
db.Entry(modifiedProject).State = EntityState.Modified;
db.SaveChanges();
}
}
Project being the parent entity
2017-06-11T23:21:10.9242463+01:00 Warning 8 Microsoft.EntityFrameworkCore.Model.Validation
The key {'ProjectId'} on entity type 'Project.DeliveryAddress->Address' contains properties in shadow state - {'ProjectId'}. To configure this warning use the DbContextOptionsBuilder.ConfigureWarnings API (event id 'CoreEventId.ModelValidationShadowKeyWarning'). ConfigureWarnings can be used when overriding the DbContext.OnConfiguring method or using AddDbContext on the application service provider.
Using the following allowed in to save updates to the database. Assume the UpdateRange(entity) sets all to modified. Not sure if this is the correct way, but it works.
using (var db = new SteelFrameCalculatorDataContext())
{
db.UpdateRange(modifiedProject);
db.SaveChanges();
}
Have you tried setting the state of the child object? Looks like you're only setting the parent Project state.
Adding this should do it:
db.Entry(modifiedProject.DeliveryAddress).State = EntityState.Modified;
db.Entry(modifiedProject).Reference(a=>a.DeliveryAddress).TargetEntry.State = EntityState.Modified;

CreateOrUpdate Operation Over Many Records Using Entity Framework 6

I am writing a web crawler for fun.
I have a remote SQL database that I want to save information about each page I visit and I am using Entity Framework 6 to persist data. For the sake of illustration, let's assume that the only data I want to save about each page is the last time I visited it.
Updating this database is very slow. Here is the operation that I want make fast:
For the current page, check if it exists in the database already
If it exists already, update the "lastVisited" timestamp field on the record and save.
If it doesn't exist, create it.
Currently I can only do 300 updates per minute. My SQL server instance shows almost no activity, so I assume I am client-bound.
My code is naive:
public static void AddOrUpdatePage(long id, DataContext db)
{
Page p = db.Pages.SingleOrDefault(f => f.id == id);
if (p == null)
{
// create
p = new Page();
p.id = id;
db.Pages.Add(p);
}
p.lastSeen = DateTime.Now;
db.SaveChanges();
}
I crawl a bunch of pages (1000s), and then call AddOrUpdatePage in a loop for each page.
It seems like the way to get more speed is batching? What is the best way to get 1000 records from my database at a time, given a set of page ids? In SQL I would use a table variable for this and join, or use a lengthy IN clause.