CodeFluent persistence race condition - codefluent

As soon as the database becomes slow for some reason ( long running query, backup, performance analyzer)
My Web Application start getting, eventually the following errors:
System.InvalidOperationException: There is already an open DataReader associated with this Command which must be closed first.
at System.Data.SqlClient.SqlInternalConnectionTds.ValidateConnectionForExecute(SqlCommand command)
at System.Data.SqlClient.SqlCommand.ValidateCommand(String method, Boolean async)
at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean asyncWrite)
at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
at CodeFluent.Runtime.CodeFluentPersistence.InternalExecuteNonQuery(Boolean firstTry)
at CodeFluent.Runtime.CodeFluentPersistence.InternalExecuteNonQuery(Boolean firstTry)
.... _> stack trace continues to my code
CodeFluent.Runtime.CodeFluentRuntimeException: CF1044: Nested persistence command is not supported. Current command: '[Contract_Search]', nested command: '[zref_template_document_LoadBlobFile]'.
at CodeFluent.Runtime.CodeFluentPersistence.CreateStoredProcedureCommand(String schema, String package, String intraPackageName, String name)
at CodeFluent.Runtime.BinaryServices.BinaryLargeObject.GetInputStream(CodeFluentContext context, Int64 rangeStart, Int64 rangeEnd)
.... _> stack trace continues to my code
The second error CF1044 happens when I open two browser windows and doing different actions. Search in one, generate a document in another.
It's difficult to reproduce. It never happens the same way.
There is a race condition somewhere I can't figure out.

Here is what actually worked for me:
public byte[] GetRtfDocumentStreamBuffer(TemplateDocumentType templateType, int culture)
{
var template = TemplateDocument.LoadActiveByDocumentTypeAndLcid(DateTime.Today, templateType.Id, culture);
var resultStream = new MemoryStream();
using (var cf = CodeFluentContext.GetNew(Constants.MyApplicationStoreName))
using (var templateStream = template.File.GetInputStream(cf, 0, 0))
using (var resultWriter = new StreamWriter(resultStream, Encoding.GetEncoding("windows-1252")))
{
GenerateRtfDocument(....);
resultWriter.Flush();
}
return resultStream.GetBuffer();
}
what I saw in decompiled cf runtime that is CodeFluentContext.Dispose() call CodeFluentPersistence.Dispose() which close reader and dispose connection.

Related

Using JPA Metamodel in criteria API throws NPE first time

I am using JPA Criteria API with JPA meta model.
final CriteriaBuilder cb = em.getCriteriaBuilder();
final CriteriaQuery<T> criteria = cb.createQuery(User.class);
final Root<T> root = criteria.from(User.class);
criteria.select(root).where(cb.equal(root.get(User_.username), value));
return em.createQuery(criteria).getResultList();
here the problem is root.get method throws NPE first time after the server starts (i.e first execution after server start), but the same code works fine from second execution onwards. It doesn't matter when the first execution executed after server started. It may be executed after few seconds or after few minutes.
Following is the exception thrown:
java.lang.NullPointerException: null
at org.apache.openjpa.persistence.criteria.PathImpl.get(PathImpl.java:245) ~[openjpa-2.4.2.jar:2.4.2]
If you look at the source code of PathImpl.get
public <Y> Path<Y> get(SingularAttribute<? super X, Y> attr) {
if (getType() != attr.getDeclaringType()) {
attr = (SingularAttribute)((ManagedType)getType()).getAttribute(attr.getName());
}
return new PathImpl<X,Y>(this, (Members.SingularAttributeImpl<? super X, Y>)attr, attr.getJavaType());
}
it access attr.getDeclaringType()
If I look at the JPA meta model which is auto generated it looks as follows:
public class User_ {
public static volatile SingularAttribute<User,String> username;
...
}
username is definitely null, however the code works well except first time.
It appears during runtime JPA set value to User_.username.
so question is how do prevent throwing NPE first time!
Using TomEE 7.0.4

EF Core 2.0: How to discover the exact object, in object graph, causing error in a insert operation?

I have a complex and big object graph that I want to insert in database by using a DbContext and SaveChanges method.
This object is a result of parsing a text file with 40k lines (around 3MB of data). Some collections inside this object have thousands of items.
I am able to parse the file correctly and add it to the context so that it can start tracking the object. But when I try to SaveChanges, it says:
Microsoft.EntityFrameworkCore.DbUpdateException: An error occurred while updating the entries. See the inner exception for details. ---> System.Data.SqlClient.SqlException: String or binary data would be truncated.
I would like to know if there is a smart and efficient way of discovering which object is causing the issue. It seems that a varchar field is too little to store the data. But it's a lot of tables and fields to check manually.
I would like to get a more specific error somehow. I already configured an ILoggerProvider and added the EnableSensitiveDataLogging option in my dbContext to be able to see which sql queries are being generated. I even added MiniProfiler to be able to see the parameter values, because they are not present in the log generated by the dbContext.
Reading somewhere in the web, I found out that in EF6 there is some validation that happens before the sql is passed to the database to be executed. But it seems that in EF Core this is not available anymore. So how can I solve this?
After some research, the only approach I've found to solve this, is implementing some validation by overriding dbContext's SaveChanges method. I've made a merge of these two approaches to build mine:
Implementing Missing Features in Entity Framework Core - Part 3
Validation in EF Core
The result is...
ApplicationDbContext.cs
public override int SaveChanges(bool acceptAllChangesOnSuccess)
{
ValidateEntities();
return base.SaveChanges(acceptAllChangesOnSuccess);
}
public override async Task<int> SaveChangesAsync(bool acceptAllChangesOnSuccess, CancellationToken cancellationToken = new CancellationToken())
{
ValidateEntities();
return await base.SaveChangesAsync(acceptAllChangesOnSuccess, cancellationToken);
}
private void ValidateEntities()
{
var serviceProvider = this.GetService<IServiceProvider>();
var items = new Dictionary<object, object>();
var entities = from entry in ChangeTracker.Entries()
where entry.State == EntityState.Added || entry.State == EntityState.Modified
select entry.Entity;
foreach (var entity in entities)
{
var context = new ValidationContext(entity, serviceProvider, items);
var results = new List<ValidationResult>();
if (Validator.TryValidateObject(entity, context, results, true)) continue;
foreach (var result in results)
{
if (result == ValidationResult.Success) continue;
var errorMessage = $"{entity.GetType().Name}: {result.ErrorMessage}";
throw new ValidationException(errorMessage);
}
}
}
Note that it's not necessary to override the other SaveChanges overloads, because they call these two.
The Error tells you that youre writing more characters to a field than it can hold.
This error for example would be thrown when you create a given field as NVARCHAR(4) or CHAR(4) and write 'hello' to it.
So you could simply check the length of the values you read in to find the one which is causing your problem. There is at least on which is too long for a field.

EF 5 code first - using a method to determine the connection

Is it possible that the connection string is redirected to a method (instead using App.config)?
(App.config is not an option because the location for a sdf can be changed)
My connection string (sdf) can change via an OpenFileDialog for instance.
The hint concerning SQLExpress was seen.
When debugging, the migration goes towards the IMigrationMetadata.Id.
Then, DataException - An exception occurred while initializing the database.
at
System.Data.Common.DbProviderServices.GetProviderManifestToken(DbConnection connection)
ProviderIncompatibleException - The provider did not return a ProviderManifestToken string.
{"An error occurred while getting provider information from the database. This can be caused by Entity Framework using an incorrect connection string. "}
at System.Data.Entity.ModelConfiguration.Utilities.DbProviderServicesExtensions.GetProviderManifestTokenChecked(DbProviderServices providerServices, DbConnection connection)
The default ctor applies a connection string that cannot be used:
Data Source=.\SQLEXPRESS;Initial Catalog=Namespace.AbacusContext;Integrated Security=True;MultipleActiveResultSets=True;Application Name=EntityFrameworkMUE
But when invoking
this.Database.Initialize( false ); //good connStr applied
within a ctor (incl. arguments), then the default ctor is automatically invoked (but not by me) and that uses an impaired connection string. See overview:
public AbacusContext( DbConnection connection, bool contextOwnsConnection ) :
base( connection, contextOwnsConnection )
{
Database.SetInitializer( new MigrateDatabaseToLatestVersion<AbacusContext, Configuration>() );
string goodConnStr = this.Database.Connection.ConnectionString;//was used for a breakpoint
this.Database.Initialize( false );
}
public AbacusContext()
{
string badConnStr = this.Database.Connection.ConnectionString;
//SQLExpress and some other stuff can be seen now
}

EF code first migration error "Object has been disconnected or does not exist at the server"

I am using Entity Framework 6.1.1 on SQL Server 2008 and I have a long running code first migration (about 20 minutes). It gets to the end and then gives the following error.
System.Runtime.Remoting.RemotingException: Object '/f10901d8_94fe_4db4_bb9d_51cd19292b01/bq6vk4vkuz5tkri2x8nwhsln_106.rem' has been disconnected or does not exist at the server.
at System.Data.Entity.Migrations.Design.ToolingFacade.ToolLogger.Verbose(String sql)
at System.Data.Entity.Migrations.Infrastructure.MigratorLoggingDecorator.ExecuteSql(DbTransaction transaction, MigrationStatement migrationStatement, DbInterceptionContext interceptionContext)
at System.Data.Entity.Migrations.DbMigrator.ExecuteStatementsInternal(IEnumerable`1 migrationStatements, DbTransaction transaction, DbInterceptionContext interceptionContext)
at System.Data.Entity.Migrations.DbMigrator.ExecuteStatementsInternal(IEnumerable`1 migrationStatements, DbConnection connection)
at System.Data.Entity.Migrations.DbMigrator.<>c__DisplayClass30.<ExecuteStatements>b__2e()
at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.<>c__DisplayClass1.<Execute>b__0()
at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)
at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute(Action operation)
at System.Data.Entity.Migrations.DbMigrator.ExecuteStatements(IEnumerable`1 migrationStatements, DbTransaction existingTransaction)
at System.Data.Entity.Migrations.DbMigrator.ExecuteStatements(IEnumerable`1 migrationStatements)
at System.Data.Entity.Migrations.Infrastructure.MigratorBase.ExecuteStatements(IEnumerable`1 migrationStatements)
at System.Data.Entity.Migrations.DbMigrator.ExecuteOperations(String migrationId, XDocument targetModel, IEnumerable`1 operations, IEnumerable`1 systemOperations, Boolean downgrading, Boolean auto)
at System.Data.Entity.Migrations.DbMigrator.ApplyMigration(DbMigration migration, DbMigration lastMigration)
at System.Data.Entity.Migrations.Infrastructure.MigratorLoggingDecorator.ApplyMigration(DbMigration migration, DbMigration lastMigration)
at System.Data.Entity.Migrations.DbMigrator.Upgrade(IEnumerable`1 pendingMigrations, String targetMigrationId, String lastMigrationId)
at System.Data.Entity.Migrations.Infrastructure.MigratorLoggingDecorator.Upgrade(IEnumerable`1 pendingMigrations, String targetMigrationId, String lastMigrationId)
at System.Data.Entity.Migrations.DbMigrator.UpdateInternal(String targetMigration)
at System.Data.Entity.Migrations.DbMigrator.<>c__DisplayClassc.<Update>b__b()
at System.Data.Entity.Migrations.DbMigrator.EnsureDatabaseExists(Action mustSucceedToKeepDatabase)
at System.Data.Entity.Migrations.Infrastructure.MigratorBase.EnsureDatabaseExists(Action mustSucceedToKeepDatabase)
at System.Data.Entity.Migrations.DbMigrator.Update(String targetMigration)
at System.Data.Entity.Migrations.Infrastructure.MigratorBase.Update(String targetMigration)
at System.Data.Entity.Migrations.Design.ToolingFacade.UpdateRunner.Run()
at System.AppDomain.DoCallBack(CrossAppDomainDelegate callBackDelegate)
at System.AppDomain.DoCallBack(CrossAppDomainDelegate callBackDelegate)
at System.Data.Entity.Migrations.Design.ToolingFacade.Run(BaseRunner runner)
at System.Data.Entity.Migrations.Design.ToolingFacade.Update(String targetMigration, Boolean force)
at System.Data.Entity.Migrations.UpdateDatabaseCommand.<>c__DisplayClass2.<.ctor>b__0()
at System.Data.Entity.Migrations.MigrationsDomainCommand.Execute(Action command)
The point of the migration is to update a field in the database that stores the MIME type of some binary data. It loops through every row, reads the binary data, attempts to determine what kind of content it is, then writes the appropriate MIME type value into the that row.
The script below uses ADO.NET to generate a list of update statements to run. I use ADO.NET because I must use .NET's imaging libraries (System.Drawing.Imaging.ImageFormat) to determine the type of binary content in each row (it'll be a jpeg, png, or pdf).
public override void Up()
{
List<string> updateStatements = new List<string>();
using(SqlConnection conn = new SqlConnection(ConfigurationManager.AppSettings["ConnectionString"]))
{
SqlCommand cmd = new SqlCommand("SELECT Table1ID, Image FROM Table1"), conn);
conn.Open();
//read each record and update the content type value based on the type of data stored
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
long idValue = Convert.ToInt64(reader["Table1ID"]);
byte[] data = (byte[])reader["Image"];
string contentType = GetMimeType(data);
updateStatements.Add(string.Format("UPDATE Table1 SET Content_Type = {0} WHERE Table1ID = {1}", contentType, idValue));
}
}
}
foreach (string updateStatement in updateStatements)
Sql(updateStatement);
}
public string GetMimeType(byte[] document)
{
if (document != null && document.Length > 0)
{
ImageFormat format = null;
try
{
MemoryStream ms = new MemoryStream(document);
Image img = Image.FromStream(ms);
format = img.RawFormat;
}
catch (Exception)
{
/* PDF documents will throw exceptions since they aren't images but you can check if it's really a PDF
* by inspecting the first four bytes with will be 0x25 0x50 0x44 0x46 ("%PDF"). */
if (document[0] == 0x25 && document[1] == 0x50 && document[2] == 0x44 && document[3] == 0x46)
return PDF;
else
return NULL;
}
if (format.Equals(ImageFormat.Jpeg))
{
return JPG;
}
else if (format.Equals(System.Drawing.Imaging.ImageFormat.Png))
{
return PNG;
}
}
return NULL;
}
I've seen this five year old post and the articles that it links to do not seem to exist anymore. At least I can't find them.
Does anyone know what's going on here?
-- UPDATE --
This appears to have something to do with how long the migration takes to run. I created a migration that does absolutely nothing other than sleep for 22 minutes
public override void Up()
{
System.Threading.Thread.Sleep(1320000);
}
and I got the same error. So it appears to be a timeout thing. I'm not 100% what object on the server they are referring to and I can't find much on this issue as it relates to code first migrations.
I tried setting the CommandTimeout property in the migrations Configuration.cs file to 5000 but it didn't help. I also attempted to set the SQL Server's Remove query timeout setting to 0 to prevent any timeouts but it didn't help either.
Poached from [GitHub EntityFramework 6 Issue #96][https://github.com/aspnet/EntityFramework6/issues/96#issuecomment-289782427]
The issue is that the ToolLogger lease lifetime (base class
MigrationsLogger is a MarshalByRefObject) is at the default (5
minutes). The ToolingFacade creates the logger, which lives in the
main program's app domain. The migrations run in a different app
domain. If a migration takes longer than 5 minutes, the attempt to log
any further information results in this error. A solution would be to
increase the lease lifetime in the main program. So... in the main
program, prior to creating the ToolingFacade, set the lease lifetime
to a longer time period:
using System.Runtime.Remoting.Lifetime;
...
LifetimeServices.LeaseTime = TimeSpan.FromHours(1);
It's a known issue in Entity Framework 6 for scripts that take a long time to complete.
A workaround is to generate only SQL script via the Update-Database command and execute the generated SQL directly on the SQL Server. In order to generate only the SQL you have to use the -Script flag:
Update-Database -Script
This has been causing us headaches. The problem appears to be due to the design of the EF migration utility. The program creates a new AppDomain in which to run migrations. The logging for the new AppDomain is handled in the original AppDomain (which is why remoting gets involved). Apparently the logger gets GC'ed if an individual migration takes too much time. I've verified this by replacing all the logger calls with Console.WriteLine - which makes the problem go away. There may be a fix by altering the migrate.exe tool (but may possibly require altering EntityFramework assembly itself).
I had this problem because the connection string in my web.config was pointing to the wrong server name.
Actually, I had to change my c:\windows\system32\drivers\etc\host file to specify correct IP address for this DB host name.
Just make sure your DB server is accessible.

Entity Framework Attaching a Persisted Object to the New Object

I am trying to perform a very simple task which is "Add the user with role in the database". The roles are already populated in the database and I am simply adding the role to the User roles collection but it keeps throwing the following exception:
The EntityKey property can only be set when the current value of the property is null.
Here is the code in User.cs:
public void AddRole(Role role)
{
if (!Exists(role))
{
role.User = this;
Roles.Add(role);
}
}
And here is the test that fails:
[Test]
public void should_save_user_with_role_successfully()
{
var _role = _roleRepository.GetByName("Student");
_user.AddRole(_role);
_userRepository.Save(_user);
Assert.IsTrue(_user.UserId > 0);
}
The Repository Code:
public bool Save(User user)
{
bool isSaved = false;
using (var db = new EStudyDevDatabaseEntities())
{
db.AddToUsers(user);
isSaved = db.SaveChanges() > 0;
}
return isSaved;
}
Here is the AddRole Method:
public bool Exists(Role role)
{
var assignedRole = (from r in Roles
where r.RoleName.Equals(role.RoleName)
select r).SingleOrDefault();
if (assignedRole != null) return true;
return false;
}
public void AddRole(Role role)
{
if (!Exists(role))
{
role.User = this;
Roles.Add(role);
}
}
And here is the whole exception:
------ Test started: Assembly: EStudy.Repositories.TestSuite.dll ------
TestCase 'EStudy.Repositories.TestSuite.Repositories.when_saving_new_user.should_save_user_with_role_successfully'
failed: System.InvalidOperationException : The EntityKey property can only be set when the current value of the property is null.
at System.Data.Objects.EntityEntry.GetAndValidateChangeMemberInfo(String entityMemberName, Object complexObject, String complexObjectMemberName, StateManagerTypeMetadata& typeMetadata, String& changingMemberName, Object& changingObject)
at System.Data.Objects.EntityEntry.EntityMemberChanging(String entityMemberName, Object complexObject, String complexObjectMemberName)
at System.Data.Objects.EntityEntry.EntityMemberChanging(String entityMemberName)
at System.Data.Objects.ObjectStateEntry.System.Data.Objects.DataClasses.IEntityChangeTracker.EntityMemberChanging(String entityMemberName)
at System.Data.Objects.DataClasses.EntityObject.set_EntityKey(EntityKey value)
at System.Data.Objects.Internal.LightweightEntityWrapper`1.set_EntityKey(EntityKey value)
at System.Data.Objects.ObjectStateManager.AddEntry(IEntityWrapper wrappedObject, EntityKey passedKey, EntitySet entitySet, String argumentName, Boolean isAdded)
at System.Data.Objects.ObjectContext.AddSingleObject(EntitySet entitySet, IEntityWrapper wrappedEntity, String argumentName)
at System.Data.Objects.DataClasses.RelatedEnd.AddEntityToObjectStateManager(IEntityWrapper wrappedEntity, Boolean doAttach)
at System.Data.Objects.DataClasses.RelatedEnd.AddGraphToObjectStateManager(IEntityWrapper wrappedEntity, Boolean relationshipAlreadyExists, Boolean addRelationshipAsUnchanged, Boolean doAttach)
at System.Data.Objects.DataClasses.RelatedEnd.IncludeEntity(IEntityWrapper wrappedEntity, Boolean addRelationshipAsUnchanged, Boolean doAttach)
at System.Data.Objects.DataClasses.EntityCollection`1.Include(Boolean addRelationshipAsUnchanged, Boolean doAttach)
at System.Data.Objects.DataClasses.RelationshipManager.AddRelatedEntitiesToObjectStateManager(Boolean doAttach)
at System.Data.Objects.ObjectContext.AddObject(String entitySetName, Object entity)
C:\Projects\EStudy\EStudySolution\EStudy.BusinessObjects\Entities\EStudyModel.Designer.cs(97,0): at EStudy.BusinessObjects.Entities.EStudyDevDatabaseEntities.AddToUsers(User user)
C:\Projects\EStudy\EStudySolution\EStudy.BusinessObjects\Repositories\UserRepository.cs(17,0): at EStudy.BusinessObjects.Repositories.UserRepository.Save(User user)
C:\Projects\EStudy\EStudySolution\EStudy.Repositories.TestSuite\Repositories\Test_UserRepository.cs(47,0): at EStudy.Repositories.TestSuite.Repositories.when_saving_new_user.should_save_user_with_role_successfully()
0 passed, 1 failed, 0 skipped, took 6.07 seconds (NUnit 2.5).
UPDATE:
Here is my UserRepository and RoleRepository and they both uses separate contexts:
public bool Save(User user)
{
bool isSaved = false;
using (var db = new EStudyDevDatabaseEntities())
{
db.AddToUsers(user);
isSaved = db.SaveChanges() > 0;
}
return isSaved;
}
public Role GetByName(string roleName)
{
using (var db = new EStudyDevDatabaseEntities())
{
return db.Roles.SingleOrDefault(x => x.RoleName.ToLower().Equals(roleName.ToLower()));
}
}
As, you can see the user and the role are using different context which you have already pointed out. The problem with using single datacontext is that I cannot layer the application properly.
Updated again based on updated question
I disagree that you "can't layer the application properly" when you share a context between repositories. It's a problem that you need to solve, but it's most certainly solvable. Also, I think you will find it considerably easier to solve than the number of problems which you create when you attempt to use multiple contexts.
At any rate, there are really only two possible solutions to your problem:
Manually keep track of which context a particular entity is attached to, and transfer it (with Attach and Detach), when necessary.
Share a Context between repository instances.
In our ASP.NET MVC applications, the logical unit of work is a single Request. Therefore, we instantiate an ObjectContext at the beginning of a request, Dispose it at the end of a request, and inject it into new repositories when we create them. Repository instances never outlast a single request.
Update based on updated question
Does the role repository and the user repository each have a (separate) context? Here's what's going on in the stack trace:
You add the User to the context.
The RelationshipManager goes through the User and ensures that any related entities are also in the context. This involves, among other things, setting their EntityKey property.
Presuming that the Role came from a different context (which appears to be the case, since otherwise the context should detect that the role is already in the context), you should see an error indicating that you cannot add an entity attached to one context into another context. For some reason, you're not seeing that here. But nevertheless, it's not a valid operation.
Instead, you get an error when the EntityKey of the role is assigned.
In my opinion, using a single ObjectContext at a time should be the general rule for working with the EntityFramework. You should use multiple contexts only when you're absolutely forced to, which, in my experience, is almost never. Working with multiple ObjectContexts concurrently is significantly harder than working with one at a time.
OK, I don't know the details of your mapping, but I would expect AddRole to be something more along the lines of:
public void AddRole(Role role)
{
this.Roles.Add(role);
}
... if User->Role is .. or:
public void AddRole(Role role)
{
this.Role = role;
}
if User -> Role is *..1.
If this doesn't help, please post the stack trace for the exception.