Testing With A Fake DbContext and Autofixture and Moq - entity-framework

SO follow this example
example and how make a fake DBContex For test my test using just this work fine
[Test]
public void CiudadIndex()
{
var ciudades = new FakeDbSet<Ciudad>
{
new Ciudad {CiudadId = 1, EmpresaId =1, Descripcion ="Santa Cruz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 2, EmpresaId =1, Descripcion ="La Paz", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1},
new Ciudad {CiudadId = 3, EmpresaId =1, Descripcion ="Cochabamba", FechaProceso = DateTime.Now, MarcaBaja = null, UsuarioId = 1}
};
//// Create mock unit of work
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
// Setup controller
var homeController = new CiudadController(mockData.Object);
// Invoke
var viewResult = homeController.Index();
var ciudades_de_la_vista = (IEnumerable<Ciudad>)viewResult.Model;
// Assert..
}
Iam tryign now to use Autofixture-Moq
to create "ciudades" but I cant. I try this
var fixture = new Fixture();
var ciudades = fixture.Build<FakeDbSet<Ciudad>>().CreateMany<FakeDbSet<Ciudad>>();
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
I get this error
Cant convert System.Collections.Generic.IEnumerable(FakeDbSet(Ciudad)) to System.Data.Entity.IDbSet(Ciudad)
cant put "<>" so I replace with "()" in the error message
Implementation of IContext and FakeDbSet
public interface IContext
{
IDbSet<Ciudad> Ciudades { get; }
}
public class FakeDbSet<T> : IDbSet<T> where T : class
how can make this to work?

A minor point... In stuff like:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
The second type arg is unnecessary and should be:
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany();
I really understand why you need a FakeDbSet and the article is a bit TL;DR... In general, I try to avoid faking and mucking with ORM bits and instead dealing with interfaces returning POCOs to the max degree possible.
That aside... The reason the normal syntax for initialising the list works is that there is an Add (and IEnumerable) in DBFixture. AutoFixture doesn't have a story for that pattern directly (after all it is compiler syntactic sugar and not particularly amenable to reflection or in line with any other conventions) but you can use AddManyTo as long as there is an ICollection in play. Luckily, within the impl of FakeDbSet as in the article, the following gives us an in:-
public ObservableCollection<T> Local
{
get { return _data; }
}
As ObservableCollection<T> derives from ICollection<T>, you should be able to:
var ciudades = new FakeDbSet<Cuidad>();
fixture.AddManyTo(ciudades.Local);
var mockData = new Mock<IContext>();
mockData.Setup(m => m.Ciudades).Returns(ciudades);
It's possible to wire up a customization to make this prettier, but at least you have a way to manage it. The other option is to have something implement ICollection (or add a prop with a setter taking IEnumerable<T> and have AF generate the parent object, causing said collection to be filled in.
Long superseded side note: In your initial question, you effectively have:
fixture.Build<FakeDbSet<Ciudad>>().CreateMany()
The problem becomes clearer then - you are asking AF to generate Many FakeDbSet<Ciudad>s, which is not what you want.

I haven't used AutoFixture in a while, but shouldn't it be:
var ciudades = new FakeDbSet<Ciudad>();
fixture.AddManyTo(ciudades);

for the moment I end doing this, I will keep reading about how use automoq, cause I'm new in this
var fixture = new Fixture();
var ciudades_fixture = fixture.Build<Ciudad>().CreateMany<Ciudad>();
var ciudades = new FakeDbSet<Ciudad>();
foreach (var item in ciudades_fixture)
{
ciudades.Add(item);
}
var mockData = new Mock<IContext>();
fixture.Create<Mock<IContext>>();
mockData.Setup(r => r.Ciudades).Returns(ciudades);

Related

how to make sure locks are released with ef core and postgres?

I have a console program that moves Data between two different servers (DatabaseA and DatabaseB).
Database B is a Postgres-Server.
It calls a lot of stored procedures and other raw queries.
I use ExecuteSqlRaw a lot.
I also use NpsqlBulk.EfCore.
The program uses the same context instance for DatabaseB during the whole run it takes to finish.
Somehow i get locks on some of my tables on DatabaseB that never get released.
This happens always on my table mytable_fromdatabase_import.
The code run on that is the following:
protected override void AddIdsNew()
{
var toAdd = IdsNotInDatabaseB();
var newObjectsToAdd = GetByIds(toAdd).Select(Converter.ConvertAToB);
DatabaseBContext.Database.ExecuteSqlRaw("truncate mytable_fromdatabase_import; ");
var uploader = new NpgsqlBulkUploader(DatabaseBContext);
uploader.Insert(newObjectsToAdd); // inserts data into mytable_fromdatabase_import
DatabaseBContext.Database.ExecuteSqlRaw("call insert_myTable_from_importTable();");
}
After i run it the whole table is not accessable annymore and when i query the locks on the server i can see there is a process holding it.
How can i make sure this process always closes and releases its locks on tables?
I thought ef-core would do that automaticaly.
-----------Edit-----------
I just wanted to add that this is not a temporary problem during the run of the console. When i run this code and it is finished my table is still locked and nothing can access it. My understanding was that the ef-core context would release everything after it is disposed (if by error or by being finished)
The problem had nothing to do with ef core but with a wrong configured backupscript. The program is running now with no changes to it and it works fine
For concrete task you need right tools. Probably you have locks when retrieve Ids and also when trying to do not load already imported records. These steps are slow!
I would suggest to use linq2db (disclaimer, I'm co-author of this library)
Create two projects with models from different databases:
Source.Model.csproj - install linq2db.SQLServer
Destination.Model.csproj - install linq2db.PostgreSQL
Follow instructions in T4 templates how to generate model from two databases. It is easy and you can ask questions on linq2db`s github site.
I'll post helper class which I've used for transferring tables on my previous project. It additionally uses library CodeJam for mapping, but in your project, for sure, you can use Automapper.
public class DataImporter
{
private readonly DataConnection _source;
private readonly DataConnection _destination;
public DataImporter(DataConnection source, DataConnection destination)
{
_source = source;
_destination = destination;
}
private long ImportDataPrepared<TSource, TDest>(IOrderedQueryable<TSource> source, Expression<Func<TSource, TDest>> projection) where TDest : class
{
var destination = _destination.GetTable<TDest>();
var tableName = destination.TableName;
var sourceCount = source.Count();
if (sourceCount == 0)
return 0;
var currentCount = destination.Count();
if (currentCount > sourceCount)
throw new Exception($"'{tableName}' what happened here?.");
if (currentCount >= sourceCount)
return 0;
IQueryable<TSource> sourceQuery = source;
if (currentCount > 0)
sourceQuery = sourceQuery.Skip(currentCount);
var projected = sourceQuery.Select(projection);
var copied =
_destination.BulkCopy(
new BulkCopyOptions
{
BulkCopyType = BulkCopyType.MultipleRows,
RowsCopiedCallback = (obj) => RowsCopiedCallback(obj, currentCount, sourceCount, tableName)
}, projected);
return copied.RowsCopied;
}
private void RowsCopiedCallback(BulkCopyRowsCopied obj, int currentRows, int totalRows, string tableName)
{
var percent = (currentRows + obj.RowsCopied) / (double)totalRows * 100;
Console.WriteLine($"Copied {percent:N2}% \tto {tableName}");
}
public class ImporterHelper<TSource>
{
private readonly DataImporter _improrter;
private readonly IOrderedQueryable<TSource> _sourceQuery;
public ImporterHelper(DataImporter improrter, IOrderedQueryable<TSource> sourceQuery)
{
_improrter = improrter;
_sourceQuery = sourceQuery;
}
public long To<TDest>() where TDest : class
{
var mapperBuilder = new MapperBuilder<TSource, TDest>();
return _improrter.ImportDataPrepared(_sourceQuery, mapperBuilder.GetMapper().GetMapperExpressionEx());
}
public long To<TDest>(Expression<Func<TSource, TDest>> projection) where TDest : class
{
return _improrter.ImportDataPrepared(_sourceQuery, projection);
}
}
public ImporterHelper<TSource> ImprortData<TSource>(IOrderedQueryable<TSource> source)
{
return new ImporterHelper<TSource>(this, source);
}
}
So begin transferring. Note that I have used OrderBy/ThenBy to specify Id order to do not import already transferred records - important order fields should be Unique Key combination. So this sample is reentrant and can be re-run again when connection is lost.
var sourceBuilder = new LinqToDbConnectionOptionsBuilder();
sourceBuilder.UseSqlServer(SourceConnectionString);
var destinationBuilder = new LinqToDbConnectionOptionsBuilder();
destinationBuilder.UsePostgreSQL(DestinationConnectionString);
using (var source = new DataConnection(sourceBuilder.Build()))
using (var destination = new DataConnection(destinationBuilder.Build()))
{
var dataImporter = new DataImporter(source, destination);
dataImporter.ImprortData(source.GetTable<Source.Model.FirstTable>()
.OrderBy(e => e.Id1)
.ThenBy(e => e.Id2))
.To<Dest.Model.FirstTable>();
dataImporter.ImprortData(source.GetTable<Source.Model.SecondTable>().OrderBy(e => e.Id))
.To<Dest.Model.SecondTable>();
}
For sure boring part with OrderBy can be generated automatically, but this will explode this already not a short answer.
Also play with BulkCopyOptions. Native Npgsql COPY may fail and Multi-Line variant should be used.

Entity Framework 6: is it possible to update specific object property without getting the whole object?

I have an object with several really large string properties. In addition, it has a simple timestamp property.
What I trying to achieve is to update only timestamp property without getting the whole huge object to the server.
Eventually, I would like to use EF and to do in the most performant way something equivalent to this:
update [...]
set [...] = [...]
where [...]
Using the following, you can update a single column:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.SaveChanges();
}
OK, I managed to handle this. The solution is the same as proposed by Seany84, with the only addition of disabling validation, in order to overcome issue with required fields. Basically, I had to add the following line just before 'SaveChanges():
db.Configuration.ValidateOnSaveEnabled = false;
So, the complete solution is:
var yourEntity = new YourEntity() { Id = id, DateProp = dateTime };
using (var db = new MyEfContextName())
{
db.YourEntities.Attach(yourEntity);
db.Entry(yourEntity).Property(x => x.DateProp).IsModified = true;
db.Configuration.ValidateOnSaveEnabled = false;
db.SaveChanges();
}

How do you increment a field in mongodb using c#

Thought this would be pretty straight forward, but my value is remaining the same (0).
What I'd like to do is increment my UnreadMessages field when the user receives a message they haven't read and then decrement it when they have. So I thought code like this would work:
var userHelper = new MongoHelper<User>();
//increment
userHelper.Collection.Update(Query.EQ("Id", userId.ToId()), Update.Inc("UnreadMessages", 1));
//decrement
userHelper.Collection.Update(Query.EQ("Id", userId.ToId()), Update.Inc("UnreadMessages", -1));
After running these no errors are thrown but the value doesn't change either. And no I'm not running one after the other as the code above could be interpreted :)
Update
Here's my helper class:
public class MongoHelper<T> : Sandbox.Services.IMongoHelper<T> where T : class
{
public MongoCollection<T> Collection { get; private set; }
public MongoHelper()
{
var con = new MongoConnectionStringBuilder(ConfigurationManager.ConnectionStrings["MongoDB"].ConnectionString);
var server = MongoServer.Create(con);
var db = server.GetDatabase(con.DatabaseName);
Collection = db.GetCollection<T>(typeof(T).Name.ToLower());
}
}
and thanks to Travis' answer I was able to pull this off:
MongoHelper<UserDocument> userHelper = new MongoHelper<UserDocument>();
var user = userHelper.Collection.FindAndModify(Query.EQ("Username", "a"), SortBy.Null, Update.Inc("MessageCount", 1), true).GetModifiedDocumentAs<UserDocument>();
Not sure what your helper does. Here is a working snippet I use:
var query = Query.And(Query.EQ("_id", keyName));
var sortBy = SortBy.Null;
var update = Update.Inc("KeyValue", adjustmentAmount);
var result = collection.FindAndModify(query, sortBy, update, true);
So, "query" finds the document, update does the increment, and FindAndModify puts them together and actually hits the database.

Using DynamicActivityProperty as OutArgument in ActivityBuilder

Greetings,
I'm trying to create a workflow using a ActivityBuilder, and then get the XAML.
This flow use a custom activity (WaitForInput) to handle bookmarks. This class inherits from NativeActivity.
I'm having a hard time finding a way to set 'Result' property of my WaitForInput activity, which expects a OutArgument.
Creating this same workflow by the VS designer, I could associate the boolean property 'MyResult' InOutArgument called 'wrapper'. Like this : [Wrapper.MyResult]
I would do this by code, and according to my research, I have to use DynamicActivityProperty.
The problem is that I don't know how to use my DynamicActivityProperty as OutArgument in this case.
This is an simplified version of the code:
var wrapper = new DynamicActivityProperty
{
Name = "Wrapper",
Type = typeof(InOutArgument<CommunicationWrapper>),
};
var activityBuilder = new ActivityBuilder();
activityBuilder.Properties.Add(wrapper);
var step1 = new FlowStep
{
//here's my problem
Action = new WaitForInput<bool> { BookmarkName = "step1", Result = ??? }
};
var flow = new Flowchart
{
StartNode = step1,
Nodes = { step1 }
};
I have founded a solution to my own problem
Result = new OutArgument<bool>(new VisualBasicReference<bool>
{ ExpressionText = "Wrapper.MyResult" }); }

Changing schema name on runtime - Entity Framework

I need to change the storage schema of the entities on runtime.
I've followed a wonderful post, available here:
http://blogs.microsoft.co.il/blogs/idof/archive/2008/08/22/change-entity-framework-storage-db-schema-in-runtime.aspx?CommentPosted=true#commentmessage
This works perfectly, but only for queries, not for modifications.
Any idea why?
Well, I was looking for this piece of code all around the Internet. In the end I had to do it myself. It's based on Brandon Haynes adapter, but this function is all you need to change the schema on runtime - and you don't need to replace the autogenerated context constructors.
public static EntityConnection Create(
string schema, string connString, string model)
{
XmlReader[] conceptualReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".csdl")
)
};
XmlReader[] mappingReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".msl")
)
};
var storageReader = XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".ssdl")
);
XNamespace storageNS = "http://schemas.microsoft.com/ado/2009/02/edm/ssdl";
var storageXml = XElement.Load(storageReader);
foreach (var entitySet in storageXml.Descendants(storageNS + "EntitySet"))
{
var schemaAttribute = entitySet.Attributes("Schema").FirstOrDefault();
if (schemaAttribute != null)
{
schemaAttribute.SetValue(schema);
}
}
storageXml.CreateReader();
StoreItemCollection storageCollection =
new StoreItemCollection(
new XmlReader[] { storageXml.CreateReader() }
);
EdmItemCollection conceptualCollection = new EdmItemCollection(conceptualReader);
StorageMappingItemCollection mappingCollection =
new StorageMappingItemCollection(
conceptualCollection, storageCollection, mappingReader
);
var workspace = new MetadataWorkspace();
workspace.RegisterItemCollection(conceptualCollection);
workspace.RegisterItemCollection(storageCollection);
workspace.RegisterItemCollection(mappingCollection);
var connectionData = new EntityConnectionStringBuilder(connString);
var connection = DbProviderFactories
.GetFactory(connectionData.Provider)
.CreateConnection();
connection.ConnectionString = connectionData.ProviderConnectionString;
return new EntityConnection(workspace, connection);
}
The resulting EntityConnection should be passed as a parameter when instantiating the context. You can modify it, so all ssdl models are modified by this function, not only the specified one.
I've managed to resolve this issue by using a brilliant library, located in CodePlex (courtesy of Brandon Haynes), named "Entity Framework Runtime Model Adapter", available here:
http://efmodeladapter.codeplex.com/
I've tweaked it a bit, to fit our needs and without the need of replacing the designer code at all.
So, I'm good.
Thanks anyways, and especially to Brandon, amazing job!
I need import data from postgres database. It by default use schema "public". So I use Entity Framework CTP 4 "Code first". It by default use schema "dbo". To change it in runtime I use:
public class PublicSchemaContext : DbContext
{
protected override void OnModelCreating(System.Data.Entity.ModelConfiguration.ModelBuilder builder)
{
builder.Entity<series_categories>().MapSingleType().ToTable("[public].[series_categories]");
}
public DbSet<series_categories> series_categories { get; set; }
}
It work for select, insert, update and delete data. So next test in pass:
[Test]
public void AccessToPublicSchema()
{
// Select
var db = new PublicSchemaContext();
var list = db.series_categories.ToList();
Assert.Greater(list.Count, 0);
Assert.IsNotNull(list.First().series_category);
// Delete
foreach (var item in db.series_categories.Where(c => c.series_category == "Test"))
db.series_categories.Remove(item);
db.SaveChanges();
// Insert
db.series_categories.Add(new series_categories { series_category = "Test", series_metacategory_id = 1 });
db.SaveChanges();
// Update
var test = db.series_categories.Single(c => c.series_category == "Test");
test.series_category = "Test2";
db.SaveChanges();
// Delete
foreach (var item in db.series_categories.Where(c => c.series_category == "Test2"))
db.series_categories.Remove(item);
db.SaveChanges();
}
Not an answer per se but a followup on Jan Matousek's Create[EntityConnection] method showing how to use from a DbContext. Note DB is the DbContext type passed to the generic repository.
public TxRepository(bool pUseTracking, string pServer, string pDatabase, string pSchema="dbo")
{
// make our own EF database connection string using server and database names
string lConnectionString = BuildEFConnectionString(pServer, pDatabase);
// do nothing special for dbo as that is the default
if (pSchema == "dbo")
{
// supply dbcontext with our connection string
mDbContext = Activator.CreateInstance(typeof(DB), lConnectionString) as DB;
}
else // change the schema in the edmx file before we use it!
{
// Create an EntityConnection and use that to create an ObjectContext,
// then that to create a DbContext with a different default schema from that specified for the edmx file.
// This allows us to have parallel tables in the database that we can make available using either schema or synonym renames.
var lEntityConnection = CreateEntityConnection(pSchema, lConnectionString, "TxData");
// create regular ObjectContext
ObjectContext lObjectContext = new ObjectContext(lEntityConnection);
// create a DbContext from an existing ObjectContext
mDbContext = Activator.CreateInstance(typeof(DB), lObjectContext, true) as DB;
}
// finish EF setup
SetupAndOpen(pUseTracking);
}
I was able to convert the solution from Jan Matousek to work in vb.net 2013 with entity framework 6. I will also try to explain how to use the code in vb.net.
We have a JD Edwards Database which uses different Schema's for each environment (TESTDTA, CRPDTA, PRODDTA). This makes switching between environments cumbersome as you have to manually modify the .edmx file if you want to change environments.
First step is to create a partial class that allows you to pass a value to the constructor of your entities, by default it uses the values from your config file.
Partial Public Class JDE_Entities
Public Sub New(ByVal myObjectContext As ObjectContext)
MyBase.New(myObjectContext, True)
End Sub
End Class
Next create the function that will modify your store schema .ssdl file in memory.
Public Function CreateObjectContext(ByVal schema As String, ByVal connString As String, ByVal model As String) As ObjectContext
Dim myEntityConnection As EntityConnection = Nothing
Try
Dim conceptualReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".csdl"))
Dim mappingReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".msl"))
Dim storageReader As XmlReader = XmlReader.Create(Me.GetType().Assembly.GetManifestResourceStream(model + ".ssdl"))
Dim storageNS As XNamespace = "http://schemas.microsoft.com/ado/2009/11/edm/ssdl"
Dim storageXml = XDocument.Load(storageReader)
Dim conceptualXml = XDocument.Load(conceptualReader)
Dim mappingXml = XDocument.Load(mappingReader)
For Each myItem As XElement In storageXml.Descendants(storageNS + "EntitySet")
Dim schemaAttribute = myItem.Attributes("Schema").FirstOrDefault
If schemaAttribute IsNot Nothing Then
schemaAttribute.SetValue(schema)
End If
Next
storageXml.Save("storage.ssdl")
conceptualXml.Save("storage.csdl")
mappingXml.Save("storage.msl")
Dim storageCollection As StoreItemCollection = New StoreItemCollection("storage.ssdl")
Dim conceptualCollection As EdmItemCollection = New EdmItemCollection("storage.csdl")
Dim mappingCollection As StorageMappingItemCollection = New StorageMappingItemCollection(conceptualCollection, storageCollection, "storage.msl")
Dim workspace = New MetadataWorkspace()
workspace.RegisterItemCollection(conceptualCollection)
workspace.RegisterItemCollection(storageCollection)
workspace.RegisterItemCollection(mappingCollection)
Dim connectionData = New EntityConnectionStringBuilder(connString)
Dim connection = DbProviderFactories.GetFactory(connectionData.Provider).CreateConnection()
connection.ConnectionString = connectionData.ProviderConnectionString
myEntityConnection = New EntityConnection(workspace, connection)
Return New ObjectContext(myEntityConnection)
Catch ex As Exception
End Try
End Function
Make sure that the storageNS namespace hardcoded value matches the one used in your code, you can view this by debugging the code and examining the storageXML variable to see what was actually used.
Now you can pass a new schema name, and different database connection info at runtime when you create your entities. No more manual .edmx changes required.
Using Context As New JDE_Entities(CreateObjectContext("NewSchemaNameHere", ConnectionString_EntityFramework("ServerName", "DatabaseName", "UserName", "Password"), "JDE_Model"))
Dim myWO = From a In Context.F4801 Where a.WADOCO = 400100
If myWO IsNot Nothing Then
For Each r In myWO
Me.Label1.Text = r.WADL01
Next
End If
End Using
These were the .net libraries used:
Imports System.Data.Entity.Core.EntityClient
Imports System.Xml
Imports System.Data.Common
Imports System.Data.Entity.Core.Metadata.Edm
Imports System.Reflection
Imports System.Data.Entity.Core.Mapping
Imports System.Data.Entity.Core.Objects
Imports System.Data.Linq
Imports System.Xml.Linq
Hope that helps anyone out there with the same issues.
I had a lot of problems getting this to work when using EF6 with an OData Data Service, so I had to find an alternate solution. In my case, I didn't really need to do it on the fly. I could get away with changing the schema when deploying to some test environments, and in the installer.
Use Mono.Cecil to rewrite the embedded .ssdl resources straight in the DLLs. This works just fine in my case.
Here is a simplified example of how you can do this:
var filename = "/path/to/some.dll"
var replacement = "Schema=\"new_schema\"";
var module = ModuleDefinition.ReadModule(filename);
var ssdlResources = module.Resources.Where(x => x.Name.EndsWith(".ssdl"));
foreach (var resource in ssdlResources)
{
var item = (EmbeddedResource)resource;
string rewritten;
using (var reader = new StreamReader(item.GetResourceStream()))
{
var text = reader.ReadToEnd();
rewritten = Regex.Replace(text, "Schema=\"old_schema\"", replacement);
}
var bytes = Encoding.UTF8.GetBytes(rewritten);
var newResource = new EmbeddedResource(item.Name, item.Attributes, bytes);
module.Resources.Remove(item);
module.Resources.Add(newResource);
}