WPF : Update View Model when DB Changes (Concurrency Issue) - entity-framework

I'm currently working on WPF app that is architectured as follows:
MVVM
Entity Framwork 4 (with LINQ).
WCF service that pool Database to get data (Oracle).
I Make my WFC call in my View Model Class and put my data in an ObsevableCollections.
Db Changes occurs from a another app.
So my app does not do any write actions on the DB what-so-ever (Zéro), it only reads data and displays it on the UI.
How can I make my app to be quickly responsive to DB changes, I read about the following solutions but I'm confused and don't know what to use:
Pooling DB every n seconds with a DispatcherTimer (seems to be too much work cause data changes every millisecond)
SqlDependency, searched all over the internet but didn't find a proper implementation with EF.
As I said, db changes every millisecond (financial data from other sources),
How can resolve this?
Thank you.

i tried the code bellow and it seemed to be working okay for the moment (but i still have some doubts about that infinit loop), let me know what you thing :
public class MyViewModel
{
BackgroundWorker _bgWorker ;
//some props
//some funcs
protected internal MyViewModel(Session session)
{
Session = session;
RefreshData();
}
protected void RefreshData()
{
try
{
_bgWorker = new BackgroundWorker
{
WorkerReportsProgress = true,
WorkerSupportsCancellation = true
};
_bgWorker.DoWork += bgDoWork;
if (!_bgWorker.IsBusy)
{
_bgWorker.RunWorkerAsync();
}
}
catch (Exception)
{
_bgWorker.CancelAsync();
}
}
private void bgDoWork(object sender, DoWorkEventArgs e)
{
var worker = (BackgroundWorker)sender;
while (!worker.CancellationPending)
{
//Thread.Sleep(1000); should i keep this or not ?
Proxy(); // WCF calls
}
}
}

Related

How to seed images in the database asp.net core 2.1

I have following database schema and I would like to seed the data in the database but I could not understand how to seed the images at first go, what should be in the table entity.
I need help to know where I need changes.
Thanks.
Since you referenced Core, here's the easiest way (in Program.Main)
try
{
var host = BuildWebHost(args);
using (var scope = host.Services.CreateScope())
{
var services = scope.ServiceProvider;
try
{
var context = services.GetRequiredService<myContext>();
DbInitializer.Seed(context);
}
catch
{
throw;
}
}
host.Run();
}
catch
{
throw;
}
and create a class called DbInitializer with a method Seed that takes an EF context. I think you can take it from there.
(and don't post images of code, post the code using Ctrl+K to format code-blocks)

EF Core 2.1 In memory DB not updating records

I'm using the in memory database provider for integration tests however I don't seem to be able to update a record. I've run the same code against a real SQL database and everything gets updated fine. Here is my test fixture code.
Test Fixture:
public class TestFixture<TStartup> : IDisposable
{
private readonly TestServer _testServer;
public HttpClient TestClient { get; }
public IDatabaseService DbContext { get { return _testServer.Host.Services.GetService<DatabaseService>(); } }
public TestFixture() : this(Path.Combine("src")) { }
protected TestFixture(string relativeTargetProjectPatentDir)
{
Environment.SetEnvironmentVariable("ASPNETCORE_ENVIRONMENT", "Testing");
var builder = new WebHostBuilder()
.ConfigureServices(services =>
{
services.AddDbContext<DatabaseService>(options =>
options.UseInMemoryDatabase("TestDB")
.EnableSensitiveDataLogging());
})
.UseEnvironment("Testing")
.UseStartup<Startup>();
_testServer = new TestServer(builder)
{
BaseAddress = new Uri("http://localhost:5010")
};
TestClient = _testServer.CreateClient();
TestClient.BaseAddress = _testServer.BaseAddress;
}
public void Dispose()
{
TestClient.Dispose();
_testServer.Dispose();
}
}
I've spent most of the day googling this and not come across any other people talking about it so I'm assuming its probably my issue rather than a EF bug. I'm sure someone would have noticed a DB that you can't update.
Updating works with Singleton but I have CQRS architecture and to check if the entry was updated in e2e test I have to reload entry
Context.Entry(entity).Reload();
I hope that this can help someone
It turned out that changing the lifetime of my DbContext in my test fixture to singleton solved my issue.
Well it can be that DbContext is used in wrong way. I had the same problem. I used the DbContext in same way as you. I simply returned the instance by .Host.Services.GetService<TContext>. The problem with this approach is that DbContext will never release tracked entities so either you set entity State as EntityState.Detached and you force DbContext to reload it, or you will use scopes.
using (var scope = _testServer.Host.Services.GetRequiredService<IServiceScopeFactory>().CreateScope())
{
var dbContext = scope.ServiceProvider.GetRequiredService<DatabaseService>();
//make any operations on dbContext only in scope
}
Adding to Chris's answer. Here is an example of what I had vs. what fixed the issue:
services.AddDbContext<TestDbContext>(options => {
options.UseInMemoryDatabase("TestDb");
});
to
var options = new DbContextOptionsBuilder<TestDbContext>()
.UseInMemoryDatabase(databaseName: "TestDb")
.Options;
services.AddSingleton(x => new TestDbContext(options));
Using AsNoTracking behavior may additionally work below,
services.AddDbContext<TestDbContext>(
a => a.UseInMemoryDatabase(databaseName: "TestDb").UseQueryTrackingBehavior(QueryTrackingBehavior.NoTracking),
ServiceLifetime.Singleton)
Also, how are you updating record? This seems to track in EFCore InMemory,
_dbContext.Entry(modifyItem).State = EntityState.Modified;
However, this doesn't seem to work as much.
_dbContext.Entry(existingItem).CurrentValues.SetValues(modifyItem);

Nested DbContext due to method calls - Entity Framework

In the following case where two DbContexts are nested due to method calls:
public void Method_A() {
using (var db = new SomeDbContext()) {
//...do some work here
Method_B();
//...do some more work here
}
}
public void Method_B() {
using (var db = new SomeDbContext()) {
//...do some work
}
}
Question:
Will this nesting cause any issues? (and will the correct DbContext be disposed at the correct time?)
Is this nesting considered bad practice, should Method_A be refactored into:
public void Method_A() {
using (var db = new SomeDbContext()) {
//...do some work here
}
Method_B();
using (var db = new SomeDbContext()) {
//...do some more work here
}
}
Thanks.
Your DbContext derived class is actually managing at least three things for you here:
the metadata that describes your database and your entity model,
the underlying database connection, and
a client side "cache" of entities loaded using the context, for change tracking, relationship fixup, etc. (Note that although I term this a "cache" for want of a better word, this is generally short lived and is just to support EFs functionality. It's not a substitute for proper caching in your application if applicable.)
Entity Framework generally caches the metadata (item 1) so that it is shared by all context instances (or, at least, all instances that use the same connection string). So here that gives you no cause for concern.
As mentioned in other comments, your code results in using two database connections. This may or may not be a problem for you.
You also end up with two client caches (item 3). If you happen to load an entity from the outer context, then again from the inner context, you will have two copies of it in memory. This would definitely be confusing, and could lead to subtle bugs. This means that, if you don't want to use shared context objects, then your option 2 would probably be better than option 1.
If you are using transactions, there are further considerations. Having multiple database connections is likely to result in transactions being promoted to distributed transactions, which is probably not what you want. Since you didn't make any mention of db transactions, I won't go into this further here.
So, where does this leave you?
If you are using this pattern simply to avoid passing DbContext objects around in your code, then you would probably be better off refactoring MethodB to receive the context as a parameter. The question of how long-lived context objects should be comes up repeatedly. As a rule of thumb, create a new context for a single database operation or for a series of related database operations. (See, for example this blog post and this question.)
(As an alternative, you could add a constructor to your DbContext derived class that receives an existing connection. Then you could share the same connection between multiple contexts.)
One useful pattern is to write your own class that creates a context object and stores it as a private field or property. Then you make your class implement IDisposable and its Dispose() method disposes the context object. Your calling code news up an instance of your class, and doesn't have to worry about contexts or connections at all.
When might you need to have multiple contexts active at the same time?
This can be useful when you need to write code that is multi-threaded. A database connection is not thread-safe, so you must only ever access a connection (and therefore an EF context) from one thread at a time. If that is too restrictive, you need multiple connections (and contexts), one per thread. You might find this interesting.
You can alter your code by passing to Method_B the context. If you do so, the creation of the second db call SomeDbContext will not be necessary.
there a question an answer in stackoverflow in this link
Proper use of "Using" statement for datacontext
It is a bit late answer, but still people may be looking so here is another way.
Create class, that cares about disposing for you. In some scenarios, there would be a function usable from different places in solution. This way you avoid creating multiple instances of DbContext and you can use nested calls as many as you like.
Pasting simple example.
public class SomeContext : SomeDbContext
{
protected int UsingCount = 0;
public static SomeContext GetContext(SomeContext context)
{
if (context != null)
{
context.UsingCount++;
}
else
{
context = new SomeContext();
}
return context;
}
private SomeContext()
{
}
protected bool MyDisposing = true;
protected override void Dispose(bool disposing)
{
if (UsingCount == 0)
{
base.Dispose(MyDisposing);
MyDisposing = false;
}
else
{
UsingCount--;
}
}
public override int SaveChanges()
{
if (UsingCount == 0)
{
return base.SaveChanges();
}
else
{
return 0;
}
}
}
Example of usage
public class ExmapleNesting
{
public void MethodA()
{
using (var context = SomeContext.GetContext(null))
{
// manipulate, save it, just do not call Dispose on context in using
MethodB(context);
}
MethodB();
}
public void MethodB(SomeContext someContext = null)
{
using (var context = SomeContext.GetContext(someContext))
{
// manipulate, save it, just do not call Dispose on context in using
// Even more nested functions if you'd like
}
}
}
Simple and easy to use.
If you think number of connections to Database,and impact of times that new connections must be opened, not an important problem and you have no limitation for support your application to run at best performance, everything is OK.
Your code works well. Because create just a db context has a low impact in your performance,meta data will be cached after first loading, and connection to your database just occurs when the code need to execute a query. With liitle performance consideration and code design, I offer you to make context factory to have just an instance of each Db Context for each instance of your application.
You can take a look at this link for more performance considerations
http://msdn.microsoft.com/en-us/data/hh949853

Entity Framework and Entity Tracker Problems

If I run the following code it throws the following error:
An entity object cannot be referenced by multiple instances of IEntityChangeTracker
public void Save(Category category)
{
using(var db = new NorthwindContext())
{
if(category.CategoryID == 0)
{
db.AddToCategorySet(category);
}
else
{
//category.RemoveTracker();
db.Attach(category);
}
db.SaveChanges();
}
}
The reason is of course that the category is sent from interface which we got from GetById method which already attached the EntityChangeTracker to the category object. I also tried to set the entity tracker to null but it did not update the category object.
protected void Btn_Update_Category_Click(object sender, EventArgs e)
{
_categoryRepository = new CategoryRepository();
int categoryId = Int32.Parse(txtCategoryId.Text);
var category = _categoryRepository.GetById(categoryId);
category.CategoryName = txtUpdateCategoryName.Text;
_categoryRepository.Save(category);
}
I'm still learning Entity Framework myself, but maybe I can help a little. When working with the Entity Framework, you need to be aware of how you're handling different contexts. It looks like you're trying to localize your context as much as possible by saying:
public void Save(Category category)
{
using (var db = new NorthwindContext())
{
...
}
}
... within your data access method. Did you do the same thing in your GetById method? If so, did you remember to detach the object you got back so that it could be attached later in a different context?
public Category GetById(int categoryId)
{
using (var db = new NorthwindContext())
{
Category category = (from c in db.Category where Category.ID == categoryId select c).First();
db.Detach(category);
}
}
That way when you call Attach it isn't trying to step on an already-attached context. Does that help?
As you pointed out in your comment, this poses a problem when you're trying to modify an item and then tell your database layer to save it, because once an item is detached from its context, it no longer keeps track of the changes that were made to it. There are a few ways I can think of to get around this problem, none of them perfect.
If your architecture supports it, you could expand the scope of your context enough that your Save method could use the same context that your GetById method uses. This helps to avoid the whole attach/detach problem entirely, but it might push your data layer a little closer to your business logic than you would like.
You can load a new instance of the item out of the new context based on its ID, set all of its properties based on the category that is passed in, and then save it. This costs two database round-trips for what should really only need one, and it isn't very maintainable.
You can dig into the context itself to mark the Category's properties as changed.
For example:
public void Save(Category category)
{
using (var db = new NorthwindContext())
{
db.Attach(category);
var stateEntry = db.ObjectStateManager.GetObjectStateEntry(category);
foreach (var propertyName in stateEntry.CurrentValues.DataRecordInfo.FieldMetadata.Select(fm => fm.FieldType.Name)) {
stateEntry.SetModifiedProperty(propertyName);
}
db.SaveChanges();
}
}
This looks a little uglier, but should be more performant and maintainable overall. Plus, if you want, you could make it generic enough to throw into an extension method somewhere so you don't have to see or repeat the ugly code, but you still get the functionality out of it.

Is this safe? - NUnit base class opens and rollsback a TransactionScope

I was thinking it would be nice to create a base class for NUnit test fixtures that opens a TransactionScope during the SetUp phase, then rolls back the transaction during tear down.
Something like this:
public abstract class TestFixtureBase
{
private TransactionScope _transaction;
[TestFixtureSetUp]
public void TestFixtureSetup()
{
_transaction = new TransactionScope();
}
[TestFixtureTearDown]
public void TestFixtureTearDown()
{
if (_transaction != null)
{
_transaction.Dispose();
}
}
}
Do you think this is a good idea?
Obviously the database is just a test database, not a live database, but it would still be annoying if it filled up with junk data from the unit tests.
What do other people do when running unit tests that involve a lot of data access?
You want to be careful here. TransactionScope is going to promote the transaction to a distributed transaction if you open up more than one connection to the database. I find that it is easier just to write some simple SQL that clears out the tables of interest to my test class before I start running the test.
EDIT: Normally I would call any test that touches the database an integration test since it involves another system. Typically, I will mock out the database when unit testing my code.
[TestSetup]
public void Setup()
{
foreach (string table in new string[] { "table1", "table2" })
{
ClearTable( table );
}
}
private void ClearTable( string table )
{
...standard stuff to set up connection...
SqlCommand command = connection.CreateCommand() );
command.CommandText = "delete from " + table;
command.ExecuteNonQuery();
... stuff to clean up connection...
}
I've used XtUnit
It automatically rolls back at the end of a unit test. You can simply add a [Rollback] attribute to the test. It's an extension to NUnit or MbUnit