Trouble Calling Stored Procedure from BackgroundWorker - entity-framework

I'm in ASP.NET MVC and am (mostly) using Entity Framework. I want to call a stored procedure without waiting for it to finish. My current approach is to use a background worker. Trouble is, it works fine without using the background worker, but fails to execute with it.
In the DoWork event handler when I call
command.ExecuteNonQuery();
it just "disappears" (never gets to next line in debug mode).
Anyone have tips on calling a sproc asynchronously? BTW, it'll be SQL Azure in production if that matters; for now SQL Server 2008.
public void ExecAsyncUpdateMemberScoreRecalc(MemberScoreRecalcInstruction instruction)
{
var bw = new BackgroundWorker();
bw.DoWork += new DoWorkEventHandler(AsyncUpdateMemberScoreRecalc_DoWork);
bw.WorkerReportsProgress = false;
bw.WorkerSupportsCancellation = false;
bw.RunWorkerAsync(instruction);
}
private void AsyncUpdateMemberScoreRecalc_DoWork(object sender, DoWorkEventArgs e)
{
var instruction = (MemberScoreRecalcInstruction)e.Argument;
string connectionString = string.Empty;
using (var sprocEntities = new DSAsyncSprocEntities()) // getting the connection string
{
connectionString = sprocEntities.Connection.ConnectionString;
}
using (var connection = new EntityConnection(connectionString))
{
connection.Open();
EntityCommand command = connection.CreateCommand();
command.CommandText = DSConstants.Sproc_MemberScoreRecalc;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_SageUserId, instruction.SageUserId);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventType, instruction.EventType);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventCode, instruction.EventCode);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventParamId, instruction.EventParamId);
int result = 0;
// NEVER RETURNS FROM RUNNING NEXT LINE (and never executes)... yet it works if I do the same thing directly in the main thread.
result = command.ExecuteNonQuery();
}
}

Add a try catch around the call and see if any exceptions are caught and are thus aborting the thread.
try {
result = command.ExecuteNonQuery();
} catch(Exception ex) {
// Log this error and if needed handle or
throw;
}

Related

Cannot attach database file when using Entity Framework Core Migration commands

I am using EntityFramework Core commands to migration database. The command I am using is like the docs suggests: dnx . ef migration apply. The problem is when specifying AttachDbFileName in connection string, the following error appear: Unable to Attach database file as database xxxxxxx. This is the connection string I am using:
Data Source=(LocalDB)\mssqllocaldb;Integrated Security=True;Initial Catalog=EfGetStarted2;AttachDbFileName=D:\EfGetStarted2.mdf
Please help how to attach the db file to another location.
Thanks
EF core seem to have troubles with AttachDbFileName or doesn't handle it at all.
EnsureDeleted changes the database name to master but keeps any AttachDbFileName value, which leads to an error since we cannot attach the master database to another file.
EnsureCreated opens a connection using the provided AttachDbFileName value, which leads to an error since the file of the database we want to create does not yet exist.
EF6 has some logic to handle these use cases, see SqlProviderServices.DbCreateDatabase, so everything worked quite fine.
As a workaround I wrote some hacky code to handle these scenarios:
public static void EnsureDatabase(this DbContext context, bool reset = false)
{
if (context == null)
throw new ArgumentNullException(nameof(context));
if (reset)
{
try
{
context.Database.EnsureDeleted();
}
catch (SqlException ex) when (ex.Number == 1801)
{
// HACK: EF doesn't interpret error 1801 as already existing database
ExecuteStatement(context, BuildDropStatement);
}
catch (SqlException ex) when (ex.Number == 1832)
{
// nothing to do here (see below)
}
}
try
{
context.Database.EnsureCreated();
}
catch (SqlException ex) when (ex.Number == 1832)
{
// HACK: EF doesn't interpret error 1832 as non existing database
ExecuteStatement(context, BuildCreateStatement);
// this takes some time (?)
WaitDatabaseCreated(context);
// re-ensure create for tables and stuff
context.Database.EnsureCreated();
}
}
private static void WaitDatabaseCreated(DbContext context)
{
var timeout = DateTime.UtcNow + TimeSpan.FromMinutes(1);
while (true)
{
try
{
context.Database.OpenConnection();
context.Database.CloseConnection();
}
catch (SqlException)
{
if (DateTime.UtcNow > timeout)
throw;
continue;
}
break;
}
}
private static void ExecuteStatement(DbContext context, Func<SqlConnectionStringBuilder, string> statement)
{
var builder = new SqlConnectionStringBuilder(context.Database.GetDbConnection().ConnectionString);
using (var connection = new SqlConnection($"Data Source={builder.DataSource}"))
{
connection.Open();
using (var command = connection.CreateCommand())
{
command.CommandText = statement(builder);
command.ExecuteNonQuery();
}
}
}
private static string BuildDropStatement(SqlConnectionStringBuilder builder)
{
var database = builder.InitialCatalog;
return $"drop database [{database}]";
}
private static string BuildCreateStatement(SqlConnectionStringBuilder builder)
{
var database = builder.InitialCatalog;
var datafile = builder.AttachDBFilename;
var dataname = Path.GetFileNameWithoutExtension(datafile);
var logfile = Path.ChangeExtension(datafile, ".ldf");
var logname = dataname + "_log";
return $"create database [{database}] on primary (name = '{dataname}', filename = '{datafile}') log on (name = '{logname}', filename = '{logfile}')";
}
It's far from nice, but I'm using it for integration testing anyway. For "real world" scenarios using EF migrations should be the way to go, but maybe the root cause of this issue is the same...
Update
The next version will include support for AttachDBFilename.
There may be a different *.mdf file already attached to a database named EfGetStarted2... Try dropping/detaching that database then try again.
You might also be running into problems if the user LocalDB is running as doesn't have correct permissions to the path.

ExecuteReader requires an open connection

I am getting the error: "ExecuteReader requires an open connection" and I know the fix is to add a connection.Open() / connection.Close(). My question pertaining to this error is more for me to understand exactly what happen under the hood.
I am currently using the "USING" statement which I expect it to open and close/dispose the connection for me. So I guess I don't understand why it didn't work as expected and I needed to explicitly code the connection.Open() / connection.Close() myself to fix the issue. I did some research and found people experienced similar issue because they were using static connection. In my case, I am creating a new instance of the connection... hence, it bothers me and hoping to get to the bottom of this instead of just fix it and move on. Thank you in advance.
Here is the code:
try
{
using (SqlConnection connection = new SqlConnection(myConnStr))
using (SqlCommand command = new SqlCommand("mySPname", connection))
{
command.CommandType = CommandType.StoredProcedure;
//add some parameters
SqlParameter retParam = command.Parameters.Add("#RetVal", SqlDbType.VarChar);
retParam.Direction = ParameterDirection.ReturnValue;
/////////////////////////////////////////////////
// fix - add this line of code: connection.Open();
/////////////////////////////////////////////////
using(SqlDataReader dr = command.ExecuteReader())
{
int success = (int)retParam.Value;
// manually close the connection here if manually open it. Code: connection.Close();
return Convert.ToBoolean(success);
}
}
}
catch (Exception ex)
{
throw;
}
Using does not open any connections, it only disposes of any allocated memory after calling End Using.
For the SqlConnection, you have to explicitly open it inside the using block, you just don't need to close it though.
I also notice that you are missing a set of brackets {} around the using SqlConnection. Maybe that's the issue? It should be like this:
try
{
using (SqlConnection connection = new SqlConnection(myConnStr))
{
connection.Open();
using (SqlCommand command = new SqlCommand("InsertProcessedPnLFile", connection))
{
command.CommandType = CommandType.StoredProcedure;
//add some parameters
SqlParameter retParam = command.Parameters.Add("#RetVal", SqlDbType.VarChar);
retParam.Direction = ParameterDirection.ReturnValue;
/////////////////////////////////////////////////
// fix - add this line of code: connection.Open();
/////////////////////////////////////////////////
using(SqlDataReader dr = command.ExecuteReader())
{
int success = (int)retParam.Value;
// manually close the connection here if manually open it. Code: connection.Close();
return Convert.ToBoolean(success);
}
}
}
}

Cloud Service for incoming TCP connections hangs

I'm developing a cloud service (worker role) for collecting data from a number of instruments. These instruments reports data randomly every minute or so. The service itself is not performance critical and doesn't need to be asynchronous. The instruments are able to resend their data up to an hour on failed connection attempt.
I have tried several implementations for my cloud service including this one:
http://msdn.microsoft.com/en-us/library/system.net.sockets.tcplistener.stop(v=vs.110).aspx
But all of them hang my cloud server sooner or later (sometimes within an hour).
I suspect something is wrong with my code. I have a lot of logging in my code but I get no errors. The service just stops to receive incoming connections.
In Azure portal it seems like the service is running fine. No error logs and no suspicious cpu usage etc.
If I restart the service it will run fine again until it hangs next time.
Would be most grateful if someone could help me with this.
public class WorkerRole : RoleEntryPoint
{
private LoggingService _loggingService;
public override void Run()
{
_loggingService = new LoggingService();
StartListeningForIncommingTCPConnections();
}
private void StartListeningForIncommingTCPConnections()
{
TcpListener listener = null;
try
{
listener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["WatchMeEndpoint"].IPEndpoint);
listener.Start();
while (true)
{
_loggingService.Log(SeverityLevel.Info, "Waiting for connection...");
var client = listener.AcceptTcpClient();
var remoteEndPoint = client.Client != null ? client.Client.RemoteEndPoint.ToString() : "Unknown";
_loggingService.Log(SeverityLevel.Info, String.Format("Connected to {0}", remoteEndPoint));
var netStream = client.GetStream();
var data = String.Empty;
using (var reader = new StreamReader(netStream, Encoding.ASCII))
{
data = reader.ReadToEnd();
}
_loggingService.Log(SeverityLevel.Info, "Received data: " + data);
ProcessData(data); //data is processed and stored in database (all resources are released when done)
client.Close();
_loggingService.Log(SeverityLevel.Info, String.Format("Connection closed for {0}", remoteEndPoint));
}
}
catch (Exception exception)
{
_loggingService.Log(SeverityLevel.Error, exception.Message);
}
finally
{
if (listener != null)
listener.Stop();
}
}
private void ProcessData(String data)
{
try
{
var processor = new Processor();
var lines = data.Split('\n');
foreach (var line in lines)
processor.ProcessLine(line);
processor.ProcessMessage();
}
catch (Exception ex)
{
_loggingService.Log(SeverityLevel.Error, ex.Message);
throw new Exception(ex.InnerException.Message);
}
}
}
One strange observation i just did:
I checked the log recently and no instrument has connected for the last 30 minutes (which indicates that the service is down).
I connected to the service myself via a TCP client i've written myself and uploaded some test data.
This worked fine.
When I checked the log again my test data had been stored.
The strange thing is, that 4 other instruments had connected about the same time and send their data successfully.
Why couldn't they connect by themself before I connected with my test client?
Also, what does this setting in .csdef do for an InputEndpoint, idleTimeoutInMinutes?
===============================================
Edit:
Since a cuple of days back my cloud service has been running successfully.
Unfortunately this morning last log entry was from this line:
_loggingService.Log(SeverityLevel.Info, String.Format("Connected to {0}", remoteEndPoint));
No other connections could be made after this. Not even from my own test TCP client (didn't get any error though, but no data was stored and no new logs).
This makes me think that following code causes the service to hang:
var netStream = client.GetStream();
var data = String.Empty;
using (var reader = new StreamReader(netStream, Encoding.ASCII))
{
data = reader.ReadToEnd();
}
I've read somewhere that StremReader's ReadToEnd() could hang. Is this possible?
I have now changed this piece of code to this:
int i;
var bytes = new Byte[256];
var data = new StringBuilder();
const int dataLimit = 10;
var dataCount = 0;
while ((i = netStream.Read(bytes, 0, bytes.Length)) != 0)
{
data.Append(Encoding.ASCII.GetString(bytes, 0, i));
if (dataCount >= dataLimit)
{
_loggingService.Log(SeverityLevel.Error, "Reached data limit");
break;
}
dataCount++;
}
Another explanation could be something hanging in the database. I use the SqlConnection and SqlCommand classes to read and write to my database. I always close my connection afterwards (finally block).
SqlConnection and SqlCommand should have default timeouts, right?
===============================================
Edit:
After some more debugging I found out that when the service wasn't responding it "hanged" on this line of code:
while ((i = netStream.Read(bytes, 0, bytes.Length)) != 0)
After some digging I found out that the NetStream class and its read methods could actually hang. Even though MS declares otherwise.
NetworkStream read hangs
I've now changed my code into this:
Thread thread = null;
var task = Task.Factory.StartNew(() =>
{
thread = Thread.CurrentThread;
while ((i = netStream.Read(bytes, 0, bytes.Length)) != 0)
{
// Translate data bytes to a ASCII string.
data.Append(Encoding.ASCII.GetString(bytes, 0, i));
}
streamReadSucceeded = true;
});
task.Wait(5000);
if (streamReadSucceeded)
{
//Process data
}
else
{
thread.Abort();
}
Hopefully this will stop the hanging.
I'd say that part of your problem is you are processing your data on the thread that listens for connections from clients. This would prevent new clients from connecting if another client has started a long running operation of some type. I'd suggest you defer your processing to worker threads thus freeing the "listener" thread to accept new connections.
Another problem you could be experiencing, if your service throws an error, then the service will stop accepting connections as well.
private static void ListenForClients()
{
tcpListener.Start();
while (true)
{
TcpClient client = tcpListener.AcceptTcpClient();
Thread clientThread = new Thread(new ParameterizedThreadStart(HandleClientComm));
clientThread.Start(client);
}
}
private static void HandleClientComm(object obj)
{
try
{
using(TcpClient tcpClient = (TcpClient)obj)
{
Console.WriteLine("Got Client...");
using (NetworkStream clientStream = tcpClient.GetStream())
using (StreamWriter writer = new StreamWriter(clientStream))
using(StreamReader reader = new StreamReader(clientStream))
{
//do stuff
}
}
}
catch(Exception ex)
{
}
}

Bulk inserts with EntityFramework 4.0 causes abort of transaction

We are receiving a file from a client (Silverlight) via WCF and on the serverside I parse this file. Each line in the file is transformed into an object and stored into the database. if the file is very large (10000 entries and more), I get the following error (MSSQLEXPRESS):
The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
I tried a lot (TransactionOptions timeout set and so on), but nothings works. The above exception message is either raised after 3000, sometimes after 6000 objects processed, but I can't succeed in processing all objects.
I append my source, hopefully somebody got an idea and can help me:
public xxxResponse SendLogFile (xxxRequest request
{
const int INTERMEDIATE_SAVE = 100;
using (var context = new EntityFramework.Models.Cubes_ServicesEntities())
{
// start a new transactionscope with the timeout of 0 (unlimited time for developing purposes)
using (var transactionScope = new TransactionScope(TransactionScopeOption.RequiresNew,
new TransactionOptions
{
IsolationLevel = System.Transactions.IsolationLevel.Serializable,
Timeout = TimeSpan.FromSeconds(0)
}))
{
try
{
// open the connection manually to prevent undesired close of DB
// (MSDTC)
context.Connection.Open();
int timeout = context.Connection.ConnectionTimeout;
int Counter = 0;
// read the file submitted from client
using (var reader = new StreamReader(new MemoryStream(request.LogFile)))
{
try
{
while (!reader.EndOfStream)
{
Counter++;
Counter2++;
string line = reader.ReadLine();
if (String.IsNullOrEmpty(line)) continue;
// Create a new object
DomainModel.LogEntry le = CreateLogEntryObject(line);
// an attach it to the context, set its state to added.
context.AttachTo("LogEntry", le);
context.ObjectStateManager.ChangeObjectState(le, EntityState.Added);
// while not 100 objects were attached, go on
if (Counter != INTERMEDIATE_SAVE) continue;
// after 100 objects, make a call to SaveChanges.
context.SaveChanges(SaveOptions.None);
Counter = 0;
}
}
catch (Exception exception)
{
// cleanup
reader.Close();
transactionScope.Dispose();
throw exception;
}
}
// do a final SaveChanges
context.SaveChanges();
transactionScope.Complete();
context.Connection.Close();
}
catch (Exception e)
{
// cleanup
transactionScope.Dispose();
context.Connection.Close();
throw e;
}
}
var response = CreateSuccessResponse<ServiceSendLogEntryFileResponse>("SendLogEntryFile successful!");
return response;
}
}
There is no bulk insert in entity framework. You call SaveChanges after 100 records but it will execute 100 separate inserts with database round trip for each insert.
Setting timeout of the transaction is also dependent on transaction max timeout which is configured on machine level (I think default value is 10 minutes). How lond does it take before your operation fails?
The best way you can do is rewriting your insert logic with common ADO.NET or with bulk insert.
Btw. throw exception and throw e? That is incorrect way to rethrow exceptions.
Important edit:
SaveChanges(SaveOptions.None) !!! means do not accept changes after saving so all records are still in added state. Because of that the first call to SaveChanges will insert first 100 records. The second call will insert first 100 again + next 100, the third call will insert first 200 + next 100, etc.
I had exactly same issue. I did EF code to insert bulk 1000 records each time.
I was working since the beginning, with a little problem with msDTC that I put to allow remot clients and admin , but after that it was ok. I did lot of work with this, but one day it JUST STOP WORKING.
I am getting
The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
VERY WEIRD! Sometimes the error changes. My suspect is the msDTC somehow , strange behaviors.
I am changing now for not using TransactionScope!
I hate when it did work and just stop. I also tried to run this in a vm, another enourmous waste of time...
My code:
private void AddTicks(FileHelperTick[] fhTicks)
{
List<ForexEF.Entities.Tick> Ticks = new List<ForexEF.Entities.Tick>();
var str = LeTicks(ref fhTicks, ref Ticks);
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Required, new TransactionOptions()
{
IsolationLevel = System.Transactions.IsolationLevel.Serializable,
Timeout = TimeSpan.FromSeconds(180)
}))
{
ForexEF.EUR_TICKSContext contexto = null;
try
{
contexto = new ForexEF.EUR_TICKSContext();
contexto.Configuration.AutoDetectChangesEnabled = false;
int count = 0;
foreach (var tick in Ticks)
{
count++;
contexto = AddToContext(contexto, tick, count, 1000, true);
}
contexto.SaveChanges();
}
finally
{
if (contexto != null)
contexto.Dispose();
}
scope.Complete();
}
}
private ForexEF.EUR_TICKSContext AddToContext(ForexEF.EUR_TICKSContext contexto, ForexEF.Entities.Tick tick, int count, int commitCount, bool recreateContext)
{
contexto.Set<ForexEF.Entities.Tick>().Add(tick);
if (count % commitCount == 0)
{
contexto.SaveChanges();
if (recreateContext)
{
contexto.Dispose();
contexto = new ForexEF.EUR_TICKSContext();
contexto.Configuration.AutoDetectChangesEnabled = false;
}
}
return contexto;
}
It times out due the TransactionScope default Maximum Timeout, check the machine.config for that.
Check out this link:
http://social.msdn.microsoft.com/Forums/en-US/windowstransactionsprogramming/thread/584b8e81-f375-4c76-8cf0-a5310455a394/

ADO.NET - Bad Practice?

I was reading an article in MSDN several months ago and have recently started using the following snippet to execute ADO.NET code, but I get the feeling it could be bad. Am I over reacting or is it perfectly acceptable?
private void Execute(Action<SqlConnection> action)
{
SqlConnection conn = null;
try {
conn = new SqlConnection(ConnectionString);
conn.Open();
action.Invoke(conn);
} finally {
if (conn != null && conn.State == ConnectionState.Open) {
try {
conn.Close();
} catch {
}
}
}
}
public bool GetSomethingById() {
SomeThing aSomething = null
bool valid = false;
Execute(conn =>
{
using (SqlCommand cmd = conn.CreateCommand()) {
cmd.CommandText = ....
...
SqlDataReader reader = cmd.ExecuteReader();
...
aSomething = new SomeThing(Convert.ToString(reader["aDbField"]));
}
});
return aSomething;
}
What is the point of doing that when you can do this?
public SomeThing GetSomethingById(int id)
{
using (var con = new SqlConnection(ConnectionString))
{
con.Open();
using (var cmd = con.CreateCommand())
{
// prepare command
using (var rdr = cmd.ExecuteReader())
{
// read fields
return new SomeThing(data);
}
}
}
}
You can promote code reuse by doing something like this.
public static void ExecuteToReader(string connectionString, string commandText, IEnumerable<KeyValuePair<string, object>> parameters, Action<IDataReader> action)
{
using (var con = new SqlConnection(connectionString))
{
con.Open();
using (var cmd = con.CreateCommand())
{
cmd.CommandText = commandText;
foreach (var pair in parameters)
{
var parameter = cmd.CreateParameter();
parameter.ParameterName = pair.Key;
parameter.Value = pair.Value;
cmd.Parameters.Add(parameter);
}
using (var rdr = cmd.ExecuteReader())
{
action(rdr);
}
}
}
}
You could use it like this:
//At the top create an alias
using DbParams = Dictionary<string, object>;
ExecuteToReader(
connectionString,
commandText,
new DbParams() { { "key1", 1 }, { "key2", 2 } }),
reader =>
{
// ...
// No need to dispose
}
)
IMHO it is indeed a bad practice, since you're creating and opening a new database-connection for every statement that you execute.
Why is it bad:
performance wise (although connection pooling helps decrease the performance hit): you should open your connection, execute the statements that have to be executed, and close the connection when you don't know when the next statement will be executed.
but certainly context-wise. I mean: how will you handle transactions ? Where are your transaction boundaries ? Your application-layer knows when a transaction has to be started and committed, but you're unable to span multiple statements into the same sql-transaction with this way of working.
This is a very reasonable approach to use.
By wrapping your connection logic into a method which takes an Action<SqlConnection>, you're helping prevent duplicated code and the potential for introduced error. Since we can now use lambdas, this becomes an easy, safe way to handle this situation.
That's acceptable. I've created a SqlUtilities class two years ago that had a similar method. You can take it one step further if you like.
EDIT: Couldn't find the code, but I typed a small example (probably with many syntax errors ;))
SQLUtilities
public delegate T CreateMethod<T> (SqlDataReader reader);
public static T CreateEntity<T>(string query, CreateMethod<T> createMethod, params SqlParameter[] parameters) {
// Open the Sql connection
// Create a Sql command with the query/sp and parameters
SqlDataReader reader = cmd.ExecuteReader();
return createMethod(reader);
// Probably some finally statements or using-closures etc. etc.
}
Calling code
private SomeThing Create(SqlDataReader reader) {
SomeThing something = new SomeThing();
something.ID = Convert.ToIn32(reader["ID"]);
...
return something;
}
public SomeThing GetSomeThingByID (int id) {
return SqlUtilities.CreateEntity<SomeThing> ("something_getbyid", Create, ....);
}
Of course you could use a lambda expression instead of the Create method, and you could easily make a CreateCollection method and reuse the existing Create method.
However if this is a new project. Check out LINQ to entities. Is far easier and flexible than ADO.Net.
Well, In my opinion check what you do before going through it.Something that is working doesn't mean it is best and good programming practice.Check out and find a concrete example and benefit of using it.But if you are considering using for big projects it would be nice using frameworks like NHibernate.Because there are a lot projects even frameworks developed based on it,like http://www.cuyahoga-project.org/.