should I set T-SQL command timeout back to default? - tsql

I hope this question is not too stupid. I have a long process t-sql command in my ADO.Net. I would like to increase the command timeout (please see below).
cmd.CommandTimeout = 600; // default is 30 sec. increase to 10 mins
try
{
cmd.ExecuteNonQuery();
cmd.CommandTimeout = 30; // set it back
}
catch (Exception ex)
{
string debug = ex.Message;
throw ex;
}
Should I set the timeout back to default after the long process is done? I am looking for the best practice. Thank you :-)

If you re-use the command object, you can add a finally to the try-catch, so it will be reset regardless of the result of the query. (success or exception).
But when you dispose the SqlCommand after you've used it. There's no need to reset the CommandTimeout

The best practice is to not reuse the command because you should not be reusing the connection. Holding a connection is not good for scale.
static private dbConnectionString = "CONNECTION DETAILS";
using (SQLConnection connection = new SQLConnection(dbConnectionString))
{
try
{
connection.Open();
using (SQLCommand command = connection.CreateCommand())
{
command.CommandTimeout = 600;
......
using(SQLDataReader rdr = command.ExecuteReader())
{
}
}
}
catch(SQLException ex)
{
}
finally
{
connection.Close();
}
}

Related

ExecuteReader requires an open connection

I am getting the error: "ExecuteReader requires an open connection" and I know the fix is to add a connection.Open() / connection.Close(). My question pertaining to this error is more for me to understand exactly what happen under the hood.
I am currently using the "USING" statement which I expect it to open and close/dispose the connection for me. So I guess I don't understand why it didn't work as expected and I needed to explicitly code the connection.Open() / connection.Close() myself to fix the issue. I did some research and found people experienced similar issue because they were using static connection. In my case, I am creating a new instance of the connection... hence, it bothers me and hoping to get to the bottom of this instead of just fix it and move on. Thank you in advance.
Here is the code:
try
{
using (SqlConnection connection = new SqlConnection(myConnStr))
using (SqlCommand command = new SqlCommand("mySPname", connection))
{
command.CommandType = CommandType.StoredProcedure;
//add some parameters
SqlParameter retParam = command.Parameters.Add("#RetVal", SqlDbType.VarChar);
retParam.Direction = ParameterDirection.ReturnValue;
/////////////////////////////////////////////////
// fix - add this line of code: connection.Open();
/////////////////////////////////////////////////
using(SqlDataReader dr = command.ExecuteReader())
{
int success = (int)retParam.Value;
// manually close the connection here if manually open it. Code: connection.Close();
return Convert.ToBoolean(success);
}
}
}
catch (Exception ex)
{
throw;
}
Using does not open any connections, it only disposes of any allocated memory after calling End Using.
For the SqlConnection, you have to explicitly open it inside the using block, you just don't need to close it though.
I also notice that you are missing a set of brackets {} around the using SqlConnection. Maybe that's the issue? It should be like this:
try
{
using (SqlConnection connection = new SqlConnection(myConnStr))
{
connection.Open();
using (SqlCommand command = new SqlCommand("InsertProcessedPnLFile", connection))
{
command.CommandType = CommandType.StoredProcedure;
//add some parameters
SqlParameter retParam = command.Parameters.Add("#RetVal", SqlDbType.VarChar);
retParam.Direction = ParameterDirection.ReturnValue;
/////////////////////////////////////////////////
// fix - add this line of code: connection.Open();
/////////////////////////////////////////////////
using(SqlDataReader dr = command.ExecuteReader())
{
int success = (int)retParam.Value;
// manually close the connection here if manually open it. Code: connection.Close();
return Convert.ToBoolean(success);
}
}
}
}

ADO.NET and Disposing without Using

I have a project that isn't using USING anywhere with their ADO.NET code. I am cleaning up their unclosed connections. Is the below code a best practice with try/catch/finally. I also have some that contains SqlTransaction that I'm disposing in between the command and connection dispose.
SqlConnection con = new SqlConnection(ConfigurationManager.AppSettings["MyNGConnectDashBoardConnectionString"].ToString());
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
try
{
con.Open();
cmd.Connection = con;
SqlDataAdapter da = new SqlDataAdapter(cmd);
da.Fill(ds);
}
catch (Exception ex)
{
throw ex;
}
finally
{
cmd.Dispose();
con.Dispose();
}
Actually, there is no need to worry about closing the connection when using the SqlDataAdapter.Fill(dataset) method. This method closes the connection after performing every Fill.
Also, there is no need to call SqlCommand.Dispose() since the command itself has no unmanaged resources to clean up. What you should be concerned about is if SqlConnection.Close() is called at some point. This is done after Fill.
What you have is fine. It is always a good idea to dispose of objects that use unmanaged resources. However, if you get sick of always explicitly calling Dispose, the best practice is probably to use the using:
using (SqlConnection con = new SqlConnection(ConfigurationManager.AppSettings["MyNGConnectDashBoardConnectionString"].ToString()))
{
using (SqlCommand cmd = new SqlCommand())
{
DataSet ds = new DataSet();
try
{
con.Open();
cmd.Connection = con;
SqlDataAdapter da = new SqlDataAdapter(cmd);
da.Fill(ds);
}
catch (Exception ex)
{
throw; // I changed this too!
}
}
}
Also, you almost always want to simply throw if you're going to "rethrow" an exception. You lose some of your stack trace if you throw ex;.
The best practice is using using instead of try/finally :)
However in your case even using is not needed, because Fill() closes the connection:
SqlConnection con = new SqlConnection(ConfigurationManager.AppSettings["MyNGConnectDashBoardConnectionString"].ToString())
SqlDataAdapter da = new SqlDataAdapter("your sql is here", con);
da.Fill(ds);
Also simple re-throwing exceptions makes no sense at all. If you need to log error, just use plain throw; as #Cory suggested.

Bulk inserts with EntityFramework 4.0 causes abort of transaction

We are receiving a file from a client (Silverlight) via WCF and on the serverside I parse this file. Each line in the file is transformed into an object and stored into the database. if the file is very large (10000 entries and more), I get the following error (MSSQLEXPRESS):
The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
I tried a lot (TransactionOptions timeout set and so on), but nothings works. The above exception message is either raised after 3000, sometimes after 6000 objects processed, but I can't succeed in processing all objects.
I append my source, hopefully somebody got an idea and can help me:
public xxxResponse SendLogFile (xxxRequest request
{
const int INTERMEDIATE_SAVE = 100;
using (var context = new EntityFramework.Models.Cubes_ServicesEntities())
{
// start a new transactionscope with the timeout of 0 (unlimited time for developing purposes)
using (var transactionScope = new TransactionScope(TransactionScopeOption.RequiresNew,
new TransactionOptions
{
IsolationLevel = System.Transactions.IsolationLevel.Serializable,
Timeout = TimeSpan.FromSeconds(0)
}))
{
try
{
// open the connection manually to prevent undesired close of DB
// (MSDTC)
context.Connection.Open();
int timeout = context.Connection.ConnectionTimeout;
int Counter = 0;
// read the file submitted from client
using (var reader = new StreamReader(new MemoryStream(request.LogFile)))
{
try
{
while (!reader.EndOfStream)
{
Counter++;
Counter2++;
string line = reader.ReadLine();
if (String.IsNullOrEmpty(line)) continue;
// Create a new object
DomainModel.LogEntry le = CreateLogEntryObject(line);
// an attach it to the context, set its state to added.
context.AttachTo("LogEntry", le);
context.ObjectStateManager.ChangeObjectState(le, EntityState.Added);
// while not 100 objects were attached, go on
if (Counter != INTERMEDIATE_SAVE) continue;
// after 100 objects, make a call to SaveChanges.
context.SaveChanges(SaveOptions.None);
Counter = 0;
}
}
catch (Exception exception)
{
// cleanup
reader.Close();
transactionScope.Dispose();
throw exception;
}
}
// do a final SaveChanges
context.SaveChanges();
transactionScope.Complete();
context.Connection.Close();
}
catch (Exception e)
{
// cleanup
transactionScope.Dispose();
context.Connection.Close();
throw e;
}
}
var response = CreateSuccessResponse<ServiceSendLogEntryFileResponse>("SendLogEntryFile successful!");
return response;
}
}
There is no bulk insert in entity framework. You call SaveChanges after 100 records but it will execute 100 separate inserts with database round trip for each insert.
Setting timeout of the transaction is also dependent on transaction max timeout which is configured on machine level (I think default value is 10 minutes). How lond does it take before your operation fails?
The best way you can do is rewriting your insert logic with common ADO.NET or with bulk insert.
Btw. throw exception and throw e? That is incorrect way to rethrow exceptions.
Important edit:
SaveChanges(SaveOptions.None) !!! means do not accept changes after saving so all records are still in added state. Because of that the first call to SaveChanges will insert first 100 records. The second call will insert first 100 again + next 100, the third call will insert first 200 + next 100, etc.
I had exactly same issue. I did EF code to insert bulk 1000 records each time.
I was working since the beginning, with a little problem with msDTC that I put to allow remot clients and admin , but after that it was ok. I did lot of work with this, but one day it JUST STOP WORKING.
I am getting
The transaction associated with the current connection has completed but has not been disposed. The transaction must be disposed before the connection can be used to execute SQL statements.
VERY WEIRD! Sometimes the error changes. My suspect is the msDTC somehow , strange behaviors.
I am changing now for not using TransactionScope!
I hate when it did work and just stop. I also tried to run this in a vm, another enourmous waste of time...
My code:
private void AddTicks(FileHelperTick[] fhTicks)
{
List<ForexEF.Entities.Tick> Ticks = new List<ForexEF.Entities.Tick>();
var str = LeTicks(ref fhTicks, ref Ticks);
using (TransactionScope scope = new TransactionScope(TransactionScopeOption.Required, new TransactionOptions()
{
IsolationLevel = System.Transactions.IsolationLevel.Serializable,
Timeout = TimeSpan.FromSeconds(180)
}))
{
ForexEF.EUR_TICKSContext contexto = null;
try
{
contexto = new ForexEF.EUR_TICKSContext();
contexto.Configuration.AutoDetectChangesEnabled = false;
int count = 0;
foreach (var tick in Ticks)
{
count++;
contexto = AddToContext(contexto, tick, count, 1000, true);
}
contexto.SaveChanges();
}
finally
{
if (contexto != null)
contexto.Dispose();
}
scope.Complete();
}
}
private ForexEF.EUR_TICKSContext AddToContext(ForexEF.EUR_TICKSContext contexto, ForexEF.Entities.Tick tick, int count, int commitCount, bool recreateContext)
{
contexto.Set<ForexEF.Entities.Tick>().Add(tick);
if (count % commitCount == 0)
{
contexto.SaveChanges();
if (recreateContext)
{
contexto.Dispose();
contexto = new ForexEF.EUR_TICKSContext();
contexto.Configuration.AutoDetectChangesEnabled = false;
}
}
return contexto;
}
It times out due the TransactionScope default Maximum Timeout, check the machine.config for that.
Check out this link:
http://social.msdn.microsoft.com/Forums/en-US/windowstransactionsprogramming/thread/584b8e81-f375-4c76-8cf0-a5310455a394/

Trouble Calling Stored Procedure from BackgroundWorker

I'm in ASP.NET MVC and am (mostly) using Entity Framework. I want to call a stored procedure without waiting for it to finish. My current approach is to use a background worker. Trouble is, it works fine without using the background worker, but fails to execute with it.
In the DoWork event handler when I call
command.ExecuteNonQuery();
it just "disappears" (never gets to next line in debug mode).
Anyone have tips on calling a sproc asynchronously? BTW, it'll be SQL Azure in production if that matters; for now SQL Server 2008.
public void ExecAsyncUpdateMemberScoreRecalc(MemberScoreRecalcInstruction instruction)
{
var bw = new BackgroundWorker();
bw.DoWork += new DoWorkEventHandler(AsyncUpdateMemberScoreRecalc_DoWork);
bw.WorkerReportsProgress = false;
bw.WorkerSupportsCancellation = false;
bw.RunWorkerAsync(instruction);
}
private void AsyncUpdateMemberScoreRecalc_DoWork(object sender, DoWorkEventArgs e)
{
var instruction = (MemberScoreRecalcInstruction)e.Argument;
string connectionString = string.Empty;
using (var sprocEntities = new DSAsyncSprocEntities()) // getting the connection string
{
connectionString = sprocEntities.Connection.ConnectionString;
}
using (var connection = new EntityConnection(connectionString))
{
connection.Open();
EntityCommand command = connection.CreateCommand();
command.CommandText = DSConstants.Sproc_MemberScoreRecalc;
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_SageUserId, instruction.SageUserId);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventType, instruction.EventType);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventCode, instruction.EventCode);
command.Parameters.AddWithValue(DSConstants.Sproc_MemberScoreRecalc_Param_EventParamId, instruction.EventParamId);
int result = 0;
// NEVER RETURNS FROM RUNNING NEXT LINE (and never executes)... yet it works if I do the same thing directly in the main thread.
result = command.ExecuteNonQuery();
}
}
Add a try catch around the call and see if any exceptions are caught and are thus aborting the thread.
try {
result = command.ExecuteNonQuery();
} catch(Exception ex) {
// Log this error and if needed handle or
throw;
}

how to try execute repeatly if command.ExecuteNonQuery() fails

how to try execute repeatly if command.ExecuteNonQuery() fails?
You can try
bool executed = false;
while (!executed)
{
try
{
command.ExecuteNonQuery();
executed = true;
}
catch
{
}
}
You can add some more conditions like a timer or a counter but this does not seem to be a good idea. You should probably come up with a better recovery scenario.
The simplest way I can think is:
while(true) {
try {
command.ExecuteNonQuery();
break;
} catch(SqlException ex) { }
}
You should anyway put some extra control code in the catch block to prevent an infinite loop and/or to log the error.