SqlDataReader not throwing an exception - ado.net

I have method
public static SqlDataReader ExecuteReader(SqlCommand cmd, int level = 0)
{
SqlConnection conn = cmd.Connection;
SqlDataReader dr = null;
cmd.CommandTimeout = SqlConnectionTimeOut;
DateTime start = DateTime.Now;
DateTime end = new DateTime();
try
{
if (conn.State == System.Data.ConnectionState.Closed)
{
conn.Open();
}
dr = cmd.ExecuteReader();
}
catch (SqlException sqlex)
{
end = DateTime.Now;
TimeSpan requestDuration = end - start;
LoggerHelper.Info("SqlException - Execute reader duration before exception in milliseconds: " + requestDuration.TotalMilliseconds.ToString());
LoggerHelper.Error("Execute reader ", sqlex ,cmd);
if (sqlex.Number == 1205 && level <= 3) // will try to execute 3 times before throwing an Process deadlocked exception
{
LoggerHelper.Info(cmd.CommandText + " was a victim of deadlock. This was attempt " + (level + 1).ToString());
dr = ExecuteReader(cmd, level++);
}
else
{
throw new Exception("Execute reader ", sqlex);
}
// throw new Exception("Execute reader ", sqlex);
}
catch (Exception ex)
{
end = DateTime.Now;
TimeSpan requestDuration = end - start;
LoggerHelper.Info("Exception - Execute reader duration before exception in milliseconds: " + requestDuration.TotalMilliseconds.ToString());
LoggerHelper.Error("Execute reader ", ex, cmd);
throw new Exception("Execute reader ", ex);
}
return dr;
}
which is called from DAL
SqlDataReader reader = DataBridgeDb.ExecuteReader(TrisuraCommand);
datatable.Load(reader, LoadOption.OverwriteChanges, tablenames);
While debugging a SQL deadlock exception is properly thrown in ExecuteReader method.
But when I deploy or change connection string to staging database, exception is thrown in datatable.Load method.
Does anyone know why? I think it comes to database properties difference or SQL Server difference?

There is a problem on this line in the catch block:
dr = ExecuteReader(cmd, level++);
The ++ operator doesn't increment level until after the function call, so it passes 0 again the second time. And the third. And so on. You need this:
dr = ExecuteReader(cmd, level++);
As shown in this sample .Net Fiddle program:
https://dotnetfiddle.net/NWTZtS
I'm not sure how that causes the issue. I'd expect a StackOverflowException, the deadlock to eventually resolve, or an SqlException with a number other than 1205. But this is a bug, and I strongly suspect somehow tit is allows the method to eventually return a null value for the SqlDataReader object. At that point, of course the DataTable.Load() method will complain.

Related

Plc4x library Modbus serial (RTU) get is not retrieving data

I am trying to write a sample program to retrieve temperature data from SHT20 temperature sensor using serial port with apache plc4x library.
private void plcRtuReader() {
String connectionString =
"modbus:serial://COM5?unit-identifier=1&baudRate=19200&stopBits=" + SerialPort.ONE_STOP_BIT + "&parityBits=" + SerialPort.NO_PARITY + "&dataBits=8";
try (PlcConnection plcConnection = new PlcDriverManager().getConnection(connectionString)) {
if (!plcConnection.getMetadata().canRead()) {
System.out.println("This connection doesn't support reading.");
return;
}
PlcReadRequest.Builder builder = plcConnection.readRequestBuilder();
builder.addItem("value-1", "holding-register:258[2]");
PlcReadRequest readRequest = builder.build();
PlcReadResponse response = readRequest.execute().get();
for (String fieldName : response.getFieldNames()) {
if (response.getResponseCode(fieldName) == PlcResponseCode.OK) {
int numValues = response.getNumberOfValues(fieldName);
// If it's just one element, output just one single line.
if (numValues == 1) {
System.out.println("Value[" + fieldName + "]: " + response.getObject(fieldName));
}
// If it's more than one element, output each in a single row.
else {
System.out.println("Value[" + fieldName + "]:");
for (int i = 0; i < numValues; i++) {
System.out.println(" - " + response.getObject(fieldName, i));
}
}
}
// Something went wrong, to output an error message instead.
else {
System.out.println(
"Error[" + fieldName + "]: " + response.getResponseCode(fieldName).name());
}
}
System.exit(0);
} catch (PlcConnectionException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
}
Connection is established with the device using serial communication. But it fails to get data and instead prints the below warning messages continously.
debugger hangs at below line:
PlcReadResponse response = readRequest.execute().get();
with below logs printing continuously.
2021-06-03-17:41:48.425 [nioEventLoopGroup-2-1] WARN io.netty.channel.nio.NioEventLoop - Selector.select() returned prematurely 512 times in a row; rebuilding Selector org.apache.plc4x.java.transport.serial.SerialPollingSelector#131f8986.
2021-06-03-17:41:55.080 [nioEventLoopGroup-2-1] WARN io.netty.channel.nio.NioEventLoop - Selector.select() returned prematurely 512 times in a row; rebuilding Selector org.apache.plc4x.java.transport.serial.SerialPollingSelector#48c328c5.
With same URL data (i.e baudrate,stopBits etc..) using modpoll.exe it works and returns the data over RTU. I am not sure what is missing here.
Kindly shed some light here.

Why won't BufferedWriter write URL content to text file?

I'm trying to write the text from the URL to a text file in batches of 35 lines, pushing enter to continue to the next batch of 35 lines. If I don't try and write to the file in batches of 35 lines it works great and writes all of the content to the text file. But when I try and use the if statement to print in batches of 35 it won't print to the file unless I push enter around 15 times. And even then it doesn't print everything. I seems like it has something to do with the if statement but I can't figure it out.
String urlString = "https://www.gutenberg.org/files/46768/46768-0.txt";
try {
URL url = new URL(urlString);
try(Scanner input = new Scanner(System.in);
InputStream stream = url.openStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
BufferedWriter writer = new BufferedWriter(new FileWriter("C:\\Users\\mattj\\Documents\\JuliusCeasar.txt"));) {
String line;
int PAGE_LENGTH = 35;
int lineCount = 0;
while ((line = reader.readLine()) != null) {
System.out.println(line);
writer.write(line + "\n");
lineCount++;
if (lineCount == PAGE_LENGTH){
System.out.println();
System.out.println("- - - Press enter to continue - - -");
input.nextLine();
lineCount = 0;
}
}
}
} catch (MalformedURLException e) {
System.out.println("We encountered a problem regarding the following URL:\n"
+ urlString + "\nEither no legal protocol could be found or the "
+ "string could not be parsed.");
e.printStackTrace();
} catch (IOException e) {
System.out.println("Attempting to open a stream from the following URL:\n"
+ urlString + "\ncaused a problem.");
e.printStackTrace();
}
I don't know Java, but there's very similar concepts in .NET. I think there's a couple of things to consider here.
BufferWriter will not write to the file immediately, it acts - as the name suggests - as a buffer, collecting up write requests over time then doing it in batch. BufferWriter has a flush method to flush the 'queued' up writes to the file immediately - so I'd do this when you hit your 35 (never flush on every write).
Also, BufferedReader and BufferedWriter are closable, so ensure to wrap them in a try statement to make sure resources are properly unlocked/cleared.

How can I read 23 million records from postgres using JDBC? I have to read from a table in postgres and write to another table

When I write a simple JPA code to findAll() data, I run into memory issues. For writing, I can do batch update. But how to read 23 million records and save them in list for storing into another table?
Java is a poor choice for processing "batch" stuff (and I love java!).
Instead, do it using pure SQL:
insert into target_table (col1, col2, ...)
select col1, col2, ....
from ...
where ...
or, if you must do some processing in java that can't be done within the query, open a cursor for the query and read rows 1 at a time and write the target row before reading the next row. This approach however will take a looooong time to finish.
I fully agree with Bohemian's answer.
If the source and the destination tables you can read and write within the same loop
something in a try - catch block like:
PreparedStatement reader = null;
PreparedStatement writer = null;
ResultSet rs = null;
try {
reader = sourceConnection.prepareStatement("select....");
writer = destinationConnection.prepareStatement("insert into...");
rs = reader.executeQuery();
int chunksize = 10000; // this is you batch size, depends on your system
int counter = 0;
while ( rs.next() {
writer.set.... // do for every field to insert the corresponding set
writer.addBatch();
if ( counter++ % chunksize == 0 ) {
int rowsWritten = writer.executeBatch();
System.out.println("wrote " + counter + " rows"); // probably a simple message to see a progress
}
}
// when finished, do not forget to flush the rest of the batch job
writer.executeBatch();
} catch (SQLException sqlex ) {
// an Errormessage to your gusto
System.out.println("SQLException: " + sqlex.getMessage());
} finally {
try {
if ( rs != null ) rs.close();
if ( reader != null ) reader.close();
if ( writer != null ) writer.close();
// probably you want to clsoe the connections as well
} catch (SQLException e ) {
System.out.println("Exception while closing " + e.getMessage());
}
}

counting the number of character in a text using FileReader

I am new in this superb place. I got help several times from this site. I have seen many answers regarding my question that was previously discussed but i am facing problem to count the number of characters using FileReader. It's working using Scanner. This is what i tried:
class CountCharacter
{
public static void main(String args[]) throws IOException
{
File f = new File("hello.txt");
int charCount=0;
String c;
//int lineCount=0;
if(!f.exists())
{
f.createNewFile();
}
BufferedReader br = new BufferedReader(new FileReader(f));
while ( (c=br.readLine()) != null) {
String s = br.readLine();
charCount = s.length()-1;
charCount++;
}
System.out.println("NO OF LINE IN THE FILE, NAMED " +f.getName()+ " IS " +charCount);
}
}`
It looks to me that each time you go through the loop, you assign the charCount to be the length of the line that iteration of the loop is concerned with. i.e. instead of
charCount = s.Length() -1;
try
charCount = charCount + s.Length();
EDIT:
If you have say the document with the contents "onlyOneLine"
Then when you first hit the while check the br.readLine() will make the BufferredReader read the first line, during the while's code block however br.readLine() is called again which advances the BufferredReader to the second line of the document, which will return null. As null is assigned to s, and you call length(), then NPE is thrown.
try this for the while block
while ( (c=br.readLine()) != null) {
charCount = charCount + c.Length(); }

Microsoft Robotics and Sql

I have an issue implementing CCR with SQL. It seems that when I step through my code the updates and inserts I am trying to execute work great. But when I run through my interface without any breakpoints, it seems to be working and it shows the inserts, updates, but at the end of the run, nothing got updated to the database.
I proceeded to add a pause to my code every time I pull anew thread from my pool and it works... but that defeats the purpose of async coding right? I want my interface to be faster, not slow it down...
Any suggestions... here is part of my code:
I use two helper classes to set my ports and get a response back...
/// <summary>
/// Gets the Reader, requires connection to be managed
/// </summary>
public static PortSet<Int32, Exception> GetReader(SqlCommand sqlCommand)
{
Port<Int32> portResponse = null;
Port<Exception> portException = null;
GetReaderResponse(sqlCommand, ref portResponse, ref portException);
return new PortSet<Int32, Exception>(portResponse, portException);
}
// Wrapper for SqlCommand's GetResponse
public static void GetReaderResponse(SqlCommand sqlCom,
ref Port<Int32> portResponse, ref Port<Exception> portException)
{
EnsurePortsExist(ref portResponse, ref portException);
sqlCom.BeginExecuteNonQuery(ApmResultToCcrResultFactory.Create(
portResponse, portException,
delegate(IAsyncResult ar) { return sqlCom.EndExecuteNonQuery(ar); }), null);
}
then I do something like this to queue up my calls...
DispatcherQueue queue = CreateDispatcher();
String[] commands = new String[2];
Int32 result = 0;
commands[0] = "exec someupdateStoredProcedure";
commands[1] = "exec someInsertStoredProcedure '" + Settings.Default.RunDate.ToString() + "'";
for (Int32 i = 0; i < commands.Length; i++)
{
using (SqlConnection connSP = new SqlConnection(Settings.Default.nbfConn + ";MultipleActiveResultSets=true;Async=true"))
using (SqlCommand cmdSP = new SqlCommand())
{
connSP.Open();
cmdSP.Connection = connSP;
cmdSP.CommandTimeout = 150;
cmdSP.CommandText = "set arithabort on; " + commands[i];
Arbiter.Activate(queue, Arbiter.Choice(ApmToCcrAdapters.GetReader(cmdSP),
delegate(Int32 reader) { result = reader; },
delegate(Exception e) { result = 0; throw new Exception(e.Message); }));
}
}
where ApmToCcrAdapters is the class name where my helper methods are...
The problem is when I pause my code right after the call to Arbiter.Activate and I check my database, everything looks fine... if I get rid of the pause ad run my code through, nothing happens to the database, and no exceptions are thrown either...
The problem here is that you are calling Arbiter.Activate in the scope of your two using blocks. Don't forget that the CCR task you create is queued and the current thread continues... right past the scope of the using blocks. You've created a race condition, because the Choice must execute before connSP and cmdSP are disposed and that's only going to happen when you're interfering with the thread timings, as you have observed when debugging.
If instead you were to deal with disposal manually in the handler delegates for the Choice, this problem would no longer occur, however this makes for brittle code where it's easy to overlook disposal.
I'd recommend implementing the CCR iterator pattern and collecting results with a MulitpleItemReceive so that you can keep your using statements. It makes for cleaner code. Off the top of my head it would look something like this:
private IEnumerator<ITask> QueryIterator(
string command,
PortSet<Int32,Exception> resultPort)
{
using (SqlConnection connSP =
new SqlConnection(Settings.Default.nbfConn
+ ";MultipleActiveResultSets=true;Async=true"))
using (SqlCommand cmdSP = new SqlCommand())
{
Int32 result = 0;
connSP.Open();
cmdSP.Connection = connSP;
cmdSP.CommandTimeout = 150;
cmdSP.CommandText = "set arithabort on; " + commands[i];
yield return Arbiter.Choice(ApmToCcrAdapters.GetReader(cmdSP),
delegate(Int32 reader) { resultPort.Post(reader); },
delegate(Exception e) { resultPort.Post(e); });
}
}
and you could use it something like this:
var resultPort=new PortSet<Int32,Exception>();
foreach(var command in commands)
{
Arbiter.Activate(queue,
Arbiter.FromIteratorHandler(()=>QueryIterator(command,resultPort))
);
}
Arbiter.Activate(queue,
Arbiter.MultipleItemReceive(
resultPort,
commands.Count(),
(results,exceptions)=>{
//everything is done and you've got 2
//collections here, results and exceptions
//to process as you want
}
)
);