I first initialie the BulkWriteOperation and add several inserts to it through a for loop. Then I do execute. I then reinitialize the BulkWriteOperation and try to add more insert but I keep getting:
java.lang.IllegalStateException: already executed
My Code:
BulkWriteOperation builder = coll.initializeOrderedBulkOperation();
for( int i = 0; i < 10; i++ ) {
BasicDBObject doc = new BasicDBObject("Something", something);
builder.insert(doc);
}
builder.execute();
builder = coll.initializeOrderedBulkOperation();
for( int i = 0; i < 10; i++ ) {
BasicDBObject doc = new BasicDBObject("Something", something);
builder.insert(doc);
}
builder.execute();
There isn't a way to reset the existing BulkWriteOperation object after executing it, so you just need to create a new one like this:
builder = coll.initializeOrderedBulkOperation();
Related
I wrote the code to update my table (SecurityQuestionAnswer) with new security password questions and move to old questions to another table (SecurityQuestionAnswersArchives). Total no of security questions is 3. I am able to update the current table, but when I add the same rows to history table, it shows weird data: only two records are added instead of 3 and the data is also duplicated. My code is as follows:
if (oldQuestions.Any())
{
var oldquestionstoarchivelist = new List<SecurityQuestionAnswersArchives>();
var oldquestionstoarchive =new SecurityQuestionAnswersArchives();
for (int i = 0; i < 3; i++)
{
oldquestionstoarchive.Id = oldQuestions[i].Id;
oldquestionstoarchive.SecurityQuestionId = oldQuestions[i].SecurityQuestionId;
oldquestionstoarchive.Answer = oldQuestions[i].Answer;
oldquestionstoarchive.UpdateDate = oldQuestions[i].UpdateDate;
oldquestionstoarchive.IpAddress = oldQuestions[i].IpAddress;
oldquestionstoarchive.SecurityQuestion = oldQuestions[i].SecurityQuestion;
oldquestionstoarchive.User = oldQuestions[i].User;
oldquestionstoarchivelist.Add(oldquestionstoarchive);
}
user.SecurityQuestionAnswersArchives = oldquestionstoarchivelist;
//await Store.UpdateAsync(user);
_dbContext.ArchiveSecurityQuestionAnswers.AddRange(oldquestionstoarchivelist);
_dbContext.SecurityQuestionAnswers.RemoveRange(oldQuestions);
await _dbContext.SaveChangesAsync();
oldquestionstoarchivelist.Clear();
}
UPDATE 1
The loop looks fine, It iterates three times(0,1,2), which is expected. First issue is with AddRange function to which I was passing a list , but it takes an IEnumerable input, I rectified it using following code.
IEnumerable<SecurityQuestionAnswersArchives> finalArchiveses = oldquestionstoarchivelist;
_dbContext.ArchiveSecurityQuestionAnswers.AddRange(finalArchiveses);
The other issue is duplicate data , which I am unable to figure out where the issue is. Please help me in finding this out.
Your help is much appreciated !
Got it ! Just sharing in case anybody has same issue.
The problem was with initialization at wrong place. I moved
var oldquestionstoarchive =new SecurityQuestionAnswersArchives();
in side the Forloop, now the variable will hold the unique values over each iteration.
var oldquestionstoarchivelist = new List<SecurityQuestionAnswersArchives>();
for (int i = 0; i < 3; i++)
{
var oldquestionstoarchive = new SecurityQuestionAnswersArchives();
oldquestionstoarchive.SecurityQuestionId = oldQuestions[i].SecurityQuestionId;
oldquestionstoarchive.Answer = oldQuestions[i].Answer;
oldquestionstoarchive.UpdateDate = oldQuestions[i].UpdateDate;
oldquestionstoarchive.IpAddress = oldQuestions[i].IpAddress;
oldquestionstoarchive.SecurityQuestion = oldQuestions[i].SecurityQuestion;
oldquestionstoarchive.User = oldQuestions[i].User;
oldquestionstoarchivelist.Add(oldquestionstoarchive);
}
I open with epplus an excel file.
After reading some data, I would like to close the package:
pck.Stream.Close()
pck.Dispose()
Unfortunatelly the excel file is still blocked. I need to close the whole application to get the excel file unlocked.
I have googled, but found nothing useful except the above.
How are you opening the file? The following creates, saves, reopens, prints, and finally deletes all withing the same thread without issue. I can even set a breakpoint anywhere and delete the file. Reading the file this way should not lock the file since it is pulled into memory:
[TestMethod]
public void OpenReopenPrintDeleteTest()
{
//Create some data
var existingFile = new FileInfo(#"c:\temp\temp.xlsx");
if (existingFile.Exists)
existingFile.Delete();
using (var package = new ExcelPackage(existingFile))
{
var workbook = package.Workbook;
workbook.Worksheets.Add("newsheet");
package.Save();
}
using (var package = new ExcelPackage(existingFile))
{
var workbook = package.Workbook;
var worksheet = workbook.Worksheets.First();
//The data
worksheet.Cells["A1"].Value = "Col1";
worksheet.Cells["A2"].Value = "sdf";
worksheet.Cells["A3"].Value = "ghgh";
worksheet.Cells["B1"].Value = "Col2";
worksheet.Cells["B2"].Value = "Group B";
worksheet.Cells["B3"].Value = "Group A";
worksheet.Cells["C1"].Value = "Col3";
worksheet.Cells["C2"].Value = 634.5;
worksheet.Cells["C3"].Value = 274.5;
worksheet.Cells["D1"].Value = "Col4";
worksheet.Cells["D2"].Value = 996440;
worksheet.Cells["D3"].Value = 185780;
package.Save();
}
//Reopen the file
using (var package = new ExcelPackage(existingFile))
{
var workBook = package.Workbook;
if (workBook != null)
{
if (workBook.Worksheets.Count > 0)
{
var currentWorksheet = workBook.Worksheets.First();
var lastrow = currentWorksheet.Dimension.End.Row;
var lastcol = currentWorksheet.Dimension.End.Column;
for (var i = 1; i <= lastrow; i++)
for (var j = 1; j <= lastcol; j++)
Console.WriteLine(currentWorksheet.Cells[i, j].Value);
}
}
}
//Delete the file
existingFile.Delete();
}
I was having the same problem... But I found the solution:
During the "SaveAs" method, I was creating a FileStream that was not being disposed.
Before:
ExcelPackage excel_package = new ExcelPackage(new MemoryStream());
//...
//Do something here
//...
excel_package.SaveAs(new FileStream("filename.xlsx", FileMode.Create));
excel_package.Dispose();
After:
ExcelPackage excel_package = new ExcelPackage(new MemoryStream());
//...
//Do something here
//...
var file_stream = new FileStream("filename.xlsx", FileMode.Create);
excel_package.SaveAs(file_stream);
file_stream.Dispose();
excel_package.Dispose();
Note that the Memory Stream opened during the ExcelPackage declaration was not explicitly disposed, because the last command "excel_package.Dispose()" already does this internally.
Hope it help.
c# excel epplus
I have the following...
DBCollection dbc = test.getCollection();
double count = dbc.count();
DBCursor cursor = dbc.find();
StringBuilder sb = new StringBuilder();
if( cursor.hasNext() )
sb.append(cursor.next().toString());
This outputs only one record but shows a count of 2. This one seems to work...
DBCollection dbc = test.getCollection();
double count = dbc.count();
DBCursor cursor = dbc.find();
StringBuilder sb = new StringBuilder();
for(double i = 0.0; i<count; i++)
sb.append(cursor.next().toString());
What am I missing
do you mean to use
while( cursor.hasNext() )
sb.append(cursor.next().toString());
Right now, you use 'if' but you probably want 'while'.
I am trying to insert multiple records to MongoDB at once , so for this i created a javaBean for each record to be inserted and added them to ArrayList .
And finally from the ArrayList , i am trying to perform a insert operation as shown below
public void insert(ArrayList<QuoteReportBean> quotelist) {
BasicDBList totalrecords = new BasicDBList();
StringBuffer sb = new StringBuffer();
int valuecount=0;
for (QuoteReportBean reportbean: quotelist) {
valuecount++;
BasicDBObject dbrecord = new BasicDBObject();
dbrecord.append("cust_id", reportbean.getCustomerId());
dbrecord.append("unique_symbol", reportbean.getUniqueSymbol());
sb.append(reportbean.getUniqueSymbol()+",");
dbrecord.append("exch", reportbean.getExchange());
dbrecord.append("access_time", reportbean.getDate());
totalrecords.add(dbrecord);
}
WriteResult result = coll.insert(totalrecords,WriteConcern.NORMAL);
}
But i am the follwoing error
Exception in thread "taskExecutor-1" java.lang.IllegalArgumentException: BasicBSONList can only work with numeric keys, not: [_id]
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:159)
at org.bson.types.BasicBSONList._getInt(BasicBSONList.java:150)
at org.bson.types.BasicBSONList.get(BasicBSONList.java:104)
at com.mongodb.DBCollection.apply(DBCollection.java:501)
at com.mongodb.DBCollection.apply(DBCollection.java:491)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:195)
at com.mongodb.DBApiLayer$MyCollection.insert(DBApiLayer.java:180)
at com.mongodb.DBCollection.insert(DBCollection.java:58)
Could anybody please help me as how to resolve this ??
BasicDBList can't be used to do inserts of multiple documents, it's only used for arrays inside a single document. To do a bulk insert, you need to pass an array of DBObjects into the insert method instead.
I changed your code to do this, and it worked without error:
StringBuffer sb = new StringBuffer();
int valuecount = 0;
final QuoteReportBean[] quotelist = {new QuoteReportBean()};
DBObject[] totalrecords = new BasicDBObject[quotelist.length];
for (int i = 0; i < quotelist.length; i++) {
QuoteReportBean reportbean = quotelist[i];
valuecount++;
BasicDBObject dbrecord = new BasicDBObject();
dbrecord.append("cust_id", reportbean.getCustomerId());
dbrecord.append("unique_symbol", reportbean.getUniqueSymbol());
sb.append(reportbean.getUniqueSymbol() + ",");
dbrecord.append("exch", reportbean.getExchange());
dbrecord.append("access_time", reportbean.getDate());
totalrecords[i] = dbrecord;
}
WriteResult result = coll.insert(totalrecords, WriteConcern.NORMAL);
As the title suggests, Is the following code acidic, e.g. if I call SaveChanges, will all the Product.Add INSERT statements be executed (or rolled back if there is an error).
using(DBEntities ctx = new DBEntities())
{
for(int i = 0; i < 10; i++)
{
ctx.Products.Add(new Product("Product " + (i + 1)));
}
ctx.SaveChanges();
}
MSDN states:
SaveChanges operates within a transaction. SaveChanges will roll back
that transaction and throw an exception if any of the dirty
ObjectStateEntry objects cannot be persisted.
However looking at the profiler, this doesn't seem to be the case. Am I required to wrap the block with TransactionScope?
using(DBEntities ctx = new DBEntities())
{
for(int i = 0; i < 10; i++)
{
ctx.Products.Add(new Product("Product " + (i + 1)));
}
ctx.SaveChanges();
}
This SaveChanges() call will be in a transaction automatically. You won't have to wrap it under a new transactionscope.