In my code has an issue but I can't see what issue in this. Column names are same word by word and it is working, If I use 1 column in csv file but when I try out more then 2-3 column fields it is giving the error below. I have checked read lots of article so I can't fix the error. What can be happen with is this lines. DB already was created with similar fields.
private void DBaktar()
{
string SQLServerConnectionString = "Server =.\\SQLEXPRESS; Database = Qiti; User Id = sa; Password = 7731231xx!!;";
string CSVpath = #"D:\FTP\"; // CSV file Path
string CSVFileConnectionString = String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};;Extended Properties=\"text;HDR=Yes;FMT=Delimited\";", CSVpath);
var AllFiles = new DirectoryInfo(CSVpath).GetFiles("*.CSV");
string File_Name = string.Empty;
foreach (var file in AllFiles)
{
try
{
DataTable dt = new DataTable();
using (OleDbConnection con = new OleDbConnection(CSVFileConnectionString))
{
con.Open();
var csvQuery = string.Format("select * from [{0}]", file.Name);
using (OleDbDataAdapter da = new OleDbDataAdapter(csvQuery, con))
{
da.Fill(dt);
}
}
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(SQLServerConnectionString))
{
bulkCopy.ColumnMappings.Add("LKod", "LKod");
bulkCopy.ColumnMappings.Add("info", "info");
bulkCopy.ColumnMappings.Add("Codex", "Codex");
bulkCopy.ColumnMappings.Add("LthNo", "LthNo");
bulkCopy.ColumnMappings.Add("Datein", "Datein");
bulkCopy.DestinationTableName = "U_Tik";
bulkCopy.BatchSize = 0;
bulkCopy.EnableStreaming = true;
bulkCopy.WriteToServer(dt);
bulkCopy.Close();
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
}
Error exception;
The given ColumnName 'LKod' does not match up with any column in data
source.
ex.StackTrace;
at
System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerCommon(Int32
columnCount) at
System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32
columnCount, CancellationToken ctoken) at
System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table,
DataRowState rowState) at
System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)
Some information can be found here: https://sqlbulkcopy-tutorial.net/columnmapping-does-not-match
Cause
You didn't provide any ColumnMappings, and there is more column in the source than in the destination.
You provided an invalid column name for the source.
You provided an invalid column name for the destination.
Solution
ENSURE to provide a ColumnMappings
ENSURE all values for source column name are valid and case sensitive.
ENSURE all values for destination column name are valid and case sensitive.
MAKE the source case insensitive
I have found a solution and working 100% true.. The link below, I hope become a path who need that.
https://johnnycode.com/2013/08/19/using-c-sharp-sqlbulkcopy-to-import-csv-data-sql-server/
I have a named raw ad-hoc query and execute with an output parameter. I am adding both input and output parameters to the command object properly i believe. I am trying to understand that the parsing that goes on for output parameters in Npgsql and why it is failing. Any Ideas.. I have tried to provide some info here.. Let me know if you can help or would need additional info. I think this should be a straightforward use case to insert some data and get some scalar return vals back from a named query using out params
Postgres
BEGIN
SELECT nextval('Role_seq') into :v_roleId;
INSERT INTO Role (roleId, organizationId, name, notes, locked, roleTypeId, rightsFlags)
VALUES (:v_roleId, :v_organizationId, :v_name, :v_notes, :v_locked, :v_roleTypeId, :v_rightsFlags);
END;
SQL Server
INSERT INTO Role (organizationId, name, notes, locked, roleTypeId, rightsFlags)
VALUES (#organizationId, #name, #notes, #locked, #roleTypeId, #rightsFlags)
SELECT #roleId = SCOPE_IDENTITY()
Oracle
BEGIN
SELECT Role_roleId_SEQ.NEXTVAL into :v_roleId FROM DUAL;
INSERT INTO Role (roleId, organizationId, name, notes, locked, roleTypeId, rightsFlags)
VALUES (:v_roleId, :v_organizationId, :v_name, :v_notes, :v_locked, :v_roleTypeId, :v_rightsFlags);
END;
I am binding all the parameters properly and this code works on all platforms(providers) except Postgres where some query parsing is failing. Here is how i am adding the params.
dsh.AddNQParameter(cmd, "roleId", ParameterDirection.Output, (object)DBNull.Value, "Int", "Int32", "Integer");
dsh.AddNQParameter(cmd, "organizationId", ParameterDirection.Input, organizationId ?? (object)DBNull.Value, "Int", "Int32", "Integer");
dsh.AddNQParameter(cmd, "name", ParameterDirection.Input, name ?? (object)DBNull.Value, "VarChar", "Varchar2", "Varchar");
dsh.AddNQParameter(cmd, "notes", ParameterDirection.Input, notes ?? (object)DBNull.Value, "VarChar", "Varchar2", "Varchar");
dsh.AddNQParameter(cmd, "locked", ParameterDirection.Input, locked ?? (object)DBNull.Value, "Bit", "Byte", "Boolean");
dsh.AddNQParameter(cmd, "roleTypeId", ParameterDirection.Input, roleTypeId ?? (object)DBNull.Value, "Int", "Int32", "Integer");
dsh.AddNQParameter(cmd, "rightsFlags", ParameterDirection.Input, rightsFlags ?? (object)DBNull.Value, "Image", "Blob", "Bytea");
Stack Trace for Postgres
Result StackTrace:
at Npgsql.SqlQueryParser.ParseRawQuery(String sql, Boolean standardConformantStrings, NpgsqlParameterCollection parameters, List`1 statements)
at Npgsql.NpgsqlCommand.ProcessRawQuery()
at Npgsql.NpgsqlCommand.<Execute>d__71.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.ValueTaskAwaiter`1.GetResult()
at Npgsql.NpgsqlCommand.<ExecuteNonQuery>d__84.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Npgsql.NpgsqlCommand.ExecuteNonQuery()
at LandisGyr.Data.Helper.ExecuteNonQueryReturnInt(DbCommand cmd, String name) in D:\tfs\cc\Command Center\Components\LGDALGenerator\Main\LG.Data.Core\Foundation\Helper.cs:line 76
at DAL_Generator_Test.Data.NamedQueries.Test.NamedQueriesTest.InsRole(DbCommand cmd, Nullable`1 organizationId, String name, String notes, Nullable`1 locked, Nullable`1 roleTypeId, Byte[] rightsFlags, Nullable`1& roleId) in D:\tfs\cc\Command Center\Components\LGDALGenerator\Main\DAL Generator Test\Data\NamedQueries\NamedQueries.Test.Designer.cs:line 985
at DAL_Generator_Test.SqlServerTests.NamedQueriesPostgresTests.Execute_NonQuery_Test_Using_DbCommand() in D:\tfs\cc\Command Center\Components\LGDALGenerator\Main\DAL Generator Test\PostgresTests\NamedQueriesPostgresTests.cs:line 90
Result Message:
Test method DAL_Generator_Test.SqlServerTests.NamedQueriesPostgresTests.Execute_NonQuery_Test_Using_DbCommand threw exception:
System.Exception: Parameter ':v_roleId' referenced in SQL but is an out-only parameter
Code Sample to repro the problem
using Npgsql;
using NpgsqlTypes;
using System;
using System.Configuration;
using System.Data.Common;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
string connectString = ConfigurationManager.ConnectionStrings["PostgresTest"].ConnectionString;
// create a table as follows
/*
* CREATE TABLE role
(
roleid integer NOT NULL DEFAULT nextval('role_seq'::regclass),
name character varying(50) COLLATE pg_catalog."default",
notes character varying(255) COLLATE pg_catalog."default",
organizationid integer NOT NULL,
roletypeid integer NOT NULL DEFAULT 0,
locked boolean NOT NULL DEFAULT false,
rightsflags bytea
)
*/
using (NpgsqlConnection con = new NpgsqlConnection(connectString))
{
con.Open();
using (DbCommand cmd = con.CreateCommand())
{
cmd.CommandText = #"BEGIN
SELECT nextval('Role_seq') into: v_roleId;
INSERT INTO Role(roleId, organizationId, name, notes, locked, roleTypeId, rightsFlags)
VALUES(:v_roleId, :v_organizationId, :v_name, :v_notes, :v_locked, :v_roleTypeId, :v_rightsFlags);
END;";
var roleIdParam = new NpgsqlParameter(":v_roleId", NpgsqlDbType.Integer);
roleIdParam.Direction = System.Data.ParameterDirection.Output;
cmd.Parameters.Add(roleIdParam);
var orgParam = new NpgsqlParameter(":v_organizationId", NpgsqlDbType.Integer);
orgParam.Direction = System.Data.ParameterDirection.Input;
orgParam.Value = 1;
cmd.Parameters.Add(orgParam);
var nameParam = new NpgsqlParameter(":v_name", NpgsqlDbType.Varchar);
nameParam.Direction = System.Data.ParameterDirection.Input;
nameParam.Value = "test role";
cmd.Parameters.Add(nameParam);
var lockedParam = new NpgsqlParameter(":v_locked", NpgsqlDbType.Boolean);
lockedParam.Direction = System.Data.ParameterDirection.Input;
lockedParam.Value = false;
cmd.Parameters.Add(lockedParam);
var roleTypeIdParam = new NpgsqlParameter(":v_roleTypeId", NpgsqlDbType.Integer);
roleTypeIdParam.Direction = System.Data.ParameterDirection.Input;
roleTypeIdParam.Value = 1;
cmd.Parameters.Add(roleTypeIdParam);
var rightsFlagsParam = new NpgsqlParameter(":v_rightsFlags", NpgsqlDbType.Bytea);
rightsFlagsParam.Direction = System.Data.ParameterDirection.Input;
rightsFlagsParam.Value = DBNull.Value;
cmd.Parameters.Add(rightsFlagsParam);
cmd.ExecuteNonQuery();
object roleId = cmd.Parameters[":v_roleId"].Value;
Console.WriteLine($"role id is {roleId}");
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
}
}
}
}
I have read the documentation on "In / Out Parameters".
https://www.npgsql.org/doc/basic-usage.html
I did a test that returned the value of the sequence, in a very similar case.
Note that you do not set the output parameter in the SQL statement.
INSERT INTO x RETURNING x.roleIdINTO :roleId
Sample code
using Npgsql;
using NpgsqlTypes;
using System;
using System.Configuration;
using System.Data.Common;
namespace ConsoleApp1
{
class Program
{
static void Main(string[] args)
{
string connectString = ConfigurationManager.ConnectionStrings["PostgresTest"].ConnectionString;
using (NpgsqlConnection con = new NpgsqlConnection(connectString))
{
con.Open();
using (DbCommand cmd = con.CreateCommand())
{
cmd.CommandText = #"INSERT INTO Role(roleId, organizationId, name, notes, locked, roleTypeId, rightsFlags)
VALUES(nextval('Role_seq'), :v_organizationId, :v_name, :v_notes, :v_locked, :v_roleTypeId, :v_rightsFlags) RETURNING roleId";
var orgParam = new NpgsqlParameter(":v_organizationId", NpgsqlDbType.Integer);
orgParam.Direction = System.Data.ParameterDirection.Input;
orgParam.Value = 1;
cmd.Parameters.Add(orgParam);
var nameParam = new NpgsqlParameter(":v_name", NpgsqlDbType.Varchar);
nameParam.Direction = System.Data.ParameterDirection.Input;
nameParam.Value = "test role";
cmd.Parameters.Add(nameParam);
var lockedParam = new NpgsqlParameter(":v_locked", NpgsqlDbType.Boolean);
lockedParam.Direction = System.Data.ParameterDirection.Input;
lockedParam.Value = false;
cmd.Parameters.Add(lockedParam);
var roleTypeIdParam = new NpgsqlParameter(":v_roleTypeId", NpgsqlDbType.Integer);
roleTypeIdParam.Direction = System.Data.ParameterDirection.Input;
roleTypeIdParam.Value = 1;
cmd.Parameters.Add(roleTypeIdParam);
var rightsFlagsParam = new NpgsqlParameter(":v_rightsFlags", NpgsqlDbType.Bytea);
rightsFlagsParam.Direction = System.Data.ParameterDirection.Input;
rightsFlagsParam.Value = DBNull.Value;
cmd.Parameters.Add(rightsFlagsParam);
var roleIdParam = new NpgsqlParameter("Returning_roleIdParam", NpgsqlDbType.Integer);
roleIdParam.Direction = System.Data.ParameterDirection.Output;
cmd.Parameters.Add(roleIdParam);
cmd.ExecuteNonQuery();
object roleId = cmd.Parameters["Returning_roleIdParam"].Value;
Console.WriteLine($"role id is {roleId}");
Console.WriteLine("Press any key to continue");
Console.ReadLine();
}
}
}
}
}
I m trying to populate sql table and then retrieve data from it. Following is my code.
public void addQuestion(Question quest)
{
int id = 1;
ContentValues values = new ContentValues();
SQLiteDatabase db = this.getWritableDatabase();
db.execSQL("DROP TABLE IF EXISTS " + TABLE_QUEST1);
onCreate(db);
values.put(KEY_QUES, quest.getQuestion());
values.put(KEY_ANSWER, quest.getAnswer());
values.put(KEY_OPTA, quest.getOptA());
values.put(KEY_OPTB, quest.getOptB());
values.put(KEY_OPTC, quest.getOptC());
db.insert(TABLE_QUEST1, null, values);
System.out.println("Added in database: " + quest.getQuestion());
}
public ArrayList<Question> getAllQuestions() {
System.out.println("getting rows 1");
ArrayList<Question> quesList = new ArrayList<Question>();
System.out.println("getting rows 2");
Cursor cursor = null;
SQLiteDatabase db = getReadableDatabase();
System.out.println("getting rows ");
cursor = db.rawQuery("SELECT * FROM " + TABLE_QUEST1, null);
if (!cursor.moveToFirst()) {
System.out.println("No data in the database ");
} else {
System.out.println("theres data in the database ");
quesList = new ArrayList<Question>();
do {
System.out.print("total rows " + cursor.getCount());
Question quest = new Question();
quest.setID(cursor.getInt(0));
quest.setQuestion(cursor.getString(1));
quest.setAnswer(cursor.getString(2));
quest.setOptA(cursor.getString(3));
quest.setOptB(cursor.getString(4));
quest.setOptC(cursor.getString(5));
quesList.add(quest);
} while (cursor.moveToNext());
cursor.close();
}
}
I have 4 rows of data in my table and I can see that with the print statement "added in database"
but when i actually read it the cursor just reads row 1 and moves out of the while loop. what could potentially be wrong.
tia
Your code was absolutely fine except placing drop command in the loop. As mentioned in the earlier comments, please make sure to avoid calling drop query each time and you'll find the result.
As Santosh has pointed out DROPPING the table (as per db.execSQL("DROP TABLE IF EXISTS " + TABLE_QUEST1);) and then re-creating it (as per onCreate(db);) will delete the table and then re-create the table removing any rows/data that had previously been added to the table.
As such it's simply a matter of removing those two lines of code, Also there appears to be no need for the line int id = 1;, so perhaps remove this, as per :-
public void addQuestion(Question quest)
{
ContentValues values = new ContentValues();
SQLiteDatabase db = this.getWritableDatabase();
values.put(KEY_QUES, quest.getQuestion());
values.put(KEY_ANSWER, quest.getAnswer());
values.put(KEY_OPTA, quest.getOptA());
values.put(KEY_OPTB, quest.getOptB());
values.put(KEY_OPTC, quest.getOptC());
db.insert(TABLE_QUEST1, null, values);
System.out.println("Added in database: " + quest.getQuestion());
}
P.S. you may consider not using hard coded column offsets but instead obtain offsets according to column names by utilising the getColumnIndex(column_name) Cursor method. e.g. :-
Question quest = new Question();
quest.setID(cursor.getInt(cursor.getColumnIndex("name_of_your_id_columm")));
quest.setQuestion(cursor.getString(cursor.getColumnIndex(KEY_QUES)));
quest.setAnswer(cursor.getString(cursor.getColumnIndex(KEY_ANSWER)));
quest.setOptA(cursor.getString(cursor.getColumnIndex(KEY_OPTA)));
quest.setOptB(cursor.getString(cursor.getColumnIndex(KEY_OPTB)));
quest.setOptC(cursor.getString(cursor.getColumnIndex(KEY_OPTC)));
quesList.add(quest);
Noting that instead of "name_of_your_id_columm", you may have something like KEY_ID defined, if so use that, thus you have a single definition so it reduces the chance of inadvertently mispelling column names or miscalculating the offsets.
I want to clone the Profile__c record. The lead has a profile__c associated with it. When conversion happens, the Profile_c on the lead is copied to the account created. What I need to do is a deep clone of the Profile__c on the new account created after the conversion. I am able to copy the profile_c over but cloning throws this error:
Error: System.DmlException: Update failed. First exception on row 0 with id 00QJ0000007dDmHMAU; first error: CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY, profile: execution of AfterUpdate caused by: System.DmlException: Insert failed. First exception on row 0; first error: CANNOT_UPDATE_CONVERTED_LEAD, cannot reference converted lead: [] Trigger.profile:, column 1: [] (System Code)
trigger profile on Lead (after update) {
Map<Id, Lead> cl = new Map<Id,Lead>();
Lead parent;
List<Contact> clist = new List<Contact>();
Set<Id> convertedids = new Set<Id>();
//list of converted leads
for (Lead t:Trigger.new){
Lead ol = Trigger.oldMap.get(t.ID);
if(t.IsConverted == true && ol.isConverted == false)
{
cl.put(t.Id, t);
convertedids.add(t.ConvertedContactId);
}
}
Set<Id> leadIds = cl.keySet();
List<Profile__c> mp = [select Id, lock__c, RecordTypeId, reason__c, End_Date__c,startus__c , Opportunity__c, Account__c, Lead__c from Profile__c where Lead__c in :leadIds];
List<ID>AccountIDs = new List<ID>();
List<Profile__c>clonedList = new list<Profile__c>();
for (Profile__c mpi:mp){
parent = cl.get(mpi.Lead__c );
mpi.opportunity__c = parent.ConvertedOpportunityId;
mpi.account__c = parent.ConvertedAccountId;
AccountIDs.add(parent.ConvertedAccountId);
Profile__c profile = mpi.clone(false,true,false,false);
clonedList.add(profile);
mpi.lock__c= true;
mpi.reason__c= 'Converted';
}
update mp;
insert clonelist
}
You are doing insert operation(insert clonelist) in which you are accessing Converted lead Id value in a field. You can't use converted LeadId field in DML operations.
Below is the Sample code that will work-
trigger ConvertedLead_Trigger on Lead (after update) {
Map<Id, Lead> cl = new Map<Id,Lead>();
Lead parent;
List<Contact> clist = new List<Contact>();
Set<Id> convertedids = new Set<Id>();
//list of converted leads
for (Lead t:Trigger.new){
Lead ol = Trigger.oldMap.get(t.ID);
if(t.IsConverted == true && ol.isConverted == false)
{
cl.put(t.Id, t);
convertedids.add(t.ConvertedContactId);
}
}
Set<Id> leadIds = cl.keySet();
List<ConvertLeadTest__c> mp =[Select Id,Name,Lead__c, Account__c,Opportunity__c from ConvertLeadTest__c where Lead__c in :leadIds];
List<ConvertLeadTest__c> mp1=new List<ConvertLeadTest__c>();
List<ConvertLeadTest__c> mp2=new List<ConvertLeadTest__c>();
for(ConvertLeadTest__c cc:mp)
{
if(cl.containsKey(cc.Lead__c))
{
cc.Account__c=cl.get(cc.Lead__c).ConvertedAccountId;
cc.Opportunity__c=cl.get(cc.Lead__c).ConvertedOpportunityId;
mp1.add(cc);
mp2.add(new ConvertLeadTest__c(Account__c=cl.get(cc.Lead__c).ConvertedAccountId,Opportunity__c=cl.get(cc.Lead__c).ConvertedOpportunityId));
}
}
update mp;
insert mp2;
}
But if you write
ConvertLeadTest__c(Lead__c=cc.Lead__c,Account__c=cl.get(cc.Lead__c).ConvertedAccountId,Opportunity__c=cl.get(cc.Lead__c).ConvertedOpportunityId));
then it will throw error.
Hope this will help you.
Thanks :)
We are not able to perform any operation on the Lead once the lead is converted.
Anything you do to try o update the converted lead will give you error.
What eventually did it for me was after the conversion, I grabbed the convertedAccountIds. Since I was already copying Profile__c to the account after conversion, I just cloned the profile there and had to set the lead on that profile to null since it can't be updated
I've been using JPA to insert entities into a database but I've run up against a problem where I need to do an insert and get the primary key of the record last inserted.
Using PostgreSQL I would use an INSERT RETURNING statement which would return the record id, but with an entity manager doing all this, the only way I know is to use SELECT CURRVAL.
So the problem becomes, I have several data sources sending data into a message driven bean (usually 10-100 messages at once from each source) via OpenMQ and inside this MDB I persists this to PostgreSQL via the entity manager. It's at this point I think there will be a "race condition like" effect of having so many inserts that I won't necessarily get the last record id using SELECT CURRVAL.
My MDB persists 3 entity beans via an entity manager like below.
Any help on how to better do this much appreciated.
public void onMessage(Message msg) {
Integer agPK = 0;
Integer scanPK = 0;
Integer lookPK = 0;
Iterator iter = null;
List<Ag> agKeys = null;
List<Scan> scanKeys = null;
try {
iag = (IAgBean) (new InitialContext()).lookup(
"java:comp/env/ejb/AgBean");
TextMessage tmsg = (TextMessage) msg;
// insert this into table only if doesn't exists
Ag ag = new Ag(msg.getStringProperty("name"));
agKeys = (List) (iag.getPKs(ag));
iter = agKeys.iterator();
if (iter.hasNext()) {
agPK = ((Ag) iter.next()).getId();
}
else {
// no PK found so not in dbase, insert new
iag.addAg(ag);
agKeys = (List) (iag.getPKs(ag));
iter = agKeys.iterator();
if (iter.hasNext()) {
agPK = ((Ag) iter.next()).getId();
}
}
// insert this into table always
iscan = (IScanBean) (new InitialContext()).lookup(
"java:comp/env/ejb/ScanBean");
Scan scan = new Scan();
scan.setName(msg.getStringProperty("name"));
scan.setCode(msg.getIntProperty("code"));
iscan.addScan(scan);
scanKeys = (List) iscan.getPKs(scan);
iter = scanKeys.iterator();
if (iter.hasNext()) {
scanPK = ((Scan) iter.next()).getId();
}
// insert into this table the two primary keys above
ilook = (ILookBean) (new InitialContext()).lookup(
"java:comp/env/ejb/LookBean");
Look look = new Look();
if (agPK.intValue() != 0 && scanPK.intValue() != 0) {
look.setAgId(agPK);
look.setScanId(scanPK);
ilook.addLook(look);
}
// ...
The JPA spec requires that after persist, the entity be populated with a valid ID if an ID generation strategy is being used. You don't have to do anything.