how to set schema in ado.net after open the connection - ado.net

I am connecting to the db2 database in this database different schemas are there.
I want to connect to connect particular schema only,
I tried that in connection string we cant give the schema,
After connection opening only we have to set the schema,
I have a code that i.e by using connect to the active data object(ADO) only,
but in ado.net how to give i dont know
Below is the code for ado connection
db.Open DBcon_string
db.Execute ("SET SCHEMA=" & AppSchema)
db.Execute ("SET PATH=""SYSIBM"",""SYSFUN"",""SYSPROC"",""SYSIBMADM"",""" & AppSchema & """")
Note: db is adodb.connection
Replace AppSchema with ‘ETWRMS’

The below Link may prove to be helpful-
http://msdn.microsoft.com/en-us/library/ms971481.aspx

Refer the following link "Using an ADO.NET Entity Framework Data Provider"
http://www.datadirect.com/download/eval_docs/dotnet_win_quickstart.htm
Similar kinda issue taht may help-
How to see the schema of a db2 table (file)

It would be similar in ADO.NET. Assuming you're using OleDbConnection - create and open it. Then create an OleDbCommand using that connection. Then using command's ExecuteNonQuery method issue same statements you did using old "db.Execute" method.

Please try-
db.Open DBcon_string
db.ExecuteNonQuery ("SET SCHEMA=" & AppSchema)
db.ExecuteNonQuery ("SET PATH=""SYSIBM"",""SYSFUN"",""SYSPROC"",""SYSIBMADM"",""" & AppSchema & """")

Related

Delete command in SQL

I want to delete 10 rows from access data
Oledbcommand cmd = new oledbcommand("Delete Top 10 From [Log], cnn);
cnn. Open();
cmd.ExecuteNonQuery();
But I have Syntax error in DELETE statement
The SQL syntax seems fine for SQL Server. (I assume you use SQL Server, because you use brackets for delimiting table or view Log. Sorry if I'm wrong...)
I think you got the error, because you forgot a closing quote for the SQL statement in your OleDbCommand constructor. And casing is sensitive in C#. You could try this:
OleDbCommand cmd = new OleDbCommand("Delete Top 10 From [Log]", cnn);
If you get a SQL related error, my assumption might be wrong and you need to look up the specific SQL syntax for the database system that you are using.
Tip: If you are using SQL Server, you may also look into the specialized SqlCommand class and corresponding classes in the System.Data.SqlClient namespace. They offer additional support for SQL Server functionality.

Lesson 3 for MSDN error SQL Server Data Tools

i'm currently following this link and performing the steps
https://msdn.microsoft.com/en-us/library/ms170625.aspx
however, after importing in the Transact-SQL query, it tells me that
"Could not create a list of fields for the query. Verify that you connect to the data source and that your query syntax is correct. Query (7,24) Parser: The syntax for 'sp' is incorrect"
What could be the problem? I have tested connection of the database and it is connected. (and yes, the table exists in the database)

JPA: How to call a stored procedure

I have a stored procedure in my project under sql/my_prod.sql
there I have my function delete_entity
In my entity
#NamedNativeQuery(name = "delete_entity_prod",
query = "{call /sql/delete_entity(:lineId)}",
and I call it
Query query = entityManager.createNamedQuery("delete_entity_prod")
setParameter("lineId",lineId);
I followed this example: http://objectopia.com/2009/06/26/calling-stored-procedures-in-jpa/
but it does not execute the delete and it does not send any error.
I haven't found clear information about this, am I missing something? Maybe I need to load the my_prod.sql first? But how?
JPA 2.1 standardized stored procedure support if you are able to use it, with examples here http://en.wikibooks.org/wiki/Java_Persistence/Advanced_Topics#Stored_Procedures
This is actually they way you create a query.
Query query = entityManager.createNamedQuery("delete_entity_prod")
setParameter("lineId",lineId);
To call it you must execute:
query.executeUpdate();
Of course, the DB must already contain the procedure. So if you have it defined in your SQL file, have a look at Executing SQL Statements from a Text File(this is for MySQL but other database systems use a similar approach to execute scripts)
There is no error shown because query is not executed at any point - just instance of Query is created. Query can be executed by calling executeUpdate:
query.executeUpdate();
Then next problem will arise: Writing some stored procedures to file is not enough - procedures live in database, not in files. So next thing to do is to check that there is correct script to create stored procedure in hands (maybe that is currently content of sql/my_prod.sql) and then use that to create procedure via database client.
All JPA implementations do not support calling stored procedures, but I assume Hibernate is used under the hood, because that is also used in linked tutorial.
It can be the case that current
{call /sql/delete_entity(:lineId)}
is right syntax for calling stored procedure in your database. It looks rather suspicious because of /sql/. If it turns out that this is incorrect syntax, then:
Consult manual for correct syntax
Test via client
Use that as a value of query attribute in NamedNativeQuery annotation.
All that with combination MySQL+Hibernate is explained for example here.

Using Script Task to create ADO NET (ODBC) Data Flow Source

I need some help with a SSIS Script Task (SQL 2008 R2) that dynamically creates a package. I am refining a package that copies data from a Sage Timberline (Now rebranded to Sage 300) Pervasive SQL environment to a SQL server data warehouse. I can create a package that opens the connection to Timberline and copies the data to a table in SQL Server. The problem is, for each company in timberline and each table in SQL, I need to create a separate data flow task. Given the three Timberline company folders and the number of tables in each folder, this would take a lot of time to create and be cumbersome to maintain and troubleshoot.
I am trying to create a package that uses a Foreach Loop to create a package that creates a ADO/ODBC source (Timberline), a OLE destination (SQL) and dynamically handles the column mapping. I found code here that almost does what I need.
I tested this code and it works great using OLE SQL source and destinations. What makes this script work is that it dynamically handles the column mapping. So, it you placed it into a Foreach Loop of the 100 or so tables, with each loop it could dynamically create the data flow and map the columns, then execute the new package.
My problem is that I can only connect to Timberline using ODBC. So, I need to modify the script to create the source connection with ADO NET (ODBC) instead of OLE. I’m having a lot of trouble trying to figure this out. Could someone please help me out with this?
Here the other couple of things I tried first, other than this approach:
Solution: Setup a Linked server to Timberline Pervasive SQL
Problem: SQL server is 64-bit and the Timberline driver is 32-bit. Using a linked server returns a architecture mismatch error. I called Sage and they said they have no plans to release a 64-bit drive.
Solution: Use one of the SQL Transfer tasks
Problem: Only works with SQL databases. This source is a Pervasive SQL database
Solution: Use a “INSERT … INTO …” type script
Problem: This requires a linked server. See the problem above
Here’s the section of the original VB .NET code I need help with:
'To Create a package named [Sample Package]
Dim package As New Package()
package.Name = "Sample Package"
package.PackageType = DTSPackageType.DTSDesigner100
package.VersionBuild = 1
'To add Connection Manager to the package
'For source database (OLTP)
Dim OLTP As ConnectionManager = package.Connections.Add("OLEDB")
OLTP.ConnectionString = "Data Source=.;Initial Catalog=OLTP;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;"
OLTP.Name = "LocalHost.OLTP"
'To add Load Employee Dim to the package [Data Flow Task]
Dim dataFlowTaskHost As TaskHost = DirectCast(package.Executables.Add("SSIS.Pipeline.2"), TaskHost)
dataFlowTaskHost.Name = "Load Employee Dim"
dataFlowTaskHost.FailPackageOnFailure = True
dataFlowTaskHost.FailParentOnFailure = True
dataFlowTaskHost.DelayValidation = False
dataFlowTaskHost.Description = "Data Flow Task"
'-----------Data Flow Inner component starts----------------
Dim dataFlowTask As MainPipe = TryCast(dataFlowTaskHost.InnerObject, MainPipe)
' Source OLE DB connection manager to the package.
Dim SconMgr As ConnectionManager = package.Connections("LocalHost.OLTP")
' Create and configure an OLE DB source component.
Dim source As IDTSComponentMetaData100 = dataFlowTask.ComponentMetaDataCollection.[New]()
source.ComponentClassID = "DTSAdapter.OLEDBSource.2"
' Create the design-time instance of the source.
Dim srcDesignTime As CManagedComponentWrapper = source.Instantiate()
' The ProvideComponentProperties method creates a default output.
srcDesignTime.ProvideComponentProperties()
source.Name = "Employee Dim from OLTP"
' Assign the connection manager.
source.RuntimeConnectionCollection(0).ConnectionManagerID = SconMgr.ID
source.RuntimeConnectionCollection(0).ConnectionManager = DtsConvert.GetExtendedInterface(SconMgr)
' Set the custom properties of the source.
srcDesignTime.SetComponentProperty("AccessMode", 0)
' Mode 0 : OpenRowset / Table - View
srcDesignTime.SetComponentProperty("OpenRowset", "[dbo].[Employee_Dim]")
' Connect to the data source, and then update the metadata for the source.
srcDesignTime.AcquireConnections(Nothing)
srcDesignTime.ReinitializeMetaData()
srcDesignTime.ReleaseConnections()
Thanks in advance!
The C# code here is what you need if you need a Derived Column transform between the Source and Destination...
http://bifuture.blogspot.com/2011/01/ssis-adding-derived-column-to-ssis.html
To get the Source & Destination connections working, there is some secret sauce here to get things working between COM and .Net...
http://blogs.msdn.com/b/mattm/archive/2008/12/30/api-sample-ado-net-source.aspx
There is a similar page showing what to do for OleDB connections too.
Creating the source tables is easy. The available ODBC Metadata collections accessible should be retrieved with GetSchema("MetaDataCollections"). This will return a list of the available schema collections available for that particular ODBC driver.
Next, you'll want to see the data types returned from GetSchema("DataTypes"), so you can correctly interpret the data types for each column retrieved from GetSchema("Columns") to make your SQL Server create table script (which I'm assuming you've done).
To at least figure out which tables have primary keys, you'll need to loop over each table returned from GetSchema("Tables") in order to work with GetSchema("Indexes"). There's a bug that requires you to query the Indexes one table at a time. It is easy to google this - create a string array to pass in as the 3rd parameter: GetSchema("Indexes", tblName, resultArray[])
What I did was got the Tables and Columns collections into object variables in my parent SSIS package. Because Timberline is so fast (not), it seemed more efficient to pull all the columns down and filter them locally...which I do to create the tables in SQL Server, if necessary.
Once that is done, use the local copy of Tables again to manipulate a SSIS package in a Script task in "design mode" (change source and destination target tables, and redo the column mappings), and execute the now-in-memory SSIS package.
For me it took awhile to figure out. Both above URLs were required. I found and copied the .Net 2.0 Dts.PipelineWrap and Dts.RuntimeWrap .dlls to Microsoft.Net\FrameworkV2.0xxxxx folder, then referenced these in each script task wanting to use them, before setting up my "using DtsPW = Microsoft.SqlServer.Dts.Pipeline.Wrapper", etc.
Of note, because Timberline is 32-bit ODBC, I think it's necessary to build the SSIS package to use "X86", and target the script tasks to use .Net 2.0 framework.
I used the Derived Column code because I needed to copy multiple Timberline DBs into one SQL Server DB. Derived Column adds a "CompanyID" value to the output pipeline to SQL Server.
In the end, map the Destination's Virtual Input columns to its External Metadata columns, based off of the pipeline the Destination is attached to:
foreach (DtsPW.IDTSVirtualInputColumn100 vColumn in destVirtInput.VirtualInputColumnCollection)
{
var vCol = destInst.SetUsageType(destInput.ID, destVirtInput, vColumn.LineageID, DtsPW.DTSUsageType.UT_READWRITE);
destInst.MapInputColumn(destInput.ID, vCol.ID, destInput.ExternalMetadataColumnCollection[vColumn.Name].ID);
}
Anyways, that code will make more sense in the context of the bifuture.blogspot.com page.
The EzApi library could help with this too, but the AdoNet connection source for it is coded as a virtual class, so you'd need to implement specific classes to use. My C# kungfu is not strong enough for that in the time I have...
Also, CozyRoc sells a toolset with custom SSIS controls (data flow Source and Destination controls...) that looks like it does this on the fly input-to-output column mapping as well.
My package seems to work good enough now... Oh, and one more, I did not have luck trying to use DSN-less ODBC connections to Timberline, just: Dsn=dsnname;Uid=user;Pwd=pwd;
SSIS packages running in 64-bit land cannot see 32-bit DSNs on 64-bit OS, it seems...at least, it didn't work for me (win7-64, 32-bit Text ODBC DSN).

T-SQL 2000: Four part table name

I don't usually work with linked servers, and so I'm not sure what I'm doing wrong here.
A query like this will work to a linked foxpro server from sql 2000:
EXEC('Select * from openquery(linkedServer, ''select * from linkedTable'')')
However, from researching on the internet, something like this should also work:
Select * from linkedserver...linkedtable
but I receive this error:
Server: Msg 7313, Level 16, State 1, Line 1
Invalid schema or catalog specified for provider 'MSDASQL'.
OLE DB error trace [Non-interface error: Invalid schema or catalog specified for the provider.].
I realize it's supposed to be ServerAlias.Category.Schema.TableName, but if I run sp_ tables _ex on the linked server, for the category for all tables I just get the network path to where the data files are, and the schema is null.
Is this server setup incorrectly? Or is what I'm trying to do not possible?
From MSDN:
Always use fully qualified names when
working with objects on linked
servers. There is no support for
implicit resolution to the dbo owner
name for tables in linked servers
You cannot rely on the implicit schema name resolution of the '..' notation for linked servers. For a FoxPro 'server' you're going to have to use the database and schema as they map to their FoxPro counterparts in the driver you use (I think they map to folder and file name, but I have't use a ISAM file driver in more than 10 years now).
I think you need to be explicit about resources in the linked server part of the query, for example:
EXEC SomeLinkedServer.Database.dbo.SomeStoredProc
In other words just dotting them out doesn't work in this case, you have to be more specific.
It's actually:
ServerAlias.Catalog.Schema.LinkedTable
Catalog is the database that you're querying on the linked server, and catalog is the catalog of the remote table. So a valid four-part name would look lik this
ServerAlias.AdventureWorks.HumanResources.Employee
or
ServerAlias.MyDB.dbo.MyTable