Datastage v9.1 - run user defined sql query file using odbc connector - datastage

I want to execute multiple lines of DDL and DML commands from a file in datastage.
I have used the ODBC connector with the write mode selected as user defined SQL and the and the SQL statements are available in the file.
But the connector stage is not executing the file. If anyone can provide me with guidance it would be greatly appreciated.
Thanks

If you can provide the more details on how you were using the 'DDL & DML statements in the file, warning messages & ODBC configuration etc.. It will help anyone to provide some suggestions- It will save your time as well to resolve it.

Related

Dump DDL and Data of DB2 with DataGrip Ultimate

I'm just recently starting to use Datagrip and DB2 engine. Is there a way to export both DDL and Data into 1 SQL file and then later run the file to create a new schema.
I only see there a option to export ONLY DDL or ONLY Data. Thanks in advance.
It is currently impossible, please follow: https://youtrack.jetbrains.com/issue/DBE-10677

Golden Gate ERROR OGG-05263 No GGSCHEMA clause

In OTN I am using these instructions to "try" and configure GoldenGate with a MSSQL Source DB to an Oracle12c Target DB
http://www.oracle.com/technetwork/articles/datawarehouse/oracle-sqlserver-goldengate-460262.html
Replicating Transactions Between Microsoft SQL Server and Oracle Database Using Oracle GoldenGate
Everything goes okay up till the command:
GGSCI (MSSQL) 2> ADD TRANDATA HRSCHEMA.EMP
Where I get the error:
ERROR OGG-05263 No GGSCHEMA clause was specified in the GLOBALS file. Please specify a GGSCHEMA shema name.
I searched and saw that currently there was no "GLOBALS" file. So I created one:
F:\GG\dirprm\globals.prm
And added one line:
GGSCHEMA hrschema
That did not help.
Still getting the same error.
Any suggestions?
Are there GoldenGate Environment variables that I need to have??
Thank-you in advace for your help.
I got the answer from Oracle Support:
The GLOBALS file should be in the main installation folder. Please remove the same from the dirprm file.
Also the GLOBALS file does not have extension. I could see that you have mentioned it as GLOBALS.prm
Made those changes and it works!

ERROR: current transaction is aborted, commands ignored until end of transaction block --- export data from Aqua studio

I am trying to export one table from Aquastudio into CSV file. The table has approximately 4.4 million rows. When I am trying to use the export window function in the aqua studio, I am facing the following error:
Error: ERROR: current transaction is aborted, commands ignored until end of transaction block
I am not understanding what the problem is. I read few articles regarding this error and found that this is happening due to some error in the last postgreSQL command. I did not use any SQL commands for this export and I dont know how to debug this. I am also unable to view the log files.
Use rollback to cancel the previous query. After that, you will be able to execute your current query.
You probably shouldn't be exporting millions of rows through a JDBC/ODBC connection, especially for Redshift.
For Redshift, please use the UNLOAD command documented here. You'll have to UNLOAD the file to S3 and download it from there.
For Postgres, use COPY TO as documented here.

Setting up environment for SQL queries

I know the basic syntax of queries but otherwise I'm a beginner with SQL.
I have an SQL file (.sql) and I downloaded a couple programs (pgadmin and sql workbench).
I have no idea how to get from where I am now to actually writing queries and finding information. How do I set up so I can actually import my SQL file and start writing queries?
pgAdmin is the default GUI for PostgreSQL.
SQL Workbench is a free, DBMS-independent, cross-platform SQL query tool.
Either way, you need to connect to a database to actually run queries. The DBMS can either run on your local machine or you can connect to a remote server - where you need access privileges of course.

Enterprise library semantic logging block. SQLDatabase sink. Out of process

I am using Enterprise library semantic logging block (out of process) and using SQL Database sink to dump all the message. After putting everything in place and doing a test run, I am getting the following error - could not find stored procedure 'dbo.WriteTraces'.
Anybody faced similar issue ? Pl suggest.
Out of process semantic logging assembly comes with some powershell scripts and .sql files. We have to edit (to change DB name) and run these scripts. This will generate the stored procs and the associated table for us.
I encountered this same error but it was because we were trying to use a schema other than dbo for our logging database. Once we changed it back to dbo that resolved the problem. We were using the out of process SemanticLogging-svc.exe, which, from what I can tell, assumes that dbo is the schema name.