How to prevent Oracle SQL Developer from writing to AppData\Roaming? - oracle-sqldeveloper

Is there any way to prevent Oracle SQL Developer (ver 4.1.5.21) from writing into AppData\Roaming?

I have a blog post that explains this here :
http://krisrice.io/2012-05-12-sql-developer-shared-setup-from-any/
The crux is to edit sqldeveloper/bin/sqldeveloper.conf
AddVMOption -Duser.home=/Users/klrice/Dropbox/sqldev

Related

SQL Developer missing a library

I'm fairly new to Oracle SQL developer. As I'm trying to make a new connection via TNS, I'm getting this error down below which I've been looking to solve for a while fruitlessly.
Error Message:
Statut : échec -Echec du test : no ocijdbc18 in java.library.path
Assuming MySQL is a typo (You are using port 1521, which is the default port for Oracle database, also its giving you ODBC error, so I assume its an Oracle database. Remember that you can connect SQL developer to a MySQL database, so please clarify if you want to connect to a MySQL database rather than Oracle) your problem is due to lack of Oracle Data Base Connector driver (ODBC). From this link download the latest version of Oracle Instant client and install it on your machine. Then define ORACLE_HOME to the path of Instant client installation (To the actual installation folder, the one that contains folders like bin, network and so on) and restart SQL Developer, then you should be good to go.
P.S. : as #thatjeffsmith has mentioned correctly in the comments, it's not necessary to have Oracle Client and/or ODBC (or similar) drivers in order to connect to an Oracle database. Using basic as the connection type and adding the correct connection info would suffice. This Article in Oracle Magazine has extensively talked about the different connection types.

How to SET_JOB in postgres?

Currently, I am transferring database from Oracle to Postgres, but I am having trouble converting
DBMS_SCHEDULER.SET_JOB_ARGUMENT_VALUE (JOB_NAME => THE_JOB_NAME,
             ARGUMENT_NAME => 'in_study_count', ARGUMENT_VALUE => IN_STUDY_COUNT)
to Postgres, but I can't map DBMS_SCHEDULER.SET_JOB_ARGUMENT_VALUE with a Postgres equivalent solution. I recently worked with Postgres, so I don't have much experience with this. Hope everybody help please.
As #a_horse_with_no_name mentioned in his comment, there is no built-in scheduler in Postgres.
There is a commercial product named EDB Postgres Advanced Server which extends PorsgreSQL and provides extra functionalities for users migrating from Oracle, including a DBMS_SCHEDULER package which does have a SET_JOB_ARGUMENT_VALUE(job_name, argument_position, argument_value) procedure, just like the Oracle package.
However, if you want to remain with the standard Open Source PostgreSQL database, installing pg_cron is probably your best option (as also mentioned by #a_horse_with_no_name).

Golden Gate ERROR OGG-05263 No GGSCHEMA clause

In OTN I am using these instructions to "try" and configure GoldenGate with a MSSQL Source DB to an Oracle12c Target DB
http://www.oracle.com/technetwork/articles/datawarehouse/oracle-sqlserver-goldengate-460262.html
Replicating Transactions Between Microsoft SQL Server and Oracle Database Using Oracle GoldenGate
Everything goes okay up till the command:
GGSCI (MSSQL) 2> ADD TRANDATA HRSCHEMA.EMP
Where I get the error:
ERROR OGG-05263 No GGSCHEMA clause was specified in the GLOBALS file. Please specify a GGSCHEMA shema name.
I searched and saw that currently there was no "GLOBALS" file. So I created one:
F:\GG\dirprm\globals.prm
And added one line:
GGSCHEMA hrschema
That did not help.
Still getting the same error.
Any suggestions?
Are there GoldenGate Environment variables that I need to have??
Thank-you in advace for your help.
I got the answer from Oracle Support:
The GLOBALS file should be in the main installation folder. Please remove the same from the dirprm file.
Also the GLOBALS file does not have extension. I could see that you have mentioned it as GLOBALS.prm
Made those changes and it works!

OpenCobolIDE and DB2 - Connection

I'm currently working on a small COBOL project and I'm using OpenCobolIDE.
I also downloaded DB2 Express and I'm able to use it by running the "Commande line processor"
Now my question is the follow one : How can I make a connection between OpenCobolIDE and DB2 ?
I saw that it was possible to use "esqlOC" but I didn't find a lot of documentation and I'm still lost at the moment.
Kind regards
I know that's it's against SO policy, but here is a link; http://db2twilight.blogspot.nl/2014/01/linuxdb2-running-cobol-with-inline-sql.html The code isn't that long, but I found no disclaimer, so assumed copyright goes to the blogger, Dick Reitveld. The post is a tutorial on linking DB2 to GnuCOBOL (was OpenCOBOL), and not how to inform the OpenCOBOLIDE how to do the build, but hopefully this fits in with your question.
The build rules are listed in a shell script on the same page.
Basically it comes down to, creating a COBOL source file with EXEC SQL statements, running them through the DB2 preprocessor, then compiling the generated sources with cobc.
db2 connect to sample
db2 prep program.sqb bindfile target ANSI_COBOL
cobc program.cbl -static -Wall -L/path/to/db2libs/sqllib/lib64 -ldb2 -v -x -save-temps -O
db2 bind program.bnd
db2 connect reset
Where "program" is your filename, with .sqb inputs and will generate .cbl and .bnd, and the
-L/path/to/.../
is the full path to where your DB2 install has placed the DB2 support libraries.

PostgreSQL error logging options not recognised

As noted here I'm trying to use the PostgreSQL COPY ERROR_LOGGING, ERROR_LOGGING_SKIP_BAD_ROWS options.
My SQL looks like this:
COPY users FROM 'C:\Users\admin\osmosis_temp\users.txt'
(ERROR_LOGGING, ERROR_LOGGING_SKIP_BAD_ROWS);
I get an SQL output of ERROR: option "error_logging" not recognized. Am I missing something that turns on the error logging in the first place?
PostgreSQL 9.3
At the top of the wiki page you've linked to one reads:
Error logging in COPY was a proposed feature developed by Aster Data against the PostgreSQL 9.0 code base. It was submitted and reviewed (1) but not accepted into the core product for that or any other version so far.
So that was only a concept.