Can DB2REMOTE be used to point a file from another server? - db2

Using the script below, I was able to load the data to the table with local files.
db2 load from SOME/LOCAL/File.txt of asc modified by reclen=123 method L \(1 11, 12 14\) REPLACE INTO schema.tablename
However, I want to achieve to load the file from another server. I don't want to transfer the files from another server to db2 server so I will be able to use the command as above. Found that DB2REMOTE can be used for remotefiles in this documentation, but I'm not sure how to execute it with success.
Do I need to do this also? Because I don't have the right IAM role and don't have the credentials to do so. If I just can skip this and proceed to connect with another server only.
This is the script I'm trying with DB2REMOTE:
db2 load from 'DB2REMOTE://centos#123.456.789.0:/folders/directory/file.txt' of asc modified by reclen=123 method L \(1 11, 12 14\) REPLACE INTO schema.tablename
Thank you in advance!

DB2REMOTE is for accessing cloud object storage (e.g Amazon S3, IBM Cloud Object Storage), from some Db2 commands.
If you are not using cloud object storage, then mount the remote directory locally with appropriate permissions, and specify the local mountpoint with the Db2 load command .
You can remote mount with SSHFS or similar, when installed and properly configured. This is not programming , but instead it is administration and configuration.

Related

DB2 load ( client in remote | file in the db2 server )

I'm using db2 client in windows to connect to Linux DB2 server.
I'm trying to upload data using my client but the data is in the /tmp/ directory in the host server.
If I use LOAD FROM "/tmp/file.txt" OF .. it fails with message QL2036N The path for the file, named pipe, or device "/tmp/file.txt" is not valid.
It is possible doing thins without db2 connect from the server itself ?
regards
Per comment thread: the solution was to ensure that the Db2-instance owner has read access to the file on the server.
When you use load from then the specified file must reside on the Db2-server, and the Db2-instance owner (e.g. db2inst1) on the server must have read access to the file. DOUBLE CHECK the permissions/ownerships. If the file is on your workstation use load client from.

How to run Redshift copy command from EC2

I have my log files on EC2 instance and want to load it to Redshift. Two questions:
Do I have to copy this log file to S3 before proceeding or can I directly copy from my EBS Volume.
I can see I can use copy command from SQL Workbench or Data Pipeline. But can I use it from my EC2 instance itself ? Which AWS CLI I need to install?
http://docs.aws.amazon.com/cli/latest/reference/redshift/ does
not list copy command
Not really. Redshift allows you to copy from a remote host, which, in your case, would be your EC2 instance. Documentation here.
The link you've referred to provides cluster management commands. To run SQL queries on your cluster, you can use the psql tool. Documentation here.
you can copy the data directly from EC2, but my recommendation is to save it first on S3 , also for a backup
All the documentation available online was confusing me. Finally the solution was that I wrote a simple Java file with DriverManager.getConnection() and calling copy command via stmt.executeUpdate() and it worked seamlessly. Only executeUpdate() did not return me number of records Inserted.

is there any way to create directory in data directory location of Amazon RDS PostgreSQL instance

AWS RDS PostgreSQL instance able to connect from another PostgreSQL client but not able to see data directory and configuration files .is there any way to edit/view data directory and configuration files
If you want to work with file system, use EC2 instances with postgres installed and configured as you wish. Neither postgres.conf, nor hba.conf cant be edited directly on file system.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Appendix.PostgreSQL.CommonDBATasks.html#Appendix.PostgreSQL.CommonDBATasks.Parameters
Instead use amazon provided interface to change supported parameters or use SET command where possible...

Can db2 import or load be used to populate DashDB?

I'm looking to bulk loads millions of rows into a DashDB database. After connecting using the DB2 CLI, I enter a command like:
db2 import from rowsToImport.csv of del insert into MY_TABLE
with results:
SQL0551N "DASHXXX" does not have the required authorization or privilege to
perform operation "BIND" on object "NULLID.SQLUAJ19". SQLSTATE=42501
Is this an inherent limitation with DashDB, or is something configured incorrectly on my client? I get a similar message when trying db2 load:
SQL2019N An error occurred while utilities were being bound to the database.
p.s. I'm aware of the rest client api for DashDB for loading data - I'm asking specifically how/if bulk loads can be done with the DB2 command line as an alternate option.
As per dashDB documentation you can use the Command line processor plus (CLPPlus). It is included in the dashDB driver package and provides a command-line user interface that you can use to connect to the dashDB database, BLUDB. You can use CLPPlus to define, edit, and run statements, scripts, and commands. Please take also a look at Connecting CLPPlus to the dashDB database to see how to connect and use the CLI.
Please note that in CLPPlus: IMPORT, EXPORT and LOAD commands have a restriction that processed files must be on the server: see here. So you should copy the input load file onto the remote server first with SCP. However SSH/SCP protocol should be blocked (not accessible) for a normal dashDB user.
Only geospatial data can be loaded from your local machine to dashDB, using IDA LOADGEOSPATIALDATA command in CLPPlus.
The file to be loaded in dashDB using the above command can be in the local file system, accessible to the CLPPlus user.
Alternative ways to do that are:
dashDB REST API (as you already mentioned). See Load delimited data using the REST API and cURL.
load the csv directly from the dashDB dashboard on Bluemix. See Loading data from the desktop into IBM dashDB.
load the csv using IBM Data Studio. See dashDB large file load using IBM Data Studio.
According to this technote, the package NULLID.SQLUAJ19 belongs to one of the early DB2 10.1 fix packs, so I suspect your client version is 10.1. When attempting to execute the IMPORT command it needs to bind some packages of that older version, since dashDB is DB2 10.5, obvisouly.
You may want to try installing the latest DB2 client fix pack, as the necessary packages may be already bound in the database.
To verify that you could run select pkgname from syscat.packages where pkgschema = 'NULLID' and pkgname like 'SQLUA%' -- you should see "SQLUAK20", which seems to be the corresponding package in DB2 10.5.
If that doesn't work, your other option might be to move to a dedicated dashDB instance, as you won't have sufficient privileges to bind missing packages in the entry-level shared dashDB service.

How to open Google Cloud SQL instance to see database

I have exported my google Cloud SQL instance to Google Cloud Storage. I have exported the file in the compressed format (.gz) to Cloud Storage bucket. Then after I downloaded to my system and extracted it using 7zip. How can open it in MySQL Workbench to see the database and values. Its file type is shown as instance name.
The exported data from Cloud SQL is similar to what you get from mysqldump. It's basically a series of SQL statements that, when you run it on another server will run all the commands to get from a clean state to the exported state.
I'm not very familiar with MySQL Workbench, but from what I've read it allows you to manage your MySQL database, browsing tables and data. So you may need to upload your exported data to another MySQL server, for example a local one running on your computer.
Note that you could also connect directly from MySQL Workbench to your Cloud SQL instance by requesting an IP for your instance and authorizing the network that you'll connect from.
You can connect directly to your Cloud SQL instance. All you need to do is whitelist your IP address and connect through MySQL Workbench as if it's a normal database instance.
You can whitelist your IP by:
Navigate to https://console.cloud.google.com/sql and select your project.
Go to the Connections tab and Add Network in the Public IP section.
Use the connection details on the Overview tab to connect
Then you can browse your database through Workbench as if it was a local instance.