Copy data File AWS S3 to Aurora Postgres - postgresql

Trying to Copy csv File from AWS S3 to Aurora Postgres.
I did add S3 access to the RDS table for s3 import.
Is there anything else i am missing?
This is command that i tried:
SELECT aws_s3.table_import_from_s3 ('t1','','DELIMITER '','' CSV HEADER',aws_commons.create_s3_uri('testing','test_1.csv','us-west-2'));
Error:
NOTICE: HINT: make sure your instance is able to connect with S3.
NOTICE: CURL error code: 28 when attempting to validate pre-signed URL, 0 attempt(s) remaining
NOTICE: HINT: make sure your instance is able to connect with S3.
ERROR: Unable to generate pre-signed url, look at engine log for details.
CONTEXT: SQL function "table_import_from_s3" statement 1 ```
can anyone help me on this please?

Related

export Amazon RDS into S3 or locally

i am using Amazon RDS Aurora postgreSQL 10.18, i need to export a specific tables with more than 50,000 rows into csv file (either local or into s3 bucket), i have tried many procedure but ended up with fail :
i tried the button export to csv from the query editor after select all rows but the API response with too large data to return
i tried to use aws_s3.query_export_to_s3, but ERROR: (credentials stored with the database cluster can’t be accessed Hint: Has the IAM role Amazon Resource Name (ARN) been associated with the feature-name "s3Export")
i tried to take a snapshot from our instance, then export it into s3 bucket but ended up with error (The specified db snapshot engine mode isn’t supported and can’t be exported)

Importing Csv file from GCS to postgres Cloud SQL instance invalid input syntax error

When importing a csv file from Cloud Storage into Cloud SQL Postgres using Cloud Composer (AIRFLOW ),I would like to remove the header, or skip rows automatically (in my dag operator: CloudSQLImportInstanceOperator) but i keep having error,It seems CloudSQLImportInstanceOperator doesn't support skip rows,how to resolve such issue?

Creating Postgres table on AWS RDS using CSV file

I'm having this issue with creating a table on my postgres DB on AWS RDS by importing the raw csv data. Here's the few steps that I already did.
CSV file has been uploaded on my S3 bucket
Followed AWS's tutorial to give RDS permission to import data from S3
Created an empty table on postgres
Tried using pgAdmin's 'import' feature to import the local csv file into the table, but it kept giving me the error.
So I'm using this query below to import the data into the table:
SELECT aws_s3.table_import_from_s3(
'public.bayarea_property_data',
'',
'(FORMAT CSV, HEADER true)',
'cottage-prop-data',
'clean_ta_file_edit.csv',
'us-west-1'
);
However, I keep getting this message:
ERROR: extra data after last expected column
CONTEXT: COPY bayarea_property_data, line 2: ",2009.0,2009.0,0.0,,0,2019,13061.0,,0,0.0,0.0,,2019,0.0,6767.0,576040,172810,403230,70,1,,1.0,,6081,..."
SQL statement "copy public.bayarea_property_data from '/rdsdbdata/extensions/aws_s3/amazon-s3-fifo-6261-20200819T083314Z-0' with (FORMAT CSV, HEADER true)"
SQL state: 22P04
Anyone can help me with this? I'm an AWS noob, so have been struggling over the past few days. Thanks!

Insert data into Redshift from Windows txt files

I have 50 txt files on windows and I would like to insert their data into a single table on Redshift.
I created the basic table structure and now I'm having issues with inserting the data. I tried using COPY command from SQLWorkbench/J but it didn't work out.
Here's the command:
copy feed
from 'F:\Data\feed\feed1.txt'
credentials 'aws_access_key_id=<access>;aws_secret_access_key=<key>'
Here's the error:
-----------------------------------------------
error: CREDENTIALS argument is not supported when loading from file system
code: 8001
context:
query: 0
location: xen_load_unload.cpp:333
process: padbmaster [pid=1970]
-----------------------------------------------;
Upon removing the Credentials argument, here's the error I get:
[Amazon](500310) Invalid operation: LOAD source is not supported. (Hint: only S3 or DynamoDB or EMR based load is allowed);
I'm not a UNIX user so I don't really know how this should be done. Any help in this regard would be appreciated.
#patthebug is correct in that Redshift cannot see your local Windows drive. You must push the data into an S3 bucket. There are some additional sources you can use per http://docs.aws.amazon.com/redshift/latest/dg/t_Loading_tables_with_the_COPY_command.html, but they seem outside the context you're working with. I suggest you get a copy of Cloudberry Explorer (http://www.cloudberrylab.com/free-amazon-s3-explorer-cloudfront-IAM.aspx) which you can use to copy those files up to S3.

SQL Database + LOAD + CLOB files = error SQL3229W

I'm having trouble making loads of tables that have CLOBS and BLOBS columns in a 'SQL Database' database in Bluemix.
The error returned is:
SQL3229W The field value in row "617" and column "3" is invalid. The row was
rejected. Reason code: "1".
SQL3185W The previous error occurred while processing data from row "617" of
the input file.
The same procedures performed in a local environment functioned normally.
under the command you use to load:
load client from /home/db2inst1/ODONTO/tmp/ODONTO.ANAMNESE.IXF OF IXF LOBS FROM /home/db2inst1/ODONTO/tmp MODIFIED BY IDENTITYOVERRIDE replace into USER12135.TESTE NONRECOVERABLE
The only manner currently you can upload lob files to a SQLDB or dashDB is to load the data and lobs from the cloud. The option is to get data from a Swift object storage in Softlayer or a Amazon S3 storage. You should have an account on one of those services.
After that, you can use the following syntax:
db2 "call sysproc.admin_cmd('load from Softlayer::softlayer_end_point::softlayer_username::softlayer_api_key::softlayer_container_name::mylobs/blob.del of del LOBS FROM Softlayer::softlayer_end_point::softlayer_username::softlayer_api_key::softlayer_container_name::mylobs/ messages on server insert into LOBLOAD')"
Where:
mylobs/ is the directory inside the Softlayer swift object storage container, defined in
LOBLOAD is the table name to be loaded in
Example:
db2 "call sysproc.admin_cmd('load from Softlayer::https://lon02.objectstorage.softlayer.net/auth/v1.0::SLOS424907-2:SL523907::0ac631wewqewre8af20c576ad5214ec70f163d600d247bd5d4dfef5453f72ff6::TestContainer::mylobs/blob.del of del LOBS FROM Softlayer::https://lon02.objectstorage.softlayer.net/auth/v1.0::SLOS424907-2:SL523907::0ac631wewqewre8af20c576ad5214ec70f163d600d247bd5d4dfef5453f72ff6::TestContainer::mylobs/ messages on server insert into LOBLOAD')"