DB2 restore from encrypted back-up - db2

I am trying to restore a DB2 database using an encrypted backup file. The backup zip file contains an .lst file, a .ddl file, over 3000 .ixf files, same number of message files and a folder with few .lob files in it.
I have tried using bind # list_file grant public after placing the .lst file and .ixf files in the /bind directory. But the error was that .ixf files could not be opened.
Any help appreciated.

What you have is not a backup (encrypted or otherwise) but the output from the db2move export command execution. Read the db2move documentation to learn how to perform the opposite operation.

Related

cannot zip locked access file

I am writing a powershell script which creates a zip file of a local folder:
[System.IO.Compression.ZipFile]::CreateFromDirectory('c:\myfolder\', 'c:\myarchive.zip', [System.IO.Compression.CompressionLevel]::Fastest,$true)
This folder contains an MS-Access database. This database is opened at the same time by another user. I cannot ask him to close this database.
The zip operation fails because the database is locked. Is there a way to bypass this lock and make a copy of the database ?
Thanks a lot
Copy the folder to a temporary place and zip the copy.

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

How to restore .sql file to ms sql 2008 , i m not having .bak file

How to restore .sql file to Sql server 2008 , I do not have .bak file.
I tried to search every where but can't find solution.
Pls Help if anyone knows.
.sql files are typically run using Sql Management Studio. They are basically saved SQL statements, so could be anything. You dont "import" them, more precisely, you "execute" them. Even though the script may indeed insert data.
To restore you may need backup ie. .bak
If you dont have the .bak file then either create it else there is no way to restore your database(including .sql file).
You need to have the .bak file to restore your database(including .sql file) else it is not possible.
EDIT:
As commented by OP, the solution that helped to him is:
C:\Users\Administrator>sqlcmd -S . -E -i C:\Users\Administrator\Desktop\scr.sql

How to import Zipped file into Postgres Table

I would like to important a file into my Postgresql system(specificly RedShift). I have found a arguement for copy that allows importing a gzip file. But the provider for the data I am trying to include in my system only produces the data in a .zip. Any built in postgres commands for opening a .zip?
From within Postgres:
COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';
From the man page for unzip -p:
-p extract files to pipe (stdout). Nothing but the file data is sent to stdout, and the files are always extracted in binary
format, just as they are stored (no conversions).
Can you just do something like
unzip -c myfile.zip | gzip myfile.gz
Easy enough to automate if you have enough files.
This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here:
This is the format that works for me if my s3 bucket contains a gzipped .csv.
copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;
unzip -c /path/to/.zip | psql -U user
The 'user' must be have super user right else you will get a
ERROR: must be superuser to COPY to or from a file
To learn more about this see
https://www.postgresql.org/docs/8.0/static/backup.html
Basically this command is used in handling large databases

Using COPY FROM in postgres - absolute filename of local file

I'm trying to import a csv file using the COPY FROM command with postgres.
The db is stored on a linux server, and my data is stored locally, i.e. C:\test.csv
I keep getting the error:
ERROR: could not open file "C:\test.csv" for reading: No such file or directory
SQL state: 58P01
I know that I need to use the absolute path for the filename that the server can see, but everything I try brings up the same error
Can anyone help please?
Thanks
Quote from the PostgreSQL manual:
The file must be accessible to the server and the name must be specified from the viewpoint of the server
So you need to copy the file to the server before you can use COPY FROM.
If you don't have access to the server, you can use psql's \copy command which is very similar to COPY FROM but works with local files. See the manual for details.