Where are log file stored in DB2?
I am searching for a file with name Updatedb20100604182008.log
from this page:
http://www.ibm.com/developerworks/data/library/techarticle/0301kline/0301kline.html
(The article goes into further detail about default locations as well.)
The database logs are initially
created in a directory called
SQLOGDIR, a sub-directory of the
database directory. You can change the
location where active logs and future
archive logs are placed by changing
the value for this configuration
parameter to point to either a
different directory, or to a device.
Archive logs that are currently stored
in the database log path directory are
not moved to the new location if the
database is configured for
roll-forward recovery.
Because you can change the log path
location, the logs needed for
roll-forward recovery may exist in
different directories or on different
devices. You can change this
configuration parameter during the
roll-forward process to allow you to
access logs in multiple locations.
The change to the value of newlogpath
will not be applied until the database
is in a consistent state. An
informational database configuration
parameter, database_consistent,
indicates the status of the database.
Note: The database manager writes to
transaction logs one at a time. The
total size of transactions that can be
active is limited by the database
configuration parameters:
The DB2 log file location can be found from the DB CFG parameter - 'Path to log files'.
The command would be the below, without an explicit connection to the DB.
db2 get db cfg for db_name | grep 'Path to log files'
Else, you can connect to the DB first and use the command as follows:-
db2 connect to db_name
db2 get db cfg | grep 'Path to log files'
db2 terminate
db2 connect to database
db2 get db cfg | grep -i log
cd /data/dblogs/NODE0000(path to the log files)
cd LOGSTREAM0000(these is log folder)
ls -altr(we can see all the log files with .log extension)
rm abc.log (give the log name which you want to delete)
Related
I've been given a project to extract data from a PostgreSQL database. I've no previous experience with PostgreSQL but the project I have is to bug fix existing code, so all the logic to connect to the engine and get data is already in place.
The problem I have is the database has been given to me in the form of the folders and files straight from the source HDD, not a backup (which isn't going to happen so "Get the customer to give you a backup instead isn't an option here).
The folders also contained the actual PostgreSQL binaries so I looked a the version (9.4.14) and downloaded the nearest (9.4.18) from the PostgreSQL site and installed it. Now all I have to do is some how is to get it to look at my given data files.
I tried the obvious of copying the contents of the data folder into the installed data folder but after the PostgreSQL service won't start.
I did find a option in the conf file:
#data_directory = 'ConfigDir'
I changed this to:
data_directory = 'C:\customer\data'
But again the service won't start after this.
The data directory used by the service is defined through the service command line which overwrites any property defined in postgresql.conf.
You need to re-create the service in order to change the data directory, e.g.:
Remove the service:
pg_ctl -unregister -N postgresql-9.1
postgresql-9.1 is the "real" name of the service, not the "Display Name". You can see that in the properties of the service inside the "services" app.
Then re-create the service with the correct data directory:
pg_ctl -register -D -D c:\customer\data -N postgresql-9.1
Another way of "debugging" startup errors in Windows, is to start Postgres from the command line (not through the service) because some errors during startup are not logged in the Postgres logfile but they are displayed on the command line. You can do that with e.g.:
pg_ctl start -D c:\customer\data`
If the bin directory is not in your PATH you need to specify the full path to it on the command line, e.g.: c:\Postgres9.1\bin\pg_ctl
When I am restoring the database, by default data is going in C drive, but when I installed the db2 that time I specify the path in D drive only.
Also, sample database files created by db2 is stored in D drive.
Can anyone please tell me what is the issue?
I have run this command:
SELECT * FROM SYSIBMADM.DBPATHS
below is the result i fetched:
LOGPATH- D:\DB2\NODE000\SQL00001\SQLOGDIR\
DB_STORAGE_PATH- C:\
LOCAL_DB_DIRECTORY - D:\DB2\NODE000\SQLOGDIR\
DBPATH - D:\DB2\NODE000\SQL00001\
I Want to change this DB_STORAGE_PATH C:\ to D:\ for all the database which i will be restoring.
You can run db2set from db2 command line that will confirm you wheather db2 installed on path with other information;
db2-command-line> db2set
DB2_ATS_ENABLE=YES
DB2_CREATE_DB_ON_PATH=YES
DB2INSTPROF=C:\where\db2\installed\IBM\DB2\DB2COPY1
DB2COMM=TCPIP
You can get more information of Directory structure for your installed DB2 database product (Windows) here
You can run the following command SELECT * FROM SYSIBMADM.DBPATHS. This will give details of following variables of your installed db2 database;
LOGPATH
DB_STORAGE_PATH
LOCAL_DB_DIRECTORY
DBPATH
These commands will provide you enough information to locate your installed database. Then you can restore your database providing the exact path.
To add a storage path to an existing database, issue the following ALTER DATABASE statement:
ALTER DATABASE database-name ADD STORAGE ON storage-path
After adding one or more storage paths to the database, you may use the ALTER TABLESPACE statement to rebalance table spaces in the database so that they start to use the new storage paths immediately.
DB2 has a configuration parameter for the default path for databases, dftdbpath. In addition, the command db2sampl to create a sample database has an option dbpath to specify where to place that database.
db2sample -dbpath D:
The above would place the new database on drive D:.
You will find that there are default paths for certain operations. The overview of DB2 database manager configuration parameters has lists most of them.
For your specific issue I would assume that a parameter was changed some time after DB2 was installed and used initially.
For RESTORE be aware that the options TO and DBPATH are ignored if restoring an existing database.
I apologize for the long post. I have a Postgresql 9.3 server running on a Amazon linux AMI. I also have a compressed dump file from another server which I created using pg_dumpall. Now, I want to restore the data from this dump file in my Postgres. However, I want to load this data into a specific location (say /data).
I'm having a fresh installation of Postgres. So when I tried to do a:
sudo service postgresql93 start
I got an error message asking me to initialize the db. So I did a:
sudo service postgresql initdb
which created the required files in /var/lib/pgsql93/data. After that, I changed the 'data_directory' configuration in /var/lib/pgsql93/data/postgresql.conf and pointed it to /data (I had to do this as root user. I couldn't open the file as the default user).
Now when I try to do a
sudo service postgresql93 start
it fails to start, and when I check the /var/lib/pgsql93/pg_startup.log file, it says:
FATAL: "/data/postgresql" is not a valid data directory
DETAIL: File "/data/postgresql/PG_VERSION" is missing.
So I copied the files from the default (/var/lib/pgsql9.3/data) to /data, changed the permissions to 700 and owner to postgres.
However, when I try to start the service again, it still fails, and in the pgstartup.log, it only says:
LOG: redirecting log output to logging collector process
HINT: Future log output will appear in directory "pg_log".
And when I check the log in /data/pg_log, it says:
LOG: database system was shut down at 2014-12-30 21:31:18 UTC
LOG: database system is ready to accept connections
LOG: autovacuum launcher started
What else could be the problem? I haven't restored the data yet. I just have the files which were created by the initdb command.
#BMW http://www.linuxquestions.org/questions/linux-server-73/change-postgresql-data-directory-649911/ is exactly what I was looking for. Thanks.
I am using command
db2 restore db S18 from /users/intadm/s18backup/ taken at 20110913113341 on /users/db2inst1/ dbpath on /users/db2inst1/ redirect without rolling forward
to restore database from backup file located in /users/intadm/s18backup/ .
Command execution gives such output:
SQL1277W A redirected restore operation is being performed. Table space
configuration can now be viewed and table spaces that do not use automatic
storage can have their containers reconfigured.
DB20000I The RESTORE DATABASE command completed successfully.
When I'm trying to connect to restored DB (by executing 'db2 connect to S18'), I'm getting this message:
SQL0752N Connecting to a database is not permitted within a logical unit of
work when the CONNECT type 1 setting is in use. SQLSTATE=0A001
When I'm trying to connect to db with db viewer like SQuireL, the error is like:
DB2 SQL Error: SQLCODE=-1119, SQLSTATE=57019, SQLERRMC=S18, DRIVER=3.57.82
which means that 'error occurred during a restore function or a restore is still in progress' (from IBM DB2 manuals)
How can I resolve this and connect to restored database?
UPD: I've executed db2ckbkp on backup file and it did not identified any issues with backup file itself.
without rolling forward can only be used when restoring from an offline backup. Was your backup taken offline? If not, you'll need to use roll forward.
When you do a redirected restore, you are telling DB2 that you want to change the locations of the data files in the database you are restoring.
The first step you show above will execute very quickly.
Normally, after you execute this statement, you would have one or more SET TABLESPACE CONTAINERS to set the new locations of each data file. It's not mandatory to issue these statements, but there's no point in specifying the redirect option in your RESTORE DATABASE command if you're not changing anything.
Then, you would issue the RESTORE DATABASE S18 COMPLETE command, which would actually read the data from the backup image, writing it to the data files.
If you did not execute the RESTORE DATABASE S18 COMPLETE, then your restore process is incomplete and it makes sense that you can't connect to the database.
What I did and what has worked:
Executed:
db2 restore db S18 from /users/intadm/s18backup/ taken at 20110913113341 on /<path with sufficient disk space> dbpath on /<path with sufficient disk space>
I got some warnings before, that some table spaces are not moved. When I specified dbpath to partition with sufficient disk space - warning has disappeared.
After that, as I have an online backup, I issued:
db2 rollforward db S18 to end of logs and complete
That's it! Now I'm able to connect.
How do I write a T-SQL backup database command to specify a file containing spaces?
Here is what I have:
BACKUP DATABASE AMDMetrics TO DISK = 'C:\Documents and Settings\daultrd\My Documents\DatabaseBackups\AMD_METRICS.DAT'
And this is the error I get:
Msg 3201, Level 16, State 1, Line 1
Cannot open backup device 'C:\Documents and Settings\daultrd\My Documents\DatabaseBackups\AMD_METRICS.DAT'. Operating system error 3(The system cannot find the path specified.).
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.
Try sharing your intended destination folder and using a UNC path to backup from the server to your local machine.
BACKUP DATABASE AMDMetrics
TO DISK = '\\YourMachineName\SharedFolderName\AMD_METRICS.DAT'
This works for me, are you sure that the directory is correct?
backup database master to disk = 'c:\Test Me\master.bak'
Processed 41728 pages for database 'master', file 'master' on file 1.
Processed 5 pages for database 'master', file 'mastlog' on file 1.
BACKUP DATABASE successfully processed 41733 pages
in 22.911 seconds (14.230 MB/sec).
copy and paste this into explorer and see if you can get there C:\Documents and Settings\daultrd\My Documents\DatabaseBackups
This of course has to be the same machine, otherwise you need to map a drive to the location or use UNC paths
I was working through this issue as well.
It's possibly that the Service that SQL Server is running under (Network Service by Default) doesn't have permission to the folder specified.
BACKUP DATABASE master TO DISK = 'master1.bak' WITH INIT
The above should backup to the default backup folder
if that works with no problem it'll be the problem stated.