Instant logging and file generation in Db2 - db2

I am currently using Community version on linux server, have configured db2audit process
Which generated audit files at respective location. Then user have to manually execute db2audit archive command to achieved logs file and have to execute thedb2 extract command to extract the archived files into flat ascIII files and then we have to load the files into respective tables.
There only we can analyze the logs by query the tables. In this whole process lots of manual intervention is required.
Question:-Do we have any config settings or utility
with the help of which we can generate logs files which include SQL statement event, host, session id,Timestamp and all instantly and automatically.
need to set instant level logging mechanism to generate flat files for logs of any SQL execution happened or any event triggered in database level in DB2 on linux server

Related

Updating online Mongo Database from offline copy

I have a large Mongo database (5M documents). I edit the database from an offline application, so I store the database on my local computer. However, I want to be able to maintain an online copy of the database, so that my website can access it.
How can I update the online copy regularly, without having to upload multiple GBs of data every time?
Is there some way to "track changes" and upload only the diff, like in Git?
Following up on my comment:
Can't you store the commands you used on your offline db, and then
apply them on the online db, through a script running on SSH for
instance ? Or even better upload a file with all the commands you ran
on your offline base, to your server and then execute them with a cron
job, or a bash script ? (The only requirement would be for your bases
to have the same start point, and same state, when you execute the
script)
I would recommend to store all the queries you execute on your offline base, to do this you have many options, I can think about the following : You can set the profiling level to log all your queries.
(Here is a more detailed thread on the matter: MongoDB logging all queries)
Then you would have to extract then somehow (grep ?), or store them directly in another file on the fly, when they are executed.
For the uploading of the script, it depends on what you would like to use, but i suppose you would need to do it during low usage hours, and you could automate the task with a CRON job, and an SSH tunnel.
I guess it all depends on your constraints (security, downtime, etc..)

Microsoft SSIS Package running successfully manually from Visual Studio and SSMS Integration Services Catalog, but not via SQL Server Agent

I have a complex SSIS package, which detects the file extension from a folder, and loads the file into a SQL Server table. I have a For Each Loop Container to load a number of files in this manner, from this folder location and load each file into a SQL Server table.
After the loading of each file into the SQL Server table, the SSIS has a File System Task in the Control Flow; this File System Task first creates an archive folder and then moves each file into this archive folder.
I am using Environment Variables in the SSMS Integration Services Catalog, to map to the parameters in the SSIS package/project.
The entire process is successful when I run the SSIS package in the SSMS Integration Services Catalog manually, but when I try to run via a SQL Server Agent, the data loading and the (File System Task's) folder creation are successful, but not the File System Task's file moving process. (The Agent is Run as SQL Server Agent Service Account.)
I get the following error when I see the execution report in the Integration Services Catalog in the SSMS:
File System Task - Move Files:Error: An error occurred with the following error message: "Access to the path is denied.
While the SQL Server Agent is able to create a folder using a File System Task successfully, it is not able to
move the file into this new folder location.
Inside the SQL Server Agent History, I see this in the job step:
Execute as user: NT Service\SQLSERVERAGENT. Microsoft(R) SQL Server Execute Package Utility Version 14.0.2002. 14 for 64-bit.
... Package execution on IS Server failed. Execution ID: 30449, Execution Status : 4.
I am not good with this permission issue in SQL Server Agent. I read about some proxy setting etc. but am not able to comprehend.
Is there a step-by-step solution you can provide me to fix this issue ?
The SQL Agent job executes the package using the SQL Agent service account. When you run the package manually, the package executes using the credentials you used to sign in. Most likely the SQL Agent service account does not have enough access to the directory, especially if it has just been created. Make sure the service account has "Full Control" of the directory the package is referencing. To test whether it is an access issue, log on to the server using the service account credentials and manually run the package from the SSIS catalog. If it fails for the same reason, you know you need to look at file system access for the service account.

Publishing Reports on tableau via tabcmd

I wanted to publish tableau reports via tab cmd commands and was able to do it successfully, one concern I have is "Connecting the twbx file to a data source' via tabcmd commands.
Following are the commands which I used to :
Login to tableau server :
tabcmd.exe login --server http://serverName --user "userName" --password "password" --site ""
Publishing Tableau reports to the Tableau server :
publish -c "E:\Tableau\ActualReportName.twbx" -n "new Report name.twbx" --project ProjectName --db-user "DBuserName" --db-password "DBpassword"
Although I have given my db credentials while publishing reports, I have nowhere mentioned the DB server Name and DB Name for that matter from which the twbx files would fetch the data.
I have multiple DB's using the same credentials, is there any way in TabCmd to specify the Db server Name and DB name from which reports would fetch data from?
Any help in this would be great!
Unless you have a pressing reason, I'd publish a .twb file instead of a .twbx file
The first thing I'd look into is Tableau servers support for publishing data sources that your published workbook can connect to via the Tableau server. That will allow you to embed your credentials in the shared data source and to update the workbooks and the connections in separate steps. That is especially useful if the data connection and the workbooks change at different tempos.
The unsupported hack is to have your script update the twb file before publishing. It's just an XML file and the info you want to change should be with the data connection details. If you go this route, standard disclaimers apply. Save backups. Don't modify the original, generate a revised version, expect to have to tweak your script when Tableau versions change, etc. still it's not too hard to make sense of their XML. You could probably do it with just a few lines of XSLT, but even a simple string replacement might be good enough.
Still I'd go with a shared data source over hacking the TWB internals in almost all cases.

running an exe file from a sql job

i'm trying to run an exe from a sql job.
the db is on the server, as well as the exe file.
the exe is supposed to write stuff on a log.
even though the sql job is successful, i see no change on the log file.
i've checked the exe locally, and it does work.
The job runs on type cmdexec, and the command is :
\\ustlvint02\c\FixProjectsWhichFailedSync\FixProjectsWhichFailedSync.exe
ustlvint02 - the server's name.
the path is valid, since i tested it by running it from my computer (and there, the log isn't created as well).
i'll appreciate any help you can offer.
Hadas
The account that SQL Server Agent runs on needs to have permissions to 1.) un the exe in that location and 2.) write to the log file location.
Find out account is used by SQL Agent, then verify that this user has the proper execute and write permissions.
Look for the log file in %WINDIR%\System32 (for 32-bit version of SQL Server) or in %WINDIR%\SysWOW64 (for 64-bit version of SQL Server), where %WINDIR% is a path to the folder where Windows is installed (typically, C:\Windows). This destination does not depend on the system account specified for the SQL Agent job. All files which your executable needs to write to or read from must be either specified within an absolute path or be specified within a relative path and thus be present in the aforementioned system folder.

Restore full external ESENT backup

I've wrote the code that creates full backups of my ESENT database, using JetBeginExternalBackup API.
Following the MSDN guidelines, I backed up every file returned by JetGetAttachInfo and JetGetLogInfo.
I've made the backup, erased old database, and copied the backup data to the database folder.
The DB engine was unable to start, the JetInit error code is "JET_errMissingLogFile".
I've checked the backup, it only contains the database file, and "<inst>XXXXX.log" log files. It lacks the current log file (I'm using circular logging, BTW).
Is there any way to restore such backup?
I don't want to use JetExternalRestore API because it's too complex: I don't need to restore to another location, I don't understand why there're 3 input folders not 2, and I don't know the values to supply in genLow and genHigh arguments.
I do need external backups: the ESENT database is used by ASP.NET on a remote server, and I'm backing it up over the Internet.
Or, maybe there's a way to retrieve the name of the current log file, and I should just add it to the backup?
Thanks in advance!
P.S. I've got no permissions to span processes on my web server, so using eseutil.exe is not an option.
Unpack all backed up files to a single folder.
Take the name of your main database file. Replace extension to .pat. Create zero-length file with that name, e.g. database.pat.
After this simple step, call JetRestoreInstance API, it will restore the backup from that folder.