This might sound basic ,but I want to know .. when I connect to a schema I want to run a couple of queries, So I want to save those queries in the worksheet of sql developer itself.. How to save them so that the next time I open worksheet can see the list of queries I want to run?
You can save them as reports:
View --> Reports --> User Defined Reports --> New Report
Related
In my company we have 1K+ Tableau workbooks, all using same Vertica data source via multiple-table connection or custom SQL. Often we end up in situation where reports stop working because underlying data source was changed: table renamed, field removed etc.
How can we proactively react to these changes is my question.
Can we try to correct source code of tableau workbooks to batch replace deprecated query parts?
Or can we monitor what data tables are used in the workbook with/without parsing the source code of the workbook to create alert system?
Thanks
We have many talend jobs to transfer data from oracle (tOracleInput) to redshift (tRedshiftOutputBulkExec). I would like to store the result information into a DB table. For example:
Job name, start time, running time, rows loaded, successful or failed
I know if I turn on log4j, most those information can be derived from the log. However, saving it into DB table will make it easy to check and report the result.
I'm most interested in records loaded. I checked this link http://www.talendbyexample.com/talend-logs-and-errors-component-reference.html and manual of tRedshiftOutputBulkExec. None of them gives me such information.
Will Talend Administration Center provide such function? What is the best way to implement it?
Thanks,
After looking at the URL you provided, tLogCatcher should provide you with what you need (minus the rows loaded, which you can get with a lookup).
I started with Talend Studio Version 6.4.1. There you can set "Stats & Logs" for a job. It can log to console, files or database. When writing to a DB you set JDBC parameters and the name for three tables:
Stats Table: stores start and end timestamps of the job
Logs Table: stores error messages
Meter Table: stores the count of rows for each monitored flow
They correspond to the components tStatCatcher, tLogCatcher, tFlowMeterCatcher, where you can find the needed table schema.
To make a flow monitored select it, open tab "Component" and mark the checkbox "Monitor this connection".
To see the logged values you can use the "AMC" (Active Monitoring Console) in Studio or TAC.
I have a SQL Server 2008 R2 with a database in it.
How to find a certain query that was executed and from what IP ?
I have tried to go through the transaction logs but I cant understand nothing there.
You should use SQL Server Profiler. It's usually installed by default - look in the SQL Server folder on the Start Menu. When you open it, start a new trace and select the database. In the Trace Properties dialog choose the TSQL template. This will then record all the queries running on the database, along with a whole lot of other stuff. It's not massively easy to track stuff down in here, but look for the BatchStarting events to find the SQL that gets run. Then you should run the procedure sp_who2 on the database so you can match up SPIDs in the profiler to logins.
Here's the situation: I receive reports written by a vendor which are all developed on their own Oracle DB. Normally, there is no issue in setting a new datasource to our own Oracle DB, but this one report in particular is not playing nicely.
The report in question has 8 SQL Expressions, and a subreport with an additional 3 SQL Expressions (I mention this because I suspect this may have something to do with it, but not sure. Almost like CR is attempting to verify the SQL Expressions on the old DB). I'm able to update the data source of the subreport just fine, but when I try it with the main report, Crystal prompts me repeatedly for the login to the OLD DB where the report was developed (which I obviously do not have access to). The prompt is inescapable and I have to terminate Crystal's process each time.
I've tried unchecking all report and database checking/verification options in CR to no avail. If anyone has any advice as to what I could try next, it would be greatly appreciated!
EDIT: Well, it looks like all I had to do is close the login window a BILLION times (OK, more like ~16, twice for each SQL Expression?). Leaving the question open, though, to see if there is any way to avoid having to go through this for future reports.
EDIT EDIT: Some more details. This is still happening with CR 2008 SP3 attempting to connect to an Oracle 11g database with 11g R2 client. I'm not sure about how these reports were developed, but it was with CR XI at the earliest.
I have seen this w/ migrated reports before, but it's been loong ago.
If you had to do it a million times, I would have guessed that you actually had 999,999 sub-reports -- all to the same old data source, all needing verification or failure to try the new data source. Sorry, I just re-read... I meant to put 'a BILLION' minus 1.
Did you check your TNSNames against (whatever config supplied by the vendor)?
Are you using the same driver used by the vendor Oracle reports? (Oracle driver vs MS ODBC for Oracle vs CR ODBC for Oracle vs MyPrettyPony ODBC Dri...)
Did you go through the Set DataSource exercise in CR?
Can you save the subreports seperately and run them individually w/out needing the (insert some large number) login window closures?
Can you create a new report, set to your own Oracle data source? (I have to assume this one is ok, based on your comments)
If you copy the Show SQL Query and use that as a Command query in a new report, does that run? (Rinse and repeat as sub-reports).
(I'm stalling for time as I search my memory for the last time I experienced the same...)
Merged with Table tool in showing one fewer database records.
i'm using table tool of jasper report but when i run this using ireport, it's showing one less record for every SQl query! if any1 has used it and faced same problem then please show me a way!