How to execute a query every 5 mins - facebook

I'm very new to OSQuery and i'd like to execute a query (e.g. SELECT * FROM last) every 5 minutes. Is there any chance, to define a script, which executes this routine in within a crontab or something else like this?
Probably it should be enough to execute the script with the query as parameter, but there is nothing in the documentation, so i guess, it won't be supported yet.
I checked their Community and also their FAQ but haven't found something relating to my problem.
OSQuery is currently on the latest version (1.7.3), self compiled, running on Ubuntu Server, 64 bit 15.10.
If you need more information to help me, just let me know.

The recommended method is using scheduled queries. You create a 'pack' like one of these GitHub link which includes the queries and frequencies. Then update the osqueryd config to include the pack.

After even more documentation and different sites, i have found a pretty cool snippet, which allows, to send the query as parameter, by calling the osqueryi process.
/path/to/osqueryi --json "YOUR QUERY"
And this returns the result in your terminal - JSON Format. So it's pretty easy to write a script (any language), execute the snippet above and parse the content. This script can be a cron as well.

Maybe you could write a script (or a C program) to perform your query.
And then use the cron to run your program every 5 minutes.

Related

SAS Viya - Environment Manager: Job triggers

I am currently looking into SAS Viya 3.4 to replace SAS 9.4.
Now I was curious to see the possibilities of the Environment Manager in scheduling Jobs and mantaining and creating Job flows. However, I noticed that I could only Drag and Drop Jobs in a flow and connect them with very few configurable options. Also as a trigger to start a Jobflow I was only able to select a time event. I am wondering if there are other trigger types to choose from. Like a Job will be triggered if a specific table exists or a file exists [or ...]. Neither did I see the possibility to trigger/start a job based on the return code of the previous job.
Also it does not seem to be smart enough to make sure two jobs don't access a library with write access at the same time.
I can't see how SAS Viya could replace a Job Orchestration Tool. However, I feel like the tool was built to replace such an Orchestration Tool. Did I miss something or is it just not possible to do so with the Environment Manager in SAS Viya?
Any help/insights is highly appreciated. I already searched through the documentation but could not find anything.. Maybe I was just looking at the wrong place?
Why 3.4 and not 3.5 (or Viya 4)?
If you want to use Viya with your own Job Orchestration software you can consider this tool (built by my team): https://cli.sasjs.io/job/
We deployed it on Jenkins for this customer: https://www.sas.com/en_us/news/press-releases/2021/july/sas-partnership-with-lloyds-list-intelligence.html

Exporting individual Congos Reports via command line

I'm trying to work out how I can export individual Cognos Reports via the command line, for the purposes of source versioning in Git at a report-by-report level. I presume XML would be the output format.
I read that the Cognos SDK can help but you need to build your own solution, which may be possible but this use case feels like something many others would already want and there'd be tooling already.
Of course, importing the individual report would also be needed.
Can anyone help here please?
Thanks.
If your end game is version control (Who changed what, when?), you should look into MotioCI. Last time I looked, there was no free version of MotioCI.
You can use tools like the ones provided by companies like http://www.motio.com. With the free version you can export the XML of the reports but only one by one.
You can also use a Cognos deployment of the reports that generates a zip file with the XML of the reports, but all the reports are in the same file and you will have to extract the XML of the individual reports by hand.
I found the SDK to be cumbersome and, when I got it working, slow.
Yes, report specs are XML.
I have created a process that produces output like what you are asking for. Here's what it involves:
A recursive common table expression (CTE) query to get the report
specs along with the folder structure as seen in Cognos.
A PowerShell script to run the query and write the results to the file system.
Another PowerShell script to pull the current content from the remote git repo, run the first PowerShell script, then add, commit, and push the results up to the remote git repo.
I also wrote a PowerShell script to perform the operations associated with git push. This involves using a program I found called HTML Tidy (http://tidy.sourceforge.net/) that can be used to make the XML human-readable. This helps with diffs in git. I use TFS, so I get a nice, side-by-side diff if I have tidied the XML. (Otherwise, it tells me the only line of XML has changed.)
I recently added output for dashboards (exploration) and data sets (dataSet2). Dashboards are stored as JSON, so my routine had to tidy that (simple in PowerShell).
I run my routine daily, getting new and modified content from the last 3 days (just in case), and weekly to do an entire dump (to capture the deletes). The weekly process takes about six minutes. The daily process is negligible.
Before you ask: I hesitate to provide actual code because I can't take any responsibility for your system.
Updates:
Hacking away at the Content Store database is not recommended and it is not supported by IBM.
For reference/comparison: I'm running IBM Cognos 11.0.7 on IIS on Windows 2012 R2 with the Content Store database on MS SQL Server 2016. Your system may be different.
Additional Resources
https://www.cognoise.com/index.php/topic,28289.msg113869.html#msg113869
https://www.cognoise.com/index.php/topic,17411.msg50409.html#msg50409
https://learn.microsoft.com/en-us/powershell/scripting/overview?view=powershell-6
https://learn.microsoft.com/en-us/sql/t-sql/language-reference?view=sql-server-2017
https://git-scm.com/docs
http://tidy.sourceforge.net/

Can I debug a PostgreSQL query sent from an external source, that I can't edit?

I see how to debug queries stored as Functions in the database. But my problem is with an external QGIS plugin that connects to my Postgres 10.4 via network and does a complex query and calculations, and stores the results back into PostGIS tables:
FOR r IN c LOOP
SELECT
(1 - ST_LineLocatePoint(path.geom, ST_Intersection(r.geom, path.geom))) * ST_Length(path.geom)
INTO
station
(continues ...)
When it errors, it just returns that line number as the failing location, but no clue where it was in the loop through hundreds of features. (And any features it has processed are not stored to the output tables when it fails.) I totally don't know enough about the plugin and about SQL to hack the external query, and I suspect if it was a reasonable task the plugin author would have included more revealing debug messages.
So is there some way I could use pgAdmin4 (or anything) from the server side to watch the query process? Even being able to see if it fails the first time through the loop or later would help immensely. Knowing the loop count at failure would point me to the exact problem feature. Being able to see "station" or "r.geom" would make it even easier.
Perfectly fine if the process is miserably slow or interferes with other queries, I'm the only user on this server.
This is not actually a way to watch the RiverGIS query in action, but it is the best I have found. It extracts the failing ST_Intersects() call from the RiverGIS code and runs it under your control, where you can display any clues you want.
When you're totally mystified where the RiverGIS problem might be, run this SQL query:
SELECT
xs."XsecID" AS "XsecID",
xs."ReachID" AS "ReachID",
xs."Station" AS "Station",
xs."RiverCode" AS "RiverCode",
xs."ReachCode" AS "ReachCode",
ST_Intersection(xs.geom, riv.geom) AS "Fraction"
FROM
"<your project name>"."StreamCenterlines" AS riv,
"<your project name>"."XSCutLines" AS xs
WHERE
ST_Intersects(xs.geom, riv.geom)
ORDER BY xs."ReachID" ASC, xs."Station" DESC
Obviously replace <your project name> with the QGIS project name.
Also works for the BankLines step if you replace "StreamCenterlines" with "BankLines". Probably could be adapted to other situations where ST_Intersects() fails without a clue.
You'll get a listing with shorter geometry strings for good cross sections and double-length strings for bad ones. Probably need to widen your display column a lot to see this.
Works for me in pgAdmn4, or in QGIS3 -> Database -> DB Manager -> (click the wrench icon). You could select only bad lines, but I find the background info helpful.

How to set Cron Job in magento using custom Module?

I want to use cron job or scheduling for my custom module in Magento.
I research about it but not able to found suitable answer which help me.
I created one form which allow user to select Day, hour, min, sec etc.. And I store all this values in database. What I want to do is that its takes value from database and Run some schedule (define php script) according to that time.
I don't have any Idea how can I do that. So, Please Help me to resolved this issue.
Thanks in Advance...
The way the Magento Cron system is designed, is not really to allow user specified scheduling, but rather developer specified scheduling (i.e. the frequency of a task is set in a modules XML configuration, that can't be changed via the GUI). This being said, you can work around this somewhat by setting your Cron to run at the most frequent interval in the XML and wrap the code in the actual cron function, with code to compare the current time with the database configuration.
One way to do this is to create a cronjob that run every 'x interval' to check your database to see if anything is schedule. If something is schedule then execute else do nothing
See How to Set Up a Cron Job
You could also try using config_path
<crontab>
<jobs>
<company_export_send_order>
<schedule>
<config_path>export/order/cron_settings</config_path>
</schedule>
<run>
<model>company_export/observer::exportOrderData</model>
</run>
</company_export_send_order>
</jobs>
</crontab>
Read more # magento cron in backend configuration

show executed query in phpPgAdmin

Is there a way to show the SQL query executed by phpPgAdmin as the way phpMyAdmin does?
For example, if I modify a column, it should show the ALTER command being executed.
If this is not possible, what other interface could I use to get this feature?
It's not possible with any currently released version of phpPgAdmin, although the feature could probably be added. You'd need to intercept the SQL being sent to the back-end, and then display this back out to the user. SQL execution is pretty well centralized, and if you look at the "history" feature you will see a way to trap/show queries, so munging those bits together would probably get you what you want. HTH, if someone implements this, please send a pull request!
As a quick dirty hack you could alter sources a bit to enable sql logging:
In classes/database/ADODB_base.php in
function execute($sql) {
...
}
add these lines at the beginning:
global $misc;
$misc->saveScriptHistory($sql);
This worked in my 5.0.3 version.