Use PowerShell to write Data direct to OMS - powershell

I'd like to write my custom data direct to our Microsoft Azure OMS using powershell. Does anyone know how this is possible?
There's alot of information about configuring instances, adding new datasources, or querying the data using Azure​RM.​Operational​Insights:
https://blogs.technet.microsoft.com/privatecloud/2016/04/05/using-the-oms-search-api-with-native-powershell-cmdlets/
https://learn.microsoft.com/de-ch/powershell/module/azurerm.operationalinsights/?view=azurermps-4.0.0
Has anyone found a way to directly write data to OMS instead of storing it in a log-file and then use custom-log-import? I know this is possible, i saw it in a keynote, but cannot find any information about this.
Thank you for your Inputs!

This should do the trick - writing to Log Analytics:
https://learn.microsoft.com/en-us/azure/log-analytics/log-analytics-data-collector-api#sample-requests
Really not much to add, it's a copy/paste process. Entries will show using the type you define in $LogType appended with '_CL' because you essentially create a Custom Log entry.

Related

How to connect API as data source in Tableau?

I need to use two data sources. One is SQL and another one is the response from a rest API.
I tried to implement WDC, but it needs an HTML and user need to interact with UI and getting the response.
But I don't want to create a html page.
Is there any way to use an API response as a data source in Tableau?
The short answer is that you can not use API directly as a data source but you should build a pipeline to transform this into flat-file o populate a database table.
The alternative answer is to use Python to connect to the REST API. You can choose to use TabPy or follow some pre-build solution like this one. Personally, I don't know how the performances could be.

How to extract view meta data from tableau

I need to extract metadata from view. When I was reading tableau rest API documentation there is no ways mention in the rest api for sheet metadata and also not mention measure and dimension API extraction. If anyone knows the way how to extract data please help me
Take a look at the Tableau Javascript API...specifically the getData method.
The Rest API is used for automating Tableau Server tasks (add users, create schedules, set permissions, etc).
You can embed a hidden extension into your dashboard that can access worksheet, field and filter information. See the docs on extensions here.
Here is a good example of one that could get you started on pulling the worksheet and field information.
Use the Tableau Metadata API - https://help.tableau.com/current/api/metadata_api/en-us/index.html
You can find sample scripts here - https://github.com/tableau/metadata-api-samples

What template was used to create my PostgreSQL database

I have a PostgreSQL database that I think was create using a custom template database.
Is there a way to see what template was used to create the database? There doesn't seem to be any documentation that describes this.
As far as I know, no, there is no system catalogue that tracks this. I'm not sure it would be terribly useful to know that database "X" was created from "template2" unless you could prove exactly what the contents of "template2" were at the point of copying. If you're tracking things to that level of detail then you will already have the creation logged.

Is it possible to store Audit4j audit events into MongoDB?

We want to save audit4j events/logs (which are usually stored in text file) to mongodb.
Is it possible with some existing adapter/plugin? Or need to write on? If we need to write any documentation which can be referred to?
Most of the article talks about auditing mongodb changes itself, hence we are confused.
Any pointer will be appreciated.
Thanks and regards
There's a plugin available on github https://github.com/nipunthathsara/Audit4j-MongoDB

Load a PostgreSQL database using cloudconnect

On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?