CosmosDB - mongoDB - Table not formed on Azure Portal, but activity log is showing status success - mongodb

I am new to Azure Database, I am using cosmosDB (MongoDB), I have connected to the CosmosDB, via string URL. When I am storing data through the collection it is not showing any exception on my application infact it is showing status success when I check the logs, but it is not visible on data explorer. When I am calling the complete data I am able to retrieve it also..
Only problem is that it is not visible on the mongoDb console neither it is visible on data explorer.
data explorer image

To see the data, click on the Documents node of the testTable database in the Data Explorer (records are called Documents in DocumentDB).
Here's an example:

Related

Logic Apps trigger Databricks notebook

I have an issue. Is that possible for Logic Apps to trigger Databricks notebook in its pipeline?
I want to automate processing of downloaded files from Sharepoint and trigger notebook of Databricks.
Starting to build the custom logicapp connector but struggling according to this guide: https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
Unfortunately struggling to create json Openapi file (image below)
1.Created a Databricks resource in Azure.
Created a Logic Apps Custom Connector resource with name Databricksconnector.
Once created clicked on edit option as shown below,
Followed the document https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
by # Pooja Shinde
and performed below steps.
Connector creation I have kept all with default values in this section.
In General Information, selected scheme as Https and in host give data bricks url as https://databricksinstancename.azuredatabricks.net
In Security, selected authentication as basic as shown below,
In Definition tab, added new action by clicking on New action
In General tab, provided Summary, Description, Operation ID and Visibility selected as none.
In Request tab, Clicked on Import from sample.
I am retrieving job details by passing job_id and details are,
Request Type: Get
Url: api/2.0/jobs/get?job_id=*******
Headers: Content-Type: application/ json
12.Click on import and request will be update as below,
Next click on Update connector and connector will be saved.
Connector saved successfully as shown below,
We can view all the provided details in json format by clicking on swagger editor,
Now created a logic app with recurrence trigger.
Able to see created custom connector as shown below,
Created connection for Azure data bricks as per the document https://medium.com/#poojaanilshinde/create-azure-logic-apps-custom-connector-for-azure-databricks-e51f4524ab27
Added parameter “job_id” and given value as shown below,
20. logic app ran successfully and able to get job details,
Reference: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/2.0/jobs?source=recommendations#--runs-list

Import Azure Monitor log data into Azure Data Factory

Is it possible to connect Azure Data Factory to the Azure Monitor logs to extract the data?
You can connect from Power BI as described here:
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/powerbi
But I want to be able to connect from Data Factory to the log.
Per my knowledge,there is no direct way which is similar to the PB way in the link you mentioned in your question in the ADF. Based on this document and ADF portal UI, we could store the log in three residences:
Azure Storage Account.
Event Hub.
Log Analytics.
For storage account ,you could access them in copy activity.
For Event hub, maybe you could use Event Hub REST API in REST dataset and ADF Web Activity or you could get an idea of Azure Stream Analytics.
For Log Analytics, you could use Log Analytics REST API in REST dataset and ADF Web Activity:
POST https://api.loganalytics.io/v1/workspaces/DEMO_WORKSPACE/query
X-Api-Key: DEMO_KEY
Content-Type: application/json
{
"query": "AzureActivity | summarize count() by Category"
}
Got is all working. I have done the following:
Create a Pipeline which contains 2 Web Activities, 1 For Each Loop & Call to stored procedure to insert the data
First Web Activity gets the bearer token
Second Web Activity calls the REST API GET and has a Header name Authorization which brings in the access_token for the first web activity Bearer {access_token}
Then A For Each Loop which I pass the output for the second Web Activity
Stored procedure Activity which passes in all my fields into an insert stored procedure
Finally that all worked. I had a lot of trouble using the Copy Activity so resorted to the For Each Loop and stored procedure call to insert each record from the output of the REST API call in the web activity.
I will post more detailed info once I get some sleep!
Partial Answer:
I have been able to use 2 Web Activities in a pipeline. 1 which gets the bearer token and 2 which then uses the bearer token to carry out the GET command. BUT now the question is how can I use the Output from the Web Activity in a subsequent Copy Activity so I can load the data into SQL ????
There are two methods witch depend on the method of authentication to the API.
The first is with a service principle, High level steps described above. This Blog on the topic is also useful: https://datasavvy.me/2020/12/24/retrieving-log-analytics-data-with-data-factory/comment-page-1/#comment-28467
Second is with Managed Identity:
first give ADF access to Log Analytics using IAM How can I use this API in Azure Data Factory
Then connect to Log Analytic API with Web activity or a copy activity (these are the two i got working).
Web Activity
URL: https://api.loganalytics.io/v1/workspaces/[Workspace ID]/query
Body: {"query":"search '*'| where TimeGenerated >= datetime(#{pipeline().parameters.it_startDate}) and TimeGenerated < datetime(#{pipeline().parameters.it_endDate}) | distinct $table "}
Copy Activity
First the linked service.
ADF Datasets:
Base URL: URL: https://api.loganalytics.io/v1/workspaces/[Workspace ID]/
Copy Source:
Body: {
"query": "#{item()[0]} | where TimeGenerated >= datetime(#{pipeline().parameters.it_startDate}) and TimeGenerated < datetime(#{pipeline().parameters.it_endDate})"
}
Additional:
The body code above, gets a list of the table names in log analytics using the web activity. Which I then pass to the Copy Activity to exports copy of the data for each table.

Error when trying to load tables from desktop with dashDB

I'm trying to load a csv file into a table and receiving the following error:
The steps that I am doing are launching the dashDB instance that I have, Load, Load from Desktop, Browse for the csv... I have tried to create a new table from the csv and add to an existing table, both resulting with the error. I have also created a dummy csv with just 1 record in it and it is also failing with the same error. When creating the table from the load, the result is the table getting created with no data loaded.
The error you are getting is an already known bug of the Bluemix Dashboard integration with the dashDB one:
The team is working on this issue, but a fully working workaround is available.
In order to complete your data loading through your file you should:
access your application section on the Bluemix dashboard
click on the 'Environment variables' section and select the 'VCAP_SERVICE' button
retrieve the 'https_url' of the dashDB service and save also the related username and password values
paste on your browser url the https_url value, it will open directly the dashDB dashboard, you can now login using the username and password found in VCAP_SERVICE section
click on 'Load' section in the left menu of the dashDB dashboard
in this section you could upload your file without any problem.

How to get recordsChanged sync status using JS Datastore API?

I'm using the JavaScript SDK flavor of the Dropbox Datastore API with a web app for mobile and desktop. When the recordsChanged event fires while the app is offline, object data about those changes are generated but the changes can't sync to the datastore until the app is online again.
The event data can be checked against the settings table, for instance, like this:
e.affectedRecordsForTable("settings")
But the array data returned has a lot of layers to wade through.
[t_datastore: t_deleted: false_managed_datastore: t_record_cache: t_rid: "startDate"_tid: "settings"__proto__: t]
I would like to capture the "has been synced" or the "not yet synced" status of each change (each array index) so that I can store the data still waiting to sync in case the session is lost (user closes the app/browser or OS kills the app process). But I also want to know if/when the data does eventually sync successfully. Where can I find the property holding this data?
I found my answer. Steve Marx has a post on the Dropbox developer blog that covers the information I needed. There is a datastore.getSyncStatus().uploading property that returns true or false depending on the state of the datastore sync status.
Source:
https://www.dropbox.com/developers/blog/61/checking-the-datastore-sync-status-in-javascript

ITEM name must be unique

Using Intuit Anywhere QuickBooks Desktop, I am creating items that added without errors to the cloud. Using an ItemQuery, Looking up the item works fine. Here is my XML request and response:
http://pastebin.com/4YaJbgZg
When I run the Intuit Sync Manager, my newly added items enter error state. These items do not exist in QuickBooks, before I run the sync manager. After entering error state, this query will make them appear:
ItemQuery iq = new ItemQuery();
iq.ErroredObjectsOnly = true;
var bItems = iq.ExecuteQuery<Item>(dataServices.ServiceContext);
How can I find out why these items enter error state? I created a log file using the Intuit Sync Manager, however, I see no error messages about these items. Here is my log:
http://pastebin.com/QhpKHvWF
QBD Item Create is in beta and is not supported in v2.
QBD Supported Objects and Operations
Support for Item Create will not be added until V3 is released. For more information about V3, please see the V3 webinar recording.