How to execute a query in VS Code for Cosmos DB? - visual-studio-code

I have installed an extension for having access to a Microsoft Azure Cosmos DB in VS Code, and I'm able to see the documents inside as expected.
However, I wasn't able to figure out how to run a query that allow me to filter results based on different conditions e.g: SELECT * FROM c WHERE c.DocumentId = 123
Is there a way to run SQL queries in VS Code on a Cosmos DB? I couldn't find any helpful tutorial and by installing "mssql" extension seems that is mainly focused on ADO.NET connections.

Based on the statements in the cosmos db extension:
Browse and query your MongoDB databases both locally and in the cloud
using scrapbooks with rich Intellisense then connect to Azure to
manage your Cosmos DB databases with support for MongoDB, Graph
(Gremlin), and SQL (previously known as DocumentDB).
You could follow below official example of Scrapbooks to query your sql api db with mongo shell syntax.
E.G.
db.coll.find({"DocumentId" :123})
Update:
Sorry for the late updates. I got 2 tools for you:
1.CosmosDbExplorer :https://www.bruttin.com/CosmosDbExplorer/
2.Azure Storage Explorer: https://azure.microsoft.com/en-us/features/storage-explorer/

This tool might help: https://github.com/microsoft/vscode-cosmosdb
The issue is that they do not explain how to use and manipulate SQL API. Please share if anyone has an idea for using SQL API like the Mango DB and this tool.

Related

Connecting to Amazon Redshift in Azure Data Studio via the Postgresql Connector

I've recently joined a company with a mixed set of databases that include a Redshift cluster and some SQL databases. I'd like to use a single IDE to access both for analytical reporting, so I don't have to switch between tools. I'm currently using workbench, which works, but it's not clicking with me.
I do like Azure Data Studio, but it's SQL Server and Postgres only. Given the similarities between Redshift and Postgres, I thought I'd see if I could connect using the Postgres driver.
I've installed the Postgres extension and can "connect" to the database. However when I try to explore the database using the tree view, I get the error message 'Cannot Expand Node'. When I run a simple query that works in workbench, e.g.
Select * from [server].[database].[table]
I get the following Error message:
Started executing query at Line 1
cursors can only be used within the transaction that created them.
Total execution time: 00:00:00.019
I know I'm trying to do something that shouldn't be done. And if I can't, I can't. But has anyone here managed to get a redshift connection going in Azure Data Studio?
FWIW, I've come across a GitHub Repository that may be a Redshift driver for data studio - but this looks like a clone of the Postgres driver, with no activity since march (not even renaming the 'Postgres' titles to Redshift)... and therefore I'm dubious.

Azure Cosmos DB: Clone collection to another database

Currently I am trying to clone a cosmos db collection from one database to another database within the cosmos db. The API of the cosmos db is set to Mongo API.
I already tried to use Azure Data factory, but it looks like that there is no support for the Mongo API so far.
Has anyone an idea how to do this respective to efficiency, automation and performance?
Any ideas are appreciated.
You can use data Migration tool suggested by Microsoft to do the same.
There is no way to take a backup and import cosmosdb.
EDIT:
With the new Cosmic Clone tool, you can take a clone/backup with data/stored procedures/triggers/udf, etc. Read my blog on the same.
I already tried to use Azure Data factory, but it looks like that
there is no support for the Mongo API so far.
Actually, Cosmos DB Mongo API and SQL API are all belong to Azure Cosmos DB service.So , you still can create cosmos db linked service and dataset in the azure data factory for your database.
Then you could create copy activity to import data from one collection to another collection.
If you want to make it as an automation task, I suggest using following 2 ways to run the copy activity.
1.Azure Time Trigger Function.
2.Web job which is run in the background of Azure Web App.
Hope it helps you.Any concern, please feel free to let me know.
I used mongodump and mongorestore to copy my database (with mongodb version 4.0.9 installed). From the windows command line I ran the following commands from my mongodb bin directory (c:\Program Files\MongoDB\Server\4.0\bin in my case).
This will copy all the collections, including indexes, in the DB to the specified /out directory as .json files.
mongodump.exe /uri:URI /out:A_DIRECTORY_TO_DUMP_TO
I then ran the following command to take everything in the /out directory and write it to the target DB:
mongorestore.exe /uri:URI /dir:DIRECTORY_TO_RESTORE_FROM
NOTE: Before importing I also had to increase the throughput for the collection, otherwise I ran into rate limiting errors. If you've set throughput at the database level this may need to be changed.

connect to documentdb using robomongo

I have a Document DB (using the DocumentDB interface, NOT the MongoDb interface), so the connection string looks like:
AccountEndpoint=https://SomeDatabase.documents.azure.com:443/;AccountKey=xxxxx;
it does NOT look like this:
mongodb://SomeDatabase:xxxxx==#SomeDatabase.documents.azure.com:10255/?ssl=true&replicaSet=globaldb
Question:
How do I connect using RoboMongo or other MongoDb tools/code?
The stuff I looked at said things like take the username (that it shows in the MongoDb version of Cosmos DB (which won't help, as it is a totally different database and the connection string there won't work for apps that need the DocumentDb interface)
Is there a way to do this,or by 'adding support for MongoDB interface to Document DB' like adding the ability to talk to a Ms-SQL Server using MongoDB because you can always download MongoDb an install that on the same machine. (and not be able to get any data passed between them)
When you use Cosmos DB, you must choose, for your deployed database, which API to use with it (DocumentDB, MongoDB, Tables, Gremlin). You cannot use multiple APIs against the same database.
The only way to use MongoDB tools & frameworks is to deploy a Cosmos DB database with the MongoDB API. The MongoDB API is what provides compatibility with MongoDB. Note: The oplog is not provided with the Cosmos MongoDB API, so tools that rely on reading/tailing the oplog will not work.
The DocumentDB API does not surface any of the MongoDB API, so you will not be able to use MongoDB-specific tools when deploying a DocumentDB-specific database.
Note: The MongoDB API of Cosmos does not surface an oplog, so any operations which attempt to query the oplog will not succeed.
Have you seen this how-to by Microsoft for this: Use Robomongo with an Azure Cosmos DB
And one more related: Connecting to Azure Cosmos DB emulator from RoboMongo

How to get Collections name from Cosmos DB using Mongo API

I am trying to execute Mongo API to perform CRUD operation on Azure Cosmos-DB.
I am running the query on Azure Data explorer.
This is a query that I am executing {db.getCollectionNames()}
I am facing {"code":500,"body":"{\"message\":\"There was an error processing your request. Please try again in a few moments.\",\"httpStatusCode\":\"InternalServerError\",\"xMsServerRequestId\":null,\"stackTrace\":null}"}
Can you please suggest the changes if I am doing something wrong here.
the Mongo Query area is not the same as a native MongoDB shell. That is, the only thing you can do within the query window is execute find() queries, and you only specify the filtering (between the {}). For example:
There's also the ability to open a mongo shell via the browser, where you can run queries, in the more traditional format for mongo:
With the browser-based shell, you can also do updates (e.g. db.families.update()) and deletes (db.families.remove()). But it doesn't support commands such as db.getCollectionNames().

Does ADO work with ODBC drivers or only OLE DB providers?

I am trying to create some VBA code to automate dashboard creation against a PostgreSQL database. I have heard the OLE DB driver is unreliable and it looks like it hasn't been touched in several years. Does ADO work with an ODBC driver?
Yes, simply reference the DSN:
oConn.Open "DSN=mySystemDSN;" & _
"Uid=myUsername;" & _
"Pwd=myPassword"