I am trying to upload 100GB Json file to cosmos DB. I want to use mongo DB API to store the data. How can i able to bulk upload the data to cosmos DB? Is there any option provided to upload json file to cosmos db.
Would probably use mongoimport to do this. There are some things to configure in Cosmos DB when importing data. You can read more here as well as a lot of other articles on migrating data into Cosmos DB API for MongoDB
Related
my database is currently present in mongodb atlas.Can i use AWS DMS to directly migrate from atlas to dynamo db or do i have to first make a EC2 instance and make a mongodb server there and dump my data. I can't find any videos nor blogs related to it
I want to upload a large file to an API endpoint and store that file into a mongo db database (I am using DRF)
I would like to migrate my already existing Mongo DB to Azure Cosmos DB Mongo API, but could not figure out whether all the functionalities in use can be implemented in cosmos DB. Currently, I am using Mongo DB Atlas.
Following are some of the functionalities that we are using in Mongo Db atlas.
Grid FS
Master DB for connecting all the other DB's.
Indexing manually.
Profiler.
TTL for documents( Data gets deleted after 90 days from its entry date).
Vertical Scaling, ie., more than 500 databases with at least 3 to 4 collections in each database.
Storing certificates whose size is more than 2 MB.
All the files are in JSON format.
Can anyone help me out with this?
Most of these features are already supported and documented. You can refer to below links for more details:
Azure Cosmos DB's API for MongoDB (3.6 version): supported features and syntax - This covers Gridfs, TTL
Per-account limits - This covers scaling limits of DB
Manage indexing in Azure Cosmos DB's API for MongoDB - This covers Index creation
Monitor and debug with insights in Azure Cosmos DB
I dont think there is master db in Azure Cosmos DB MongoDB API
Source: Azure Storage Gen 2 (file with 10 json lines)
Sink: Azure Cosmos with Mongo API
I used Azure Data factory pipeline (Copy activity) to move the file data to Mongo collection. Copy is successful but when I run find({}) on my collection, it returns 0 records. When I run stats(), it shows the count as 10 which is expected. I cannot figure out what is the issue when reading these records from Robo3T to query Mongo DB.
I created second pipeline to read data from Mongo and write to Azure Storage to test if the data really is present in Mongo. I was able to write all 10 records to storage. It proves the data is present in Mongo, but I cannot read/access it.
You wont be able to directly read data collection stored in the data or any databases. Must you use Mongo Shell via Azure Portal. Where you have to go to your Azure Cosmos DB resource -> Data Explorer -> Mongo Shell. If there any specific errors here is the troubleshooting document.
I have a Document DB (using the DocumentDB interface, NOT the MongoDb interface), so the connection string looks like:
AccountEndpoint=https://SomeDatabase.documents.azure.com:443/;AccountKey=xxxxx;
it does NOT look like this:
mongodb://SomeDatabase:xxxxx==#SomeDatabase.documents.azure.com:10255/?ssl=true&replicaSet=globaldb
Question:
How do I connect using RoboMongo or other MongoDb tools/code?
The stuff I looked at said things like take the username (that it shows in the MongoDb version of Cosmos DB (which won't help, as it is a totally different database and the connection string there won't work for apps that need the DocumentDb interface)
Is there a way to do this,or by 'adding support for MongoDB interface to Document DB' like adding the ability to talk to a Ms-SQL Server using MongoDB because you can always download MongoDb an install that on the same machine. (and not be able to get any data passed between them)
When you use Cosmos DB, you must choose, for your deployed database, which API to use with it (DocumentDB, MongoDB, Tables, Gremlin). You cannot use multiple APIs against the same database.
The only way to use MongoDB tools & frameworks is to deploy a Cosmos DB database with the MongoDB API. The MongoDB API is what provides compatibility with MongoDB. Note: The oplog is not provided with the Cosmos MongoDB API, so tools that rely on reading/tailing the oplog will not work.
The DocumentDB API does not surface any of the MongoDB API, so you will not be able to use MongoDB-specific tools when deploying a DocumentDB-specific database.
Note: The MongoDB API of Cosmos does not surface an oplog, so any operations which attempt to query the oplog will not succeed.
Have you seen this how-to by Microsoft for this: Use Robomongo with an Azure Cosmos DB
And one more related: Connecting to Azure Cosmos DB emulator from RoboMongo