I want to create and maintain mongodb atlas datalake programmatically but seems there is no option available. I could find out one API which can be used to create/update/delete data lake but that it only allows to set some options. Here is the link I am following - https://docs.mongodb.com/datalake/reference/api/dataLakes-create-one-tenant/#request-body-parameters
Doesn anyone know how to setup other options like data store, storage etc which you can do from mongodb atlas UI?
Related
My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.
You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).
Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.
I have several images in a container in Blob Storage that have metadata I want to store in CosmosDB. The images are Jpeg's, Would I have to pull the data into Data Factory or is there a simpler method?
I think the simplest way would be to just write a small console app to loop through all your blob containers and items and insert into Cosmos DB using it's SDK.
There are many ways to loop through images stored in Blob Storage and store them in Cosmos DB. Please refer to below links. A little tweak and required functionality can be achieved.
Using ADF
Link
Use Azure Logic App
Link 1 Link 2
Write a custom code and then populate the result into Cosmos DB
Link
I have data getting stored in Google firebase (i.e. output of google vision API). I need the same data to get stored in MongoDB which is running on AWS. Is there a connectivity possible between Google cloud and AWS for data migration?
There is no out of the box solution for what you're trying to accomplish. Currently Firestore supports exporting it's data as documented here though the format of the export is probably not something MongoDB could import right away. Even if the format was compatible you would need some kind of data processing pipeline to handle the flow from side to side.
Depending on how you're handling the ingestion of Vision API results you might be able to include code to also send that data to MongoDB. If that's not the case you might need to design a custom solution for this particular use case.
I have set up my Mongodb database have connected successfully.
However my project is to create an online cookbook.
My database currently is various recipes but with each document I want an image to be linked to it.
I know I can use gridFS but I would prefer to store the images in the same place as the recipes.
I have seen I can use base64 but that is no appearing for me
I am very new to using Mongo and some advise is greatly appreciated
Store the images in s3 bucket and storing the path of the s3 image in the DB.
Using s3 url you can access the image.
Currently I need to create admin ui, using google cloud datastore as user storage, and google functions as controllers.
I found the Keystone JS CMS platform, which uses mongoDB for storing user data, so I am looking for the chance to use it with google cloud datastore.
Is it possible to create some driver that will redirect all the keystone db requests not to mongodb, but in Google Datastore?
The core of Keystone is built on top of Mongo (and Mongoose) functions and code. At the moment, there isn't another way to use Keystone with another database type or provider. See this GitHub issue for some more information that may be of use.