We have a Loopback v3.8 application using MongoDB connector v3.1
It works fine in the environments running native MongoDB but now we would like to deploy to Azure and use Cosmos DB, which in theory should support all the native MongoDB commands.
The problem we're having is that PATCH operations (which I believe are mapped to Model.updateAttributes by Loopback) are not working.
This is the error we get:
Could not update Client. { Error: No Client found for id
592cc132a31109354c45d1d8 }
Loopback debug strings:
loopback:connector:mongodb updateAttributes +7ms Client 592cc132a31109354c45d1d8 { '$set': { loginDate:2017-06-02T12:30:18.201Z } }
loopback:connector:mongodb MongoDB: model=Client command=findAndModify +2ms [ { _id: 592cc132a31109354c45d1d8 },
[ [ '_id', 'asc' ] ],
{ '$set': { loginDate: 2017-06-02T12:30:18.201Z } },
{}, [Function] ]
loopback:connector:mongodb Result: +399ms { _t: 'FindAndModifyResponse', ok: 1, value: null, lastErrorObject: { n: 1, updatedExisting: false, value: null } }
loopback:connector:mongodb updateAttributes.callback +4ms Client 592cc132a31109354c45d1d8 null { _t: 'FindAndModifyResponse', ok: 1,
value: null,
lastErrorObject: { n: 1, updatedExisting: false, value: null } }
If we do a GET for that Client, using its Id, we get the correct response, so the Client document is there.
Can the Loopback MongoDB connector used for Cosmos DB?
Are we missing something that requires Loopback to work correctly with Cosmos DB?
Thanks.
This is because the underlying base for cosmosdb is no longer MongoDB, it simply allows you to access is using a familiar API. The MongoDB connector is not intended for use with Cosmos DB, I personally have been looking for a solution for my own use and came across the NPM package loopback-connector-cosmosdb which worked for some simple applications but is completely unsupported by it's developer and loopback.
Related
I am trying to set up a Vue js page to list the data from MongoDB Atlas.
My Node JS server is connecting fine, and getting the API (Documents) listed.
See the code here that is working, but I am not able to get the URL for my localhost.
The localhost:4000/whatever-needs-to-be-here??? is all giving an error of can't get data.
The port it is listening at is 4000
I have axios on a page trying to get data, but I am not sure what URL to enter in the axios get('????') as my local url is not working.
What am I missing to have the data display in the browser?
npm start
> my-view-app#0.1.0 start /Users/macbookpro/my-vue-app-03-4
> node server.js
Server listening at 4000
This is what I get in the console when running npm start.
Also, the cluster name is clusterdives, the database name is: dive_db, the collection name is dives.
```
(node:2276) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
[MongoDB connection] SUCCESS
[
{
_id: 5f9ca2dd9a1d712bd137aa73,
dive_number: 1,
dive_date: 2015-06-06T04:00:00.000Z,
dive_location: 'Lake Phoenix, Rawlings, Va',
dive_country: 'United States',
dive_description: 'Dive number 4 of open water training',
dive_note: 'long note and description of the dive. This is where we write down memorable moments of the dive, prior to the dive, during the dive and after the dive.',
dive_duration: { dive_duration_value: 45, dive_duration_unit: 'min' },
dive_depth: { dive_depth_value: 50, dive_depth_unit: 'ft' }
},
{
_id: 5fa2142f137e530a5a4a7cf7,
dive_number: '2',
dive_date: '2015-06-06T04:00:00.000+00:00',
dive_location: ' Lake Phoenix, Rawlings, Va',
dive_country: 'United States',
dive_description: 'Dive number 4 of open water training',
dive_note: 'long note and description of the dive. This is where we write down mem',
dive_duration: { dive_duration_value: '45', dive_duration_unit: 'min' },
dive_depth: { dive_depth_value: '50', dive_depth_unit: 'ft' }
}
]
This is the scrip section in the Mainlist vue component/page.
<script>
import axios from 'axios';
export default {
name: 'Mainlist',
data() {
return {
dive_db: [],
}
},
mounted() {
axios.get('**THIS IS WHAT I AM LOOKING FOR**').then((response) => {
console.log(response.data);
this.dive_db = response.data;
})
.catch((error) => {
console.log(error);
})
}
}
</script>
I am retrieving my azure cosmosdb/mongodb document from a custom trigger to azure functions.. But my objectId seems to be encrypted.. How to get the correct objectid..
for example ObjectId("5df88e60d588f00c32a3c9ce") is coming as ]øŽ`Õˆð2£ÉÎ
or ObjectId("5df88f92d588f00c32a3c9d1") is coming as ]ø’Õˆð2£ÉÑ
Is there a way to retrieve objectid in nodejs/python or any script if i give ]ø’Õˆð2£ÉÑ as input.
This is my function.json used in the azure function
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases1",
"connectionStringSetting": "devcosmosdb_DOCUMENTDB",
"databaseName": "devcosmosdb",
"collectionName": "newCollection",
"createLeaseCollectionIfNotExists": "true"
}
]
}
This is my nodejs code..
module.exports = async function (context, documents) {
if (!!documents && documents.length > 0) {
context.log('Document Id: ', documents[0].id);
context.log(documents[0]);
}
}
This is my output and this is where i am not getting the objectid properly..
2020-06-16T17:16:38Z [Information] Executing 'Functions.changeTrigger' (Reason='New changes on collection newCollection at 2020-06-16T17:16:38.2618864Z', Id=adc9556a-133f-4e85-b533-5574283a5a7d)
2020-06-16T17:16:38Z [Information] Document Id: NWRmODhkZGRkNTg4ZjAwYzMyYTNjOWNj
2020-06-16T17:16:38Z [Information] {
id: 'NWRmODhkZGRkNTg4ZjAwYzMyYTNjOWNj',
_rid: 'KEcnAO163B4EAAAAAAAAAA==',
_self: 'dbs/KEcnAA==/colls/KEcnAO163B4=/docs/KEcnAO163B4EAAAAAAAAAA==/',
_ts: 1592327797,
_etag: '"0000c1d2-0000-0300-0000-5ee8fe750000"',
'$t': 3,
'$v': {
_id: { '$t': 7, '$v': ']øÝÕð\f2£ÉÌ' },
name: { '$t': 2, '$v': 'myname' },
email: { '$t': 2, '$v': 'my email' },
},
_lsn: 537
}
Please go to Azure portal to check the content of your document. I have done a test on my side, it just works fine.
Here is the document I used to test.
{
"id": "testid1",
"test1":"testvalue1",
"test2":{
"test21":"test21value",
"objectId":"5df88f92d588f00c32a3c9d1"
}
}
After clicking save button, the function will be triggered.
Here is my testing code.
def main(documents: func.DocumentList) -> str:
if documents:
logging.info('Document id: %s', documents[0]['id'])
logging.info('%s',documents[0].to_json())
The output is as below.
Update:
Currently only SQL API base is supported in Azure function cosmosdb trigger. You can also find the feature under Settings part.
The URI should be something like https://testbowman.documents.azure.com:443/
If you create a mongodb api cosmosdb account, you won't find the 'add to function' feature. And the URI should be something like https://tonycosmosdb.mongo.cosmos.azure.com:443/
I am building a personal blog and chose Gatsby because of the obvious reasons(performance and easy to start out) and because I have some React background for the frontend. Also, I had built a simple app to create my content(html string) and store in a MongoDB database using express server. Now for the blog, I am just trying to pull the data from MongoDB using gatsby-source-mongodb plugin.
My MongoDB schemas have relationships. For instance, a 'Post' schema has a 'user' property which is an ObjectID that references a user from 'User' schema. My config for the gatsby-source-mongodb looks like:
{
resolve: 'gatsby-source-mongodb',
options: {
dbName: 'KathaDB',
collection: 'posts',
server: {
address: "somecluster",
port: 27017
},
auth: {
user: 'someuser',
password: 'somepasswd'
},
extraParams: {
replicaSet: 'test',
ssl: true,
authSource: 'admin',
retryWrites: true,
preserveObjectIds: true
}
}
}
I have a couple of questions:
When I query, I get all the properties from my 'Post' schema but I don't have 'user' property in the response. I don't know if it is due to the type of the property. I dug up a bit and found a similar issue here. It seems like they have solved the issue by preserving the ObjectID but i didn't even get the property that is of type ObjectID.
Another thing, does this plugin support relationships? For example, is it possible to get the 'user' data when its ObjectID is given?
It does.
MongoDB relies on ObjectIDs for relationships, so you might have to add preserveObjectIds: true to your plugin options:
{
resolve: "gatsby-source-mongodb",
options: {
dbName: "KathaDB",
collection: "posts",
server: {
address: "somecluster",
port: 27017,
},
auth: {
user: "someuser",
password: "somepasswd",
},
extraParams: {
replicaSet: "test",
ssl: true,
authSource: "admin",
retryWrites: true,
preserveObjectIds: true,
},
preserveObjectIds: true, // <= here
},
};
I'm unsure whether gatsby-source-mongodb creates the relationships out of the box (I don't think it does, if my memory is correct), but with the ObjectIds, you can create foreign-key relationships using GraphQL.
There are two ways of doing this in Gatsby:
Using mappings in gatsby-config.js
Using a GraphQL #link directive through Gatsby's schema customization (from v2.2)
I recommend the second option, since it's a more GraphQL way of doing things, and happens in gatsby-node.js where most node operations are taking place. However, if you're starting out with Gatsby and GraphQL, the first option might be easier to set up.
cannot find documentation for mongodb 4.0 transactions support for node.js
Has it already been available in mongo driver
http://mongodb.github.io/node-mongodb-native/3.1/api/
As mentioned on the comment as well, you can find the reference for transactions on node-mongodb-native v3.1 API ClientSession. This is because transactions are associated with a session. That is, you start a transaction for a session. At any given time, you can have at most one open transaction for a session.
The documentation for MongoDB multi-document Transactions also contains examples Node.js code snippets. For example:
session.startTransaction({
readConcern: { level: 'snapshot' },
writeConcern: { w: 'majority' }
});
const employeesCollection = client.db('hr').collection('employees');
const eventsCollection = client.db('reporting').collection('events');
await employeesCollection.updateOne(
{ employee: 3 },
{ $set: { status: 'Inactive' } },
{ session }
);
await eventsCollection.insertOne(
{
employee: 3,
status: { new: 'Inactive', old: 'Active' }
},
{ session }
);
try {
await commitWithRetry(session);
} catch (error) {
await session.abortTransaction();
throw error;
}
The reference for the methods above can be found on:
ClientSession.startTransaction()
ClientSession.commitTransaction()
ClientSession.abortTransaction()
In addition to MongoDB Node.js driver v3.1, please note that multi-document transactions are available for replica sets only on MongoDB v4.0.x. Transactions for sharded clusters are available starting with version v4.2.
I would like to pull data from Postgres to Gatsby using graphql. I have written node.js server, but i cannot find way to use it in gatsby.
(https://github.com/gstuczynski/graphql-postgres-test)
Have you any ideas?
What you need to do is implement a source plugin as seen here https://www.gatsbyjs.org/docs/create-source-plugin/.
There are many examples within the gatsby repository that implement the source api. See those for inspiration! Basically you need to translate the contents of your Postgres db into a format gatsby understands. Gatsby calls this format “nodes”.
You could implement a plugin which interfaces with your db directly or with whatever api your node server exposes (graphql, REST etc.).
The gatsby-source-pg module connects directly to your database and adds the tables/views/functions/etc to Gatsby's GraphQL API. To use it, install the module:
yarn add gatsby-source-pg
then add to to the plugin list in gatsby-config.js:
module.exports = {
plugins: [
/* ... */
{
resolve: "gatsby-source-pg",
options: {
connectionString: "postgres://localhost/my_db",
},
},
],
};
The connection string can also include username/password, host, port and SSL if you need to connect to remote database; e.g.: postgres://pg_user:pg_pass#pg_host:5432/pg_db?ssl=1
You can query it in your components using the root postgres field, e.g.:
{
postgres {
allPosts {
nodes {
id
title
authorId
userByAuthorId {
id
username
}
}
}
}
}
Gatsby now supports an arbitrary GraphQL endpoint as a source which will help: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
You can also use Hasura to give you an instant GraphQL API on Postgres and then query that from your Gatsby app. You can follow the tutorial here.
Step1: Deploy Hasura against your existing Postgres database: https://docs.hasura.io/1.0/graphql/manual/getting-started/using-existing-database.html
Step 2: Install the gatsby-source-graphql plugin for gatsby: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
Step 3: Configure the plugin
{
plugins: [
{
resolve: 'gatsby-source-graphql', // <- Configure plugin
options: {
typeName: 'HASURA',
fieldName: 'hasura', // <- fieldName under which schema will be stitched
createLink: () =>
createHttpLink({
uri: `https://my-graphql.herokuapp.com/v1alpha1/graphql`, // <- Configure connection GraphQL url
headers: {},
fetch,
}),
refetchInterval: 10, // Refresh every 10 seconds for new data
},
},
]
}
Step 4: Make the GraphQL query in your component
const Index = ({ data }) => (
<div>
<h1>My Authors </h1>
<AuthorList authors={data.hasura.author} />
</div>
)
export const query = graphql`
query AuthorQuery {
hasura { # <- fieldName as configured in the gatsby-config
author { # Normal GraphQL query
id
name
}
}
}
Other links:
Sample-app/tutorial:
https://github.com/hasura/graphql-engine/tree/master/community/sample-apps/gatsby-postgres-graphql
Blogpost:
https://blog.hasura.io/create-gatsby-sites-using-graphql-on-postgres-603b5dd1e516
Note: I work at Hasura.