Connect Gatsby with Postgres - postgresql

I would like to pull data from Postgres to Gatsby using graphql. I have written node.js server, but i cannot find way to use it in gatsby.
(https://github.com/gstuczynski/graphql-postgres-test)
Have you any ideas?

What you need to do is implement a source plugin as seen here https://www.gatsbyjs.org/docs/create-source-plugin/.
There are many examples within the gatsby repository that implement the source api. See those for inspiration! Basically you need to translate the contents of your Postgres db into a format gatsby understands. Gatsby calls this format “nodes”.
You could implement a plugin which interfaces with your db directly or with whatever api your node server exposes (graphql, REST etc.).

The gatsby-source-pg module connects directly to your database and adds the tables/views/functions/etc to Gatsby's GraphQL API. To use it, install the module:
yarn add gatsby-source-pg
then add to to the plugin list in gatsby-config.js:
module.exports = {
plugins: [
/* ... */
{
resolve: "gatsby-source-pg",
options: {
connectionString: "postgres://localhost/my_db",
},
},
],
};
The connection string can also include username/password, host, port and SSL if you need to connect to remote database; e.g.: postgres://pg_user:pg_pass#pg_host:5432/pg_db?ssl=1
You can query it in your components using the root postgres field, e.g.:
{
postgres {
allPosts {
nodes {
id
title
authorId
userByAuthorId {
id
username
}
}
}
}
}

Gatsby now supports an arbitrary GraphQL endpoint as a source which will help: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
You can also use Hasura to give you an instant GraphQL API on Postgres and then query that from your Gatsby app. You can follow the tutorial here.
Step1: Deploy Hasura against your existing Postgres database: https://docs.hasura.io/1.0/graphql/manual/getting-started/using-existing-database.html
Step 2: Install the gatsby-source-graphql plugin for gatsby: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
Step 3: Configure the plugin
{
plugins: [
{
resolve: 'gatsby-source-graphql', // <- Configure plugin
options: {
typeName: 'HASURA',
fieldName: 'hasura', // <- fieldName under which schema will be stitched
createLink: () =>
createHttpLink({
uri: `https://my-graphql.herokuapp.com/v1alpha1/graphql`, // <- Configure connection GraphQL url
headers: {},
fetch,
}),
refetchInterval: 10, // Refresh every 10 seconds for new data
},
},
]
}
Step 4: Make the GraphQL query in your component
const Index = ({ data }) => (
<div>
<h1>My Authors </h1>
<AuthorList authors={data.hasura.author} />
</div>
)
export const query = graphql`
query AuthorQuery {
hasura { # <- fieldName as configured in the gatsby-config
author { # Normal GraphQL query
id
name
}
}
}
Other links:
Sample-app/tutorial:
https://github.com/hasura/graphql-engine/tree/master/community/sample-apps/gatsby-postgres-graphql
Blogpost:
https://blog.hasura.io/create-gatsby-sites-using-graphql-on-postgres-603b5dd1e516
Note: I work at Hasura.

Related

Using Mongodb Adapter in NextAuth not working

I git clone & copy MUI Nextjs example project & start from there.
From NextAuth portal, they said I can just copy mongodb adapter setup here and basically it will work well out of the box. I placed this file in this path: /src/lib/mongodb.js
Here I'm using CredentialsProvider. Basically I'm using my own form for login & authentication process.
Here my /pages/api/auth/[...nextauth].js file:
import { MongoDBAdapter } from "#next-auth/mongodb-adapter";
import NextAuth from 'next-auth';
import CredentialsProvider from 'next-auth/providers/credentials';
import clientPromise from "../../../src/lib/mongodb";
export default NextAuth({
secret: process.env.SECRET,
adapter: MongoDBAdapter(clientPromise),
providers: [
CredentialsProvider({
async authorize(credentials) {
const { email, password } = credentials
if(email !== 'test#test.com' || password !== 'password123') {
throw new Error('User does not exists. Please make sure you insert the correct email & password.')
}
return {
id: 1,
name: 'Tester',
email: 'test#test.com'
}
}
})
],
callbacks: {
redirect: async ({ url, baseUrl }) => {
return baseUrl
}
}
})
What I understood, I can just straight away use this adapter & it will create 4 models/tables (User, Session, Account, VerificationToken) by default. So I don't need to create them myself. Ref doc here
According to the NextAuth MongoDB Adapter documentation, I just need to specify the MONGODB_URI in .env.local.
so here my /.env.local file content:
NEXTAUTH_URL=http://localhost:3000
MONGODB_URI=mongodb+srv://<username>:<password>#rest.lvnpm.mongodb.net/<database_name>?retryWrites=true&w=majority
SECRET=someSecret
NODE_ENV=development
So currently, it does nothing at all. I don't need to specify session.strategy to database since by default NextAuth will recognized that if I use adapter option.
What do I need to do here to make this work? Any helps is appreciated. Here my github project
I just found out that in NextAuth, if I use CredentialsProvider. I won't be able to persist data using database strategy. You may go here to NextAuth documentation itself to know why

Getting data from Mongodb Atlas in the console when running npm start , but cannot get the URL

I am trying to set up a Vue js page to list the data from MongoDB Atlas.
My Node JS server is connecting fine, and getting the API (Documents) listed.
See the code here that is working, but I am not able to get the URL for my localhost.
The localhost:4000/whatever-needs-to-be-here??? is all giving an error of can't get data.
The port it is listening at is 4000
I have axios on a page trying to get data, but I am not sure what URL to enter in the axios get('????') as my local url is not working.
What am I missing to have the data display in the browser?
npm start
> my-view-app#0.1.0 start /Users/macbookpro/my-vue-app-03-4
> node server.js
Server listening at 4000
This is what I get in the console when running npm start.
Also, the cluster name is clusterdives, the database name is: dive_db, the collection name is dives.
```
(node:2276) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.
[MongoDB connection] SUCCESS
[
{
_id: 5f9ca2dd9a1d712bd137aa73,
dive_number: 1,
dive_date: 2015-06-06T04:00:00.000Z,
dive_location: 'Lake Phoenix, Rawlings, Va',
dive_country: 'United States',
dive_description: 'Dive number 4 of open water training',
dive_note: 'long note and description of the dive. This is where we write down memorable moments of the dive, prior to the dive, during the dive and after the dive.',
dive_duration: { dive_duration_value: 45, dive_duration_unit: 'min' },
dive_depth: { dive_depth_value: 50, dive_depth_unit: 'ft' }
},
{
_id: 5fa2142f137e530a5a4a7cf7,
dive_number: '2',
dive_date: '2015-06-06T04:00:00.000+00:00',
dive_location: ' Lake Phoenix, Rawlings, Va',
dive_country: 'United States',
dive_description: 'Dive number 4 of open water training',
dive_note: 'long note and description of the dive. This is where we write down mem',
dive_duration: { dive_duration_value: '45', dive_duration_unit: 'min' },
dive_depth: { dive_depth_value: '50', dive_depth_unit: 'ft' }
}
]
This is the scrip section in the Mainlist vue component/page.
<script>
import axios from 'axios';
export default {
name: 'Mainlist',
data() {
return {
dive_db: [],
}
},
mounted() {
axios.get('**THIS IS WHAT I AM LOOKING FOR**').then((response) => {
console.log(response.data);
this.dive_db = response.data;
})
.catch((error) => {
console.log(error);
})
}
}
</script>

how to sync data from ydn-db web app to backend server?

With ydn-dn, i want to automatically synchronise data from my web app with my REST back end.
I read the documentation and searched in examples but i cannot make it work.
https://yathit.github.io/ydn-db/synchronization.html
http://dev.yathit.com/api/ydn/db/schema.html#sync
I tried to define a schema with sync configuration like that :
var schema = {
stores: [ {
name: 'contact',
keyPath: 'id',
Sync: {
format: 'rest',
transport: service,
Options: {
baseUri: '/'
}
}
}
]
};
and created a function for transport :
var service = function(args) {
console.log("contact synch");
};
but my service function is never called.
I certainly misunderstood how YDN-db work, but i didn't found any example.
To complete, here is a jsfiddle :
http://jsfiddle.net/asicfr/y7sL7b3j/
Please see the example http://yathit.github.io/ydndb-demo/entity-sync/app.html
Older example http://yathit.github.io/sprintly-service/playground.html from https://github.com/yathit/sprintly-service

How to seed dev database in Sails.js in a reproducible way

I'm looking for a best way to seed my development database in sails js.
In rails I would just use the seeds.rb file but even without that I could use a rake task.
With sails I am unsure of how I could do this outside of manually doing it with sails console.
Please note that solutions which add logic to config/models and the models themselves for production seeding are not what I am looking for. I don't want these records to exist in production.
You can seed your database in the config/bootstrap.js file.
To seed it for a particular environment, what I usually do is:
// config/bootstrap.js
module.exports.bootstrap = function (cb) {
if(process.env.NODE_ENV !== 'development')
return cb();
// Do whatever you want
// And don't forget to...
return cb();
};
And to drop the database each time during the Sails lifting:
// config/env/development.js
module.exports = {
models: {
migrate: 'drop'
}
};
You can use test framework, like Mocha. At your development mode, switch your table name to development table. Here is step by step:
Install mocha with npm install mocha --save-dev
Create test/boostrap.test.js and fill with (configure as your needs), look at my configured connections, it'll override default connections at config.
var Sails = require('sails'),
sails;
before(function (done) {
Sails.lift({
log : {
level: 'error'
},
connections: {
mongodbServer: {
database: 'table_test'
}
},
models : {
migrate: 'drop'
}
}, function (err, server) {
sails = server;
done(err, sails);
});
});
after(function (done) {
// here you can clear fixtures, etc.
sails.lower(done);
});
Create another file for seeding your data, for example create test/inject/seed.js and fill with something like.
describe('data seeding', function(){
it('should seed data', function(done){
sails.models.someModel
.create({
name: 'Some Name'
})
.then(function(result){
done();
})
.catch(done);
});
});
Add this at your package.json under "scripts" key.
"test": "_mocha test/bootstrap.test.js test/inject/**/*.inject.js --no-timeouts"
Run it with npm test to seed your data.
If you need to use it at development mode, when you run sails lift, edit your config/env/development.js and add something like this.
module.exports = {
connections: {
mongodbServer: {
database: 'table_test'
}
}
};
Now your sails lift will use table_test instead of production table, so your production table will be clean.

Using Grunt to Mock Endpoints

I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?
I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.
As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.
Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}
grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.
You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/
Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}