Using Mongodb Adapter in NextAuth not working - mongodb

I git clone & copy MUI Nextjs example project & start from there.
From NextAuth portal, they said I can just copy mongodb adapter setup here and basically it will work well out of the box. I placed this file in this path: /src/lib/mongodb.js
Here I'm using CredentialsProvider. Basically I'm using my own form for login & authentication process.
Here my /pages/api/auth/[...nextauth].js file:
import { MongoDBAdapter } from "#next-auth/mongodb-adapter";
import NextAuth from 'next-auth';
import CredentialsProvider from 'next-auth/providers/credentials';
import clientPromise from "../../../src/lib/mongodb";
export default NextAuth({
secret: process.env.SECRET,
adapter: MongoDBAdapter(clientPromise),
providers: [
CredentialsProvider({
async authorize(credentials) {
const { email, password } = credentials
if(email !== 'test#test.com' || password !== 'password123') {
throw new Error('User does not exists. Please make sure you insert the correct email & password.')
}
return {
id: 1,
name: 'Tester',
email: 'test#test.com'
}
}
})
],
callbacks: {
redirect: async ({ url, baseUrl }) => {
return baseUrl
}
}
})
What I understood, I can just straight away use this adapter & it will create 4 models/tables (User, Session, Account, VerificationToken) by default. So I don't need to create them myself. Ref doc here
According to the NextAuth MongoDB Adapter documentation, I just need to specify the MONGODB_URI in .env.local.
so here my /.env.local file content:
NEXTAUTH_URL=http://localhost:3000
MONGODB_URI=mongodb+srv://<username>:<password>#rest.lvnpm.mongodb.net/<database_name>?retryWrites=true&w=majority
SECRET=someSecret
NODE_ENV=development
So currently, it does nothing at all. I don't need to specify session.strategy to database since by default NextAuth will recognized that if I use adapter option.
What do I need to do here to make this work? Any helps is appreciated. Here my github project

I just found out that in NextAuth, if I use CredentialsProvider. I won't be able to persist data using database strategy. You may go here to NextAuth documentation itself to know why

Related

[next-auth][error][adapter_error_getUserByAccount]; Cannot read properties of undefined (reading 'findUnique')

I am making a sign up page with 3 providers (Twitter, Facebook and Instagram) using next-auth and prisma with mongoDB. The issue appears when I try to sign up with any of the providers. Here is my nextauth.js file.
import NextAuth from "next-auth"
import { PrismaAdapter } from "#next-auth/prisma-adapter"
import { PrismaClient } from '#prisma/client';
import InstagramProvider from "next-auth/providers/instagram";
import TwitterProvider from "next-auth/providers/twitter";
import FacebookProvider from "next-auth/providers/facebook";
const prisma = new PrismaClient();
export default NextAuth({
adapter: PrismaAdapter(prisma),
providers: [
InstagramProvider({
clientId: process.env.INSTAGRAM_CLIENT_ID,
clientSecret: process.env.INSTAGRAM_CLIENT_SECRET
}),
TwitterProvider({
clientId: process.env.TWITTER_CLIENT_ID,
clientSecret: process.env.TWITTER_CLIENT_SECRET,
version: "2.0",
}),
FacebookProvider({
clientId: process.env.FACEBOOK_CLIENT_ID,
clientSecret: process.env.FACEBOOK_CLIENT_SECRET
}),
],
session: {
strategy: 'jwt',
},
});
I have tried to reinstall all the dependencies, because I don't see what else could be the problem. At first I thought it was the dependencies so I reinstall all the dependencies.
The problem is in your adapter in the [nextauth].js file or wherever you are declaring the prisma instance.
Check out those similar discussions:
https://github.com/nextauthjs/next-auth/discussions/4152
Integrating Prisma Adapter with Next Auth - Unexpected token 'export'
The issue actually comes from the prisma schema. I fixed it after reading the next auth documentation about prisma and mongoDB.

How to set idField in feathersVuex auth service

I have an issue that I couldn't resolve for a quite long time.
I'm making a shop using feathersJS as a backend api and Vue.js on the front and Mongodb. As I try to authenticate user on website, everything goes smoothly until "setUser" action fires and in auth service getters I get an error saying "Cannot read property 'idField' of undefined". I believe that I changed this property to "_id", wherever I could, but still this error occurs. Here are some screenshots, that may be helpful. This project is supposed to get me my first job in webDev so I would be eternally grateful for your support.
AddItem action in vuex,
setUser action in vuex
Feathers-client.js:
import feathers from '#feathersjs/feathers';
import socketio from '#feathersjs/socketio-client';
import auth from '#feathersjs/authentication-client';
import io from 'socket.io-client';
import { iff, discard } from 'feathers-hooks-common';
import feathersVuex from 'feathers-vuex';
const socket = io('http://localhost:3030', { transports: ['websocket'] });
const feathersClient = feathers()
.configure(socketio(socket))
.configure(auth({ storage: window.localStorage, debug: false }))
.hooks({
before: {
all: [
iff(
context => ['create', 'update', 'patch'].includes(context.method),
discard('__id', '__isTemp'),
),
],
},
});
export default feathersClient;
// Setting up feathers-vuex
const {
makeServicePlugin, makeAuthPlugin, BaseModel, models, FeathersVuex,
} = feathersVuex(
feathersClient,
{
idField: '_id', // Must match the id field in your database table/collection
whitelist: ['$regex', '$options'],
},
);
export {
makeAuthPlugin, makeServicePlugin, BaseModel, models, FeathersVuex,
};
auth service file:
import { makeAuthPlugin } from '../../feathers-client';
export default makeAuthPlugin({
userService: 'api/users',
entityIdField: '_id',
});
The issue here was the name of the service, which as a plugin in vuex was under a name of "users" not "api/users". The solution was to change in servicePlugin options nameStyle: "path" instead of "short"

How do I solve the keycloak refresh /#state on my angular7 project - i'm using keycloak-angular

I have also tried all varying combinations for web origins and valid redirect URIs
I login via keycloak and it continuously redirects me back and forth between my localhost application and this url: http://localhost:4200/#state=166446fd-daf6-4b76-b595-583c01c663df&session_state=57ead1f3-bf41-4117-9ddf-75e37c9248e7&code=8692b58b-0868-4762-b82e-acde9911dd34.57ead1f3-bf41-4117-9ddf-75e37c9248e7.1e8b5b9d-b590-453e-b396-62b46c18cc9f
I have tried it on firefox and chrome but with the same issue - it seems to be looking for the keycloak.json file in the network tab even though I can login to the correct realm via keycloak
GET http://localhost:4200/keycloak.json 404 (Not Found)
scheduleTask # zone.js:2969
ERROR An error happened during Keycloak initialization. core.js:1601
Unhandled Promise rejection: An error happened during Keycloak initialization. ; Zone: ; Task: Promise.then ; Value: An error happened during Keycloak initialization. undefined
static init(): Promise<any> {
const keycloakAuth: any = Keycloak({
url: 'http://localhost:8080/auth',
realm: 'ContractPortal',
clientId: 'secretkey2',
'ssl-required': 'external',
'public-client': true
});
KeycloakService.auth.loggedIn = false;
return new Promise((resolve, reject) => {
keycloakAuth.init({onLoad: 'login-required'})
.success(() => {
console.log(keycloakAuth);
KeycloakService.auth.loggedIn = true;
KeycloakService.auth.authz = keycloakAuth;
KeycloakService.auth.logoutUrl = keycloakAuth.authServerUrl
+ '/realms/angular_keycloak/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
resolve();
})
.error(() => {
reject();
});
});
}
it'd be great if one of you guys can point me in the right direction to solve this issue...
I have found another similar question to what I have asked link but not sure how to implement this solution! this is my provider setup -
providers: [
{
provide: APP_INITIALIZER,
useFactory: initializer,
multi: true,
deps: [KeycloakService]
}
],
but OP puts the following in his provider
providers: [
KeyCloakService,
AssetService,
{
provide: LocationStrategy,
useClass: PathLocationStrategy
}
]
Please let me know if you require any other information
For anybody that ran into the same issue - I misunderstood the keycloak.json file and did not know where to get it - so I was exporting the complete json file from keycloak but THIS is not how you should get it!!
First you have to go to your realm > clients and click on installation
then download the keycloak OIDC json file
then place it next to the index.html file in your application
This solved my issue - hope it helps somebody else
I was researching on the internet and it is the same case, for me my angular page is reloading automatically in infnite loop
I did some changes in keycloak-init.ts file (please find following code and change checkLoginIframe from true to false)
Old Code: keycloak.init ({onLoad: 'login-required', "checkLoginIframe": true})
New Code: keycloak.init ({onLoad: 'login-required', "checkLoginIframe": false})
this.loadingService.isLoading$.pipe(delay(100)).subscribe((data => {
this.loading = data
}))
this.isLogging = await this.keycloak.isLoggedIn();
type roleUser = Array<{ id: number, text: string }>
if (this.isLogging) {
this.userProfile = await this.keycloak.loadUserProfile();
this.userRoles = await this.keycloak.getUserRoles();
}else {
// this.clearSession();
}
if you are using "this.keycloak.loadUserProfile()" Remove this "clear session"

Connect Gatsby with Postgres

I would like to pull data from Postgres to Gatsby using graphql. I have written node.js server, but i cannot find way to use it in gatsby.
(https://github.com/gstuczynski/graphql-postgres-test)
Have you any ideas?
What you need to do is implement a source plugin as seen here https://www.gatsbyjs.org/docs/create-source-plugin/.
There are many examples within the gatsby repository that implement the source api. See those for inspiration! Basically you need to translate the contents of your Postgres db into a format gatsby understands. Gatsby calls this format “nodes”.
You could implement a plugin which interfaces with your db directly or with whatever api your node server exposes (graphql, REST etc.).
The gatsby-source-pg module connects directly to your database and adds the tables/views/functions/etc to Gatsby's GraphQL API. To use it, install the module:
yarn add gatsby-source-pg
then add to to the plugin list in gatsby-config.js:
module.exports = {
plugins: [
/* ... */
{
resolve: "gatsby-source-pg",
options: {
connectionString: "postgres://localhost/my_db",
},
},
],
};
The connection string can also include username/password, host, port and SSL if you need to connect to remote database; e.g.: postgres://pg_user:pg_pass#pg_host:5432/pg_db?ssl=1
You can query it in your components using the root postgres field, e.g.:
{
postgres {
allPosts {
nodes {
id
title
authorId
userByAuthorId {
id
username
}
}
}
}
}
Gatsby now supports an arbitrary GraphQL endpoint as a source which will help: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
You can also use Hasura to give you an instant GraphQL API on Postgres and then query that from your Gatsby app. You can follow the tutorial here.
Step1: Deploy Hasura against your existing Postgres database: https://docs.hasura.io/1.0/graphql/manual/getting-started/using-existing-database.html
Step 2: Install the gatsby-source-graphql plugin for gatsby: https://www.gatsbyjs.org/packages/gatsby-source-graphql/
Step 3: Configure the plugin
{
plugins: [
{
resolve: 'gatsby-source-graphql', // <- Configure plugin
options: {
typeName: 'HASURA',
fieldName: 'hasura', // <- fieldName under which schema will be stitched
createLink: () =>
createHttpLink({
uri: `https://my-graphql.herokuapp.com/v1alpha1/graphql`, // <- Configure connection GraphQL url
headers: {},
fetch,
}),
refetchInterval: 10, // Refresh every 10 seconds for new data
},
},
]
}
Step 4: Make the GraphQL query in your component
const Index = ({ data }) => (
<div>
<h1>My Authors </h1>
<AuthorList authors={data.hasura.author} />
</div>
)
export const query = graphql`
query AuthorQuery {
hasura { # <- fieldName as configured in the gatsby-config
author { # Normal GraphQL query
id
name
}
}
}
Other links:
Sample-app/tutorial:
https://github.com/hasura/graphql-engine/tree/master/community/sample-apps/gatsby-postgres-graphql
Blogpost:
https://blog.hasura.io/create-gatsby-sites-using-graphql-on-postgres-603b5dd1e516
Note: I work at Hasura.

Using Grunt to Mock Endpoints

I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?
I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.
As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.
Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}
grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.
You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/
Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}