How to integrate Spring MVC and Nuxt JS? - rest

I have learnt Nuxt JS and Spring MVC. I want to know, how to make a single page web application integrating or configuring Spring MVC and Nuxt JS. I didn't find any well documented material over internet. Basically, I want to handle all CRUD operations asynchronously. Database is MySQL. If possible, can someone help me how to do this? Thank you in advance!

I hope this will answer you, if I understand your question correctly.
Assuming, you have written the data access operations using Spring, Nuxt Js runs on port 3000, Tomcat on port 8080.
Let's say this is our RestController which fetches users data from database (using repository, service layer). Note the use of CrossOrigin - enabling cross origin request for restful web service which includes headers for Cross-Origin Resource Sharing (CORS) in the response. Here, we only allow localhost:3000 to send cross-origin requests. You can also go for Global CORS configuration.
#RestController
#RequestMapping("api")
#CrossOrigin(origins = "http://localhost:3000")
public class MainRestController {
private final IRestService restService;
#Autowired
public MainRestController(IRestService restService) {
this.restService = restService;
}
#GetMapping(value = "users", produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<Iterable<String>> getUsers() {
try {
return new ResponseEntity<>(restService.getAllUsers(), HttpStatus.OK);
} catch (Exception e) {
return new ResponseEntity<>(HttpStatus.BAD_REQUEST);
}
}
}
As you are using Nuxt js, this is our vue component which tries to access our REST end point which we created above. We are using axios to get our response here.
<template>
<div class="container">
<ul>
<li v-for="user of users">
{{user}}
</li>
</ul>
</div>
</template>
<script>
export default {
async asyncData({ $axios }) {
const users = await $axios.$get('http://localhost:8080/api/users');
return { users }
}
}
</script>
Your nuxt.config.js should contain this: the axios and proxy module should be installed.
modules: [
'#nuxtjs/axios',
'#nuxtjs/proxy'
],
axios: {
proxy: true,
},
env: {
baseUrl: process.env.BASE_URL || 'http://localhost:3000'
},
proxy: {
'/api/': {
target: 'http://localhost:8080/',
pathRewrite: { "^/api": "" },
changeOrigin: true,
}
},

Related

Why is IdentityServer redirecting to http rather than https?

I have a very simple MVC5 website that I'm trying to secure with IdentityServer3.
Both my website and my IdentityServer instance are hosted as separate sites in AppHarbor. Both are behind https.
When I hit a resource in my website that is protected by an [Authorize] attribute (e.g., /Home/About), I am successfully redirected to IdentityServer, and I can successfully authenticate.
When IdentityServer POSTs its response back to the website (via app.FormPostResponse.js), the website responds with a 302 redirect to the requested resource - as expected. However, this redirect is to http, not https (see the network trace below).
I'm sure this is just something wrong with my IdentityServer config, but I'd appreciate any pointers as to what I've got wrong.
(AppHarbor uses a reverse proxy (nginx I believe) in front of IIS, where SSL terminates - so I have RequireSsl = false for this scenario, as per the IdentityServer documentation.)
Here is my website's Startup.cs
public class Startup
{
public void Configuration(IAppBuilder app)
{
app.UseCookieAuthentication(new CookieAuthenticationOptions
{
AuthenticationType = "Cookies"
});
app.UseOpenIdConnectAuthentication(new OpenIdConnectAuthenticationOptions
{
Authority = "https://<my-idsrv3>.apphb.com/identity",
ClientId = "<my-client-id>",
Scope = "openid profile roles email",
RedirectUri = "https://<my-website>.apphb.com",
ResponseType = "id_token",
SignInAsAuthenticationType = "Cookies",
UseTokenLifetime = false
});
JwtSecurityTokenHandler.InboundClaimTypeMap = new Dictionary<string, string>();
}
}
Here is Startup.cs from my IdentityServer3 instance:
public class Startup
{
public void Configuration(IAppBuilder app)
{
app.Map("/identity", idsrvApp =>
{
idsrvApp.UseIdentityServer(new IdentityServerOptions
{
SiteName = "My Identity Server",
SigningCertificate = Certificates.LoadSigningCertificate(),
RequireSsl = false,
PublicOrigin = "https://<my-idsrv3>.apphb.com",
Factory = new IdentityServerServiceFactory()
.UseInMemoryUsers(Users.Get())
.UseInMemoryClients(Clients.Get())
.UseInMemoryScopes(Scopes.Get())
});
});
}
}
Here is the definition of my website Client:
new Client
{
Enabled = true,
ClientName = "My Website Client",
ClientId = "<my-client-id>",
Flow = Flows.Implicit,
RedirectUris = new List<string>
{
"https://<my-website>.apphb.com"
},
AllowAccessToAllScopes = true
}
Here is the trace from Chrome, after clicking 'Yes, Allow' on the IdentityServer consent screen:
So it looks like this issue was caused by my client website being behind an SSL-terminating nginx front-end.
With reference to this GitHub issue, I added the following to the start of my website's app configuration:
app.Use(async (ctx, next) =>
{
string proto = ctx.Request.Headers.Get("X-Forwarded-Proto");
if (!string.IsNullOrEmpty(proto))
{
ctx.Request.Scheme = proto;
}
await next();
});
This makes the website aware that incoming requests were over https; this in turn appears to ensure that the IdentityServer3 middleware generates https uri's.
Had the same issue running identityserver4 in an Azure App Service. Even with forced https, the generated urls in .well-known/openid-configuration were still http://.
Fixed using the same solution as the other answer, but using AspNetCore ForwardedHeadersExtensions:
var forwardOptions = new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto,
// Needed because of mixing http and https.
RequireHeaderSymmetry = false,
};
// Accept X-Forwarded-* headers from all sources.
forwardOptions.KnownNetworks.Clear();
forwardOptions.KnownProxies.Clear();
app.UseForwardedHeaders(forwardOptions);
See also https://github.com/IdentityServer/IdentityServer4/issues/1331 for more discussion on this subject.
Add forwarded headers in your startup
services.Configure<ForwardedHeadersOptions>(options =>
{
options.ForwardedHeaders =
ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto | ForwardedHeaders.XForwardedHost;
});
and
app.UseForwardedHeaders(new ForwardedHeadersOptions()
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto
});
Finally tell the config it has to replace the http to https in the redirect url. I'm still looking for a better way to implement this.
in your .addopenidconnect() add:
Func<RedirectContext, Task> redirectToIdentityProvider = (ctx) =>
{
if (!ctx.ProtocolMessage.RedirectUri.StartsWith("https") && !ctx.ProtocolMessage.RedirectUri.Contains("localhost"))
ctx.ProtocolMessage.RedirectUri = ctx.ProtocolMessage.RedirectUri.Replace("http", "https");
return Task.FromResult(0);
};
opt.Events = new OpenIdConnectEvents
{
OnRedirectToIdentityProvider = redirectToIdentityProvider
};

how to import mongo db in vue js 2?

i already have try to install mongodb via npm but i keep getting error Cannot find module "fs"
and my code is look like this
<script>
const MongoClient = require('mongodb').MongoClient;
export default {
data(){
return{
msg:'this is a test'
}
},
created:function(){
MongoClient.connect('mongodb://127.0.0.1:27017', (err, database) => {
if (err){
console.log('1');
}else{
console.log('2');
}
})
}
}
</script>
<template>
<div>
{{msg}}
</div>
</template>
so how do i import mongodb to my vuejs 2 framework?
VueJS is frontend framework.
You definitely should not try to deal with DB directly from Vue.
You should have backend made with any language/framework you want: NodeJS(if you want to stick with JS), ASP.NET(C#), Spring(Java) etc. and your backend should deal with DB and you should only make AJAX requests to your backend and send/get back JSONs and deal with JSONs on frontend with Vue.

Credential Validation from database in Oracle JET

I have created a sample application in Oracle JET which would route to the homepage upon login.
I want to validate the user credentials(username and password) with the table in the database using RESTful web services and only upon successful validation I want the application to be routed to the homepage.
Since I am new to Oracle JET and have less knowledge about integrating and validating user input with the data in the database, it would be a great if someone could help me with this. Thank you.
You can use ajax method to call restful web services.
Here is an sample that can help you.
self.username = ko.observable("");
self.password = ko.observable("");
self.login = function(data, event)
{
$.ajax({
url: "https://restservicesforlogin?username="+self.username()+"&userpwd="+self.password()+"",
type: 'GET',
headers: {
your headers Details
},
success: function(data)
{
if(self.ERROR_CODE()=='S')
{
oj.Router.rootInstance.go('homePage');
}
if(self.ERROR_CODE()=='E')
{
alert("Invalid username/password");
self.isLoggedIn(false);
}
},
error: function(jqXHR, exception)
{
alert("Internal Server Error") ;
}
})
}

JayData Web API Error $metadata

I get the following error from JayData.
Object {requestUri: "/api/program/getprograms/$metadata", statusCode: 404, statusText: "Not Found", responseText: "<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Stric…↵ </fieldset> ↵</div> ↵</div> ↵</body> ↵</html> ↵"}
This is how I am calling the service. Any idea what I am doing wrong?
$data.service("/api/program/getprograms", function (contextFactory) {
var remotecontext = contextFactory();
remotecontext.Program.filter("it.Program.ProgramID == '1'");
context.Programs.forEach(function (program) {
console.log(program);
});
});
I also tried:
var remotedb = new AppContext({ provider: 'webApi', databaseName: 'RemoteDB', dataSource: '/api/program/getprograms' });
$data.service() and $data.initService() was created to generate dynamic client-side data model on the fly. This is the alternative to generate static data model with JaySvcUtil.exe.
This won't work with WebAPI endpoints and webApi provider since there is no metadata service in WebAPI. The $metadata service is available only in OData endpoints, for WebAPI, you have to build you client-side data model manually.

Using Grunt to Mock Endpoints

I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?
I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.
As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.
Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}
grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.
You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/
Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}