How to render react on server-side with an api? - rest

Just to make it clear, I'm using the MongoDB, Express, React and Node stack.
I'm trying to learn react.js right now. I got the basics right and I am able to code a simple react app with a router. I've also tried server-side rendering a simple react app and it also works perfectly. However, I'm kind of stuck now that I want to make a full app with a rest api and server-side rendering.
1) I don't know how I should go about separating the api and the react code in the server file. Would starting off by listing the api calls and then do the server-side rendering work?
Like so:
app.get('/api/whatever', function(req, res) {
//get whatever
});
app.get('*', function(req, res) {
//math routes and renderToString React
});
2) Also, the reason I couldn't even test the above, is that when I try to run the server with nodemon it throws an error because it doesn't understand the react code, how should I go about this? Should I somehow configure nodemon to read es6 or ignore it or configure webpack to run the express server ?
3) The final question that could clear this whole story quite easily. I've tried finding an answer but got many conflicting ones instead. Are the google crawlers capable of crawling a React app? I'm learning server-side rendering for SEO, is that all really necessary?
Sorry for the long question, looking forward to reading your answers.

I do it the same way you do in your code example in the project I'm currently working on – I match * and then use React Router to render different pages. I wrote a blog article about this, with code examples.
in the setup I have, I use webpack to compile my backend code, just like I do with the frontend code. I use the watch mechanism to listen for code changes and automatically restart the node server after recompiling. No need for nodemon.
#!/usr/bin/env node
const path = require('path');
const webpack = require('webpack');
const spawn = require('child_process').spawn;
const serverConfig = require('webpack.config.server');
const compiler = webpack(serverConfig);
const watchConfig = {
aggregateTimeout: 300,
poll: 1000,
ignored: '**/*.scss'
};
let serverControl;
compiler.watch(watchConfig, (err, stats) => {
if (err) {
console.error(err.stack || err);
if (err.details) {
console.error(err.details);
}
return;
}
const info = stats.toJson();
if (stats.hasErrors()) {
info.errors.forEach(message => console.log(message));
return;
}
if (stats.hasWarnings()) {
info.warnings.forEach(message => console.log(message));
}
if (serverControl) {
serverControl.kill();
}
serverControl = spawn('node', [path.resolve(__dirname, '../../dist/polly-server.js')]);
serverControl.stdout.on('data', data => console.log(`${new Date().toISOString()} [LOG] ${data}`));
serverControl.stderr.on('data', data => console.error(`${new Date().toISOString()} [ERROR] ${data}`));
});
yes, Google crawls client-side React code, but server-side rendering is still a good idea, because crawl results may be inconsistent, especially if you load parts of the page dynamically after Ajax calls

Related

Mocking REST calls in Svelte

Hi I've been using Svelte for some weeks not and really enjoy it.
I was trying to set up unit tests according to https://testing-library.com/docs/svelte-testing-library/intro/
and that went also well. What that guide does not include however, is how I should mock my REST calls. I have tried following without success:
jest-mock-fetch
jest-fetch-mock
jest-mock-promise
msw server (this does not respond anything, maybe it works only for React applications?)
Has anyone successfully mocked the REST calls in a Svelte app, if so could you post a minimal fiddle to show me what libs to use and how it looks like. Thank you.
Instead of mocking the request functions you can mock at the network layer with the msw library. This has the added benefit of not being tied to fetch, so you could replace with axios if need be.
It would look something like this:
import {rest} from 'msw'
import {setupServer} from 'msw/node'
const server = setupServer(
rest.get('/my-api', (req, res, ctx) => {
return res(ctx.json({greeting: 'hello there'}))
}),
)
beforeAll(() => server.listen())
afterEach(() => server.resetHandlers())
afterAll(() => server.close())
test('does what I want it to', async () => {
// the actual test...
})
You can have a look at the react examples in the testing library. Although it's specifically for react, the msw use can be used the same in svelte testing.

Working with URL parameters in custom Kibana plugin

I am working on a custom plugin to Kibana (7.5.2). The plugin is of type 'app'. I would like to be able to pass parameters to this plugin in order to pre-load some data from Elasticsearch. I.e., I need to provide users with some specific URLs containing parameters that will be used by the plugin to show only a relevant portion of data.
My problem is that I was not able to find sufficient documentation on this and I do not know what the correct approach should be. I will try to summarize what I know/have done so far:
I have read the official resources on plugin development
I am aware of the fact that _g and _a URL parameters are used to pass state in Kibana applications. However, a) I am not sure if this is the correct approach in my case and b) I also failed to find any information on how my plugin should access the data from these parameters.
I checked the sources of other known plugins, but again, failed to find any clues.
I am able to inject some configuration values using injectUiAppVars in the init method of my plugin (index.js) and retrieve these values in my app (main.js):
index.js:
export default function (kibana) {
return new kibana.Plugin({
require: ['elasticsearch'],
name: ...,
uiExports: {
...
},
...
init(server, options) { // eslint-disable-line no-unused-vars
server.injectUiAppVars('logviewer', async () => {
var kibana_vars = await server.getInjectedUiAppVars('kibana');
var aggregated_vars = { ...kibana_vars, ...{ mycustomparameter: "some value" } }
return aggregated_vars
});
...
}
});
}
main.js
import chrome from 'ui/chrome';
. . .
const mycustomparameter = chrome.getInjected('mycustomparameter');
Providing that I manage to obtain parameters from URL, this would allow me to pass them to my app (via mycustomparameter), but again, I am not sure if this approach is correct.
I tried to get some help via the Elastic forum, but did not receive any answer yet.
My questions
1. Is there any source of information on this particular topic? I am aware of the fact that the plugin API changes frequently, hence I do not expect to find an extensive documentation. Maybe a good example?
Am I completely off course with the way I am trying to achieve it?
Thanks for reading this, any help would be much appreciated!

How RestBase wiki handle caching

Following the installation of RestBase using standard config, I have a working version of summary API.
The problem that the caching mechanism seems strange to me.
The piece of code would decide whether to look at a table cache for fast response. But I cannot make it a server-cache depend on some time-constrain (max-age when the cache is written for example). It means that the decision to use cache or not entirely depend on clients.
Can someone explain the workflow of RestBase caching mechanism?
// Inside key.value.js
getRevision(hyper, req) {
//This one get the header from client request and decide to use cache
or not depend on the value. Does it mean server caching is non-existent?
if (mwUtil.isNoCacheRequest(req)) {
throw new HTTPError({ status: 404 });
}
//If should use cache, below run
const rp = req.params;
const storeReq = {
uri: new URI([rp.domain, 'sys', 'table', rp.bucket, '']),
body: {
table: rp.bucket,
attributes: {
key: rp.key
},
limit: 1
}
};
return hyper.get(storeReq).then(returnRevision(req));
}
Cache invalidation is done by the change propagation service, which is triggered on page edits and similar events. Cache control headers are probably set in the Varnish VCL logic. See here for a full Wikimedia infrastructure diagram - it is outdated but gives you the generic idea of how things are wired together.

self.addEventListener('fetch', function(e) { }) is not working

I have a doubt in PWA and will be glad if someone helps me with that. In my PWA I don't have any problem with storing static files like HTML, JS & CSS. But am facing Issues on dynamic data. i.e : my self.addEventListener('fetch', function(e) { }) is not getting called, but other functionalities are working fine i.e: 'install' and 'active' event.
to be more particular, I am using #angular/service-worker which worked fine but I created another sw file called sw.js. In my sw-js I'm listening to the events like 'install' 'active' and 'fetch'. My sw.js fetch is not getting called whereas the other two methods work well. But while fetching the ngsw-worker.js's fetch method alone gets called.
The thing I need is to make CRUD Operations in PWA with angular.
Thanks in advance!
You can do the dynamic caching like below , the service worker will intercept every request and add in to the cache.
self.addEventListener("fetch", function (event) {
event.respondWith(
caches.open("dynamiccache").then(function (cache) {
return fetch(event.request).then(function (res) {
cache.put(event.request, res.clone());
return res;
})
})
)
}
Note : You can't cache POST Requests
Can service workers cache POST requests?

node.js and socket.io: different connections for different "sessions"

I've got a node.js application that 'streams' tweets to users. At the moment, it just searches Twitter for a hard-coded string, but I'd like to allow users to configure this in the URL (eg. by visiting /?q=stackoverflow).
At the moment, my code looks a bit like this:
app.get('/', function (req, res) {
// page rendering skipped
io.sockets.on('connection', function (socket) {
twit.stream('user', {track: 'stackoverflow'}, function(stream) {
stream.on('data', function (data) {
socket.volatile.emit('tweet', data);
}
});
});
});
The question is, how do I make it so that each user can see a different stream of tweets simultaneously? At the moment, it works fine in a single browser tab, but it falls over as soon as a second one is opened - and the error is fairly deep down inside socket.io. Am I misusing it?
I haven't fully got my head around socket.io yet, so that could be the issue.
Thanks in advance!
Every time a new request comes in, you are redefining the connection callback with io.sockets.on - you should move that block of code outside of app.get, after your initialization statement of the io object.