Is there a way to save ParseObject without make a HTTP request to the REST API? - mongodb

I didn't find very much about this topic, so I wonder if it is an easy task to achieve or if it's actually not possible. My problem is that I have a lot of HTTP requests on my server even if a Cloud function is called only once. So I suppose that all the object updating / savings / queries are made by using the REST API. I have so many HTTP requests that several hundred are going timeout, I suppose for the huge traffic that it's generated.
Is there a way to save a ParseObject by executing the query directly to MongoDB? If it's not possible at the moment can you give me some hints if there are already some helper functions to convert a ParseQuery and a ParseObject to the relative in MongoDB so that I can use the MongoDB driver directly?
It's really important for my application to reduce HTTP requests traffic at the moment.
Any idea? Thanks!
EDIT:
Here an example to reproduce the concept:
Make a cloud function:
Parse.Cloud.define('hello', async (req, res) => {
let testClassObject = new Parse.Object('TestClass');
await testClassObject.save(null, {useMasterKey: true});
let query = new Parse.Query('TestClass');
let testClassRecords = await query.find({useMasterKey: true});
return testClassRecords;
});
Make a POST request:
POST http://localhost:1337/parse/functions/hello
Capture HTTP traffic on port 1337 using Wireshark:
You can see that for 1 POST request other 2 are made because of the saving / query code. My goal would be to avoid these two HTTP calls and instead make a DB call directly so that less traffic will go through the whole webserver stack.
Link to the Github question: https://github.com/parse-community/parse-server/issues/6549

The Parse Server directAccess option should do the magic for you. Please make sure you are initializing Parse Server like this:
const api = new ParseServer({
...
directAccess: true
});
...

Related

How to batch requests to the same URL without causing memory leaks

I have a system that processes images. Essentially, I provide an ID to it, and it fetches a source image, and then it begins performing transformations on it to resize and reformat it.
This system gets quite a bit of usage, and one of the things that I've noticed is that I tend to get many requests for the same ID simultaneously, but in different requests to the webserver.
What I'd like to do is "batch" these requests. For example, if there's 5 simultaneous requests for the image "user-upload.png", I'd like there to be only one HTTP request to fetch the source image.
I'm using NestJS with default scopes for my service, so the service is shared between requests. Requests to fetch the image are done with the HttpModule, which is using axios internally.
I only care about simultaneous requests. Once the request finishes, it will be cached, and that prevents new requests from hitting the HTTP url.
I've thought about doing something like this (Pseudocode):
#Provider()
class ImageFetcher {
// Store in flight requests as a map between id:promise
inFlightRequests = { }
fetchImage(id: string) {
if (this.inFlightRequests[id]) {
return this.inFlightRequests[id]
}
this.inFlightRequests[id] = new Promise(async (resolve, reject) => {
const { data } = await this.httpService.get('/images' + id)
// error handling omitted here
resolve(data)
delete inFlightRequests[id]
})
return this.inFlightRequests[id]
}
}
The most obvious issue I see is the potential for a memory leak. This is solveable with more custom code, but I thought I'd see if anyone has any suggestions for doing this without writing more code.
In particular, I've also thought about using an axios interceptor, but I'm not entirely sure how to handle that properly. Any pointers here would be really appreciated.

ASP.NET Core 5 route redirection

We have an ASP.NET Core 5 Rest API where we have used a pretty simple route:
[Route("api/[controller]")]
The backend is multi-tenant, but tenant-selection has been handled by user credentials.
Now we wish to add the tenant to the path:
[Route("api/{tenant}/{subtenant}/[controller]")]
This makes cross-tenant queries simpler for tools like Excel / PowerQuery, which unfortunately tend to store credentials per url
The problem is to redirect all existing calls to the old route, to the new. We can assume that the missing pieces are available in the credentials (user-id is on form 'tenant/subtenant/username')
I had hope to simply intercept the route-parsing and fill in the tenant/subtenant route values, but have had not luck so far.
The closes thing so far is to have two Route-attributes, but that unfortunately messes up our Swagger documentation; every method will appear with and without the tenant path
If you want to transparently change the incoming path on a request, you can add a middleware to set Path to a new value, for example:
app.Use(async (context,next) =>
{
var newPath = // Logic to determine new path
// Rewrite and continue processing
context.Request.Path = newPath;
await next();
});
This should be placed in the pipeline after you can determine the tenant and before the routing happens.

Project Reactor and Server Side Events

I'm looking for a solution that will have the backend publish an event to the frontend as soon as a modification is done on the server side. To be more concise I want to emit a new List of objects as soon as one item is modified.
I've tried implementing on a SpringBoot project, that uses Reactive Web, MongoDB which has a #Tailable cursor that publish an event as soon as the capped collection is modified. The problem is that the capped collection has some limitation and is not really compatible with what I want to do. The thing is I cannot update an existing element if the new one has a different size(as I understood this is illegal because you cannot make a rollback).
I honestly don't even know if it's doable, but maybe I'm lucky and I'll run into a rocket scientist right here that will prove otherwise.
Thanks in advance!!
*** EDIT:
Sorry for the vague question. Yes I'm more focused on the HOW, using the Spring Reactive framework.
When I had a similar need - to inform frontend that something is done on the backend side - I have used a message queue.
I have published a message to the queue from the backend and the frontend consumed the message.
But I am not sure if that is what you're looking for.
if you are using webflux with spring reactor, I think you can simply have a client request with content-type as 'text/event-stream' or 'application/stream+json' and You shall have API that can produce those content-type. This gives you SSE model without too much effort.
#GetMapping(value = "/stream", produces = {MediaType.TEXT_EVENT_STREAM_VALUE, MediaType.APPLICATION_STREAM_JSON_VALUE, MediaType.APPLICATION_JSON_UTF8_VALUE})
public Flux<Message> get(HttpServletRequest request) {
Just as an idea - maybe you need to use a web socket technology here:
The frontend side (I assume its a client side application that runs in a browser, written in react, angular or something like that) can establish a web-socket communication with the backend server.
When the process on backend finishes, the message from backend to frontend can be sent.
You can do emitting changes by hand. For example:
endpoint:
public final Sinks.Many<SimpleInfoEvent> infoEventSink = Sinks.many().multicast().onBackpressureBuffer();
#RequestMapping(path = "/sseApproach", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<ServerSentEvent<SimpleInfoEvent>> sse() {
return infoEventSink.asFlux()
.map(e -> ServerSentEvent.builder(e)
.id(counter.incrementAndGet() + "")
.event(e.getClass().getName())
.build());
}
Code anywhere for emitting data:
infoEventSink.tryEmitNext(new SimpleInfoEvent("any custom event"));
Watch out of threads and things like "subscribeOn", "publishOn", but basically (when not using any third party code), this should work good enough.

RESTfull implementation and general informatino

I have been reading a lot lately, and even more experimenting with web Development. There are some things that I simply cant understand, therefore any help is appreciated.
I am not trying to get my homework done for me. I have some holes in my knowledge, that I desire to fill. Please, help me out with your views :)
REST questions:
Reading documentation this is perfectly understandable (NODE.JS / Express) example:
EXAMPLE ONE (get):
app.get('/', function(req, res) {
res.send('please select a collection, e.g., /collections/messages')
})
My explanation: When the root of the server is hit, send thie following message
EXAMPLE TWO (get):
app.get('/collections/:collectionName/:id', function(req, res) {
req.collection.findOne({name: req.collection.id(req.params.id)},
function(e, result){
if (e) return next(e)
res.send(result)
})
})
My explanation: When the url in hit, take id from the URL (that is located in params.id) and make search based on it (that is MongoDB).
EXAMPLE THREE (post):
app.post('/collections/:collectionName', function(req, res) {
req.collection.insert(req.body, {}, function(e, results){
if (e) return next(e)
res.send(results)
})
})
My explanation: When the URL is hit, take the payload(JSON in this case) that is located in req.body, and insert it as a new document.
Questions:
Are example one and two both RESTfull?
I am now totally confused with params.id. I do understand that POST is transmitted in rew.body... what is params.id? Is it containing URL variables, such as :ID?
My explanations... are they correct?
Example three is also REST, regardless of the fact that POST is used?
Example three, '/collections/:collectionName. Why is the ':collectionName' passed in URL, I could have placed it in req.body as a parameter (along with new data) and take it from there? What is the benefit of doing it?
Thank you
An API must be using HATEOAS to be RESTful. On first example, if / is the entry point of your API, the response should contain links for the available collections, not a human readable string like that. That's definitely not RESTful.
Exactly.
They're OK, except that there's nothing in the third example implying it's a JSON body. It should check for a Content-Type header sent by the client.
REST isn't dependent on HTTP. As long as you're using the HTTP methods as they were standardized, it's fine. POST is the method to use for any action that isn't standardized, so it's fine to use POST for anything, if there isn't a method specific for that. For instance, it's not correct to use POST for retrieval, but it's fine to use it for creating a new resource if you don't have the full representation.
POST means the data body is subordinated to the resource at the target URI. If collectionName were in the POST body, this would mean you were POSTing to /collections, which would make more sense to create a new collection, not a new item of a collection.

node.js: expressjs with mongoose

I'm working on my first node.js / express / mongoose app and I'm facing a problem due to asynchronisation mechanism of node.js. It seems I do not do the thing correctly...
Here is the test route I defined using express:
app.get('/test', function(req, res){
var mod = mongoose.model('MyModel');
mod.find({},function(err, records){
records.forEach(function(record){
console.log('Record found:' + record.id);
// res.send('Thing retrieved:' + record.id);
});
});
});
When I issue a http://localhost/test, I'd like to get the list of records of type 'MyModel' in the response.
The code above is working fine but when it comes to returning this whole list to the client... it does not work (the commented res.send line) and only returned the first record.
I'm very new to node.js so I do not know if it's the good solution to embed several callback functions within the first callback function of app.get . How could I have the whole list returned ?
Any idea ?
What you should be doing is:
mod.find({},function(err, records){
res.writeHead(200, {'Content-Length': body.length});
records.forEach(function(record){
res.write('Thing retrieved:' + record.id);
});
});
Please always check the documentation as well:
http://nodejs.org/docs/v0.3.8/api/http.html#response.write
I missed that you was using express, the send function is part of express and extend's the serverResponse object of node (my bad).
but my answer still applies, express's send function sends the data using ServerResponse.end() so there for the socket get's closed and you cannot send data any more, using the write function uses the native function.
You may also want to call res.end() when the request is fully completed as some item's within express may be affected