I'm practicing with a simple CRUD rest API in Go, I'm finally done with some basic boilerplate (I'm might switch to GORM), but I noticed that mux is not properlly routing my POST method handler. If I were to post to /inventory I instead receive all the items on my DB.
func RegisterItemsRouter(router *mux.Router) {
itemsRouter := router.PathPrefix("/inventory").Subrouter()
itemsRouter.HandleFunc("/", addItem).Methods(http.MethodPost)
itemsRouter.HandleFunc("/", getItems).Methods(http.MethodGet)
itemsRouter.HandleFunc("/{id:[0-9]+}", getItemById).Methods(http.MethodGet)
itemsRouter.HandleFunc("/{id}", deleteItem).Methods(http.MethodDelete)
itemsRouter.HandleFunc("/{id}", updateItem).Methods(http.MethodPut)
}
Here's how it looks like in postman
Related
I'm looking for a solution that will have the backend publish an event to the frontend as soon as a modification is done on the server side. To be more concise I want to emit a new List of objects as soon as one item is modified.
I've tried implementing on a SpringBoot project, that uses Reactive Web, MongoDB which has a #Tailable cursor that publish an event as soon as the capped collection is modified. The problem is that the capped collection has some limitation and is not really compatible with what I want to do. The thing is I cannot update an existing element if the new one has a different size(as I understood this is illegal because you cannot make a rollback).
I honestly don't even know if it's doable, but maybe I'm lucky and I'll run into a rocket scientist right here that will prove otherwise.
Thanks in advance!!
*** EDIT:
Sorry for the vague question. Yes I'm more focused on the HOW, using the Spring Reactive framework.
When I had a similar need - to inform frontend that something is done on the backend side - I have used a message queue.
I have published a message to the queue from the backend and the frontend consumed the message.
But I am not sure if that is what you're looking for.
if you are using webflux with spring reactor, I think you can simply have a client request with content-type as 'text/event-stream' or 'application/stream+json' and You shall have API that can produce those content-type. This gives you SSE model without too much effort.
#GetMapping(value = "/stream", produces = {MediaType.TEXT_EVENT_STREAM_VALUE, MediaType.APPLICATION_STREAM_JSON_VALUE, MediaType.APPLICATION_JSON_UTF8_VALUE})
public Flux<Message> get(HttpServletRequest request) {
Just as an idea - maybe you need to use a web socket technology here:
The frontend side (I assume its a client side application that runs in a browser, written in react, angular or something like that) can establish a web-socket communication with the backend server.
When the process on backend finishes, the message from backend to frontend can be sent.
You can do emitting changes by hand. For example:
endpoint:
public final Sinks.Many<SimpleInfoEvent> infoEventSink = Sinks.many().multicast().onBackpressureBuffer();
#RequestMapping(path = "/sseApproach", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<ServerSentEvent<SimpleInfoEvent>> sse() {
return infoEventSink.asFlux()
.map(e -> ServerSentEvent.builder(e)
.id(counter.incrementAndGet() + "")
.event(e.getClass().getName())
.build());
}
Code anywhere for emitting data:
infoEventSink.tryEmitNext(new SimpleInfoEvent("any custom event"));
Watch out of threads and things like "subscribeOn", "publishOn", but basically (when not using any third party code), this should work good enough.
I am working my way through using IBM Cloud Bluemix environment with their Kitura flavor of server side Swift implementation.
Of course, key to this is the ability to make all sorts of HTTP requests
So far I have been able to handle GET requests and POST requests with pure JSON body.
I am stuck when it comes to form-data or application/x-www-form-urlencoded.
From what I read, it appears that I should be using the Kitura-provided BodyParser class, but I'm afraid I am not even sure how to actually use it in code.
I have mostly used the following very useful posts to make my way so far.
From Rob Allen From Horea Porutiu From Kevin Hoyt
As far as i understand it now I will need to use the BodyParser and Router classes from Kitura, but it seems to me that htose are arlredy taken care of in IBM Cloud Function implementation of OpenWhisk + Kitura Swift... so I am not too sure now...
Any idea or pointer anyone ?
Thanks
You can use request.readString() to read the body information in its raw format.
If you have the BodyParser middleware in play using:
router.all("/name", middleware: BodyParser())
Then you can use this for urlencoded bodies:
router.post("/name") { request, response, next in
guard let parsedBody = request.body else {
next()
return
}
switch parsedBody {
case .urlEncoded(let data):
let name = data["name"].string ?? ""
try response.send("Hello \(name)").end()
default:
break
}
next()
}
Where data is a [String:String] dictionary.
ok, i answered my own question with further understanding that Kitura and Kitura-Net are 2 different things. The ClientRequest Class in Kitura-Net handles all this.
All here
The vertx implementation of calling / invoking / consuming the REST APIs through requestAbs method
of io.vertx.core.http.HttpClient class from vertx-core-3.2.0.jar results in HTTP Error :: 302 and Response Data as HTML Erro Response .
Not sure how the requestAbs method behaves as there's no exception thrown and it does not write any logs as well.
Also source code attached for this method with vertx jars. Suspect, if the method implementation has a bug?
The same REST API calls are success with Browser / POSTMAN.
The traditional approach with Apache HTTPClient for REST Calls are success, then I doubt why not with vertx framework.
Any solution / modification in code snippet below is much appreciated.
Thanks
Code
Your code is a bit confusing (it looks like variables names are not always the same).
Anyway, you will manage to do what you want with that code:
final HttpClient httpClient = vertx.createHttpClient();
final String url = "http://services.groupkt.com/country/get/iso2code/IN";
httpClient.getAbs(url, response -> {
if (response.statusCode() != 200) {
System.err.println("fail");
} else {
response.bodyHandler(b -> System.out.println(b.toString()));
}
}).end();
Hope this will help.
I have undetermined resources that need to be fetched from a server. I tried to accomplish this by using the repeatElement() and concat() operators like this:
repeatElement(0, CurrentThreadScheduler.instance).map({ _ -> Observable<[Task]> in
// API.getTasks() uses Alamofire to request data
return API.getTasks(loggedUser, after: loggedUser.taskPullTime)
}).concat().takeWhile({ (tasks) -> Bool in
return tasks.count > 0
})
Unfortunately, repeatElement will just emit an item without waiting for the old one to be handled. I think the reason is that Alamorfire executes in a private serial queue.
However, I cannot figure out how to solve this problem.
I used the strategy inspired from here in my Android project. Everything works fine because Retrofit init HTTP request in a synchronous manner.
I'm having some issues with the newest version of sails.js (0.11.0). It stated in github that plain socket.io code will be accepted and ran in sails.js; however, I am simply trying to emit a message from a client when they click on something like so:
$('#myBtn').on('click', function(){
io.socket.emit('message', {
message: {
subject: subject
},
sender: id
});
});
I end up getting an "Uncaught TypeError: undefined is not a function" on the line of io.socket.emit() aka emit is not a function of io.socket.
Here are some references that I have looked at:
https://github.com/balderdashy/sails/issues/2397
http://www.tysoncadenhead.com/blog/getting-started-with-socket-io#.VQCFjvnF9tU
I have a feeling with the updated version of sails, instead of emitting a message I should be doing something along the lines of:
io.socket.post('/user/message', data, function(data, jwres) {
});
Something concerns me with the following answer here:
Sending session specific messages with socket.io and sails.js
It states "class rooms" are being deprecated along with publishCreate, publishDestroy, introduce, and obituary.
So do I follow a Pub/Sub paradigm, re-write my more "socket-io-ish" code to utilize sails Blueprints & Pub/Sub, or continue in my socket-io fashion?
Is there another way of emitting a message from client using sails?
You are correct in that the recommended way of communicating with the server via sockets is to use the RESTful socket client methods. The benefit is that you can use the regular Sails routing and controller/action architecture for socket communication instead of having to support a whole other layer of subscription and event-handling on the backend. This is one of the main reasons why #mikermcneil originally created Sails. Two things to note:
You can use req.isSocket in your controller action to determine whether the request is coming from a socket, and
You can get the raw, underlying socket.io instance on the client with io.socket._raw, which will have the emit method. But again, this is not the recommended practice.