Grpc-gateway make strange wrapping of com.google.protobuf.wrappers.StringValue result - streaming

I have GRPC service with the following function:
rpc StreamMessages(StreamMessagesRequest) returns (stream google.protobuf.StringValue) {
option (google.api.http) = {
post: "/messages:stream"body: "*"
};
}
With grpc-gateway behind it.
Once I have collection of 3 strings: "msg1", "msg2", "msg3" - wrapping every one as com.google.protobuf.wrappers.StringValue and returning as stream.
On GRPC side everything fine, but when I'm trying to execute REST request via gateway the issue happens:
According to documentation, Json representation of google.protobuf.StringValue is just JsonString, so expected streaming result is:
"msg1"
"msg2"
"msg3"
But it returns unexpected format instead:
{"result":"msg1"}
{"result":"msg2"}
{"result":"msg3"}
Question: How can I cause gateway to return the expected?

In order to obtain what you're looking for you need to use the protobuf specific package for the json marshaling. Instead of the standard one, use this one: google.golang.org/protobuf/encoding/protojson.
The interface is the same, but it correctly marshals StringValues to strings when an actual value is provided and the fields get ignored if the StringValue pointer is nil.

Related

Access nested body property from HTTP resolver(AppSync)

I'm new to AWS AppSync and I am trying to access certain body property(from HTTP response) in my resolver's response mapping template.
For example: I am able to present the response as is via $util.toJson($ctx.result.body), but when I try to get some of the nested body properties it fails.
For example, imagine the body looks like this:
{
about:{
"firstName":"Chuck",
"lastName":"Norris"
}
}
and $util.toJson($ctx.result.body.about) returns null. Any thoughts?
I found a way extract the parsed body in the following way:
#set ($parsed_body = $util.parseJson($ctx.result.body))
And then I am able to access the properties via dot notation:
parsed_body.about.firstName
The part I was missing is $util.parseJson(<json-string>)
It seems that the body is a JSON string.

How to send a list as parameter in databricks notebook task?

I am using Databricks Resi API to create a job with notebook_task in an existing cluster and getting the job_id in return.
Then I am calling the run-now api to trigger the job.
In this step, I want to send a list as argument via the notebook_params, which throws an error saying "Expected non-array for field value".
Is there any way I can send a list as an argument to the job?
I have tried sending the list argument in base_params as well with same error.
user_json={
"name": job_name,
"existing_cluster_id": cluster_id,
"notebook_task": {
"notebook_path": notebook_path
},
"email_notifications":{
"on_failure":[email_id]
},
"max_retries": 0,
"timeout_seconds": 3600
}
response=requests.post('https://<databricks_uri>/2.0/jobs/create',headers=head,json=user_json,timeout=5, verify=False)
job_id=response.json()['job_id']
json_job={"job_id":job_id,"notebook_params":{"name":"john doe","my_list":my_list}}
response = requests.post('https://<databricks_uri>/2.0/jobs/run-now', headers=head, json=json_job, timeout=200, verify=False)
Not found any native solution yet, but my solution was to pass the list as a string and parse it back out on the other side:
json_job={"job_id":job_id,
"notebook_params":{
"name":"john doe",
"my_list":"spam,eggs"
}
}
Then in databricks:
my_list=dbutils.widgets.get("my_list")
my_list=my_list.split(",")
With appropriate care around special characters or e.g. conversion to numeric types.
If the objects in the list are more substantial, then sending them as a file to dbfs using the CLI or API before running the job may be another option to explore.
Hi may be I'm bit late but found a better solution.
Step 1:
Use JSON.stringyfy() in the console of any browser to convert your value(object, array, JSON etc) into string
Ex:
Now use this value in the body of URL
In Databricks notebook convert string to JSON using python json module.
Hope this helps

Capture json response value and http status from cpprest sdk pplx task cpp to local variables

I want to write a generic function in cpp that gets JSON data using cpprestsdk and copy the http status response code and the JSON data. The calling method will use the json_resp and http_status codes. Later on, I want to further make this function more generic by passing the URL and use it to get data from different web services. Please let me know how I can accomplish this.
pplx::task<void> handleWebServerRequest( web::json::value json_resp, int *http_status)
{
..
http_client client(L"http://weburl.com:8000/getjsondata");
return client.request(methods::GET).then([](http_response response) -> pplx::task<json::value> {
// Store the http status code to be returned to calling function
*http_status = response.status_code();
..
if(response.status_code() == status_codes::OK) {
return response.extract_json();
}
return pplx::task_from_result(json::value()); }).then([](pplx::task<json::value> previousTask) {
try {
// capture json response to json_resp
json_resp = previousTask.get();
}
catch( const http_exception& e) {
// print error
}
});
}
In my research I have found that the only difference between using cpprest api to consume a PHP web service and a WCF web service is the function parameter. When consuming a PHP web service you can set the function parameter to an empty string. Where as when consuming a WCF service you need to pass it a function parameter-because the protocol for receiving requests and issuing responses in a WCF service is very different, but the process of sending requests and receiving responses is asynchronous so there will always be at least three modules, functions or tasks involved. One to make the request. The other to wait and receive the response and another to parse the data which is called asynchronously by the function that receives the response. I suppose you could put all three tasks into one function and use go to statements to execute each task, perhaps use some inline assembly to capture the response, and use pointers in place of parameters - but it is still three tasks anyway you slice it. The two others run in a thread and do not have access to the application data, but the last function that parses the data (the json object) which is called asynchronously you could make generic. I don't know which web services you want to consume, but I posted two samples on github-Example of Casablanca (cpprestsdk 2.9.1) consuming a PHP web service and Example of Casablanca (cpprestsdk 2.9.1) consuming a WCF (.net) web service. I believe this should get you off to a good start. To capture the json values you can convert your json values to std strings (as shown below) and then you can store them respectively in a local hashmap by adding a hashmap pointer argument to all three functions and passing a reference to the local hashmap variable from which ever function you are calling it from where they can be converted to what ever data type you need.
void get_field_map_json(json::value & jvalue, unordered_map <string, string> * hashmap)
{
if (!jvalue.is_null())
{
for (auto const & e : jvalue.as_object())
{
std::string key(conversions::to_utf8string(e.first));
std::string value(conversions::to_utf8string(e.second.as_string()));
(*hashmap)[key] = value;
}
}

Using Spring Cloud Contract Groovy DSL, how can I parameterize the response to include values from the request?

I am using Spring Cloud Contract to create stubs for a REST service so I can test with a REST client. I have the stub runner working within a Spring Boot application, and it all works as expected. The problem I am having is that I'd like to see elements of the requests in the responses, to better simulate the eventual behavior of the REST service. For example, in this contract, I'd like what is passed in the "code" field in the request to appear regurgitated in the response:
package contracts
org.springframework.cloud.contract.spec.Contract.make {
request {
method('POST')
url $("/resource")
body ([
code : $(client(regex('[a-zA-Z0-9]{5,32}')))
])
}
response {
status 200
body([
code: ???
])
}
}
Obviously the input "code" can be anything that matches the regular expression, and so the actual value is unknown until runtime. Is there anything i can put in place of "???" to return the code submitted in the request ? I tried accessing, for example:
request.body.serverValue['code']
but that value it seems is generated at compile time, perhaps to enable the auto-generation of tests in ContractVerifierTest.java under generated-test-sources.
Can this be done ? Is this an appropriate use of Spring Cloud Contract ?
Currently, it's not supported. We prefer an approach where you have simpler contracts. If you need in a response a value from the request just hard code both the request and the response parts of the contract.
You can, however, file an issue and we can try to think of something in the future releases.
UPDATE:
With version 1.1.0 that's already possible. Check out the docs for more info - http://cloud.spring.io/spring-cloud-static/spring-cloud-contract/1.1.0.RELEASE/#_referencing_request_from_response

Restangular sends empty payload for keys starting with "$"

I'm using Restangular to connect with Mongolab. I'd like to use the following code to push an update:
var theData = {"$push":{exercises :{type: "running"}}};
Restangular.all('employees').one(user._id.$oid).customPUT(theData ,null, {apiKey: apiKey});
When I run this and look at the XHR request, the payload is always set to {}.
However, if I pull out the $, my payload looks like this:
{"push":{exercises :{type: "running"}}}
In that instance, the payload looks fine, but mongolab thinks I'm wanting to add a field named "push" instead of pushing to the excercises array since I'm not using the "$push" keyword.
I can have the "$" anywhere in the string except at the front (e.g " $push" and "push$" work) but unfortunately, that's what mongo requires in order to push an update. Is there some setting that I'm missing, or is this a bug in restangular?
yes, the $ will be striped: before the data is send, the data will be transformed with angular.toJson function:
#name angular.toJson
#function
#description
Serializes input into a JSON-formatted string. Properties with leading $ characters will be
stripped since angular uses this notation internally.
If you don't want this behavior you have to provide a transformRequest function (http://docs.angularjs.org/api/ng.$http). If your data is already json you may just write:
transformRequest: function(data){
return data;
}
the transformRequest must be provided as option during resource configuration. see
http://docs.angularjs.org/api/ngResource.$resource