Getting Cannot read property in postman scripts when null found - rest

I am getting an Error when running a script in the Postman Tests tab, When trying to check that a property is not null.
My JSON response:
{
"statusMessage": "Success",
"timeStamp": "2018-01-23 05:13:16.7",
"numberOfRecords": 7,
"parties": [
{
"shippingAddress": null,
"shippingDetails": null,
"paExpirationDate": "",
"historyDate": "01/22/2018",
"renewal": {
"renewalRx": "N",
"priorRxNumber": "",
"priorSB": "",
"priorFillNumber": ""
},
"noOfRefillingRemaining": "1",
"ndc": "00074-3799-06",
"rxId": "7004942",
"fillingNumber": "0"
},
{
"shippingAddress": {
"addressLine1": "2150 St",
"addressLine2": "Home Line 2",
"city": "Bronx",
"state": "NY",
"zipCode": "10453",
"addressSeqNumber": "1",
"medFacilityIndicator": "N"
}
}
]
}
My postman script is:
var jsonData = JSON.parse(responseBody);
var parties = jsonData.parties;
parties.forEach(function(data){
if(data.shippingAddress!==null && data.shippingAddress.addressLine1 !== null ){
postman.setEnvironmentVariable("addressLine1",data.shippingAddress.addressLine1);
}
I am getting the following error:
"Error running tests for results: TypeError: Cannot read property 'addressLine1' of null"

You could try this, I changed your code slightly but this would work:
var parties = pm.response.json().parties
for(i = 0; i < parties.length; i++) {
if(parties[i].shippingAddress !== null) {
pm.environment.set("addressLine1", parties[i].shippingAddress.addressLine1)
}
}
I tested this locally with the Schema that you provided and that it wrote 2150 St to my environment file.
The schema you posted doesn't seem to be a complete one, I think that the parties array has a shippingAddress property which is either null or it is an object containing the shippingAddress details - I might be wrong but I can't get my head around the data that you posted.
I don't think what you're searching on the in the if statement is correct and it wouldn't work the way you have it because if the first condition is null (like in your response data) it wouldn't ever meet the second condition because the object wouldn't be there and that shippingAddress.addressLine1 reference would always show that error.
Or you could use your code like this:
var jsonData = JSON.parse(responseBody)
var parties = jsonData.parties
parties.forEach(function(data) {
if(data.shippingAddress !== null) {
postman.setEnvironmentVariable("addressLine1",data.shippingAddress.addressLine1)
}
})

Related

A Map related error in flutter which happens everytime I try to parse it

I have a map which contains two standard key and value pairs like
_body = {
"expected_time": "$Time",
"payment": "$payment",
"receiver" : {},
};
And another map named receiver inside it as you can see. Values are being passed to the receiver later using a for loop and the information is being added just fine.
for(int i=0; i<=n; i++)
{
_body['receiver'][i] = {
"receiver_name" : "abc",
};
}
The issue I'm facing is when trying to send this map to an api call via http.post
there jsonEncode(body) has been used to encode the map to send it. When sending those simple key value pairs I'm getting no error but when I'm trying to include the receiver field as well I'm getting the error.
Can anyone please tell me what I need to here? Thanks!
you are not making it in the right way, try this
var _body = {
"expected_time": "time",
"payment": "payment",
"receiver" : {},
};
for(int i=0; i<=3; i++) {
_body.addAll({
'receiver[$i]': {
"receiver_name": "abc",
}
});
}
print(_body);
and the output is like this
{expected_time: time, payment: payment, receiver: {}, receiver[0]: {receiver_name: abc}, receiver[1]: {receiver_name: abc}, receiver[2]: {receiver_name: abc}, receiver[3]: {receiver_name: abc}}
you can now encode it

Flutter : can't get all response body from get request

I have get request in Flutter app, when I test the request in postman I get all data, something like this :
{
"result":{
"name" : "somename",
"images":[
"test.jpg",
"test2.jpg"
],
"sizes":[
{
"id": 1,
"value" : 5
},
{
"id": 2,
"value" : 15
}
]
}
}
I call data and print them like this without using models:
var data = json.decode(response.body);
print(data['result']['name']);
print(data['result']['images']);
print(data['result']['sizes']);
it is print all things expect last one.
where must be the mistake?
Solved, by adding "?sizesView = true" to the link
final response = await http.get(path +'?sizesView = true'):
you should get the index of the last one because it is in a dictionary not a list do this:
print(data['result']['sizes'][0]['id']) // it will get the first index of the sizes list and then get the id key
or you can creat a model of list to get the indexes of your list

Passing values from an API through another API discord.js

So what trying to do is get some values I get from an API, pass onto an array to THEN pass it through another API and then add up the data it gets.
I already made this fetching it from 1 api, but I don't know how to fetch all the values from one api, pass it into an array, and fetch each value for each item.
let req1 = `https://inventory.roblox.com/v1/users/${imageURL}/assets/collectibles?sortOrder=Asc&limit=100`
let req2 = `https://users.roblox.com/v1/users/${imageURL}`
function getItems() {
return axios.get(req1)
}
function getUser() {
return axios.get(req2)
}
await Promise.all([getItems(), getUser(),getValue()])
.then(function(response){
let rap = response[0].data.data
let items = response[0].data.data.assetId
let itemarray = 0
let value = response[2].data.items[items array goes here][4]
let sum = 0
for(let i = 0; i < rap.length; i++) {sum += response[0].data.data[i].recentAveragePrice}
this is what I have so far. Any help will be appreciated. Thanks.
Edit:
Heres what req1 spits out:
{
"previousPageCursor": null,
"nextPageCursor": "49034821991_1_a887f8ff3a6b076aafd71c91a5a4a4fb",
"data": [
{
"userAssetId": 23726188,
"serialNumber": null,
"assetId": 7636350,
"name": "Supa Fly Cap",
"recentAveragePrice": 23535,
"originalPrice": null,
"assetStock": null,
"buildersClubMembershipType": 0
},
{
"userAssetId": 1329942875,
"serialNumber": 13717,
"assetId": 113325603,
"name": "Recycled Cardboard Shades",
"recentAveragePrice": 5703,
"originalPrice": 25,
"assetStock": 15000,
"buildersClubMembershipType": 0
},
```
req2 doesnt apply for this part of the code :p

AppSync: pipeline resolver #return null result

I'm successfully using a pipeline resolver to persist a parent/child relationship, except when the list of child items is empty and I #return early.
I'm guessing the issue is around my response mappers and use of $ctx.prev vs $ctx.result but I can't figure it out.
The pipeline looks like this:
BEFORE template: {}
Function 1:
request = PutItem the parent
response = $utils.toJson($ctx.result)
Function 2:
request = TransactWriteItems (foreach UpdateItem) the children
response = $utils.toJson($ctx.prev.result)
AFTER template: $utils.toJson($ctx.prev.result)
When I call the mutation with
{"parentAttribute":"foo", "children": [{"childAttribute": "bar"}]}
I get a good response like:
{
"data": {
"createFoo": {
"parentAttribute": "foo",
"children": [
{
"childAttribute": "bar"
}
]
}
}
}
If no children, Function 2 request mapper does #return to avoid "TransactWriteItems must have at least one operation" error.
In this scenario I am hoping for the above response to the mutation, just with children: []
Instead, I get:
{
"data": {
"createFoo": null
}
}
The data has been written correctly; if I query it I get back the parent with empty list of children.
How do I get this pipeline to execute so that it returns the combined parent+child data whether the child array is populated or not?
Detail
The schema is something like:
type Foo {
id: String!
attr1: String
bars: [Bar]
}
type Bar {
id: String!
attr2: String
}
type Mutation {
createFoo(foo: Foo): Foo
}
And a dynamodb representation like this:
pk
sk
attr1
attr2
FOO#1
METADATA#FOO#1
Lorem
FOO#1
BAR#1
Ipsum
While the pipeline looks like:
before.vtl
{}
createParent-request.vtl
{
"version" : "2017-02-28",
"operation" : "PutItem",
"key" : {
"pk" : $util.dynamodb.toDynamoDBJson(...),
"sk" : $util.dynamodb.toDynamoDBJson(...)
},
"attributeValues" : {
"data" : $util.dynamodb.toDynamoDBJson(...)
}
}
createParent-response.vtl
#if($ctx.error)
$utils.error($ctx.error.message, $ctx.error.type)
#end
$utils.toJson($ctx.result)
createChildren-request.vtl
#if($ctx.args.fooInput.children.size() > 0)
{
"version": "2018-05-29",
"operation": "TransactWriteItems",
"transactItems": [
#foreach( $child in $ctx.args.fooInput.children )
{
"table": "${table}",
"operation": "UpdateItem",
"key": {
"pk" : $util.dynamodb.toDynamoDBJson(...),
"sk" : $util.dynamodb.toDynamoDBJson(...)
},
"update": {
"expression": "SET #data = :data",
"expressionNames": {
"#data": "data"
},
"expressionValues": {
":data":
$util.dynamodb.toDynamoDBJson(...)
}
}
}
#if( $foreach.hasNext ),#end
#end
]
}
#else
#return
#end
createChildren-response.vtl
#if($ctx.error)
$utils.error($ctx.error.message, $ctx.error.type)
#end
$utils.toJson($ctx.prev.result)
after.vtl
#if($ctx.error)
$utils.error($ctx.error.message, $ctx.error.type)
#end
$utils.toJson($ctx.prev.result)
I figured it out. For the expected behaviour, one needs the 'after' mapper to return the necessary JSON to populate the overall mutation response. In my example above, after.vtl needs to return a parent and nothing else matters (in particular, the result of the individual function response mappers).
I ended up putting the output of the 'create parent' operation into ctx.stash then returning ctx.stash in after.vtl, setting the other resolvers to {}.
Note that, if your response has subtypes (with their own resolvers) and you return it sparse, AppSync will call the resolver. In the context of my example, it's enough to return the parent without any children and then the normal query resolver for "get children of a parent" will execute to populate the final response.

Mongoose - can't insert subDocuments of a Dictionary Type

I have a Mongoose schema for the document Company, that has several fields. One of these (documents_banks) is a "free" field, of dictionary type, because I don't know the names of the keys in advance.
The problem is that, when I save the document (company.save()) even if the resulting saved document has the new sub_docs, in the DB no new sub_docs are actually saved.
var Company = new Schema({
banks: [{ type: String }], // array of Strings
documents_banks: {} // free field
});
Even if documents_banks is not restricted by the Schema, it will have this structure (in my mind):
{
"bank_id1": {
"doc_type1": {
"url": { "type": "String" },
"custom_name": { "type": "String" }
},
"doc_type2": {
"url": { "type": "String" },
"custom_name": { "type": "String" }
}
},
"bank_id2": {
"doc_type1": {
"url": { "type": "String" },
"custom_name": { "type": "String" }
}
}
}
But I don't know in advance names of keys bank_id neither doc_type, so I used the Dictionary type (documents_banks:{}).
Now, this below is the function I use to save new sub_docs in documents_banks. The same logic I always use to save new sub_docs.. Anyway this time, it seems saved, but it's not.
function addBankDocument(company_id, bank_id, doc_type, url, custom_name) {
// retrieve the company document
Company.findById(company_id)
.then(function(company) {
// create empty sub_docs if needed
if (!company.documents_banks) {
company.documents_banks = {};
}
if (!company.documents_banks[bank_id]) {
company.documents_banks[bank_id] = {};
}
// add the new sub_doc
company.documents_bank[bank_id][doc_type] = {
"url": url,
"custom_name": custom_name
};
return company.save();
})
.then(function(saved_company) {
// I try to check if the new obj has been saved
console.log(saved_company.documents_bank[bank_id][doc_type]);
// and it actually prints the new obj!!
});
}
The saved_company returned by the .save() actually has the new sub_docs, but if I check the DB there is not the new sub_doc! I can save just the first one, all the others are not stored.
So, the console.log() always print the new sub_docs, but actually in the DataBase, just the first sub_doc is saved, not the others. So at the end, saved_company always has 1 sub_doc, the first one.
It seems very strange to me, since saved_company has the new sub_docs. What can be happened?
This below is a real extract from by DB, and it will contains forever just the sub_doc "doc_bank#1573807781414", others will be not present in the DB.
{
"_id": "5c6eaf8efdc21500146e289c", // company_id
"banks": [ "MPS" ],
"documents_banks": {
"5c5ac3e025acd98596021a9a": // bank_id
{
"doc_bank#1573807781414": // doc_type
{
"url": "http://...",
"custom_name": "file1"
}
}
}
}
Versions:
$ npm -v
6.4.1
$ npm show mongoose version
5.7.11
$ node -v
v8.16.0
It seems that, since mongoose doesn't know the exact model of the subdoc, it can't know when it changes. So I have to use markModified to notify changes of the "free field" (also known as dictionary or MixedType) with this:
company_doc.documents_banks["bank_id2"]["doc_type3"] = obj; // modify
company_doc.markModified('documents_banks'); // <--- notify changes
company_doc.save(); // save changes
As I understood, markModified force the model to 'update' that field during the save().