How to send Nested Array parameter using alamofire's Multipart form data - swift

How to send this parameter to multipart
let dictionary = [
"user" :
[
"email" : "\(email!)",
"token" : "\(loginToken!)"
],
"photo_data" :[
"name" : "Toko Tokoan1",
"avatar_photo" : photo,
"background_photo" : photo,
"phone" : "0222222222",
"addresses" :[[
"address" : "Jalan Kita",
"provinceid" : 13,
"cityid" : 185,
"postal" : "45512"
]],
"banks" :[[
"bank_name" : "PT Bank BCA",
"account_number" : "292993122",
"account_name" : "Tukiyeum"
]]
]
]
I tried this below code but I can't encode value (which in NSDic) to utf 8
for (key, value) in current_user {
if key == "avatar_photo" || key == "background_photo"{
multipartFormData.appendBodyPart(fileURL: value.data(using: String.Encoding.utf8)!, name: key) // value error because its NSDic
}else{
multipartFormData.appendBodyPart(data: value.data(using: String.Encoding.utf8)!, name: value) // value error because its NSDic
}
}
value in append body part cannot be use because it's NSDictionary not string. How the right way to put that parameter in multipartformdata?

It is allowed to have nested multiparts.
The use of a Content-Type of multipart in a body part within another multipart entity is explicitly allowed. In such cases, for obvious reasons, care must be taken to ensure that each nested multipart entity must use a different boundary delimiter.
RFC 1341
So you have to do the same you did on the outer loop: Simply loop through the contents of the dictionary generating key-value pairs. Obviously you have to set a different part delimiter, so the client can distinguish a nested part change from a top-level part change.
Maybe it is easier to send the whole structure as application/json.

Related

GET a string inside a object nested within a array

I have a get router that gets orders with the time stamp
router.get('/api/order/:date/:time', function (req,res,next) {
Box.find({orders: {date:req.params.date}}.then (function (order){
console.log('GET ORDER / with ', req.params.date);
res.send(order);
}).catch(next);
});
the :time is just to allow my frontend to call this specific get which offers a timestamp inside the :date parameter
now, the model
const orderSchema = new Schema({
name : { type : String },
date : { type : String }, // date stamp of only YYYY/MM/DD
orders : { type : Array}
});
Inside this array of orders you can find elements such as:
"orders" : [
{
"type" : "Meat Lover Slice",
"extraType" : "na",
"extraInfo" : "na",
"date" : "2018-09-27:08:47:07",
"quantity" : "1",
"size" : "medium",
"crust" : "normal",
"split" : false,
and so on.. (15 or so elements)
You can see inside this array of orders, there are time stamped with YYYY/MM/DD:HH:MM:SS (2018-09-27:08:47:07).
inside the router, I do get
console.log('GET ORDER / with ', req.params.date) // > GET ORDER / with 2018-09-27:08:47:07
so it does receive the time at this route.
But with the params, how do I filter out the the specific orders matching that element?
If I have understood the question correctly is the short answer is that you can't, there is no way to "filter" sub-documents in a standard query. (Have a look at Return only matched sub-document elements within a nested array for a more in depth answer.)
What you could do is either use MongoDB Aggregations (https://docs.mongodb.com/manual/aggregation/) or do the filtering yourself when you have received the query result.
I would suggest doing the filtering yourself.

Numeric comparisons such as greater-than on a currency amount stored as a string

So I have an example JSON file with fake bank users, and I want to find the people with a balance greater than $1000, how can I do so? Do keep in mind the balance is a String value.
EXAMPLE USER INFO:
{
"_id" : ObjectId("58d1cae3cba624106c8080ab"),
"isActive" : false,
"balance" : "$3,495.58",
"age" : 24,
"eyeColor" : "blue",
"name" : "Webster Sanders",
"gender" : "male",
"company" : "HALAP",
"email" : "webstersanders#halap.com",
"phone" : "+1 (883) 536-2259",
"address" : "300 Jewel Street, Sugartown, Federated States Of Micronesia, 9305"
}
You could use a regular expression like:
db.users.find({ balance: { $regex: /^\$[1-9][0-9\,]{3,}/ } });
You are making life very difficult for yourself by storing the monetary value as a string. You would be better off storing it as a numeric value, possibly with a string equivalent (for presentation) in a second field if necessary.
Number(currency.replace(/[^0-9.-]+/g,""));
please try below code
var currency = "-$4,400.50";
var number = Number(currency.replace(/[^0-9.-]+/g,""));
First get balance in string, and then replace $ and , with '' (blank quotes) then use int.parse.
For example :
string a = balance
a = a.replace("$","");
a = a.replace(",","");
int balance = int.parse(a);

Pig's AvroStorage LOAD removes unicode chars from input

I am using pig to read avro files and normalize/transform the data before writing back out. The avro files have records of the form:
{
"type" : "record",
"name" : "KeyValuePair",
"namespace" : "org.apache.avro.mapreduce",
"doc" : "A key/value pair",
"fields" : [ {
"name" : "key",
"type" : "string",
"doc" : "The key"
}, {
"name" : "value",
"type" : {
"type" : "map",
"values" : "bytes"
},
"doc" : "The value"
} ]
}
I have used the AvroTools command-line utility in conjunction with jq to dump the first record to JSON:
$ java -jar avro-tools-1.8.1.jar tojson part-m-00000.avro | ./jq --compact-output 'select(.value.pf_v != null)' | head -n 1 | ./jq .
{
"key": "some-record-uuid",
"value": {
"pf_v": "v1\u0003Basic\u0001slcvdr1rw\u001a\u0004v2\u0003DayWatch\u0001slcva2omi\u001a\u0004v3\u0003Performance\u0001slc1vs1v1w1p1g1i\u0004v4\u0003Fundamentals\u0001snlj1erwi\u001a\u0004v5\u0003My Portfolio\u0001svr1dews1b2b3k1k2\u001a\u0004v0\u00035"
}
}
I run the following pig commands:
REGISTER avro-1.8.1.jar
REGISTER json-simple-1.1.1.jar
REGISTER piggybank-0.15.0.jar
REGISTER jackson-core-2.8.6.jar
REGISTER jackson-databind-2.8.6.jar
DEFINE AvroLoader org.apache.pig.piggybank.storage.avro.AvroStorage();
AllRecords = LOAD 'part-m-00000.avro'
USING AvroLoader()
AS (key: chararray, value: map[]);
Records = FILTER AllRecords BY value#'pf_v' is not null;
SmallRecords = LIMIT Records 10;
DUMP SmallRecords;
The corresponding record for the last command above is as follows:
...
(some-record-uuid,[pf_v#v03v1Basicslcviv2DayWatchslcva2omiv3Performanceslc1vs1v1w1p1g1i])
...
As you can see the unicode chars have been removed from the pf_v value. The unicode characters are actually being used as delimiters in these values so I will need them in order to fully parse the records into their desired normalized state. The unicode characters are clearly present in the encoded .avro file (as demonstrated by dumping the file to JSON). Is anybody aware of a way to get AvroStorage to not remove the unicode chars when loading records?
Thank you!
Update:
I have also performed the same operation using Avro's python DataFileReader:
import avro.schema
from avro.datafile import DataFileReader, DataFileWriter
from avro.io import DatumReader, DatumWriter
reader = DataFileReader(open("part-m-00000.avro", "rb"), DatumReader())
for rec in reader:
if 'some-record-uuid' in rec['key']:
print rec
print '--------------------------------------------'
break
reader.close()
This prints a dict with what looks like hex chars substituted for the unicode chars (which is preferable to removing them entirely):
{u'value': {u'pf_v': 'v0\x033\x04v1\x03Basic\x01slcvi\x1a\x04v2\x03DayWatch\x01slcva2omi\x1a\x04v3\x03Performance\x01slc1vs1v1w1p1g1i\x1a'}, u'key': u'some-record-uuid'}

Specifying Numbers in VTL on AMS API Gateway

Doc Ref: http://docs.aws.amazon.com/apigateway/latest/developerguide/models-mappings.html
In AMS VTL one specifies dictionary fields in a model schema thus:
"field1" : {"type":"string"},
"field2" : {"type":"number"},
and so a mapping template can populate such fields thus:
#set($inputRoot = $input.path('$'))
"questions" :
[
#foreach($elem in $inputRoot)
{
"field1" : "$elem.field1",
"field2" : $elem.field2
}#if($foreach.hasNext),#end
#end
]
However... my iOS app complains the received data isn't in JSON format. If I add quotes around $elem.field2 then iOS accepts the JSON and converts all fields to strings.
My Lambda function is returning is returning a standard JSON list of dictionaries with field2 defined as an integer.
But APIG returns strings for all my fields, delimited with {} and a prefix:
{S=some text}
{N=10000000500}
So I can see that field2 isn't a number but a string {N=10000000500}.
How do I handle numbers in this system?
Undocumented but you can simply specify the type after the field name in a mapping template:
#set($inputRoot = $input.path('$'))
"questions" :
[
#foreach($elem in $inputRoot)
{
"field1" : "$elem.field1.S",
"field2" : $elem.field2.N
}#if($foreach.hasNext),#end
#end
]
Note that string fields need to be delimited in quotes.

Querying mongodb

I have a collection of emails in MongoDB with a field containing an array of json.
How can knowing the sender and the receiver, how can I find all the emails exchanged between the two people ? I need to do something like
db.email.find({"from": i.st20#gmail.com", "tos":"ron#gmail.com")
but I cant find the right way to write this query :(
> db.emails.findOne()
{
"from" : {
"real_name" : "it",
"address" : "i.st20#gmail.com"
},
"tos" : [
{
"real_name" : null,
"address" : "ron#gmail.com"
}
],
}
Use "from.address" and "tos.address":
db.emails.find({"from.address" : "i.st20#gmail.com", "tos.address" : "ron#gmail.com"})
Each field is considered as a json, we can precise the expected value through a "." :
db.emails.find({"tos.address" : "ron#gmail.com", "from.address":"i.st20#gmail.com"})
From and To are objects and data inside them can be accessed through a . operator.
So the query will be:
db.emails.find({"from.address" : "i.st20#gmail.com", "tos.address" : "ron#gmail.com"}).