CloudKit REST API - cloudkit

I'm trying to use the CloudKit API to create records in the public database from my server, using a server-to-server key. I'm using the following shell script to generate the curl command. When I run it, the response from Apple just says there was an internal error.
{
"uuid" : "a6415feb-168b-4615-9577-10c5168d7d7c",
"serverErrorCode" : "INTERNAL_ERROR"
}
This is the script I'm using:
#!/bin/sh
subpath=/database/1/iCloud.com.mycompany.myapp/development/public/records/modify
date=`date -u +"%Y-%m-%dT%H:%M:%SZ"`
body='
{
"operations": [
{
"operationType": "forceReplace",
"record": {
"recordType": "Drawing",
"fields": {
"date": "2021-01-09T12:00:00Z",
"numbers": [14, 26, 38, 45, 46, 13],
"type": 1
}
},
"recordName": "powerball20210109"
}
],
"atomic": true
}
'
encoded=`echo $body | base64`
signature="$date:$encoded:$subpath"
curl -X POST https://api.apple-cloudkit.com$subpath \
-H 'Content-Type: application/json' \
-H 'X-Apple-Cloudkit-Request-KeyID: myKeyHere' \
-H "X-Apple-CloudKit-Request-ISO8601Date: $date" \
-H "X-Apple-CloudKit-Request-SignatureV1: $signature" \
-d "$body"
The CloudKit schema shows Drawing as a custom type with three properties:
date is a Date/Time
numbers is an Int(64) (List)
type is an Int(64)

I can see a couple of issues. All fields need to have an object with a value property. Also, dates are saved to CloudKit as an integer in the format of milliseconds since Jan 1, 1970.
I've never tried saving an array of integers like that, but it looks right.
Try this:
"fields":{
"date": { "value" : 1610193600000 },
"numbers": { "value" : [14, 26, 38, 45, 46, 13] },
"type": { "value" : 1 }
}

Related

InfluxDB: Query data from measurement via REST API

I am using the REST API of InfluxDB
curl -s -XPOST -G influxdb_url:8086/query?pretty=true --data-urlencode "db=metrics" --data-urlencode "q=SHOW MEASUREMENTS WITH MEASUREMENT=~/a.b-c*/"
to retrieve the measurements available
{
"results": [
{
"statement_id": 0,
"series": [
{
"name": "measurements",
"columns": [
"name"
],
"values": [
[
"a.b-c"
],
[
"a.b-cd"
],
[
"a.b-cde"
],
[
"a.b-cdfg"
]
]
}
]
}
]
}
and then I trying to select all the data from one of them
curl -s -XPOST -G influxdb_url:8086/query?pretty=true --data-urlencode "db=metrics" --data-urlencode "q=SELECT * FROM "a.b-c""
and I am getting this error
{
"error": "error parsing query: found -, expected ; at line 1, char 18"
}
The exact same query works if I login the Influx instance
select * from "a.b-c"
The issue here is the use of the quotation mark " in the string where you specify your query. To properly include the quotation mark in your query you need to wrap the string in triple single quotes. In this particular case the correct way is the following:
curl -s -XPOST -G influxdb_url:8086/query?pretty=true --data-urlencode "db=metrics" --data-urlencode '''q=SELECT * FROM "a.b-c"'''
which in turn gives you a desired outcome. An example of a response using test dataset:
{
"results": [
{
"statement_id": 0,
"series": [
{
"name": "a.b-c",
"columns": [
"time",
"m",
"value"
],
"values": [
[
"2020-10-06T15:13:29.562248587Z",
"a",
0
]
]
}
]
}
]
}

Unknown SeaweedFs Filer API response format

I've installed the last SeaweedFS version (version 30GB 1.72 linux amd64) using docker-compose, I'm running master, volume and filer servers.
All the system seems to work OK, I can upload and download files, however the API response when I query on a directory is different that the response showed in the official doc.
For instance, when I query the /dir1 directory with the command:
curl -H "accept: application/json" localhost:8888/dir1/?pretty=y
The response is like:
{
"Path": "/dir1",
"Entries": [
{
"FullPath": "/dir1/nyfile.bin",
"Mtime": "2020-04-16T17:56:55Z",
"Crtime": "2020-04-16T17:56:55Z",
"Mode": 432,
"Uid": 1000,
"Gid": 1000,
"Mime": "application/octet-stream",
"Replication": "000",
"Collection": "",
"TtlSec": 0,
"UserName": "",
"GroupNames": null,
"SymlinkTarget": "",
"Md5": "zQnaPjjZsQpiU+N3RXp7GQ==",
"Extended": null,
"chunks": [
{
"file_id": "7,030d2d9790",
"size": 55320265,
"mtime": 1587059815546104803,
"e_tag": "7b71a215",
"fid": {
"volume_id": 7,
"file_key": 3,
"cookie": 221091728
}
}
]
}
],
"Limit": 100,
"LastFileName": "weed.bin",
"ShouldDisplayLoadMore": false
}
That response is quite different from the example in the docs (https://github.com/chrislusf/seaweedfs/wiki/Filer-Server-API):
> curl -H "Accept: application/json" "http://localhost:8888/javascript/?pretty=y" # list all files under /javascript/
{
"Directory": "/javascript/",
"Files": [
{
"name": "new_name.js",
"fid": "3,034389657e"
},
{
"name": "report.js",
"fid": "7,0254f1f3fd"
}
],
"Subdirectories": null
}
So, I've got some questions:
Where is the documentation (if exists) for the new Filer REST API ?
How can I figure out what is a file and what is a directory with the new API ?
Currently, I'm using the "chunks" property, If there is "chunks" then It's a file otherwise is a directory.
How can I get the size of a file ? Should I sum the size property in all its chunks ?
Documentation not there yet. The REST API is not used internally. Use gRPC for more advanced usages.
Mode follows os.Mode https://golang.org/pkg/os/#FileMode. Use os.ModeDir to determine file or directory.
Chunks may have overlaps. Use the highest watermark for file size.

Firebase Firestore REST example

Hello I am looking to write a script which uses firebase firestore and writes some json to a specific collection in firestore. I have done this with the realtime db but firestore is a tad different below is my Realtime db snippet that works.
curl -X POST \
-d '{"param1":"'"$1"'", "param2":"'"$2"'"}' \
https://xxxx.firebaseio.com/xxxx.json?
Thanks for the help
After Reading the documentation I got to this
curl -X POST \
-H "Content-Type: application/json" \
-d'{
"fields": {
"Field1": {
"stringValue": "'"$var1"'"
},
"Field2": {
"stringValue": "'"$var2"'"
},
"Field3": {
"stringValue": "$var3"
}
}
}'\"https://firestore.googleapis.com/v1beta1/projects/**PROJECT_ID**/databases/(default)/documents/**COLLECTION_ID**?&key=(YOUR API KEY)"
The accepted answer helped me, but it took me a long time to figure out how can I use data types other than stringValues, so I am adding this answer hoping someone finds this helpful in the future.
curl -X POST \
-H "Content-Type: application/json" \
-d' {
"fields": {
"Field1": {
"arrayValue": {
"values": [{
"mapValue": {
"fields": {
"key1": {
"stringValue": "val1"
},
"key2": {
"stringValue": "val2"
}
}
}
}]
}
},
"Field2": {
"integerValue": <intValue>
},
"Field3": {
"stringValue": "var3"
}
}
}'\"https://firestore.googleapis.com/v1beta1/projects/**PROJECT_ID**/databases/(default)/documents/**COLLECTION_ID**?&key=<YOUR WEB API KEY>"
Use this for reference.
example field value
var data = { "fields": { "productName":{"stringValue": dealname.toString()}, "companyname":{"stringValue": companyName.toString()}, "contact":{"stringValue": contact.toString()}, "email":{"stringValue": email.toString()}, "domain":{"stringValue": domain.toString()}, "createdate":{"stringValue": createdate.toString()}, "salesCode":{"stringValue": code.toString()}, "price":{"stringValue": amount.toString()}, "phone":{"stringValue": phone.toString()}, "orderId":{"stringValue": orderId.toString() } } };
more information firestore information

How can I use the BigQuery REST API from the command line?

Attempting to make a plain GET request to one of the BigQuery REST APIs gives an error that looks like this:
curl https://www.googleapis.com/bigquery/v2/projects/$PROJECT_ID/jobs/$JOBID
Output:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization",
...
What is the correct way to invoke one of the REST APIs from the command-line, such as the query or insert APIs? The API reference has a "Try this API", but the examples don't translate directly to something you can run from the command-line.
As a disclaimer, when working from the command-line, using the bq tool will usually be sufficient, or for more complex use cases, the BigQuery client libraries enable programming with BigQuery from multiple languages. It can still be useful sometimes to make plain requests to the REST APIs to see how certain APIs work at a low level, however.
First, make sure that you have installed the Google Cloud SDK. This should include the gcloud and bq command-line tools. If you haven't already, authorize your account by running this command from your terminal:
gcloud auth login
This should prompt you to log in and then give you an access code that you can paste into your terminal. (The exact process may change over time).
Now let's try a query using the BigQuery REST API, calling the jobs.query method. Modify this script with your own project name, which you can find from the Google Cloud Console, then paste the script into your terminal:
PROJECT="YOUR_PROJECT_NAME"
QUERY="\"SELECT 1 AS x, 'foo' AS y;\""
REQUEST="{\"kind\":\"bigquery#queryRequest\",\"useLegacySql\":false,\"query\":$QUERY}"
echo $REQUEST | \
curl -X POST -d #- -H "Content-Type: application/json" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
https://www.googleapis.com/bigquery/v2/projects/$PROJECT/queries
If it worked, you should see output that looks like this:
{
"kind": "bigquery#queryResponse",
"schema": {
"fields": [
{
"name": "x",
"type": "INTEGER",
"mode": "NULLABLE"
},
{
"name": "y",
"type": "STRING",
"mode": "NULLABLE"
}
]
},
"jobReference": {
"projectId": "<your project ID>",
"jobId": "<your job ID>"
},
"totalRows": "1",
"rows": [
{
"f": [
{
"v": "1"
},
{
"v": "foo"
}
]
}
],
"totalBytesProcessed": "0",
"jobComplete": true,
"cacheHit": false
}
If you haven't set up the bq command-line tool, you can use bq init from your terminal to do so. Once you have, you can try running the same query using it:
bq query --use_legacy_sql=False "SELECT 1 AS x, 'foo' AS y;"
You can also see the REST API requests that the bq tool makes by passing the --apilog= option:
bq --apilog= query --use_legacy_sql=False "SELECT [1, 2, 3] AS x;"
Now let's try an example using the jobs.insert method instead of the query API. Run this script, replacing YOUR_PROJECT_NAME with your project name:
PROJECT="YOUR_PROJECT_NAME"
QUERY="\"SELECT 1 AS x, 'foo' AS y;\""
REQUEST="{\"configuration\":{\"query\":{\"useLegacySql\":false,\"query\":${QUERY}}}}"
echo $REQUEST | \
curl -X POST -d #- -H "Content-Type: application/json" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
https://www.googleapis.com/bigquery/v2/projects/$PROJECT/jobs
Unlike the query API, which returned a response immediately, you will see a result that looks similar to this:
{
"kind": "bigquery#job",
"etag": "\"<etag string>\"",
"id": "<project name>:<job ID>",
"selfLink": "https://www.googleapis.com/bigquery/v2/projects/<project name>/jobs/<job ID>",
"jobReference": {
"projectId": "<project name>",
"jobId": "<job ID>"
},
"configuration": {
"query": {
"query": "SELECT 1 AS x, 'foo' AS y;",
"destinationTable": {
"projectId": "<project name>",
"datasetId": "<anonymous dataset>",
"tableId": "<anonymous table>"
},
"createDisposition": "CREATE_IF_NEEDED",
"writeDisposition": "WRITE_TRUNCATE",
"useLegacySql": false
}
},
"status": {
"state": "RUNNING"
},
"statistics": {
"creationTime": "<timestamp millis>",
"startTime": "<timestamp millis>"
},
"user_email": "<your email address>"
}
Notice the status:
"status": {
"state": "RUNNING"
},
If you want to check on the job now, you can use the jobs.get method. Similar to before, run this from your terminal, using the job ID from the output in the previous step:
PROJECT="YOUR_PROJECT_NAME"
JOB_ID="YOUR_JOB_ID"
curl -H "Authorization: Bearer $(gcloud auth print-access-token)" \
https://www.googleapis.com/bigquery/v2/projects/$PROJECT/jobs/$JOB_ID
If the query is done, you'll get a response that indicates as much:
...
"status": {
"state": "DONE"
},
...
Finally, we can make a request to fetch the query results, also using the REST API.
curl -H "Authorization: Bearer $(gcloud auth print-access-token)" \
https://www.googleapis.com/bigquery/v2/projects/$PROJECT/queries/$JOB_ID
The output will look similar to when we used the jobs.query method above:
{
"kind": "bigquery#getQueryResultsResponse",
"etag": "\"<etag string>\"",
"schema": {
"fields": [
{
"name": "x",
"type": "INTEGER",
"mode": "NULLABLE"
},
{
"name": "y",
"type": "STRING",
"mode": "NULLABLE"
}
]
},
"jobReference": {
"projectId": "<project ID>",
"jobId": "<job ID>"
},
"totalRows": "1",
"rows": [
{
"f": [
{
"v": "1"
},
{
"v": "foo"
}
]
}
],
"totalBytesProcessed": "0",
"jobComplete": true,
"cacheHit": true
}

Doubts about notifications format in Orion usin APIv2

we are testing the subscription functionality using the APIv2. We are following the guidelines described in http://telefonicaid.github.io/fiware-orion/api/v2/ . We are able to create a correct subscription but the format of the notification messages that we received is not what we expected. The format of these messages is like the APIv1 version. Is this the expected behavior?
We are using the Docker image from https://hub.docker.com/r/fiware/orion/.
Version information about the build:
{
"orion" : {
"version" : "1.0.0-next",
"uptime" : "0 d, 1 h, 28 m, 47 s",
"git_hash" : "a729812c45d2749fffbc19add17631b2fffc8797",
"compile_time" : "Fri Apr 8 10:05:55 UTC 2016",
"compiled_by" : "root",
"compiled_in" : "838a42ae8431"
}
}
Steps to reproduce:
Create an entity:
(curl -X POST http://<cb_url>:<cb_port>/v2/entities?options=keyValues -s -S --header 'Content-Type: application/json' \
--header 'Accept: application/json' -d #- | python -mjson.tool) <<EOF
{
"type":"Room",
"id": "test",
"humidity":40
}
EOF
Create a subscription:
(curl -X POST http://<cb_url>:<cb_port>/v2/subscriptions -s -S --header 'Content-Type: application/json' \
--header 'Accept: application/json' -d #- | python -mjson.tool) <<EOF
{
"description": "One subscription to rule them all",
"subject": {
"entities": [
{
"idPattern": ".*",
"type": "Room"
}
],
"condition": {
"attributes": [
"humidity"
],
"expression": {
"q": "humidity>40"
}
}
},
"notification": {
"callback": "http://192.168.99.1:5000",
"attributes": [
"humidity"
],
"throttling": 5
},
"expires": "2016-05-05T14:00:00.00Z"
}
EOF
Update attribute.:
(curl -X PUT <cb_url>:<cb_port>/v2/entities/test/attrs/humidity/value -s -S --header 'Content-Type: application/json' \
--header 'Accept: application/json' -d #- | python -mjson.tool) <<EOF
{
"value": 50
}
EOF
We get the notification with the following format:
{u'contextResponses': [
{u'contextElement': {
u'attributes': [{
u'name': u'humidity',
u'type': u'none',
u'value': {u'value': 50}
}],
u'id': u'test',
u'isPattern': u'false',
u'type': u'Room'
},
u'statusCode': {
u'code': u'200',
u'reasonPhrase': u'OK'
}}],
u'originator': u'localhost',
u'subscriptionId': u'5707b72882fc213130f4e5b9'}
NGSIv2 notification formats have not been yet implemented (at Orion 1.0.0). Note that NGSIv2 is yet in beta status and sometimes the specification (where the new notification format has been defined) is a step forward the implementation.
There is a github issue about this, to which you can subscribe in order to know when this feature gets implemented.
EDIT: NGSIv2 notification formats have been implemented in Orion 1.1.0.