Couchbase Cordova - doc_ids | Need to get uuids - mongodb

I am trying to get the doc_ids in couchbase cordova to be in uuid format.
Currently, any doc that is inserted has _ids like - -AexsV4lbjOoH-AdlN1Fi0W , --mpIWIHza6CQEJEHRxPKba etc.
My setup is of such nature that the _ids need to be uuids as the cordova app will save the local DB as a part of a universal MongoDB on the server.
(So the DB on the server will have multiple local DBs stored). Hence, I need the _ids to be uuids
I did a quick research into how to create uuids in JS and found quite a few answers like -
/**
* Fast UUID generator, RFC4122 version 4 compliant.
* #author Jeff Ward (jcward.com).
* #license MIT license
* #link http://stackoverflow.com/questions/105034/how-to-create-a-guid-uuid-in-javascript/21963136#21963136
**/
var UUID = (function() {
var self = {};
var lut = []; for (var i=0; i<256; i++) { lut[i] = (i<16?'0':'')+(i).toString(16); }
self.generate = function() {
var d0 = Math.random()*0xffffffff|0;
var d1 = Math.random()*0xffffffff|0;
var d2 = Math.random()*0xffffffff|0;
var d3 = Math.random()*0xffffffff|0;
return lut[d0&0xff]+lut[d0>>8&0xff]+lut[d0>>16&0xff]+lut[d0>>24&0xff]+'-'+
lut[d1&0xff]+lut[d1>>8&0xff]+'-'+lut[d1>>16&0x0f|0x40]+lut[d1>>24&0xff]+'-'+
lut[d2&0x3f|0x80]+lut[d2>>8&0xff]+'-'+lut[d2>>16&0xff]+lut[d2>>24&0xff]+
lut[d3&0xff]+lut[d3>>8&0xff]+lut[d3>>16&0xff]+lut[d3>>24&0xff];
}
return self;
})();
which generates uuids like d6414228-b07c-4bd2-9aa3-d1df8b548de6
So my question is - is there a direct way inside couchbase phonegap plugin to achieve this directly?

When writing a document to Couchbase Lite, you can either let the database pick a random ID (it will be unique) or specify one.
You can send a POST request to have the database generate the document ID.
curl -H 'Content-Type: application/json' \
-vX POST 'http://localhost:5984/app' \
-d '{"name": "john"}'
{"id":"-UwALc1GlcAcG60uag1oMf1","rev":"1-10dc5637dc2ccb55e007440cca73a415","ok":true}
Or a PUT request to specify one.
curl -H 'Content-Type: application/json' \
-vX PUT 'http://localhost:5984/app/john' \
-d '{"name": "john"}'
{"id":"john","rev":"1-10dc5637dc2ccb55e007440cca73a415","ok":true}
Note that from then on you must specify the revision number of the current revision to save updates on the document.
curl -H 'Content-Type: application/json' \
-vX PUT 'http://localhost:5984/app/john?rev=1-10dc5637dc2ccb55e007440cca73a415' \
-d '{"name": "johnny"}'

Related

How to add the column to Google Sheets using API and provide the name and type of the column in the same call?

So, what I could achieve using the Google Sheets API is being able to create a new column using the following curl based call
curl -v \
-H 'Authorization: Bearer ya29.GlxUB9K_96tyQFyQ64eaYOeImtJt32213zjosf6LW1Inv6MOqQCCodA7CycvL5EFKIpeX4dVEebS4rUl24U1J7euhMjqBZq0QEU7ZK1B64THQXNwBpDvTzoUT9hTRg' \
-H 'Content-Type: application/json' \
-d '{
"requests": [
{
"insertDimension": {
"range": {
"sheetId": 2052094881,
"dimension": "COLUMNS",
"startIndex": 0,
"endIndex": 1
}
}
}
],
}' \
https://sheets.googleapis.com/v4/spreadsheets/1mHrPXQILuprO4NdqTgrVKlGazvvzgCFqIphGdsmptD8:batchUpdate
While this call is useful, it does not completely help. This is because the reason I wanted to add a column was to give a name and type (or format) the column values. But, as per this API, this is what see as an output
Is there a way to create and add name and type to the column in a single API call?
Thanks a lot!

MongoDB to BigQuery

What is the best way to export data from MongoDB hosted in mlab to google bigquery?
Initially, I am trying to do one time load from MongoDB to BigQuery and later on I am thinking of using Pub/Sub for real time data flow to bigquery.
I need help with first one time load from mongodb to bigquery.
In my opinion, the best practice is building your own extractor. That can be done with the language of your choice and you can extract to CSV or JSON.
But if you looking to a fast way and if your data is not huge and can fit within one server, then I recommend using mongoexport. Let's assume you have a simple document structure such as below:
{
"_id" : "tdfMXH0En5of2rZXSQ2wpzVhZ",
"statuses" : [
{
"status" : "dc9e5511-466c-4146-888a-574918cc2534",
"score" : 53.24388894
}
],
"stored_at" : ISODate("2017-04-12T07:04:23.545Z")
}
Then you need to define your BigQuery Schema (mongodb_schema.json) such as:
$ cat > mongodb_schema.json <<EOF
[
{ "name":"_id", "type": "STRING" },
{ "name":"stored_at", "type": "record", "fields": [
{ "name":"date", "type": "STRING" }
]},
{ "name":"statuses", "type": "record", "mode": "repeated", "fields": [
{ "name":"status", "type": "STRING" },
{ "name":"score", "type": "FLOAT" }
]}
]
EOF
Now, the fun part starts :-) Extracting data as JSON from your MongoDB. Let's assume you have a cluster with replica set name statuses, your db is sample, and your collection is status.
mongoexport \
--host statuses/db-01:27017,db-02:27017,db-03:27017 \
-vv \
--db "sample" \
--collection "status" \
--type "json" \
--limit 100000 \
--out ~/sample.json
As you can see above, I limit the output to 100k records because I recommend you run sample and load to BigQuery before doing it for all your data. After running above command you should have your sample data in sample.json BUT there is a field $date which will cause you an error loading to BigQuery. To fix that we can use sed to replace them to simple field name:
# Fix Date field to make it compatible with BQ
sed -i 's/"\$date"/"date"/g' sample.json
Now you can compress, upload to Google Cloud Storage (GCS) and then load to BigQuery using following commands:
# Compress for faster load
gzip sample.json
# Move to GCloud
gsutil mv ./sample.json.gz gs://your-bucket/sample/sample.json.gz
# Load to BQ
bq load \
--source_format=NEWLINE_DELIMITED_JSON \
--max_bad_records=999999 \
--ignore_unknown_values=true \
--encoding=UTF-8 \
--replace \
"YOUR_DATASET.mongodb_sample" \
"gs://your-bucket/sample/*.json.gz" \
"mongodb_schema.json"
If everything was okay, then go back and remove --limit 100000 from mongoexport command and re-run above commands again to load everything instead of 100k sample.
ALTERNATIVE SOLUTION:
If you want more flexibility and performance is not your concern, then you can use mongo CLI tool as well. This way you can write your extract logic in a JavaScript and execute it against your data and then send output to BigQuery. Here is what I did for the same process but used JavaScript to output in CSV so I can load it much easier to BigQuery:
# Export Logic in JavaScript
cat > export-csv.js <<EOF
var size = 100000;
var maxCount = 1;
for (x = 0; x < maxCount; x = x + 1) {
var recToSkip = x * size;
db.entities.find().skip(recToSkip).limit(size).forEach(function(record) {
var row = record._id + "," + record.stored_at.toISOString();;
record.statuses.forEach(function (l) {
print(row + "," + l.status + "," + l.score)
});
});
}
EOF
# Execute on Mongo CLI
_MONGO_HOSTS="db-01:27017,db-02:27017,db-03:27017/sample?replicaSet=statuses"
mongo --quiet \
"${_MONGO_HOSTS}" \
export-csv.js \
| split -l 500000 --filter='gzip > $FILE.csv.gz' - sample_
# Load all Splitted Files to Google Cloud Storage
gsutil -m mv ./sample_* gs://your-bucket/sample/
# Load files to BigQuery
bq load \
--source_format=CSV \
--max_bad_records=999999 \
--ignore_unknown_values=true \
--encoding=UTF-8 \
--replace \
"YOUR_DATASET.mongodb_sample" \
"gs://your-bucket/sample/sample_*.csv.gz" \
"ID,StoredDate:DATETIME,Status,Score:FLOAT"
TIP: In above script I did small trick by piping output to able to split the output in multiple files with sample_ prefix. Also during split it will GZip the output so you can load it easier to GCS.
From a basic reading of MongoDB's documentation, it sounds like you can use mongoexport to dump your database as JSON. Once you've done that, refer to the BigQuery loading data topic for a description of how to create a table from JSON files after copying them to GCS.
You can read data from MongoDB and stream it to BigQuery. You can find an example in NodeJS here.
This is an extension of the linked example that prevents duplicated records (as long as they are still streaming buffer):
const { BigQuery } = require('#google-cloud/bigquery');
const bigqueryClient = new BigQuery();
...
const jsonData = // Array of documents from MongoDB
const inputRows = jsonData.map(row => ({
insertId: row._id,
json: row
}));
const insertOptions = {
raw: true
};
await bigqueryClient
.dataset(datasetId)
.table(tableId)
.insert(inputRows, insertOptions);

Angular2 - Unable to fetch using a 'where' condition in query to a REST API

I am working with Parse Server, and I am trying to fetch data using the REST API in my Angular2 application.
I am trying to fetch data according to a where condition, but I am not able to do so.
This is the code I am using:
constructor(http) {
var key = new URLSearchParams();
this.http = http;
this.disho = null;
key.set('where', {"estTime":"5min"});
this.http.get('https://parseapi.example.com/classes/Setto', { where : key }).subscribe(data => {
this.disho = data.json().results;
console.log(this.disho);
});
}
The above code ignore my where condition, and returns all the records.
However, the following code returns the right results when executed in terminal via cURL
curl -X GET \
-H "X-Parse-Application-Id: someKey" \
-H "X-Parse-REST-API-Key: someKey" \
-G \
--data-urlencode 'where={"estTime":"5min"}' \
https://parseapi.example.com/classes/Setto
Change { where : key } to { search : key }
Should work!
I think that there is a typo in your code. Single quotes are missing. You could try the following:
key.set('where', '{"estTime":"5min"}');

Parse: Creating a New Class Programmatically

Is it possible to create a new Class programmatically (i.e. not from the dashboard) via any of the API's or the Parse CLI?
The REST API appears to have functionality to fetch, modify and delete individual Schemas (classes) but not to add them. (https://parse.com/docs/rest/guide#schemas).
Hoping for something like the following:
curl -X ADD \
-H "X-Parse-Application-Id: XXXXXX" \
-H "X-Parse-Master-Key: XXXXXXXX" \
-H "Content-Type: application/json" \
https://api.parse.com/1/schemas/City
You seem to have skipped the part which deals with adding schema in the documentation. To create a new class, according to documentation, You use following method in cURL:
curl -X POST \
-H "X-Parse-Application-Id: Your APP Id" \
-H "X-Parse-Master-Key: Your master key" \
-H "Content-Type: application/json" \
-d '
{
"className": "Your Class name goes here",
"fields": {
"Your field name here": {
"type": "Your field's data type e.g. String, Int etc. Add multiple fields if you want"
}
}
}' \
https://api.parse.com/1/schemas/[Your class name]
Or in Python:
import json,httplib
connection = httplib.HTTPSConnection('api.parse.com', 443)
connection.connect()
connection.request('POST', '/1/schemas/Game', json.dumps({
"className":"[Your class name]","fields":{"Your field name":{"type":"your field's data type"} }
}), {
"X-Parse-Application-Id": "7Lo3U5Ei75dragCphTineRMoCfwD7UJjd1apkPKX",
"X-Parse-Master-Key": "ssOXw9z1ni1unx8tW5iuaHCmhIObOn4nSW9GHj5W",
"Content-Type": "application/json"
})
result = json.loads(connection.getresponse().read())
print result

Spring MVC 3.1 REST services post method return 415

I'm doing a Spring MVC controller and I still get problem with POST operation.
I've read many solutions on stackoverflow without to fix my problem.
My achievement at the moment :
I sent a GET request with an Id and return an Object converted to JSON successfully.
I failed to send a POST request with a JSON body, return = 415 UNSUPPORTED_MEDIA_TYPE
1) I added to my pom.xml the Jackson API : 1.8.5
2) My Spring configuration file:
I added all necessary parts :
viewResolver
org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter
MappingJacksonHttpMessageConverter
mvc:annotation-driven
scan my controllers
3) My model object is simple : an Account with Id, Name and an amount
#Document
public class Account implements Serializable {
private static final long serialVersionUID = 9058933587701674803L;
#Id
private String id;
private String name;
private Double amount=0.0;
// and all get and set methods
4) and finally my simplified Controller class :
#Controller
public class AdminController {
#RequestMapping(value="/account", method=RequestMethod.POST,
headers = {"content-type=application/json"})
#ResponseStatus( HttpStatus.CREATED )
public void addAccount(#RequestBody Account account){
log.debug("account from json request " + account);
}
#RequestMapping(value="/account/{accountId}", method=RequestMethod.GET)
#ResponseBody
public Account getAccount(#PathVariable("accountId") long id){
log.debug("account from json request " + id);
return new Account();
}
}
5) On client side I've just executed curl commands :
The successfully GET command :
curl -i -GET -H 'Accept: application/json' http://myhost:8080/compta/account/1
The POST command which failed:
curl -i -POST -H 'Accept: application/json' -d '{"id":1,"name":"test",amount:"0.0"}' http://myhost:8080/compta/account
Any ideas where I'm going wrong?
Well, "UNSUPPORTED_MEDIA_TYPE" should be a hint. Your curl command is actually sending:
Content-Type: application/x-www-form-urlencoded
Simply add explicit Content-Type header and you're good to go:
curl -v -i -POST -H 'Accept: application/json' -H 'Content-Type: application/json' -d '{"id":1,"name":"test",amount:"0.0"}' http://myhost:8080/compta/account
Try this :
curl -i -POST -H "Accept: application/json" -H "Content-type: application/json" -d '{"id":1,"name":"test",amount:"0.0"}' http://myhost:8080/compta/account