Tcl rest interface POST with query - rest

Using the tcllib rest package in interface "mode", how would one POST a key/value pairs while also passing key/value pairs in the query string?
Given one=foo, two=bar for the query string, and three=baz, four=qux for the form submission, how would the config array be structured and how would the call be implemented?

Related

Creating a REST call with multiple document ID's

I have this URL string:
http://localhost:9033/api/v1/myapi/account/123456/collection/COLL12345
I want to amend it to also include document ID's. There could be one or more documents so I think these should be in some form of array. Should I change my API call from GET to POST and include the document ID's as a JSON array in the message body or is it possible to construct a URL string with them?
You can both add them to the URI path or the URI query. I would use the query for this, something like: /api/v1/myapi/account/123456/collection/COLL12345?ids=[1,4,5]. You need to URI encode the [1,4,5] part and URI decode and JSON parse it on the server.

Stop mongoose from interpret string as (hex) number

I have stored crypto account addresses in mongodb, and want to retrieve documents using the addresses. One example look like this "0x754B2f9797b096f417a24686c4B368F508...". My schema specify the addresses are string, and thus, the addresses are stored in mongodb in string and look as the above example. I can retrieve document as expected in mongosh (shell). However, I cannot retrieve any document even if I pass literal string in the filter with mongoose (using find, findOne, and so on). The only way to retrive documents is to not pass any filter. I guess mongoose interprets the value as number and thus convert it before pass to mongodb.
How can I fix the problem?
Thanks.

How to query mongodb without knowing dynamic field type in Java?

I want to create some APIs, which let user pass in just strings for types like number, boolean. And automatically convert them before querying mongodb. Is it possible?
Yes, It is possible for MongoDB. You could write your own utility to convert string in mongo specific query or you could use some open source utilities like enter link description here.
Eventually, MongoDB accepts JSON string to execute the same client does also convert each query in the same JSON format. MongoDB clients or MongoDB doesn't need any predefined mapping or POJO.
This utility will convert string as shown below -
User string -
"select * from users where firstName='Vijay' AND lastName='Rajput'"
Then this utility will convert it into -
db.users.find({$and: [{firstName: 'Vijay'}, {lastName: 'Rajput'}]})

Why use Postgres JSON column type?

The JSON column type accepts non valid JSON
eg
[1,2,3] can be inserted without the closing {}
Is there any difference between JSON and string?
While [1,2,3] is valid JSON, as zerkms has stated in the comments, to answer the primary question: Is there any difference between JSON and string?
The answer is yes. A whole new set of query operations, functions, etc. apply to json or jsonb columns that do not apply to text (or related types) columns.
For example, while with text columns you would need to use regular expressions and related string functions to parse the string (or a custom function), with json or jsonb, there exists a separate set of query operators that works within the structured nature of JSON.
From the Postgres doc, given the following JSON:
{
"guid": "9c36adc1-7fb5-4d5b-83b4-90356a46061a",
"name": "Angela Barton",
"is_active": true,
"company": "Magnafone",
"address": "178 Howard Place, Gulf, Washington, 702",
"registered": "2009-11-07T08:53:22 +08:00",
"latitude": 19.793713,
"longitude": 86.513373,
"tags": [
"enim",
"aliquip",
"qui"
]
}
The doc then says:
We store these documents in a table named api, in a jsonb column named
jdoc. If a GIN index is created on this column, queries like the
following can make use of the index:
-- Find documents in which the key "company" has value "Magnafone"
SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc #> '{"company": "Magnafone"}';
This allows you to query the jsonb (or json) fields very differently than if it were simply a text or related field.
Here is some Postgres doc that provides some of those query operators and functions.
Basically, if you have JSON data that you want to treat as JSON data, then a column is best specified as json or jsonb (which one you choose depends on whether you want to store it as plain text or binary, respectively).
The above data can be stored in text, but the JSON data types have the advantage you can apply JSON rules in those columns. There are several functions which are JSON specified which cannot be used for text fields.
Refer to this link to understand about the json functions/procedures

How to retrieve all objects in a Mongodb collection including the ids?

I'm using Casbah and Salat to create my own Mongodb dao and am implementing a getAll method like this:
val dao: SalatDAO[T, ObjectId]
def getAll(): List[T] = dao.find(ref = MongoDBObject()).toList
What I want to know is:
Is there a better way to retrieve all objects?
When I iterate through the objects, I can't find the object's _id. Is it excluded? How do I include it in the list?
1°/ The ModelCompanion trait provides a def findAll(): SalatMongoCursor[ObjectType] = dao.find(MongoDBObject.empty) methods. You will have to do a dedicated request for every collection your database have.
If you iterate over the objects returned, it could be better to iterate with the SalatMongoCursor[T] returned by the dao.find rather than doing two iterations (one with the toList from Iterator trait then another on your List[T]).
2°/ Salat maps the _id key with your class id field. If you define a class with an id: ObjectId field. This field is mapped with the mongo _id key.
You can change this behaviour using the #Key annotation as pointed out in Salat documentation
I implemented something like:
MyDAO.ids(MongoDBObject("_id" -> MongoDBObject("$exists" -> true)))
This fetches all the ids, but given the wide range of what you might be doing, probably not the best solution for all situations. Right now, I'm building a small system with 5 records of data, and using this to help understand how MongoDB works.
If this was a production database with 1,000,000 entries, then this (or any getAll query) would be stupid. Instead of doing that, consider trying to write a targeted query that goes after the real results you seek.