Conversion of a object into a particular datetime format - c#-3.0

I want to convert a query string object into a datetime in this format:- "YYYY-mm-dd HH:mm:ss.xxx" in C#.net. But when i am using Convert.ToDateTime(object) for getting the datetime value then an exception is being fired.
Could anyone can provide me the iFormatProvider for the same.?
Thanks
Varun Sareen

Have a look at DateTime.TryParseExact Method

I think your problem is trying to convert the QueryString object, instead of getting a value from the query string and then converting the value.
A QueryString object is a keyed collection of the values specified in a URL from an HTTP request. So if you have a URL like: http://example.com?a=1&b=2&c=3, the request QueryString object will contain three values: 1, 2, and 3. To access the values you would use the keys:
var aValue = Request.QueryString["a"];
And the variable aValue would then contain the string value "1" (without the quotes).
After getting the value from the query string, you can use the TryParseExact method suggested by #astander.

Related

Postgres jsonb_set concatenate current value

I'm trying to use jsonb_set to update a range of json objects within my database. I can get a query working that updates the object with a string value, however I cannot seem to get it to update using the current value.
UPDATE entity
SET properties = jsonb_set(properties, '{c_number}', concat('0', properties->>'c_number'))
WHERE type = 1 and length(properties->>'c_number') = 7
The above doesn't work in its current format, I think the issue is the properties->>'c_number' inside the jsonb_set. Is there a way I can access the current value and simply add a leading 0?
Found a solution:
UPDATE entity
SET properties = jsonb_set(properties, '{c_number}', concat('"0', properties->>'c_number', '"')::jsonb)
WHERE type = 1 and length(properties->>'c_number') = 7
Based on this answer I was able to prepare my solution.
My goal was to create a new property in JSON, with a value that is based on the value of one of the properties which my JSON already has.
For example:
I have:
{
property_root: { property_root_child: { source_property_key: "source_property_value" } }
}
I want:
{
property_root: { property_root_child: { source_property_key: "source_property_value", target_property_key: "source_property_value + my custom ending" } }
}
So my query would look:
UPDATE database.table_with_json
SET json_column=jsonb_set(
json_column,
'{ property_root, property_root_child, target_property_key }',
concat('"', json_column->'property_root'->'property_root_child'->>'source_property_key', ' + my custom ending', '"')::jsonb)
WHERE
json_column->'property_root'->'property_root_child'->'source_property_key' IS NOT NULL
Why concat looks messy? Based on the answer mentioned above:
The third argument of jsonb_set() should be of jsonb type. The problem is in casting a text string to jsonb string, you need a string in double quotes.
That is why we have to wrap concat in double qoutes.

Issue with JsonPath while extracting field from a json string in Scala

I have a json string and I want to extract a parameter from this json using JsonPath in scala. Given Json:
{
"message_version":"2.0",
"message":"1.0",
"message_metadata":{
\"event_type\":\"MerchantRegistrationFraudEvaluation\",
\"event_date\":\"1513665186657\"
}
}
When I tried the below code to extract "event_type" from this json string and it throws an exception:
val eventType = JsonPath.read[String](jsonString, "$.message_metadata.event_type")
Exception:
com.jayway.jsonpath.PathNotFoundException: Expected to find an object with property ['event_type'] in path $['message_metadata'] but found 'java.lang.String'. This is not a json object according to the JsonProvider: 'com.jayway.jsonpath.spi.json.JsonSmartJsonProvider'
But if I use two separate statements, one to extract message_metadata first then other one to extract event_type as below it works fine:
val eventMetaData = JsonPath.read[String](message, "$.message_metadata") val eventType = JsonPath.read[String](eventMetaData, "$.event_type") eventType // MerchantRegistrationFraudEvaluation.
Can someone help here to get the event_type in a single line?

Passing an array of object having name and value attribute as query param

Here is scenario.
I have below pojo
Property {
String name;
String value;
}
I want to pass an array of above pojo as query param. How do i do that.
Something like
http://myservice.com?property:name=n1&property:value=v1&property:name=n2&property:value=v2
And I want to figure out at service end that v1 is the value for n1.
Is there a way to achieve this.
One way that you can pass this into a URL as the following -
sample.php?arr_args[key1]=value1&arr_args[key2]=value2

Dereference a ReferenceField in Mongoengine

I am trying to dereference a reference field on my Flask backend and return the complete object with that certain field dereferenced.
The field I am trying to dereference is defined like this:
vouches_received = db.ListField(db.ReferenceField('Vouch'))
The way I am trying to dereference it is like this:
unverified_vouches = []
for vouch in usr.vouches_received:
unverified_vouches.append(vouch.to_mongo())
usr.vouches_received = unverified_vouches
However, when I then do:
usr.to_json()
On the object, then I get a ValidationError like so:
ValidationError: u'{...}' is not a valid ObjectId, it must be a
12-byte input of type 'str' or a 24-character hex string
The 3 dots (...) is basically the document dereferenced, it has mostly Strings, a Date Field, and some other reference fields I do not wish to dereference.
I am aware this is a valid error, as it is expecting an ObjectID for the reference field, but then arises the question, how do I succeed at dereferencing that field and return the document.
Thanks
The ListField is expecting elements of ObjectId and because you've de-referenced them it throws that error. I'm not sure this is the most elegant way but could you convert the usr.to_json() to a dict and then replace the vouches_received list with a deferenced list afterwards - I can't test it but something like?
user_dict = json.loads(usr.to_json())
unverified_vouches = []
for vouch in usr.vouches_received:
user_dict['vouches_received'].append(vouch.to_mongo())
usr_json = json.dumps(user_dict)
A better solution may be to use an EmbededDocument.

How to query a JSON element

Let's say I have a Postgres database (9.3) and there is a table called Resources. In the Resources table I have the fields id which is an int and data which is a JSON type.
Let's say I have the following records in said table.
1, {'firstname':'Dave', 'lastname':'Gallant'}
2, {'firstname':'John', 'lastname':'Doe'}
What I want to do is write a query that would return all the records in which the data column has a json element with the lastname equal to "Doe"
I tried to write something like this:
records = db_session.query(Resource).filter(Resources.data->>'lastname' == "Doe").all()
Pycharm however is giving me a compile error on the "->>"
Does anyone know how I would write the filter clause to do what I need?
Try using astext
records = db_session.query(Resource).filter(
Resources.data["lastname"].astext == "Doe"
).all()
Please note that the column MUST have a type of a JSONB. The regular JSON column will not work.
Also you could explicitly cast string to JSON (see Postgres JSON type doc).
from sqlalchemy.dialects.postgres import JSON
from sqlalchemy.sql.expression import cast
db_session.query(Resource).filter(
Resources.data["lastname"] == cast("Doe", JSON)
).all()
If you are using JSON type (not JSONB) the following worked for me:
Note the '"object"'
query = db.session.query(ProductSchema).filter(
cast(ProductSchema.ProductJSON["type"], db.String) != '"object"'
)
I have some GeoJSON in a JSON (not JSONB) type column and none of the existing solutions worked, but as it turns out, in version 1.3.11 some new data casters were added, so now you can:
records = db_session.query(Resource).filter(Resources.data["lastname"].as_string() == "Doe").all()
Reference: https://docs.sqlalchemy.org/en/14/core/type_basics.html#sqlalchemy.types.JSON
Casting JSON Elements to Other Types
Index operations, i.e. those invoked by calling upon the expression
using the Python bracket operator as in some_column['some key'],
return an expression object whose type defaults to JSON by default, so
that further JSON-oriented instructions may be called upon the result
type. However, it is likely more common that an index operation is
expected to return a specific scalar element, such as a string or
integer. In order to provide access to these elements in a
backend-agnostic way, a series of data casters are provided:
Comparator.as_string() - return the element as a string
Comparator.as_boolean() - return the element as a boolean
Comparator.as_float() - return the element as a float
Comparator.as_integer() - return the element as an integer
These data casters are implemented by supporting dialects in order to
assure that comparisons to the above types will work as expected, such
as:
# integer comparison
data_table.c.data["some_integer_key"].as_integer() == 5
# boolean comparison
data_table.c.data["some_boolean"].as_boolean() == True
According sqlalchemy.types.JSON, you can do it like this
from sqlalchemy import JSON
from sqlalchemy import cast
records = db_session.query(Resource).filter(Resources.data["lastname"] == cast("Doe", JSON)).all()
According to this, pre version 1.3.11, the most robust way should be like this, as it works for multiple database types, e.g. SQLite, MySQL, Postgres:
from sqlalchemy import cast, JSON, type_coerce, String
db_session.query(Resource).filter(
cast(Resources.data["lastname"], String) == type_coerce("Doe", JSON)
).all()
From version 1.3.11 onward, type-specific casters is the new and neater way to handle this:
db_session.query(Resource).filter(
Resources.data["lastname"].as_string() == "Doe"
).all()