How to convert String to Integer from OData-Model in SAPUI5? - sapui5

I have an OData model with some Edm.String fields which represents ABAP NUMC(9) or NUMC(10). All other Edm-Types (like Int32) ends with an error.
Can I now convert these Fields to Integer in SAPUI5? The content is i.e. 0000012345. If I use type in my binding, my text is empty. If I use it without type, the strings are output correctly.
new sap.m.Text({
text: {
path: "{statusData>AnzPdf}",
type: new sap.ui.model.type.Integer()
}
})

Apply formatter as give in example to convert data from model to Integer format.

Related

How to insert jsonb[] data into column using pg-promise

Given a table with a column of type jsonb[], how do I insert a json array into the column?
Using the provided formatters :array, :json won't work in this instance - unless I am missing the correct combination or something.
const links = [
{
title: 'IMDB',
url: 'https://www.imdb.com/title/tt0076759'
},
{
title: 'Rotten Tomatoes',
url: 'https://www.rottentomatoes.com/m/star_wars'
}
];
const result = await db.none(`INSERT INTO tests (links) VALUES ($1:json)`, [links]);
You do not need the library's :json filter in this case, as you need an array of JSON objects, and not a JSON with an array of JSON objects.
The former is formatted correctly by default, which then only needs ::json[] type casting:
await db.none(`INSERT INTO tests(links) VALUES($1::json[])`, [links]);
Other Notes
Use pg-monitor or event query to output queries being executed, for easier diagnostics.
Method none can only resolve with null, no point storing the result in a variable.
Library pg-promise does not have any :array filter, see supported filters.

Node pg-promise, bind multiple values with type casting

I'm currently using the pg-promise library to insert multiple values into a database in the format:
const cs = new pgp.helpers.ColumnSet(['booking_id', {name:'timeslot', cast:'timestamp'}], {table: 'booking'});
// data input values:
const values = [];
bookings.forEach(slot => {
values.push({booking_id: booking_id, timeslot: slot});
});
Where I need timeslot to be a timestamp. However it comes into the API as value like
1515586500.0
Using the above cast property my query gets resolved like so
insert into "booking"("booking_id","timeslot") values(1,'1515586500.0'::timestamp)
however this throws an error of cannot cast type numeric to timestamp without time zone
If I use the to_timestamp function however this works how I need it to e.g
insert into "booking"("booking_id","timeslot") values(1,to_timestamp('1515586500.0'));
Is there any way I can get pg-promise to use to_timestamp rather than the ::timestamp notation?
Change the column definition to this one:
{
name: 'timeslot',
mod: ':raw',
init: c => pgp.as.format('to_timestamp($1)', c.value)
}
or
{
name: 'timeslot',
mod: ':raw',
init: c => pgp.as.format('to_timestamp(${value})', c)
}
...as per the Column type documentation.
Or you can use Custom Type Formatting on the type, to self-format automatically.
Also, you do not need to remap values to suit the ColumnSet object, you use ColumnSet object to fit the data instead. So if the value for column timeslot is in property slot, you just use prop: 'slot' within your column definition to change where the value is coming from.

Prevent Json.NET from interpreting a string as a date

I have some attributes returned from a rest service that are served as an array of name-value pairs.
In some cases the value is a date expressed in universal sortable format:
{
"name": "Modification-Date",
"value": "2017-11-13T15:15:13.968Z"
}
When it gets parsed by the deserialiser, the value is identified as a date but given that the object the pair is deserialised into has type string for both name and value, the date is then converted to string and it loses precision: "13/11/2017 15:15:13"
This is easily visible by using a converter for the NameValue type.
if (reader.TokenType == JsonToken.StartObject)
{
var item = JObject.Load(reader);
return new NameValueFacet()
{
Name = item["name"].Value<string>(),
Value = item["value"].Value<string>()
};
}
item["value"].Type shows the type is Date.
How do I get to have Json.NET leave it as a string, "unparsed"?
You can try with Newtonsoft. See below.
JsonConvert.DeserializeObject<your_object>(your_json, new IsoDateTimeConverter{ DateTimeFormat = "dd/MM/yyy" });

Custom expression in JPA CriteriaBuilder

I have an Entity with a String field (storing JSON), and need to compare value from its database column with another value. Problem is that type of this database column is TEXT, but in fact it contains JSON. So, is there a way to write something like this? I.e. I need to compare my value with some field of JSON from TEXT column.
criteriaBuilder.equal(root.get("json_column").customExpressionn(new Expression{
Object handle(Object data){
return ((Object)data).get("json_field")
}
}), value)
Assuming, you have a MySQL server with version > 5.7.x
I just had the same issue. I wanted to find all entities of a class that had a JSON field value inside a JSON object column.
The solution that worked for me was something along the lines (sorry typing from a phone)
(root, query, builder)->{
return builder.equal(
builder.function("JSON_EXTRACT", String.class, root.get("myEntityJsonAttribute"), builder.literal("$.json.path.to.json.field")),
"searchedValueInJsonFieldOfJsonAttribute"
)
}

Scala reading CSV data with multiple record types directly into data classes

I have a CSV file to process, with multiple record types. The first column of each row tells me which type of data it is. I have written a data class for each row (there are over 30) and while I could hand write a contructor for each data class which accepts an Array[String] from the parser as these records have upto a maximum of 35 fields that is error prone and messy.
As there are multiple record types, there is obviously no first line with field names.
I am using Scala 2.11, so I thought how about using reflection, and rather than using the underlying Java reflection I can of course use Scala reflection directly, but I am having difficulty understand how.
So what I need to do is to:-
For each row in the file, read and parse the record into columns
Based on the field column create one of the right data class
for each field in the data object
based on the field type, convert the column and save it in the object
Now this relied on the fields remaining in the same order, does Scala guarantee that?
Also I have found lots of examples showing how to find the fields for a class, but I have not yet found how to get and set values into the fields. Can someone please point me in the right direction?
Here are the first two records:-
990001,DD002,1995,20150723,143937,O,NMR,Impel Pro,1998v1.00,,,NMR,1998v2.00
091017,1,HOLSTEIN,HF,D ,N,12,99,280,999,305,1,0.1,3,75,99.9,1.5,1.5,7,7,1,1,7,7,0,0,9,9,22000,25000,29000,32000,15,65,1,1047,Y,Y,C
The first record is a header, the second is a definition of a Breed of cattle, in this case a Holstein. 990001 means header and is represented in my code as a Header object, and 091017 means breed, and is represented as a Breed object.
I want to load the Header into:-
class Header(var dataDictionaryType: String = "",
var isoVersion: String = "",
var createdOrUploadedDate: Option[LocalDate] = Some(null),
var createdOrUploadedTime: Option[LocalTime] = Some(null),
var systemStatus: String = "",
var senderName: String = "",
var receiverName: String = "",
var adedNationalVersion:String = "",
var processComputerType: String = "",
var adedManufacturerVersion: String = ""
Note that there needs to be a few extra Options added.