Is there a way to return a JSON object within a xe:restViewColumn? - rest

I'm trying to generate a REST-Service on a XPage with the viewJsonService service type.
Within a column I need to have a JSON object and tried to solve that with this code:
<xe:restViewColumn name="surveyResponse">
<xe:this.value>
<![CDATA[#{javascript:
var arrParticipants = new Array();
arrParticipants.push({"participant": "A", "selection": ["a1"]});
arrParticipants.push({"participant": "B", "selection": ["b1", "b2"]});
return (arrParticipants);
}
]]>
</xe:this.value>
</xe:restViewColumn>
I was expecting to get this for that specific column:
...
"surveyResponse": [
{ "participant": "A",
"selection": [ "a1" ]
},
{ "participant": "B",
"selection": [ "b1", "b2" ]
}
]
...
What I am getting is this:
...
"surveyResponse": [
"???",
"???"
]
...
When trying to use toJson for the array arrParticipants the result is not valid JSON format:
...
"surveyResponse": "[{\"selection\": [\"a1\"],\"participant\":\"A\"},{\"selection\": [\"b1\",\"b2\"],\"participant\":\"B\"}]"
...
When tyring to use fromJson for the array arrParticipants the result is:
{
"code": 500,
"text": "Internal Error",
"message": "Error while executing JavaScript computed expression",
"type": "text",
"data": "com.ibm.xsp.exception.EvaluationExceptionEx: Error while executing JavaScript computed expression at com.ibm.xsp.binding.javascript.JavaScriptValueBinding.getValue(JavaScriptValueBinding.java:132) at com.ibm.xsp.extlib.component.rest.DominoViewColumn.getValue(DominoViewColumn.java:93) at com.ibm.xsp.extlib.component.rest.DominoViewColumn.evaluate(DominoViewColumn.java:133) at com.ibm.domino.services.content.JsonViewEntryCollectionContent.writeColumns(JsonViewEntryCollectionContent.java:213) at com.ibm.domino.services.content.JsonViewEntryCollectionContent.writeEntryAsJson(JsonViewEntryCollectionContent.java:191) at com.ibm.domino.services.content.JsonViewEntryCollectionContent.writeViewEntryCollection(JsonViewEntryCollectionContent.java:170) at com.ibm.domino.services.rest.das.view.RestViewJsonService.renderServiceJSONGet(RestViewJsonService.java:394) at com.ibm.domino.services.rest.das.view.RestViewJsonService.renderService(RestViewJsonService.java:112) at com.ibm.domino.services.HttpServiceEngine.processRequest(HttpServiceEngine.java:167) at com.ibm.xsp.extlib.component.rest.UIBaseRestService._processAjaxRequest(UIBaseRestService.java:242) at com.ibm.xsp.extlib.component.rest.UIBaseRestService.processAjaxRequest(UIBaseRestService.java:219) at com.ibm.xsp.util.AjaxUtilEx.renderAjaxPartialLifecycle(AjaxUtilEx.java:206) at com.ibm.xsp.webapp.FacesServletEx.renderAjaxPartial(FacesServletEx.java:225) at com.ibm.xsp.webapp.FacesServletEx.serviceView(FacesServletEx.java:170) at com.ibm.xsp.webapp.FacesServlet.service(FacesServlet.java:160) at com.ibm.xsp.webapp.FacesServletEx.service(FacesServletEx.java:138) at com.ibm.xsp.webapp.DesignerFacesServlet.service(DesignerFacesServlet.java:103) at com.ibm.designer.runtime.domino.adapter.ComponentModule.invokeServlet(ComponentModule.java:576) at com.ibm.domino.xsp.module.nsf.NSFComponentModule.invokeServlet(NSFComponentModule.java:1281) at com.ibm.designer.runtime.domino.adapter.ComponentModule$AdapterInvoker.invokeServlet(ComponentModule.java:847) at com.ibm.designer.runtime.domino.adapter.ComponentModule$ServletInvoker.doService(ComponentModule.java:796) at com.ibm.designer.runtime.domino.adapter.ComponentModule.doService(ComponentModule.java:565) at com.ibm.domino.xsp.module.nsf.NSFComponentModule.doService(NSFComponentModule.java:1265) at com.ibm.domino.xsp.module.nsf.NSFService.doServiceInternal(NSFService.java:653) at com.ibm.domino.xsp.module.nsf.NSFService.doService(NSFService.java:476) at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.doService(LCDEnvironment.java:341) at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.service(LCDEnvironment.java:297) at com.ibm.domino.xsp.bridge.http.engine.XspCmdManager.service(XspCmdManager.java:272) Caused by: com.ibm.jscript.InterpretException: Script interpreter error, line=7, col=8: Error while converting from a JSON string at com.ibm.jscript.types.FBSGlobalObject$GlobalMethod.call(FBSGlobalObject.java:785) at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161) at com.ibm.jscript.types.FBSGlobalObject$GlobalMethod.call(FBSGlobalObject.java:219) at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:175) at com.ibm.jscript.ASTTree.ASTReturn.interpret(ASTReturn.java:49) at com.ibm.jscript.ASTTree.ASTProgram.interpret(ASTProgram.java:119) at com.ibm.jscript.ASTTree.ASTProgram.interpretEx(ASTProgram.java:139) at com.ibm.jscript.JSExpression._interpretExpression(JSExpression.java:435) at com.ibm.jscript.JSExpression.access$1(JSExpression.java:424) at com.ibm.jscript.JSExpression$2.run(JSExpression.java:414) at java.security.AccessController.doPrivileged(AccessController.java:284) at com.ibm.jscript.JSExpression.interpretExpression(JSExpression.java:410) at com.ibm.jscript.JSExpression.evaluateValue(JSExpression.java:251) at com.ibm.jscript.JSExpression.evaluateValue(JSExpression.java:234) at com.ibm.xsp.javascript.JavaScriptInterpreter.interpret(JavaScriptInterpreter.java:221) at com.ibm.xsp.javascript.JavaScriptInterpreter.interpret(JavaScriptInterpreter.java:193) at com.ibm.xsp.binding.javascript.JavaScriptValueBinding.getValue(JavaScriptValueBinding.java:78) ... 27 more Caused by: com.ibm.commons.util.io.json.JsonException: Error when parsing JSON string at com.ibm.commons.util.io.json.JsonParser.fromJson(JsonParser.java:61) at com.ibm.jscript.types.FBSGlobalObject$GlobalMethod.call(FBSGlobalObject.java:781) ... 43 more Caused by: com.ibm.commons.util.io.json.parser.ParseException: Encountered " "object "" at line 1, column 2. Was expecting one of: "false" ... "null" ... "true" ... ... ... ... "{" ... "[" ... "]" ... "," ... at com.ibm.commons.util.io.json.parser.Json.generateParseException(Json.java:568) at com.ibm.commons.util.io.json.parser.Json.jj_consume_token(Json.java:503) at com.ibm.commons.util.io.json.parser.Json.arrayLiteral(Json.java:316) at com.ibm.commons.util.io.json.parser.Json.parseJson(Json.java:387) at com.ibm.commons.util.io.json.JsonParser.fromJson(JsonParser.java:59) ... 44 more "
}
Is there any way to get the desired answer?

Well, the best way to achieve the desired result is to use the xe:customRestService if you need to return a cascaded JSON object.
All other xe:***RestService elements assume that you will return a flat JSON construct of parameter and value pairs, where the value is a simple data type (like boolean, number or string and - funny though - arrays) but not a complex data type (like objects).
This is, that this result here
...
"surveyResponse": [
{ "participant": "A",
"selection": [ "a1" ]
},
{ "participant": "B",
"selection": [ "b1", "b2" ]
}
]
...
will be only available on using xe:customRestService where you can define your JSON result by yourself.
Using the other services the results are limited to this constructions:
...
"surveyResponse": true;
...
or
...
"surveyResponse": [
"A",
"B"
]
...

cant you use built-in server-side javascript function toJson ?

You could try intercepting the AJAX call when reading the JSON and then manually de-sanitise the JSON string data.
There are more details here.
http://www.browniesblog.com/A55CBC/blog.nsf/dx/15112012082949PMMBRD68.htm
Personally I'd recommend against this as unless you are absolutely sure the end user can't inject code into the JSON data.

Related

'TypeError: StructType can not accept object

I'm trying to convert this json string data to Dataframe in Databricks
a = """{ "id": "a",
"message_type": "b",
"data": [ {"c":"abcd","timestamp":"2022-03-
01T13:10:00+00:00","e":0.18,"f":0.52} ]}"""
the schema I defined for the data is this
schema=StructType(
[
StructField("id",StringType(),False),
StructField("message_type",StringType(),False),
StructField("data", ArrayType(StructType([
StructField("c",StringType(),False),
StructField("timestamp",StringType(),False),
StructField("e",DoubleType(),False),
StructField("f",DoubleType(),False),
])))
,
]
)
and when I run this command
df = sqlContext.createDataFrame(sc.parallelize([a]), schema)
I get this error
PythonException: 'TypeError: StructType can not accept object '{ "id": "a",\n"message_type": "JobMetric",\n"data": [ {"c":"abcd","timestamp":"2022-03- \n01T13:10:00+00:00","e":0.18,"f":0.52=} ]' in type <class 'str'>'. Full traceback below:
anyone could help me with this, would much appreciate it!
Your a variable is wrong.
"data": [ "{"JobId":"ATLUPS10m2101V1","Timestamp":"2022-03-
01T13:10:00+00:00","number1":0.9098145961761475,"number2":0.5294908881187439}" ]
Should be
"data": [ {"JobId":"ATLUPS10m2101V1","Timestamp":"2022-03-
01T13:10:00+00:00","number1":0.9098145961761475,"number2":0.5294908881187439} ]
And check if it is OK to match name with JobId to job_id and Timestamp to timestamp.
Issue is whenever you're passing the string object to struct schema it expects RDD([StringType, StringType,...]) however, in your current scenario it is getting just string object. In order to fix it first you need to convert your string to a json object and from there you'll need to create a RDD. See the below logic for details -
Input Data -
a = """{"run_id": "1640c68e-5f02-4f49-943d-37a102f90146",
"message_type": "JobMetric",
"data": [ {"JobId":"ATLUPS10m2101V1","timestamp":"2022-03-01T13:10:00+00:00",
"score":0.9098145961761475,
"severity":0.5294908881187439
}
]
}"""
Converting to a RDD using json object -
from pyspark.sql.types import *
import json
schema=StructType(
[
StructField("run_id",StringType(),False),
StructField("message_type",StringType(),False),
StructField("data", ArrayType(StructType([
StructField("JobId",StringType(),False),
StructField("timestamp",StringType(),False),
StructField("score",DoubleType(),False),
StructField("severity",DoubleType(),False),
])))
,
]
)
df = spark.createDataFrame(data=sc.parallelize([json.loads(a)]),schema=schema)
df.show(truncate=False)
Output -
+------------------------------------+------------+--------------------------------------------------------------------------------------+
|run_id |message_type|data |
+------------------------------------+------------+--------------------------------------------------------------------------------------+
|1640c68e-5f02-4f49-943d-37a102f90146|JobMetric |[{ATLUPS10m2101V1, 2022-03-01T13:10:00+00:00, 0.9098145961761475, 0.5294908881187439}]|
+------------------------------------+------------+--------------------------------------------------------------------------------------+

How to use Clauses with API REST Neo4J?

I have a complexe request that perfectly works in Neo4j Browser that I want to use through API Rest, but there are Clauses I can't cope with.
The syntax looks like :
MATCH p=()-[*]->(node1)
WHERE …
WITH...
....
FOREACH … SET …
I constructed the query with Transactional Cyper as i have been suggested by #cybersam, but I don't manage to use more than one clause anyway.
To give an exemle, if I write the statement in one line :
:POST /db/data/transaction/commit {
"statements": [
{
"statement": "MATCH p = (m)-[*]->(n:SOL {PRB : {PRB1}}) WHERE nodes (p)
MATCH q= (o:SOL {PRB : {PRB2}} RETURN n, p, o, q;",
"parameters": {"PRB1": "Title of problem1", "PRB2": "Title of problem2"}
} ],
"resultDataContents": ["graph"] }
I shall obtain :
{"results":[],"errors":[{"code":"Neo.ClientError.Statement.SyntaxError","message":"Invalid input 'R': expected whitespace, comment, ')' or a relationship pattern (line 1, column 90 (offset: 89))\r\n\"MATCH p = (m)-[*]->(n:SOL {PRB : {PRB1}}) WHERE nodes (p) MATCH q= (o:SOL {PRB : {PRB2}} RETURN n, p, o, q;\"\r\n ^"}]}
But if I put it in several lines, :
:POST /db/data/transaction/commit {
"statements": [
{
"statement": "MATCH p = (m)-[*]->(n:SOL {PRB : {PRB1}})
WHERE nodes (p)
MATCH q= (o:SOL {PRB : {PRB2}}
RETURN n, p, o, q;",
"parameters": {"PRB1": "Title of problem1", "PRB2": "Title of problem2"}
}
],
"resultDataContents": ["graph"]
}
it is said :
{"results":[],"errors":[{"code":"Neo.ClientError.Request.InvalidFormat","message":"Unable
to deserialize request: Illegal unquoted character ((CTRL-CHAR, code
10)): has to be escaped using backslash to be included in string
value\n at [Source: HttpInputOverHTTP#41fa906c; line: 4, column:
79]"}]}
Please, I need your help !
Alex
Using the Transaction Cypher HTTP API, you could just pass the same Cypher statement to the API.
To quote from this section of the doc, here is an example of the simplest way to do that:
Begin and commit a transaction in one request If there is no need to
keep a transaction open across multiple HTTP requests, you can begin a
transaction, execute statements, and commit with just a single HTTP
request.
Example request
POST http://localhost:7474/db/data/transaction/commit
Accept: application/json; charset=UTF-8
Content-Type: application/json
{
"statements" : [ {
"statement" : "CREATE (n) RETURN id(n)"
} ]
}
Example response
200: OK
Content-Type: application/json
{
"results" : [ {
"columns" : [ "id(n)" ],
"data" : [ {
"row" : [ 6 ],
"meta" : [ null ]
} ]
} ],
"errors" : [ ]
}

Syntax Error: missing ) after argument list #(shell):2:4

I am getting the above error in Mongo DB when entering the following in the shell but I can't for the life of me see where there is a syntax error...
db.createUser (
... "user":"dbTest",
... "pwd":"testPass",
... "roles": [
... { "role":"readWrite", "db":"test" }
... ]
... )
That has been copied and pasted directly from the console.
You're missing curly braces around the object literal:
db.createUser ({
"user":"dbTest",
"pwd":"testPass",
"roles": [
{ "role":"readWrite", "db":"test" }
]
})
See db.createUser() - Examples

How to properly use context variables in OrientDB ETL configuration file?

Summary
Trying to learn about OrientDB ETL configuration json file.
Assuming a CSV file where:
each row is a single vertex
a 'class' column gives the intended class of the vertex
there are multiple classes for the vertices (Foo, Bar, Baz)
How do I set the class of the vertex to be the value of the 'class' column?
Efforts to Troubleshoot
I have spent a LOT of time in the OrientDB ETL documentation trying to solve this. I have tried many different combinations of let and block and code components. I have tried variable names like className and $className and ${classname}.
Current Results:
The code component is able to correctly print the value of `className', so I know that it is being set correctly.
The vertex component isn't referencing the variable correctly, and consequently sets the class of each vertex to null.
Context
I have a freshly created database (PLOCAL GRAPH) on localhost called 'deleteme'.
I have an vertex CSV file (nodes.csv) that looks like this:
id,name,class
1,Jack,Foo
2,Jill,Bar
3,Gephri,Baz
And an ETL configuration file (test.json) that looks like this:
{
"config": {
"log": "DEBUG"
},
"source": {"file": {"path": "nodes.csv"}},
"extractor": {"csv": {}},
"transformers": [
{"block": {"let": {"name": "$className",
"value": "$input.class"}}},
{"code": {"language": "Javascript",
"code": "print(className + '\\n'); input;"}},
{"vertex": {"class": "$className"}}
],
"loader": {
"orientdb": {
"dbURL": "remote:localhost:2424/deleteme",
"dbUser": "admin",
"dbPassword": "admin",
"dbType": "graph",
"tx": false,
"wal": false,
"batchCommit": 1000,
"classes": [
{"name": "Foo", "extends": "V"},
{"name": "Bar", "extends": "V"},
{"name": "Baz", "extends": "V"}
]
}
}
}
And when I run the ETL job, I have output that looks like this:
aj#host:~/bin/orientdb-community-2.1.13/bin$ ./oetl.sh test.json
OrientDB etl v.2.1.13 (build 2.1.x#r9bc1a54a4a62c4de555fc5360357f446f8d2bc84; 2016-03-14 17:00:05+0000) www.orientdb.com
BEGIN ETL PROCESSOR
[file] INFO Reading from file nodes.csv with encoding UTF-8
[orientdb] DEBUG - OrientDBLoader: created vertex class 'Foo' extends 'V'
[orientdb] DEBUG orientdb: found 0 vertices in class 'null'
+ extracted 0 rows (0 rows/sec) - 0 rows -> loaded 0 vertices (0 vertices/sec) Total time: 1001ms [0 warnings, 0 errors]
[orientdb] DEBUG - OrientDBLoader: created vertex class 'Bar' extends 'V'
[orientdb] DEBUG orientdb: found 0 vertices in class 'null'
[orientdb] DEBUG - OrientDBLoader: created vertex class 'Baz' extends 'V'
[orientdb] DEBUG orientdb: found 0 vertices in class 'null'
[csv] DEBUG document={id:1,class:Foo,name:Jack}
[1:block] DEBUG Transformer input: {id:1,class:Foo,name:Jack}
[1:block] DEBUG Transformer output: {id:1,class:Foo,name:Jack}
[1:code] DEBUG Transformer input: {id:1,class:Foo,name:Jack}
Foo
[1:code] DEBUG executed code=OCommandExecutorScript [text=print(className); input;], result={id:1,class:Foo,name:Jack}
[1:code] DEBUG Transformer output: {id:1,class:Foo,name:Jack}
[1:vertex] DEBUG Transformer input: {id:1,class:Foo,name:Jack}
[1:vertex] DEBUG Transformer output: v(null)[#3:0]
[csv] DEBUG document={id:2,class:Bar,name:Jill}
[2:block] DEBUG Transformer input: {id:2,class:Bar,name:Jill}
[2:block] DEBUG Transformer output: {id:2,class:Bar,name:Jill}
[2:code] DEBUG Transformer input: {id:2,class:Bar,name:Jill}
Bar
[2:code] DEBUG executed code=OCommandExecutorScript [text=print(className); input;], result={id:2,class:Bar,name:Jill}
[2:code] DEBUG Transformer output: {id:2,class:Bar,name:Jill}
[2:vertex] DEBUG Transformer input: {id:2,class:Bar,name:Jill}
[2:vertex] DEBUG Transformer output: v(null)[#3:1]
[csv] DEBUG document={id:3,class:Baz,name:Gephri}
[3:block] DEBUG Transformer input: {id:3,class:Baz,name:Gephri}
[3:block] DEBUG Transformer output: {id:3,class:Baz,name:Gephri}
[3:code] DEBUG Transformer input: {id:3,class:Baz,name:Gephri}
Baz
[3:code] DEBUG executed code=OCommandExecutorScript [text=print(className); input;], result={id:3,class:Baz,name:Gephri}
[3:code] DEBUG Transformer output: {id:3,class:Baz,name:Gephri}
[3:vertex] DEBUG Transformer input: {id:3,class:Baz,name:Gephri}
[3:vertex] DEBUG Transformer output: v(null)[#3:2]
END ETL PROCESSOR
+ extracted 3 rows (4 rows/sec) - 3 rows -> loaded 3 vertices (4 vertices/sec) Total time: 1684ms [0 warnings, 0 errors]
Oh, and what does DEBUG orientdb: found 0 vertices in class 'null' mean?
Try this. I wrestled with this for awhile too, but the below setup worked for me.
Note that setting #class before the vertex transformer will initialize a Vertex with the proper class.
"transformers": [
{"block": {"let": {"name": "$className",
"value": "$input.class"}}},
{"code": {"language": "Javascript",
"code": "print(className + '\\n'); input;"}},
{ "field": {
"fieldName": "#class",
"expression": "$className"
}
},
{"vertex": {}}
]
To get your result, you could use "ETL" to import data from csv into a CLASS named "Generic".
Through an JS function, "separateClass ()", create new classes taking the name from the property 'Class' imported from csv, and put vertices from class Generic to new classes.
File json:
{
"source": { "file": {"path": "data.csv"}},
"extractor": { "row": {}},
"begin": [
{ "let": { "name": "$className", "value": "Generic"} }
],
"transformers": [
{"csv": {
"separator": ",",
"nullValue": "NULL",
"columnsOnFirstLine": true,
"columns": [
"id:Integer",
"name:String",
"class:String"
]
}
},
{"vertex": {"class": "$className", "skipDuplicates": true}}
],
"loader": {
"orientdb": {
"dbURL": "remote:localhost/test",
"dbType": "graph"
}
}
}
After importing the data from etl, in javascript creates the function
var g = orient.getGraphNoTx();
var queryResult= g.command("sql", "SELECT FROM Generic");
//example filed vertex: ID, NAME, CLASS
if (!queryResult.length) {
print("Empty");
} else {
//for each value create or insert in class
for (var i = 0; i < queryResult.length; i++) {
var className = queryResult[i].getProperty("class").toString();
//chech is className is already created
var countClass = g.command("sql","select from V where #class = '"+className+"'");
if (!countClass.length) {
g.command("sql","CREATE CLASS "+className+" extends V");
g.command("sql"," CREATE PROPERTY "+className+".id INTEGER");
g.command("sql"," CREATE PROPERTY "+className+".name STRING");
g.commit();
}
var id = queryResult[i].getProperty("id").toString();
var name = queryResult[i].getProperty("name").toString();
g.command("sql","INSERT INTO "+className+ " (id, name) VALUES ("+id+",'"+name+"')");
g.commit();
}
//remove class generic
g.command("sql","truncate class Generic unsafe");
}
the result should be like the one shown in the picture.

GCS Transfer via Source: URL list "errorCode": "UNKNOWN"

I'm trying to transfer 7,860,379 files, using the transfer system via URL list, however always encounter the same error:
{ //...
"errorBreakdowns": [
{
"errorCode": "UNKNOWN",
"errorCount": "1",
"errorLogEntries": [
{
"url": " or ",
"errorDetails": [
""
]
}
]
}
]
// ...
}
All the my URLs are valid and the file format as documented:
TsvHttpData-1.0
^([^ ]+)\t([0-9]+)\t([a-f0-9]{32})$
The error I find the API is very generic, someone went through the same problem?
Since, I thank you.
Based on your regex, I suspect you are not providing a base-64 encoded MD5, as it often contains '=' characters. To do this, you need to compute the binary version of your MD5 and then convert it to base64.
Example: Hk2gdsIpWTDz3kQssoTqKg==