How to write fields dynamically to a csv file using a tJava - talend

I'm using Talend to extract database field names from the table and write to a csv them after replacing the "_" in field names with " ". I want to have these values against the actual headers.
eg:
|First_Name|Last_Name|
|----------|---------|
|First Name|Last Name|
My job looks similar to following.
Code in tJavaRow is as follows:
for (java.lang.reflect.Field field:
input_row.getClass().getDeclaredFields()) {
String fieldName = field.getName();
String newFieldName = null;
newFieldName = fieldName.replaceAll("_", " ");
context.columnName = newFieldName;
System.out.println("Field name is " + context.columnName );
}
How can I get the value of this context variable for each field in csv file? If I directly use it in the tmap it will only have name of the last field as value.
tmap I had between tJava and the tFileOutputDelimited.

you cannot change schema as they are treated as declared variables in java code which is generated in backend.
your schema "|First_Name|Last_Name|" is converted into as below:
String First_Name =null;
String Last_Name = null;
So you cannot change those schema column names on fly.
But you can create a record from the column names which you are retrieving from database by using the delimiter you want(lets take comma)
for (java.lang.reflect.Field field : input_row.getClass().getDeclaredFields()) {
String fieldName = field.getName();
context.columnName = context.columnName + "," + fieldName.replaceAll("_", " ");
}
And now, before writing your data into a csv file, write this header record in context.columnName into that csv file.
After writing the header record, append your data to that file by checking "Append" check box in tFileOutputDelimitted.

Related

How convert java type to domain of postgres with hibernate(springData)?

i created domain in postgres:
create domain arrayofids as numeric[];
Now i want to use the domain in spring data like this:
String fakeQuery = "unnest(CAST (:ids AS arrayofids))";
Query nativeQuery = entityManager.createNativeQuery(fakeQuery);
BigInteger[] arrayOfids = new BigInteger[] {new BigInteger("1"),new BigInteger("2)} //or It can be List. It is not important
nativeQuery.setParameter("ids", arrayOfids);
List resultList = nativeQuery.getResultList();
Of course i get Error:
org.postgresql.util.PSQLException: ERROR: cannot cast type bytea to arrayofIds
Before i used https://dalesbred.org/docs/api/org/dalesbred/datatype/SqlArray.html and it worked fine or did custom types in JDBC myself. Hibernate doesn't allow use my domain easy.
Query like that:
select * from mtTbale where id in :ids
is not interested. I should use the domain with unnest and CAST
Make :ids as a text representation of an array of numbers, i.e. "{1, 2}" and add SELECT to the query.
Try this:
String fakeQuery = "SELECT unnest(CAST (:ids AS arrayofids))";
Query nativeQuery = entityManager.createNativeQuery(fakeQuery);
String arrayOfids = "{" + 1 + ", " + 2 + "}"; // obtain the string as is relevant in your case
nativeQuery.setParameter("ids", arrayOfids);
List resultList = nativeQuery.getResultList();
and the second query should look like this:
select * from mtTbale where id = ANY(CAST(:ids AS arrayofids));
You may also use Postgres shorthand (and more readable) cast syntax.
Instead of CAST(:ids AS arrayofids) use :ids::arrayofids.

Accepting parameters with #Query annotation in the repository package spring boot with postgresql

Database used: postgreSQL
I have the following code:
#Query(value="select id, meta " + "from temp " +
"where meta #> \'{\"animal\": \"donkey\" }\'", nativeQuery = true)
List<classDemo> findByMeta();
The meta column has data in the following format:
{
"meta": { "animal":"dog", "type":"dirty"}
}
I would like to provide the value of animal as a parameter (e.g. donkey should be a parameter), so that I can extract all the records of the entered value during a get request. What can I do to the above code so that I can enter a parameter rather than a direct value?
Above code is jsonb format, present in meta column
If the meta column is json or jsonb, then you can use this for your query:
select id, meta
from temp
where meta->'meta'->>'animal' = 'donkey'
If meta is data type text, then this will work so long as all rows in the table have valid json stored in meta:
select id, meta
from temp
where (meta::jsonb)->'meta'->>'animal' = 'donkey'
It has been a long while since I have worked in Java and even longer when it comes to Spring, but you should be able to placeholder 'donkey' with a question mark and then setString(1, "donkey") on it.

how to convert map<anydata> to json

In my CRUD Rest Service I do an insert into a DB and want to respond to the caller with the created new record. I am looking for a nice way to convert the map to json.
I am running on ballerina 0.991.0 and using a postgreSQL.
The return of the Update ("INSERT ...") is a map.
I tried with convert and stamp but i did not work for me.
import ballerinax/jdbc;
...
jdbc:Client certificateDB = new({
url: "jdbc:postgresql://localhost:5432/certificatedb",
username: "USER",
password: "PASS",
poolOptions: { maximumPoolSize: 5 },
dbOptions: { useSSL: false }
}); ...
var ret = certificateDB->update("INSERT INTO certificates(certificate, typ, scope_) VALUES (?, ?, ?)", certificate, typ, scope_);
// here is the data, it is map<anydata>
ret.generatedKeys
map should know which data type it is, right?
then it should be easy to convert it to json like this:
{"certificate":"{certificate:
"-----BEGIN
CERTIFICATE-----\nMIIFJjCCA...tox36A7HFmlYDQ1ozh+tLI=\n-----END
CERTIFICATE-----", typ: "mqttCertificate", scope_: "QARC", id_:
223}"}
Right now i do a foreach an build the json manually. Quite ugly. Maybe somebody has some tips to do this in a nice way.
It cannot be excluded that it is due to my lack of programming skills :-)
The return value of JDBC update remote function is sql:UpdateResult|error.
The sql:UpdateResult is a record with two fields. (Refer https://ballerina.io/learn/api-docs/ballerina/sql.html#UpdateResult)
UpdatedRowCount of type int- The number of rows which got affected/updated due to the given statement execution
generatedKeys of type map - This contains a map of auto generated column values due to the update operation (only if the corresponding table has auto generated columns). The data is given as key value pairs of column name and column value. So this map contains only the auto generated column values.
But your requirement is to get the entire row which is inserted by the given update function. It can’t be returned with the update operation if self. To get that you have to execute the jdbc select operation with the matching criteria. The select operation will return a table or an error. That table can be converted to a json easily using convert() function.
For example: Lets say the certificates table has a auto generated primary key column name ‘cert_id’. Then you can retrieve that id value using below code.
int generatedID = <int>updateRet.generatedKeys.CERT_ID;
Then use that generated id to query the data.
var ret = certificateDB->select(“SELECT certificate, typ, scope_ FROM certificates where id = ?”, (), generatedID);
json convertedJson = {};
if (ret is table<record {}>) {
var jsonConversionResult = json.convert(ret);
if (jsonConversionResult is json) {
convertedJson = jsonConversionResult;
}
}
Refer the example https://ballerina.io/learn/by-example/jdbc-client-crud-operations.html for more details.?

Inserting string into jsonb type postgres

I have a function that generates a prepared statement for batch insert into postgres where I am trying to insert the string into type jsonb in postgres.
My struct looks like:
type struct1 struct {
id int
comment string
extra string
}
and my table schema looks like:
create table deal (
id bigserial,
comment varchar(75),
extra jsonb
)
and I want to dump []struct1 to Postgres DB "deal".
My function which generates the prepared statement looks like this:
func BulkInsert(str []struct1, ctx context.Context) string {
log.Debug("inserting records to DB")
query := fmt.Sprintf(`insert into deal (%s) values `, strings.Join(dbFields, ","))
var numFields = len(dbFields)
var values []interface{}
for i, database := range str {
values = append(values, database.Comment,`'`+database.Extra+`'`)
n := i * numFields
query += `(`
for j := 0; j < numFields; j++ {
query += `$` + strconv.Itoa(n+j+1) + `,`
}
query = query[:len(query)-1] + `),`
}
query = query[:len(query)-1]
return query
Expected results should be: I should be able to insert string to json or you can say cast string to json and dump it.
The actual result is :
could not save batch: pq: invalid input syntax for type json"
Function of json_build_array('exp1'::Text, 'exp2'::Text) may help you.
return json object: {'exp1', 'exp2'}
And extract the values just use operator ->><index> like ->>1 to get 'exp2'.
If you just want to insert into database, function of to_json('any element') should also works, which can convert any element to a json object.
And you can get more funtions about json(jsonb) in postgres document.

Getting Table and Column Names from DbContext

With the new Public Mapping API what is the proper way of going from an Entity to getting the corresponding database schema and table names, and the mapping of the entities properties to column names?
Here is a reference with a ADO.NET Entity Data Model generated with Database First approach. It uses the schema provided by the connection of DbContext.
Schema Restriction is not enforced but just returns more clear results. For further information, please take a look with
https://msdn.microsoft.com/en-us/library/cc716722(v=vs.110).aspx
using (var dbContext = new DbGeneratedByEfEntities())
{
var conn = dbContext.Database.Connection;
conn.Open();
var restrictions = default(string[]);
// please replace DbName and Owner with the actual value
restrictions = new string[] { "DbName", "dbo" };
// get DataTable from schema with collection name of "Tables"
var tablesDataTable = conn.GetSchema(SqlClientMetaDataCollectionNames.Tables, restrictions);
// show column names
Debug.WriteLine(string.Join(" | ", tablesDataTable.Columns.Cast<DataColumn>()));
// show row contents, table names from schema
tablesDataTable.Rows.Cast<DataRow>().ToList().ForEach(r =>
{
Debug.WriteLine(string.Join(" | ", r.ItemArray));
});
/************************/
// please replace DbName, Owner and TableName with the actual value
restrictions = new string[] { "DbName", "dbo", "TableName" };
// get DataTable from schema with collection name of "Columns"
var colsDataTable = conn.GetSchema(SqlClientMetaDataCollectionNames.Columns, restrictions);
// show column names
Debug.WriteLine(string.Join(" | ", colsDataTable.Columns.Cast<DataColumn>()));
// show row contents, field names from schema
colsDataTable.Rows.Cast<DataRow>().ToList().ForEach(r =>
{
Debug.WriteLine(string.Join(" | ", r.ItemArray));
});
}