custom CSV fields names in spring batch writer - spring-batch

i want to write to a CSV file in spring batch writer, but i don't want to use the property names in my bean, i need a custom field name. is there another way to do it without using BeanWrapperFieldExtractor?
Here's my bean:
public class VMi {
private String Name;
private String OrgName;
private String Status;
private int CPU;
private int RAM;
private String IP;
}
i want to output the fields with the same names except for "OrgName" which i want it as "Organization->Name"

If you want to write different names in the header of the output file, you can use a FlatFileHeaderCallback. Here is a quick example:
#Bean
public FlatFileItemWriter<VMi> itemWriter() {
return new FlatFileItemWriterBuilder<VMi>()
.resource(new FileSystemResource("vmi.csv"))
.name("vmiWriter")
.delimited()
.names("Name", "OrgName", "Status", "CPU", "RAM", "IP")
.headerCallback(writer -> writer.write("Name, Organization->Name, Status, CPU, RAM, IP"))
.build();
}
This will still use the BeanWrapperFieldExtractor to extract values from items based on getters, but writes a different name for OrgName in the output file.

Related

Esper EPL aggregation with custom function

I am trying to do an aggregation with Esper 8.8.0 EPL using the following code. When ProductEvent is published then I am trying to save it into a table after converting the complete ProductEvent bean object into json. Is there any way I can pass ProductEven Object itself in custom function when executing merge statement -
#public create table OutputTable
(
productId string primary key
, productName string
, productJson string
);
#name('stmtUpdateOutputTable') on ProductEvent pe
merge OutputTable ot
where ot.productId = pe.productId
when not matched
then insert select productId, productName, Utils.getJson(*)
when matched
then update
set
ot.productName= pe.productName,
ot.productJson = Utils.getJson(*)
;
ProductEvent is a java bean that contains more than 100 property so it is not a good idea I pass each individual field when call custom function -
public class ProductEvent{
private String productId;
private String productName;
private Double price;
private LocalDate firstAvailableDate;
//..... around 100 more properties here
}
Utils is a helper class that contains static method -
public static String getJson(ProductEvent event) {
return new ObjectMapper().writeValueAsString(event);
}
In the on-merge, there are two aliases: "pe" for ProductEvent and "ot" for OutputTable, so "...Utils.getJson(pe)..." would work.

How can I get data from #DBRef document using #Query -> Spring data mongo

I need help to get the data from another document I have the following class.
#Data
#Document(collection = "tmVersion")
public class TmVersion {
#Id
private String id;
private String cVrVersionId;
#DBRef
private TaApplicationVersion taApplicationVersion;
}
and
#Data
#Document(collection = "taApplicationVersion")
public class TaApplicationVersion {
#Id
private String id;
private String dVrAppName;
private String dVrAppCode;
}
This is my repository in which I map what I want to be shown but in taApplicationVersion I need to show all this object also how is it done?
#Query(value="{}", fields="{'cVrVersionId': 1, 'taApplicationVersion.dVrAppName': 2,
'dVrVersionNumber': 3}")
Page<TmVersion> getAllVersionWithOutFile(Pageable pageable)
Couple of things to mention here.
If you want this kind of join between tables, then you need to rethink your choice of Mongodb as database. No Sql Databases thrive on the fact that there is very less coupling between tables(collections). So if you are using #DBRef, it negates that. Mongodb themselves do not recommend using #DBRef.
This cannot be achieved with the method like you have in the repository. You need to use Projections. Here is the documentation for that.
Create a Porjection interface like this. Here you can control which fields you need to include in the Main class(TmVersion)
#ProjectedPayload
public interface TmVersionProjection {
#Value("#{#taApplicationVersionRepository.findById(target.taApplicationVersion.id)}")
public TaApplicationVersion getTaApplicationVersion();
public String getId();
public String getcVrVersionId();
}
Change the TmVersionRepository like this
public interface TmVersionRepository extends MongoRepository<TmVersion, String> {
#Query(value="{}")
Page<TmVersionProjection> getAllVersionWithOutFile(Pageable pageable);
}
Create a new Repository for TaApplicationVersion. You can add #Query on top of this method and control which fields from subclass needs to be returned.
public interface TaApplicationVersionRepository extends MongoRepository<TaApplicationVersion, String> {
TaApplicationVersion findById(String id);
}

REST Api Spring boot with mongodb how to get json?

#GetMapping("/getAccount")
public Account validateAccount(#RequestBody) {
}
Very new to spring boot. My account file has 5+ values all strings, username, password, id, and some etc things.
Given this
{
"username": "bob"
"password": "password"
}
It should give this with 200 response code OK
{
"id": "45645646546"
"username": "bob"
"password": "password"
"status": "Single"
"filler": "filler"
}
However I'm not sure how to read the "username" and "password" json in my validateAccount function
Not really related to this question but does anyone know how to send a response code in the function? Like .sendresponseheader(400) something like that
public class AccountDTO {
#JsonIgnore
private Long id;
#NotNull
private String username;
#NotNull
private String password;
#JsonIgnore
private String status;
#JsonIgnore
private String filler;
// getters & setters
}
You may want to create a DTO (Data Transaction Object) as shown above. Here's a link to it's wiki.
Next pass map user input into this DTO using #RequestBody annotation.
#RestController
public class AccountController {
#GetMapping("/accounts")
public ResponseEntity<Account> validateAccount(#RequestBody AccountDTO accountDTO) {
return new ResponseEntity<>(accountService.validate(accountDTO), HttpStatus.OK);
}
}
Or you can use
#RestController
public class AccountController {
#GetMapping("/accounts")
public Response validateAccount(#RequestBody AccountDTO accountDTO) {
return new ResponseEntity().ok(accountService.validate(accountDTO));
}
}
The user input will be converted from json to AccountDTO using whatever JSON processor your're using most probably it'll be com.fasterxml.jackson.core.
The #JsonIgnore and #NotNull annotation will ensure only username and password fields are used and others are ignored while taking input from user.
You can pass this DTO to your service classes and use something like findByUsername() in your Business Logic and return populated AccountDTO using the below mapper function or some external libraries like Model Mapper or MapStruct.
public toAccountDTO(Account account) {
AccountDTO accountDTO = new AccountDTO();
accountDTO.setUsername(account.getUsername());
// and so on...
return accountDTO;
}
And for your last query, wrap the returned AccountDTO object in ResponseEntity wrapper to provide a proper Response Code with your payload. Here's a link to ResponseEntity Java docs.
AccountDto.java
===============
class AccountDto{
private Long id;
private String username;
private String password;
private String status;
private String filler;
//getters & setters
}
#GetMapping("/getAccount")
public ResponseEntity validateAccount(#RequestBody AccountDto accountDto) {
return new ResponseEntity<>(accountServie.validate(accountDto),HttpStatus.OK);
}
You can do your custom operations before returning the response. Take a look Best Practice of REST
For json response nothing specific just mark class with #RestController.
For #RequestBody just use a pojo to bind the values
For error code and status you can use ResponseEntity

(JDBI/Dropwizard) PSQLException when retrieving auto-incremented id from PostgreSQL

I'm trying to set up a dropwizard project but I'm stuck. When I try to get the auto generated id field with #GetGeneratedKeys then I'm getting the following Exception:
org.postgresql.util.PSQLException: Bad value for type long : foo.
The request is a simple JSON Request
{"name":"foo"}
The INSERT into the database is successful but it seems that the statement returns the value of the name instead of the generated id. How can I solve this?
I use postgresql, and the table project contains a primary key field "id" with nextval('project_id_seq'::regclass). Here are the POJO, DAO and Resource Classes I use:
public class Project {
private long id;
private String name;
public Project() { // Jackson deserialization }
public Project(long id, String name) {
this.id = id;
this.name = name;
}
...
}
#RegisterMapper(ProjectMapper.class)
public interface ProjectDAO {
#SqlUpdate("insert into project (name) values (:name)")
#GetGeneratedKeys
public long insert(#Bind("name") String name);
}
#Path("/project")
#Consumes({MediaType.APPLICATION_JSON})
#Produces({MediaType.APPLICATION_JSON})
public class ProjectResource {
ProjectDAO projectDAO;
public ProjectResource(ProjectDAO personDAO) {
this.projectDAO = personDAO;
}
#POST
#Timed
public Response add(#Valid Project project) {
long newId = projectDAO.insert(project.getName());
project.setId(newId);
return Response.status(Response.Status.CREATED)
.entity(project).build();
}
}
===============
UPDATE
I just figured out that this relates to the fact that my id column isn't the first column in my table. The column name is. The problem occurs because #GetGeneratedKeys is using org.skife.jdbi.v2.sqlobject.FigureItOutResultSetMapper which is using org.skife.jdbi.v2.PrimitivesMapperFactory which returns org.skife.jdbi.v2.util.LongMapper.FIRST. This mapper is calling
java.sql.ResultSet.getLong(1) through the method extractByIndex(...) to retrieve the generated id, which isn't the id in my case...
I'll fix the issue by reorganizing the columns in the database, but I'd like to have a robust implementation if possible: Is there a way to specify the column name of the id column when using the #GetGeneratedKeys Annotation? (The org.skife.jdbi.v2.util.LongMapper class contains a also method called extractByName(...))
This is an issue in the jdbi implementation and is fixed in a newer version as described in https://github.com/jdbi/jdbi/issues/114

javaee 6 rest api named query result

I have a simple JEE6 rest class that gets the data from db2. I am using Jackson in ApplicationConfig class to convert the entity objects to json. It converts with the field names as the key and the value as the right hand value. So for example:
Class Entity {
String name;
String address;
}
converts to
{name:"hello", address:"world"}
The service is as follows:
public List<T> findAll() {
javax.persistence.criteria.CriteriaQuery cq = getEntityManager().getCriteriaBuilder().createQuery();
cq.select(cq.from(entityClass));
return getEntityManager().createQuery(cq).getResultList();
}
Now I want to only return the name in json format. So I created a named query as follows in the entity class:
#NamedQuery(name = "justGetName", query = "SELECT a.name FROM Applications a")
And the service changed to
public List<T> findAll() {
return getEntityManager().createNamedQuery("justGetName").getResultList();
}
This returns the following array:
[{"first","second","third"}]
But I want to get back:
[{name:"first",name:"second",name:"third"}]
How do I write the named query so that the class field names are added to the json structure? Thank you.
You querying a list of strings from your database and this is what the service returns.
Their are multiple ways to achieve your goal.
Pure JPA
Using #JsonIgnore to tell Jackson not to serialize an attribute
class Application {
String name;
#JsonIgnore
String address;
}
Create a new Entity class that only contains the attributes you would like to share
class ApplicationName {
String name;
}
Alternatively you could introduce a separate class that only contains the attributes you would like to share and convert the results from the query into this class and return than the list of this converted values.