How to deal with IResourceStreamWriter API change in Wicket 1.5? - wicket

In Wicket 1.4, I had a custom AbstractResourceStreamWriter (used in a custom kind of Link for streaming a file that gets generated on the fly):
private AbstractResourceStreamWriter resourceStreamWriter() {
return new AbstractResourceStreamWriter() {
#Override
public void write(OutputStream output) {
try {
reportService.generateReport(output, report);
} catch (ReportGenerationException e) {
// ...
}
}
#Override
public String getContentType() {
return CONTENT_TYPES.get(report.getOutputType());
}
};
}
In Wicket 1.5, the IResourceStreamWriter interface has been changed so that the method gets a Response instead of OutputStream. It is somewhat confusing that the IResourceStreamWriter javadocs still talk about OutputStream:
Special IResourceStream implementation that a Resource can return when
it directly wants to write to an output stream instead of return the
IResourceStream.getInputStream()
...
Implement this method to write the resource data directly the the
given OutputStream.
Anyway, I don't see a quick way of getting an OutputStream from the Response.
Given that I have a method (the call generateReport(output, report) in above code) which expects an OutputStream to write into, what's the simplest way to make this work again?

What about
ByteArrayOutputStream baos = new ByteArrayOutputStream();
reportService.generateReport(baos, report);
response.write(baos.toByteArray());
or something similar?

There is a org.apache.wicket.request.Response#getOutputStream(). But again I'm not sure this is the same as in 1.4.x. In 1.5 this will buffer what you write in the output stream. Where the javadoc says it shouldn't be buffered.

Related

How should I define Flink's Schema to read Protocol Buffer data from Pulsar

I am using Pulsar-Flink to read data from Pulsar in Flink. I am having difficulty when the data's format is Protocol Buffer.
In the GitHub top page, Pulsar-Flink is using SimpleStringSchema. However, seemingly it does not comply with Protocol Buffer officially. Does anyone know how to deal with the data format? How should I define the schema?
StreamExecutionEnvironment see = StreamExecutionEnvironment.getExecutionEnvironment();
Properties props = new Properties();
props.setProperty("topic", "test-source-topic")
FlinkPulsarSource<String> source = new FlinkPulsarSource<>(serviceUrl, adminUrl, new SimpleStringSchema(), props);
DataStream<String> stream = see.addSource(source);
// chain operations on dataStream of String and sink the output
// end method chaining
see.execute();
FYI, I am writing Scala code, so if your explanation is for Scala(not for Java), it is really helpful. Surely, any kind of advice is welcome!! Including Java.
You should implement your own DeserializationSchema. Let's assume that you have a protobuf message Address and have generated the respective Java class. Then the schema should look like the following:
public class ProtoDeserializer implements DeserializationSchema<Address> {
#Override
public TypeInformation<Address> getProducedType() {
return TypeInformation.of(Address.class);
}
#Override
public Address deserialize(byte[] message) throws IOException {
return Address.parseFrom(message);
}
#Override
public boolean isEndOfStream(Address nextElement) {
return false;
}
}

FlatFileItemWriter write header only in case when data is present

have a task to write header to file only if some data exist, other words if reader return nothing file created by writer should be empty.
Unfortunately FlatFileItemWriter implementation, in version 3.0.7, has only private access fields and methods and nested class that store all info about writing process, so I cannot just take and overwrite write() method. I need to copy-paste almost all content of FlatFileItemWriter to add small piece of new functionality.
Any idea how to achieve this more elegantly in Spring Batch?
So, finally found a less-more elegant solution.
The solution is to use LineAggregators, and seems in the current implementation of FlatFileItemWriter this is only one approach that you can use safer when inheriting this class.
I use separate line aggregator only for a header, but the solution can be extended to use multiple aggregators.
Also in my case header is just predefined string, thus I use PassThroughLineAggregator by default that just return my string to FlatFileItemWriter.
public class FlatFileItemWriterWithHeaderOnData extends FlatFileItemWriter {
private LineAggregator lineAggregator;
private LineAggregator headerLineAggregator = new PassThroughLineAggregator();
private boolean applyHeaderAggregator = true;
#Override
public void afterPropertiesSet() throws Exception {
Assert.notNull(headerLineAggregator, "A HeaderLineAggregator must be provided.");
super.afterPropertiesSet();
}
#Override
public void setLineAggregator(LineAggregator lineAggregator) {
this.lineAggregator = lineAggregator;
super.setLineAggregator(lineAggregator);
}
public void setHeaderLineAggregator(LineAggregator headerLineAggregator) {
this.headerLineAggregator = headerLineAggregator;
}
#Override
public void write(List items) throws Exception {
if(applyHeaderAggregator){
LineAggregator initialLineAggregator = lineAggregator;
super.setLineAggregator(headerLineAggregator);
super.write(getHeaderItems());
super.setLineAggregator(initialLineAggregator);
applyHeaderAggregator = false;
}
super.write(items);
}
private List<String> getHeaderItems() throws ItemStreamException {
// your actual implementation goes here
return Arrays.asList("Id,Name,Details");
}
}
PS. This solution assumed that if method write() called then some data exist.
Try this in your writer
writer.setShouldDeleteIfEmpty(true);
If you have no data, there is no file.
In other case, you write your header and your items
I'm thinking of a way as below.
BeforeStep() (or a Tasklet) if there is no Data at all, you set a flag such as "noData" is 'true'. Otherwise will be 'false'
And you have 2 writers, one with Header and another one without Header. In this case you can have a base Writer acts as a parent and then 2 writers inherits it. The only difference between them is one with Header and one doesn't have HeaderCallBack.
Base on the flag, you can switch to either 'Writer with Header' or 'Writer without Header'
Thanks,
Nghia

Is there a way to use Solr's streaming API with spring data solr?

I have a use case where I need to fetch the ids of my entire solr collection. For that, with solrj, I use the Streaming API like this :
CloudSolrServer server = new CloudSolrServer("zkHost1:2181,zkHost2:2181,zkHost3:2181");
SolrQuery query = new SolrQuery("*:*");
server.queryAndStreamResponse(tmpQuery, handler);
Where handler is a class that implements StreamingResponseCallback, ommited in my code for brevity.
Now, the Spring data repositories abstraction give me the ability to search by pages, by cursors, but I can't seem to find a way to handle the streaming use case.
Is there a workaround ?
SolrTemplate allows to access the underlying SolrClient in a callback style. So you could use that one to work around the current limitations.
The result conversion using the MappingSolrConverter available via the SolrTemplate is broken at the moment (I need to check why) - but you get the idea of how to do it.
solrTemplate.execute(new SolrCallback<Void>() {
#Override
public Void doInSolr(SolrClient solrClient) throws SolrServerException, IOException {
SolrQuery sq = new SolrQuery("*:*");
solrClient.queryAndStreamResponse("collection1", sq, new StreamingResponseCallback() {
#Override
public void streamSolrDocument(SolrDocument doc) {
// the bean conversion fails atm
// ExampleSolrBean bean = solrTemplate.getConverter().read(ExampleSolrBean.class, doc);
System.out.println(doc);
}
#Override
public void streamDocListInfo(long numFound, long start, Float maxScore) {
// do something useful
}
});
return null;
}
});

Reading from streams instead of files in spring batch itemReader

I am getting a csv file as a webservice call which needs to be laoded. Right now I am saving it in temp directory to provide it as setResource to Reader.
Is there a way to provide stream(byte[]) as is instead of saving the file first?
The method setResource of the ItemReader takes a org.springframework.core.io.Resource as a parameter. This class has a few out-of-the-box implementations, among which you can find org.springframework.core.io.InputStreamResource. This class' constructor takes a java.io.InputStream which can be implemented by java.io.ByteArrayInputStream.
So technically, yes you can consume a byte[] parameter in an ItemReader.
Now, for how to actually do that, here are a few ideas :
1) Create your own FlatFileItemReader (since CSV is a flat file) and make it implement StepExecutionListener
public class CustomFlatFileItemReader<T> extends FlatFileItemReader<T> implements StepExecutionListener {
}
2) Override the beforeStep method, do your webservice call within and save the result in a variable
private byte[] stream;
#Override
public void beforeStep(StepExecution stepExecution) {
// your webservice logic
stream = yourWebservice.results();
}
3) Override the setResource method to pass this stream as the actual resource.
#Override
public void setResource(Resource resource) {
// Convert byte array to input stream
InputStream is = new ByteArrayInputStream(stream);
// Create springbatch input stream resource
InputStreamResource res = new InputStreamResource(is);
// Set resource
super.setResource(res);
}
Also, if you don't want to call your webservice within the ItemReader, you can simply store the byte array in the JobExecutionContext and get it in the beforeStep method with stepExecution.getJobExecution().getExecutionContext().get("key");
I am doing right now with FlaFileItemReader, reading a file from Google Storage. No needed to extends:
#Bean
#StepScope
public FlatFileItemReader<MyDTO> itemReader(#Value("#{jobParameters['filename']}") String filename) {
InputStream stream = googleStorageService.getInputStream(GoogleStorage.UPLOADS, filename);
return new FlatFileItemReaderBuilder<MyDTO>()
.name("myItemReader")
.resource(new InputStreamResource(stream)) //InputStream here
.delimited()
.names(FIELDS)
.lineMapper(lineMapper()) // Here is mapped like a normal File
.fieldSetMapper(new BeanWrapperFieldSetMapper<MyDTO>() {{
setTargetType(MyDTO.class);
}})
.build();
}

Xpage REST service control and service bean

I am trying to implement REST Service using XPage REST Service Control. I have opted for "customRESTService".
I would like to emit JSON when this service is requested. I can write logic in Server Side Java Script.
But I noticed that this customRESTService also supports "serviceBean", meaning I can write whole logic in pure JAVA.
I have given below code of the bean. I have declared it in faces-config.xml as well. But it throws exception while rendering. Has anyone used "serviceBean" in customRESTService?
I appreciate any help!! Thanks!!
public class GetApproverJSON{
public GetApproverJSON(){
System.out.println("Instantiating Bean");
}
public String doGet() throws NotesException{
JSONObject mainObj = new JSONObject();;
JSONObject itemObj;
try{
mainObj.put("label", "name");
mainObj.put("identifier", "abbr");
itemObj = new JSONObject();
itemObj.put("name", "");
itemObj.put("abbr", "");
mainObj.accumulate("items", itemObj);
return mainObj.toString();
}catch(Exception e){
System.out.println("Exception occured while generating JSON ");
e.printStackTrace();
return mainObj.toString();
}finally{
}
}
Error :
com.ibm.domino.services.ServiceException: Error while rendering service
at com.ibm.xsp.extlib.component.rest.CustomService$ScriptServiceEngine.renderService(CustomService.java:304)
at com.ibm.domino.services.HttpServiceEngine.processRequest(HttpServiceEngine.java:167)
at com.ibm.xsp.extlib.component.rest.UIBaseRestService._processAjaxRequest(UIBaseRestService.java:252)
at com.ibm.xsp.extlib.component.rest.UIBaseRestService.processAjaxRequest(UIBaseRestService.java:229)
at com.ibm.xsp.util.AjaxUtilEx.renderAjaxPartialLifecycle(AjaxUtilEx.java:206)
at com.ibm.xsp.webapp.FacesServletEx.renderAjaxPartial(FacesServletEx.java:221)
at com.ibm.xsp.webapp.FacesServletEx.serviceView(FacesServletEx.java:166)
at com.ibm.xsp.webapp.FacesServlet.service(FacesServlet.java:160)
at com.ibm.xsp.webapp.FacesServletEx.service(FacesServletEx.java:137)
at com.ibm.xsp.webapp.DesignerFacesServlet.service(DesignerFacesServlet.java:103)
at com.ibm.designer.runtime.domino.adapter.ComponentModule.invokeServlet(ComponentModule.java:576)
at com.ibm.domino.xsp.module.nsf.NSFComponentModule.invokeServlet(NSFComponentModule.java:1267)
at com.ibm.designer.runtime.domino.adapter.ComponentModule$AdapterInvoker.invokeServlet(ComponentModule.java:847)
at com.ibm.designer.runtime.domino.adapter.ComponentModule$ServletInvoker.doService(ComponentModule.java:796)
at com.ibm.designer.runtime.domino.adapter.ComponentModule.doService(ComponentModule.java:565)
at com.ibm.domino.xsp.module.nsf.NSFComponentModule.doService(NSFComponentModule.java:1251)
at com.ibm.domino.xsp.module.nsf.NSFService.doServiceInternal(NSFService.java:598)
at com.ibm.domino.xsp.module.nsf.NSFService.doService(NSFService.java:421)
at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.doService(LCDEnvironment.java:341)
at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.service(LCDEnvironment.java:297)
at com.ibm.domino.xsp.bridge.http.engine.XspCmdManager.service(XspCmdManager.java:272)
Caused by: com.ibm.xsp.FacesExceptionEx: Bean getApproverJSON is not a CustomServiceBean
at com.ibm.xsp.extlib.component.rest.CustomService.findBeanInstance(CustomService.java:226)
at com.ibm.xsp.extlib.component.rest.CustomService$ScriptServiceEngine.renderService(CustomService.java:255)
... 20 more
You need to change your code to:
public class GetApproverJSON{ ...}
to:
public class GetApproverJSON extends CustomServiceBean {
#Override
public void renderService(CustomService service, RestServiceEngine engine) throws ServiceException {
HttpServletRequest request = engine.getHttpRequest();
HttpServletResponse response = engine.getHttpResponse();
response.setHeader("Content-Type", "application/json; charset=UTF-8");
// Here goes your code, get the response writer or stream
}
since that's the interface the REST service is expecting. You will need to implement just renderService. You can get the method (GET, POST etc.) from the request object
I've never used the service bean before, I usually create my own parser with a static doGet method very similar to yours and in the doGet property of the custom REST service make a call to the static doGet method I create. But I think (I'm probably wrong on this count) if you use the service bean it has to be an entire servlet like if you wrote your own actual REST Service, and not just the parser portion.
I've created quite a few of the parsers and have found that a list of maps:
List>
is usually the best approach for building the initial data. I then loop through the list to build my JSON. In the Extension Library there is a class called JsonWriter which makes it very easy to build a JSON Object. Use the JsonWriter like:
StringWriter sw = new StringWriter();
JsonWriter jw = new JsonWriter(sw);
jw.startObject();
jw.startProperty("SomeProperty");
jw.outStringLiteral("SomeValue");
jw.endProperty();
jw.endObject();
return sw.toString();
For a full on example you can take a look at the REST service I built for my JQuery FullCalendar demo. While none of the methods are static (I need to track a couple of properties) you should get the basic idea. But what kicks the whole thing off is a call to the writeJson() method. That is invoked in this custom control.
Those examples should get you going on building your own custom JSON parser and emitting that JSON back to your application.