FlowableEmitter fails to emit the items - rx-java2

I have a program which polls 2 directories - dir1 and dir2 and emit the file as soon as the file arrives in any of the 2 directories. But when the files arrive at same time in both directories, some files from either of the directory is failed to get emitted.
The code is as follows:
Flowable.create((FlowableEmitter<Path> em) -> pollDirectory(Arrays.asList(dir1,dir2), em),
BackpressureStrategy.BUFFER).subscribe();
The method pollDirectory has the below code:
for (Path directory: pathList) {
FileAlterationObserver fao = new FileAlterationObserver(directory);
fao.addListener(new FileAlterationListenerImpl(emitter));
final FileAlterationMonitor monitor = new FileAlterationMonitor(5000);
monitor.addObserver(fao);
try {
monitor.start();
} catch (Exception e) {
handleException(e);
}
}
The FileAlterationListenerImpl class emits the path on file creation and code is as follows:
public class FileAlterationListenerImpl implements FileAlterationListener {
FlowableEmitter<Path> source;
public FileAlterationListenerImpl(FlowableEmitter<Path> emitter) {
super();
this.source = emitter;
}
#Override
public void onFileCreate(final File file) {
source.onNext(file.toPath());
}
}
Is there any way in RxJava to handle this scenario so that emitter emits files from both directories even if the files are arrived at same time?

FileAlterationObserver creates a thread that will trigger its emission thus if you have two or more, that will result in concurrent onNext invocation.
The javadoc indicates this is not safe and you need to use serialize() to ensure thread safety.
Flowable.create((FlowableEmitter<Path> em) ->
pollDirectory(Arrays.asList(dir1,dir2), em.serialize()),
BackpressureStrategy.BUFFER
)
.subscribe();

Related

Vert.x: How to wait for a future to complete

Is there a way to wait for a future to complete without blocking the event loop?
An example of a use case with querying Mongo:
Future<Result> dbFut = Future.future();
mongo.findOne("myusers", myQuery, new JsonObject(), res -> {
if(res.succeeded()) {
...
dbFut.complete(res.result());
}
else {
...
dbFut.fail(res.cause());
}
}
});
// Here I need the result of the DB query
if(dbFut.succeeded()) {
doSomethingWith(dbFut.result());
}
else {
error();
}
I know the doSomethingWith(dbFut.result()); can be moved to the handler, yet if it's long, the code will get unreadable (Callback hell ?) It that the right solution ? Is that the omny solution without additional libraries ?
I'm aware that rxJava simplifies the code, but as I don't know it, learning Vert.x and rxJava is just too much.
I also wanted to give a try to vertx-sync. I put the dependency in the pom.xml; everything got downloaded fine but when I started my app, I got the following error
maurice#mickey> java \
-javaagent:~/.m2/repository/co/paralleluniverse/quasar-core/0.7.5/quasar-core-0.7.5-jdk8.jar \
-jar target/app-dev-0.1-fat.jar \
-conf conf/config.json
Error opening zip file or JAR manifest missing : ~/.m2/repository/co/paralleluniverse/quasar-core/0.7.5/quasar-core-0.7.5-jdk8.jar
Error occurred during initialization of VM
agent library failed to init: instrument
I know what the error means in general, but I don't know in that context... I tried to google for it but didn't find any clear explanation about which manifest to put where. And as previously, unless mandatory, I prefer to learn one thing at a time.
So, back to the question : is there a way with "basic" Vert.x to wait for a future without perturbation on the event loop ?
You can set a handler for the future to be executed upon completion or failure:
Future<Result> dbFut = Future.future();
mongo.findOne("myusers", myQuery, new JsonObject(), res -> {
if(res.succeeded()) {
...
dbFut.complete(res.result());
}
else {
...
dbFut.fail(res.cause());
}
}
});
dbFut.setHandler(asyncResult -> {
if(asyncResult.succeeded()) {
// your logic here
}
});
This is a pure Vert.x way that doesn't block the event loop
I agree that you should not block in the Vertx processing pipeline, but I make one exception to that rule: Start-up. By design, I want to block while my HTTP server is initialising.
This code might help you:
/**
* #return null when waiting on {#code Future<Void>}
*/
#Nullable
public static <T>
T awaitComplete(Future<T> f)
throws Throwable
{
final Object lock = new Object();
final AtomicReference<AsyncResult<T>> resultRef = new AtomicReference<>(null);
synchronized (lock)
{
// We *must* be locked before registering a callback.
// If result is ready, the callback is called immediately!
f.onComplete(
(AsyncResult<T> result) ->
{
resultRef.set(result);
synchronized (lock) {
lock.notify();
}
});
do {
// Nested sync on lock is fine. If we get a spurious wake-up before resultRef is set, we need to
// reacquire the lock, then wait again.
// Ref: https://stackoverflow.com/a/249907/257299
synchronized (lock)
{
// #Blocking
lock.wait();
}
}
while (null == resultRef.get());
}
final AsyncResult<T> result = resultRef.get();
#Nullable
final Throwable t = result.cause();
if (null != t) {
throw t;
}
#Nullable
final T x = result.result();
return x;
}

List files under a folder within a Camel rest service

I am creating a rest service using Camel Rest DSL. Within the service I need to list all files under a folder and do some processing on them. PFB the code -
from("direct:postDocument")
.to("file:/home/s469457/service/content-util/composite?noop=true")
.setBody(constant(null))
.log("Scanning file ${file:name.noext}.${file:name.ext}...");
Please advice.
~ Arunava
I would suggest to write a processor or a bean to list files in directory. I think that would be more efficient and so much simpler. Using Camel's file component you would have to deal with intricacies you might not expect.
Regardless. You will need to do pollEnrich and afterwards aggregate the whole result. I also think that you would run into trouble and will not be able to read files multiple times, to solve that you might need to create idempotent repository, but when reading files there might be concurrency/file locking issues...
Here's some pseudocode to get you started if you want to go that way:
from("direct:listFiles")
.pollEnrich("file:"+getFullPath()+"?noop=true")
.aggregate(new AggregationStrategy {
public Exchange aggregate(Exchange oldExchange, Exchange newExchange) {
String filename = newExchange.getIn().getHeader("CamelFileName", String.class)
if (oldExchange == null) {
newExchange.getIn().setBody(new ArrayList<String>(Arrays.asList(filename)));
return newExchange;
} else {
...
}
})
//Camel Rest Api to list files
rest().path("/my-api/")
.get()
.produces("text/plain")
.to("direct:listFiles");
//Camel Route to list files
List<String> fileList = new ArrayList<String>();
from("direct:listFiles")
.loopDoWhile(body().isNotNull())
.pollEnrich("file:/home/s469457/service/content-util/composite?noop=true&recursive=true&idempotent=false&include=.*.csv")
.choice()
.when(body().isNotNull())
.process( new Processor(){
#Override
public void process(Exchange exchange) throws Exception {
File file = exchange.getIn().getBody(File.class);
fileList.add(file.getName());
}
})
.otherwise()
.process( new Processor(){
#Override
public void process(Exchange exchange) throws Exception {
if (fileList.size() != 0)
exchange.getOut().setBody(String.join("\n", fileList));
fileList.clear();
}
})
.end();

Vertx merge contents of multiple files in single file

What is the best way to append the contents of multiple files into single file in vertx? I have tried vertx filesystem and asyncFile but both do not have a option to append file or I did not know of any. Is there any alternative approach to merge or append files in vertx asynchronously.
The only solution I could find is to make buffer list and write content on the end of each previous buffer length using loop.
Indeed, as of Vert.x 3.4, there is no helper method on FileSystem to append a file to another file.
You could do it with AsyncFile and Pump as follows.
First create a utility method to open files:
Future<AsyncFile> openFile(FileSystem fileSystem, String path, OpenOptions openOptions) {
Future<AsyncFile> future = Future.future();
fileSystem.open(path, openOptions, future);
return future;
}
Then another one to append a file to another file with a Pump:
Future<AsyncFile> append(AsyncFile source, AsyncFile destination) {
Future<AsyncFile> future = Future.future();
Pump pump = Pump.pump(source, destination);
source.exceptionHandler(future::fail);
destination.exceptionHandler(future::fail);
source.endHandler(v -> future.complete(destination));
pump.start();
return future;
}
Now you can combine those with sequential composition:
void merge(FileSystem fileSystem, String output, List<String> sources) {
openFile(fileSystem, output, new OpenOptions().setCreate(true).setTruncateExisting(true).setWrite(true)).compose(outFile -> {
Future<AsyncFile> mergeFuture = null;
for (String source : sources) {
if (mergeFuture == null) {
mergeFuture = openFile(fileSystem, source, new OpenOptions()).compose(sourceFile -> {
return append(sourceFile, outFile);
});
} else {
mergeFuture = mergeFuture.compose(v -> {
return openFile(fileSystem, source, new OpenOptions()).compose(sourceFile -> {
return append(sourceFile, outFile);
});
});
}
}
return mergeFuture;
}).setHandler(ar -> {
System.out.println("Done");
});
}

opennlp with netbeans is not giving output

How to use opennlp with netbeans. I made a small program as given in apache document but it is not working. I have set path to the opennlp bin as stated in the apache document but still i m not geting an output. it is not able to find .bin and hence SentenceModel model.
package sp;
public class Sp {
public static void main(String[] args) throws FileNotFoundException {
InputStream modelIn ;
modelIn = new FileInputStream("en-token.bin");
try {
SentenceModel model = new SentenceModel(modelIn);
}
finally {
if (modelIn != null) {
try {
modelIn.close();
}
catch (IOException e) {
}
}
}
}
}
The current working directory when you run in Netbeans is the base project directory (the one with build.xml in it). Place your .bin file there, and you should be able to open the file like that.

Threading concept

Can somebody help me on this:
private Thread workerThread;
private EventWaitHandle waitHandle;
if (workerThread == null)
{
workerThread = new Thread(new ThreadStart(Work));
workerThread.Start();
//workerThread.Join();
}
else if (workerThread.ThreadState == ThreadState.WaitSleepJoin)
{
waitHandle.Set();
}
private void Work()
{
while (true)
{
string filepath = RetrieveFile();
if (filepath != null)
ProcessFile(filepath);
else
// If no files left to process then wait
waitHandle.WaitOne();
}
}
private void ProcessFile(string filepath)
{
XMLCreation myXML = new XMLCreation();
myXML.WriteXml(filepath, XMLFullFilePath);
}
private string RetrieveFile()
{
if (workQueue.Count > 0)
return workQueue.Dequeue();
else
return null;
}
see this is how all this work.
i have a filewatcher event that fires only when new file is being add to that folder, now the problem is its a small part of bigger application and when the file watcher fires there is another process which is accessing that file and i get error like this file is being used by another process. so i have tried to implement through threading but with the above code only some files are being processed, but in the log i can see all the files are being processed. Is it the right way to do it or am i missing something in it
thanks in adv.
You will have to use a mutex to control who is accessing the file and allow only one process at a time to work with that file at the very first time. If you think that there is the possibility that more than one thread will be waiting to work with the same file then you will have to implement a producer-consumer threading system with a queue.
Here is the best documentation about threads you can find in .NET:
http://www.albahari.com/threading/