How to read a pdf file in Vertx? - vert.x

I am new to VertX and I want to read a pdf using the "GET" method. I know that buffer will be used. But there are no resources on the internet on how to do that.

Omitting the details of how you would get the file from your data store (couchbase DB), it is fair to assume the data is read correctly into a byte[].
Once the data is read, you can feed it to an io.vertx.core.buffer.Buffer that can be used to shuffle data to the HttpServerResponse as follows:
public void sendPDFFile(byte[] fileBytes, HttpServerResponse response) {
Buffer buffer = Buffer.buffer(fileBytes);
response.putHeader("Content-Type", "application/pdf")
.putHeader("Content-Length", String.valueOf(buffer.length()))
.setStatusCode(200)
.end(buffer);
}

Related

Google Cloud Storage atomic creation of a Blob

I'm using haddop-connectors
project for writing BLOBs to Google Cloud Storage.
I'd like to make sure that a BLOB with a specific target name that is being written in a concurrent context is either written in FULL or not appearing at all as visible in case that an exception has occurred while writing.
In the code below, in case that that an I/O exception occurs, the BLOB written will appear on GCS because the stream is being closed in finally:
val stream = fs.create(path, overwrite)
try {
actions.map(_ + "\n").map(_.getBytes(UTF_8)).foreach(stream.write)
} finally {
stream.close()
}
The other possibility would be to not close the stream and let it "leak" so that the BLOB does not get created. However this is not really a valid option.
val stream = fs.create(path, overwrite)
actions.map(_ + "\n").map(_.getBytes(UTF_8)).foreach(stream.write)
stream.close()
Can anybody share with me a recipe on how to write to GCS a BLOB either with hadoop-connectors or cloud storage client in an atomic fashion?
I have used reflection within hadoop-connectors to retrieve an instance of com.google.api.services.storage.Storage from the GoogleHadoopFileSystem instance
GoogleCloudStorage googleCloudStorage = ghfs.getGcsFs().getGcs();
Field gcsField = googleCloudStorage.getClass().getDeclaredField("gcs");
gcsField.setAccessible(true);
Storage gcs = (Storage) gcsField.get(googleCloudStorage);
in order to have the ability to make a call based on an input stream corresponding to the data in memory.
private static StorageObject createBlob(URI blobPath, byte[] content, GoogleHadoopFileSystem ghfs, Storage gcs)
throws IOException
{
CreateFileOptions createFileOptions = new CreateFileOptions(false);
CreateObjectOptions createObjectOptions = objectOptionsFromFileOptions(createFileOptions);
PathCodec pathCodec = ghfs.getGcsFs().getOptions().getPathCodec();
StorageResourceId storageResourceId = pathCodec.validatePathAndGetId(blobPath, false);
StorageObject object =
new StorageObject()
.setContentEncoding(createObjectOptions.getContentEncoding())
.setMetadata(encodeMetadata(createObjectOptions.getMetadata()))
.setName(storageResourceId.getObjectName());
InputStream inputStream = new ByteArrayInputStream(content, 0, content.length);
Storage.Objects.Insert insert = gcs.objects().insert(
storageResourceId.getBucketName(),
object,
new InputStreamContent(createObjectOptions.getContentType(), inputStream));
// The operation succeeds only if there are no live versions of the blob.
insert.setIfGenerationMatch(0L);
insert.getMediaHttpUploader().setDirectUploadEnabled(true);
insert.setName(storageResourceId.getObjectName());
return insert.execute();
}
/**
* Helper for converting from a Map<String, byte[]> metadata map that may be in a
* StorageObject into a Map<String, String> suitable for placement inside a
* GoogleCloudStorageItemInfo.
*/
#VisibleForTesting
static Map<String, String> encodeMetadata(Map<String, byte[]> metadata) {
return Maps.transformValues(metadata, QuickstartParallelApiWriteExample::encodeMetadataValues);
}
// A function to encode metadata map values
private static String encodeMetadataValues(byte[] bytes) {
return bytes == null ? Data.NULL_STRING : BaseEncoding.base64().encode(bytes);
}
Note in the example above, that even if there are multiple callers trying to create a blob with the same name in parallel, ONE and only ONE will succeed in creating the blob. The other callers will receive 412 Precondition Failed.
GCS objects (blobs) are immutable 1, which means they can be created, deleted or replaced, but not appended.
The Hadoop GCS connector provides the HCFS interface which gives the illusion of appendable files. But under the hood, it is just one blob creation, GCS doesn't know if the content is complete or not from the application's perspective, just as you mentioned in the example. There is no way to cancel a file creation.
There are 2 options you can consider:
Create a temp blob/file, copy it to the final blob/file, then delete the temp blob/file, see 2. Note that there is no atomic rename operation in GCS, rename is implemented as copy-then-delete.
If your data fits into the memory, first read up the stream and buffer the bytes in memory, then create the blob/file, see 3.
GCS connector should also work with the 2 options above, but I think GCS client library gives you more control.

Not Reading bytes properly during FTP transfer in Spring Batch

I am doing a project where I have to efficiently transfer data(any file) from one endpoint(HTTP, FTP, SFTP) to other. I want to use springBatch concurrency and parallelism feature of Job. In my case, one file will be one job. So, I am Trying to read file(any extension) from ftp(running locally) and writing it to same ftp in different folder.
My Reader has:
FlatFileItemReader<byte[]> reader = new FlatFileItemReader<>();
reader.setResource(new UrlResource("ftp://localhost:2121/source/1.txt"));
reader.setLineMapper((line, lineNumber) -> {
return line.getBytes();
});
And Writer has:
URL url = new URL("ftp://localhost:2121/dest/tempOutput/TransferTest.txt");
URLConnection conn = url.openConnection();
DataOutputStream out = new DataOutputStream(conn.getOutputStream());
for (byte[] b : bytes) { //I am getting List<byte[]> in my writer
out.write(b);
}
out.close();
In case of text file, all content is showing in one line(omitting nextLine character) and in case of video file bytes are missing/corrupted as video is not getting played at destination.
What I am doing wrong or is there something better way to transfer file(irrespective of its extension).

Read large file using vertx

I am new to using vertx and I am using vertx filesystem api to read file of large size.
vertx.fileSystem().readFile("target/classes/readme.txt", result -> {
if (result.succeeded()) {
System.out.println(result.result());
} else {
System.err.println("Oh oh ..." + result.cause());
}
});
But the RAM is all consumed while reading and the resource is not even flushed after use. The vertx filesystem api also suggest
Do not use this method to read very large files or you risk running out of available RAM.
Is there any alternative to this?
To read large file you should open an AsyncFile:
OpenOptions options = new OpenOptions();
fileSystem.open("myfile.txt", options, res -> {
if (res.succeeded()) {
AsyncFile file = res.result();
} else {
// Something went wrong!
}
});
Then an AsyncFile is a ReadStream so you can use it together with a Pump to copy the bits to a WriteStream:
Pump.pump(file, output).start();
file.endHandler((r) -> {
System.out.println("Copy done");
});
There are different kind of WriteStream, like AsyncFile, net sockets, HTTP server responses, ...etc.
To read/process a large file in chunks you need to use the open() method which will return an AsyncFile on success. On this AsyncFile you setReadBufferSize() (or not, the default is 8192), and attach a handler() which will be passed a Buffer of at most the size of the read buffer you just set.
In the example below I have also attached an endHandler() to print a final newline to stay in line with the sample code you provided in the question:
vertx.fileSystem().open("target/classes/readme.txt", new OpenOptions().setWrite(false).setCreate(false), result -> {
if (result.succeeded()) {
result.result().setReadBufferSize(READ_BUFFER_SIZE).handler(data -> System.out.print(data.toString()))
.endHandler(v -> System.out.println());
} else {
System.err.println("Oh oh ..." + result.cause());
}
});
You need to define READ_BUFFER_SIZE somewhere of course.
The reason for that is that internally .readFile calls to Files.readAllBytes.
What you should do instead is create a stream out of your file, and pass it to Vertx handler:
try (InputStream steam = new FileInputStream("target/classes/readme.txt")) {
// Your handling here
}

Compact Framework - Upload file via REST

I am looking for the best way to transfer files from the compact framework to a server via REST. I have a web service I created using .net Web API. I've looked at several SO questions and other sites that dealt with sending files, but none of them seem to work the for what I need.
I am trying to send media files from WM 6 and 6.5 devices to my REST service. While most of the files are less than 300k, an odd few may be 2-10 or so megabytes. Does anyone have some snippets I could use to make this work?
Thanks!
I think this is the minimum for sending a file:
using (var fileStream = File.Open(#"\file.txt", FileMode.Open, FileAccess.Read, FileShare.Read))
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://www.destination.com/path");
request.Method = "POST"; // or PUT, depending on what the server expects
request.ContentLength = fileStream.Length; // see the note below
using (var requestStream = request.GetRequestStream())
{
int bytes;
byte[] buffer = new byte[1024]; // any reasonable buffer size will do
while ((bytes = fileStream.Read(buffer, 0, buffer.Length)) > 0)
{
requestStream.Write(buffer, 0, bytes);
}
}
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
}
}
catch (WebException ex)
{
// failure
}
}
Note: HTTP needs a way to know when you're "done" sending data. There are three ways to achieve this:
Set request.ContentLength as used in the example, because we know the size of the file before sending anything
Set request.SendChunked, to send chunks of data including their individual size
You could also set request.AllowWriteStreamBuffering to write to an in-memory buffer, but I wouldn't recommend wasting that much memory on the compact framework.

How to get richtext box value in other form?

I am creating a simple C# windows application and I want to access my richtextbox value in other from. I am able to access the richtextbox but when i am trying to access the value of that it gives me null.
Any suggestion is helpful.
if you use winforms richtextbox then you can simple do the following action
richtextbox2.Text = richtextbox1.Text;
if you use wpf richtextbox the look at the msdn examples for loading and saving TextRanges from/to streams.
save: http://msdn.microsoft.com/en-us/library/ms598701.aspx
load: http://msdn.microsoft.com/en-us/library/system.windows.documents.textrange.load.aspx
here are the code examples from msdn
// This method accepts an input stream and a corresponding data format. The method
// will attempt to load the input stream into a TextRange selection, apply Bold formatting
// to the selection, save the reformatted selection to an alternat stream, and return
// the reformatted stream.
Stream BoldFormatStream(Stream inputStream, string dataFormat)
{
// A text container to read the stream into.
FlowDocument workDoc = new FlowDocument();
TextRange selection = new TextRange(workDoc.ContentStart, workDoc.ContentEnd);
Stream outputStream = new MemoryStream();
try
{
// Check for a valid data format, and then attempt to load the input stream
// into the current selection. Note that CanLoad ONLY checks whether dataFormat
// is a currently supported data format for loading a TextRange. It does not
// verify that the stream actually contains the specified format. An exception
// may be raised when there is a mismatch between the specified data format and
// the data in the stream.
if (selection.CanLoad(dataFormat))
selection.Load(inputStream, dataFormat);
}
catch (Exception e) { return outputStream; /* Load failure; return a null stream. */ }
// Apply Bold formatting to the selection, if it is not empty.
if (!selection.IsEmpty)
selection.ApplyPropertyValue(TextElement.FontWeightProperty, FontWeights.Bold);
// Save the formatted selection to a stream, and return the stream.
if (selection.CanSave(dataFormat))
selection.Save(outputStream, dataFormat);
return outputStream;
}