I would like to take a 512x512 image and convert it into a png byte array in Matlab so that I can stream it via a socket.
At the moment I take the array, write it to a png file using imwrite(I,'file.png'), then read it as a binary file and send it through the socket. This is obviously horribly inefficient because I first write to disk and then read from disk. I want to skip the and write to disk.
Is there anyway to do this in Matlab?
Probably not directly using the base MATLAB toolbox since the PNG file itself is created by the PNGWRITEC MEX-function. However, there may be some Java classes that can help, such as those in the javax.imageio package.
Related
I have a file that is in AVL file format from a program called ArcGIS (formerly ArcView) that I need to convert to a mat file. It contains data I need to play back. Can anybody suggest a simple way to convert the file? I have done quite a bit of searching, and it seems the AVL is not a binary, aka it is a text format, which means I could write a program to convert it, but only if I also knew the corresponding matlab MAT file format. Moreover, this could take quite a while to do, and I need to file to be converted quickly so I can use the data.
My MATLAB script generates a figure from a timeseries data that, when saved, is over 200 MB in size. Is there a way to compress the figure to a lesser size in '*.fig' format? The compression has to be lossless so that I can zoom in and view the details in the figure. The figure has to be saved in *.fig format so that the axis property relations between subplots are preserved and I can use the data cursor tool.
The *.fig format cannot be saved as is in compressed form. The format is just not capable of it. But in MATLAB you can use functions zip to compress files created by savefig, and unzip with passing to openfig. This way you can create simple script to load and save zipped figs. Of course you will need to use a temp file, which should be taken care of as well.
I have to analyze bio acoustic audiofiles using matlab. Eventually I want to be able to find anomalies in the audio. That's the reason I need to find a way to represent the audio in a way I can extract and compare features. I'm dealing with mp3 files up to 150 mb. These files are too large for matlab to read in to it's memory. Therefore I want to use the memmapfile() function. I used the following code and a small mp3 file to find out how it actually works.
[testR, ~] = audioread('test.mp3');
testM = memmapfile('test.mp3');
disp(testM.Data);
disp(testR);
The actual values of the testM.Data and testR are different. Audioread() returns a 7483391 x 2 matrix and memmapfile() a 4113874 x 1 matrix.
I'm not really sure how memmapfile() works, I expected this to be equal to each other. Is there a way to read mp3 files in the same format audioread() does using memmapfile()? And what does memmapfile actually return in case of an audio file? Maybe it's also usable in the vector format in the case of anomaly detection?
Thanks in advance!
NOTE: The original files were in wav IMA ADPCM format with sizes from 1.5 up to 2.5 gb. Since Matlab can't deal with that format and the size of the files I converted them to 8bit mp3 files.
I think that the problem is mammapfile by default read data in uint8 format, while audioread function read data in another way.
How you can see here you can specify the format of data when you read it with memmapfile, so try to "play" with different values. From the documentation I read that you can read data in double format, so try to modify the memmapfile data format and audioread data format.
Last thing, memmapfile always organize the data in matrix like "somenumbers x 1", so if you want the original one you need to use something like reshape.
Anyway if you work with big data I suggest you to try with something different instead memmapfile, because it is very very slow
I have applied JPEG baseline compression algorithm by writing each step in matlab. Now, I have the JPEG compresses image data in binary form and the header to be appended. Please tell me how to make a file that would be recognized as JPEG file by OS. Should it be binary file or what is the process.?
Regards
You are going to need to read two thing:
1) The JPEG standard
2) The standard for some file format (e.g., JFIF, EXIF).
You are going to need to have a JPEG file header (see file format standards). You are going to have to create DHT, DQT, SOF, and SOS markets for the compressed data (JPEG standard).
All of the data is in binary format. You have to remember to convert FF values in the compressed data stream to FFFF.
I'm finding that writing and reading the native mat file format becomes very, very slow with larger data structures of about 1G in size. In addition we have other, non-matlab, software that should be able to read and write these files. So I would to find an alternative format to use to serialize matlab data structures. Ideally this format would ...
be able to represent an arbitrary matlab structure to a file.
have faster I/O than than mat files.
have I/O libraries for other languages like Java, Python and C++.
Simplifying your data structures and using the new v7.3 MAT file format, which is a variant of HDF5, might actually be the best approach. The HDF5 format is open and already has I/O libraries for your other languages. And depending on your data structure, they may be faster than the old binary mat files.
Simplify the data structures you're saving, preferring large arrays of primitives to complex container structures.
Try turning off compression if your data structures are still complex.
Try the v7.3 MAT file format using "-v7.3"
If using a network file system, consider saving and loading to a temporary dir on a fast local drive and copying to/from the network
For large data structures, your MAT file I/O speed may be determined more by the internal structure of the data you're writing out than the size of the resulting MAT file itself. (In my experience, this has usually been the major factor in slow MAT files.) When you say "arbitrary Matlab structure", that suggests you might be using cells, structs, or objects to make complex data structures. That slows down MAT I/O because there is per-array overhead in MAT file I/O, and the members of cell and struct arrays (container types) all count as separate arrays. For example, 5,000 strings stored in a cellstr are much, much slower than the same 5,000 strings stored in a 2-D char array. And objects have even more overhead. As a test, try writing out a 1 GB file that contains just a 1 GB primitive array of random uint8s, and see how long that takes. From there, see if you can simplify your data to reduce the total mxarray count, even if that means reshaping it for serialization. (My experience with this is mostly with the v7 format; the newer HDF5 format may have less per element overhead.)
If your data files live on the network, you could also try doing the save and load operations on temporary files on fast local drives, and separately using copy operations to move them back and forth between the network. At least on Windows networks, I've seen speedups of up to 2x from doing this. Possibly due to optimizations the full-file copy operation can do that the MAT I/O code can't.
It would probably be a substantial effort to come up with an alternate file format that supported fully arbitrary Matlab data structures and was portable to other languages. I'd try making smaller changes around your use of the existing format first.
mat format has changed with Matlab versions. v7.3 uses HDF5 format, which has builtin compression and other features and it can take a large time to read/write. However, you can force Matlab to use previous formats which are faster (but might take more space).
See here:
http://www.mathworks.com/help/matlab/import_export/mat-file-versions.html