How to extract F2FS file system based disk img files? - android-source

I am writing a program in windows (C#) to extract disk images from AOSP. I am using simg2img, but to no avail. Simg2img only supports EXT4 based file systems and some of my IMG files are written in F2FS. How do I extract these files? I have been struck with this problem for weeks if anyone can give any direction it would be helpful!

Related

How to convert las file to ply file?

I want to open my 3D point cloud in MATLAB. But they are in .las files. How can I display them in MATLAB???
I heard about .ply file can open 3D point data on MATLAB. So I want to know how to convert las files to ply files.
There is a .las file reader for matlab here:
https://es.mathworks.com/matlabcentral/fileexchange/48073-lasdata
Once you have the data in matlab you can use these point cloud tools, which are part of the computer vision toolbox:
https://es.mathworks.com/help/vision/3-d-point-cloud-processing.html
If you want to embrace the open source force, I'm writing a Python (easy transition from matlab) library for point cloud processing:
https://github.com/daavoo/pyntcloud
You can use the free and open-source CloudCompare software.
On the command line:
CloudCompare -O file_to_convert.las -C_EXPORT_FMT PLY -SAVE_CLOUDS
Take care to the order of the options: it seems that -SAVE_CLOUDS must be at the end.
That will result in a binary-format PLY file in the same directory as the file to convert, named using the original filename and the date of export, like: file_to_convert_2019-07-18_13h32_06_751.ply
I found no way to specify the output file name (should you find one please comment below).
Should you want a more predictable name, add option -NO_TIMESTAMP before the option -SAVE_CLOUDS (but then you risk overwriting files so be careful).
More help (such as how to export in ASCII-format) in the documentation.
I timed this on powerful PC, it took 170s to convert a 2.7GB LAS file with 102M points (XYZ,intensity,time).
if you have LAStools installed, you can use las2txt to convert your *.las/*.laz files into *.xyz format which MeshLab can import natively as a point cloud, which may then be converted into a Mesh.
There are a multitude of caveats to that depending on the source of your data-set.

Is there a way to convert/export tracking files (.trc files) from media cybernetics Image Pro Plus to a .mat file type?

I am a biology graduate student trying to export these files so that they can be used with a matlab based automated behavior classification software JAABA. It looks like there is no direct way to save .trc files as .mat (http://www.mediacy.com/imageproplus/specifications). At the very least I would like to figure out a way to read the format of the .trc files so that I could write a script to get them to make sense for JAABA. If anyone is familiar with either of these programs or both, or could simply point to a good way to write an importer (definitely outside of my skill set) I would be very grateful.

pkl file for customized image data in pylearn2

I'm totally novice in pylearn2 & right now I'm working in Brats database in which we work on 3D MRI of brain for tumor segmentation by using Pylearn2 same as explained in pylearn2 tutorial for CIFAR10 Database .
My problem is that all the volumetric images in database is in .mha format and in order to use Pylearn2-CNN I must have either .pkl file or the binary files for the image database. "Image" module can't read .mha file.
Can anybody tell me how to work with .mha files in Pylearn2 and how to generate .pkl file for .mha files.
You could try converting your files from .mha to png and then making a .pkl file.

Matlab freezes at certain files when dicomread is used

This problem is bothering me for a long time and I hope that someone is able to help me. I've searched the internet extensively but it seems that im the only one with this problem.
Occasionally when im loading multiple dicom files into Matlab it freezes at a certain file. Im unable to terminate the script and i have to force matlab to shut down. I do not know if this is a bug but i hope that there is a work-around for this because the dicomread does not return an error but freezes Matlab.
More information:
It happens with multiple datasets from different organisations
It happens with multiple computers
Matlab version 2013b/2014a/2014b
I hope that somebody can help me to fix this or find a workaround.
I had the same problem and I am using Matlab 2014. The same code I have runs fine on Matlab 2012.
I solved the issue by copying the DICOM library from Matlab 2012 to 2014. If you have a Windows machine, the library in the 2012 version is typically installed at
C:\Program Files\MATLAB\R2012a\toolbox\images\iptformats
The 2014 version is at
C:\Program Files\MATLAB\R2014a\toolbox\images\iptformats
Identical problem here with 3D CT scans. I had hundreds of scans stored as dicom folders (1 file per slice) that I converted to dicom volumes (1 file per entire volume) with compression. 6 of them would trigger a segmentation fault in the dicomparse call inside dicomread, despite I had no problem reading them in other software tools.
Twe easiest walkaround for me was to re-export these dicoms as uncompressed dicom volumes with a different software tool.

Read a very large text file in Matlab (~30Gb)

I have some results in format text include text header. It's about 15-50Gb. I want to import this in Matlab for the treatment. Could you give me some advises what command I should use for this big file?
You mat use the Root software that is available at Cern website
This was developed to work with very large files ( say > 1T)
I solved this problem via textscan function. But I had text files around 5 GB. Computer has 16GB RAM and it wasn't enough and have to use pagefile.sys. Reading time cca 60min.