Is Parallel Computing Toolbox Supported by Matlab Coder? - matlab

I am developing a parallel application using the Matlab parallel computing toolbox. I am wondering if I can convert the application using Matlab Coder for C/C++? Can Parallel Application developed using Matlab Parallel Computing Toolbox be converted to C/C++ by Matlab Coder?

MATLAB coder doesn't support Parallel Computing Toolbox. See http://www.mathworks.co.uk/products/matlab-coder/description2.html for the supported functionality.
You can however deploy using the compiler applications using Parallel Computing.

MATLAB Coder does in fact support PARFOR for code generation as of R2013b:
http://www.mathworks.com/help/coder/ref/parfor.html
http://www.mathworks.com/help/coder/release-notes.html
It has been supported in MEX since R2012b (see release notes again).

Related

interfacing cuSolver with MATLAB

I like to use cuSolver code for Eigen value decomposition of complex matrix in Matlab.
I am using MATLAB CUDA kernel and it seems that its not possible to interface cuSolver with MATLAB as the cuSolver contains the code for host as well as for device (as mentioned here: http://docs.nvidia.com/cuda/cusolver/#syevd-example1)
while MATLAB CUDA kernel works only for the kernel function..
Please comment.
Any other idea to compute Eigenvalue decomposition of large no of matrices containing complex data in parallel on GPU by using Matlab environment?
You almost certainly need to use the MEX interface. This allows you to take in gpuArray data, and call kernels and other CUDA library functions.
See the doc: http://uk.mathworks.com/help/distcomp/run-mex-functions-containing-cuda-code.html for more.

Accelerating MATLAB code using GPUs?

AccelerEyes announced in December 2012 that it works with Mathworks on the GPU code and has discontinued its product Jacket for MATLAB:
http://blog.accelereyes.com/blog/2012/12/12/exciting-updates-from-accelereyes/
Unfortunately they do not sell Jacket licences anymore.
As far as I understand, the Jacket GPU Array solution based on ArrayFire was much faster than the gpuArray solution provided by MATLAB.
I started working with gpuArray, but I see that many functions are implemented poorly. For example a simple
myArray(:) = 0
is very slow. I have written some custom CUDA-Kernels, but the poorly-implemented standard MATLAB functionality adds a lot of overhead, even if working with gpuArrays consistently throughout the code. I fixed some issues by replacing MATLAB code with hand written CUDA code - but I do not want to reimplement the MATLAB standard functionality.
Another feature I am missing is sparse GPU matrices.
So my questions are:
How do is speed up the badly implemented default GPU implementations provided by MATLAB? In particular, how do I speed up sparse matrix operations in MATLAB using the GPU?
MATLAB does support CUDA based GPU. You have to access it from the "Parallel Computing Toolbox". Hope these 2 links also help:
Parallel Computing Toolbox Features
Key Features
Parallel for-loops (parfor) for running task-parallel algorithms on multiple processors
Support for CUDA-enabled NVIDIA GPUs
Full use of multicore processors on the desktop via workers that run locally
Computer cluster and grid support (with MATLAB Distributed Computing Server)
Interactive and batch execution of parallel applications
Distributed arrays and single program multiple data (spmd) construct for large dataset handling and data-parallel algorithms
MATLAB GPU Computing Support for NVIDIA CUDA-Enabled GPUs
Using MATLAB for GPU computing lets you accelerate your applications with GPUs more easily than by using C or Fortran. With the familiar MATLAB language you an take advantage of the CUDA GPU computing technology without having to learn the intricacies of GPU architectures or low-level GPU computing libraries.
You can use GPUs with MATLAB through Parallel Computing Toolbox, which supports:
CUDA-enabled NVIDIA GPUs with compute capability 2.0 or higher. For releases 14a and earlier, compute capability 1.3 is sufficient.
GPU use directly from MATLAB
GPU-enabled MATLAB functions such as fft, filter, and several linear algebra operations
GPU-enabled functions in toolboxes: Image Processing Toolbox, Communications System Toolbox, Statistics and Machine Learning Toolbox, Neural Network Toolbox, Phased Array Systems Toolbox, and Signal Processing Toolbox (Learn more about GPU support for signal processing algorithms)
CUDA kernel integration in MATLAB applications, using only a single line of MATLAB code
Multiple GPUs on the desktop and computer clusters using MATLAB workers in Parallel Computing Toolbox and MATLAB Distributed Computing Server
I had the pleasure of attending a talk by John, the founder of AccelerEyes. They did not get the speedup because they just removed poorly written code and replaced it with code that saved a few bits here and there. Their speedup was mostly from exploiting the availability of cache and doing a lot of operations in-memory (GPU's). Matlab relied on transferring data between GPU and CPU, if I remember correctly, and hence the speedup was crazy.

matlab & beagleboard xm

Can I install matlab on beagleboard xm (running ubuntu 12.04)? If I can't how can I run matlab on beagleboard xm. I installed octave and it runs perfect but I need to use Matlab.
The short answer is NO. zellus is correct. Matlab requires a SSE2 instruction set. All ARM chip does not support SSE2. Just different architecture. See the disscussion at Mathworks:
http://www.mathworks.co.kr/matlabcentral/newsreader/view_thread/320518
http://www.mathworks.co.uk/support/solutions/en/data/1-B3MR75/
Referring to System Requirements - Release 2012a MATLAB only runs on
Any Intel or AMD x86 processor supporting SSE2 instruction set**
As stated on BeagleBoard-xM Product Details
BeagleBoard-xM delivers extra ARM ® Cortex TM-A8
is equipped with an ARM processor. Therefore I expect MATLAB not to run.
With the proper toolbox, you could generate code than run on ARM processors like the BeagleBoard
As already pointed out you probably don't want to (or can) install Matlab on the beagleboard but have Matlab generate C-code that will run on the board. For this you will need either Matlab Coder or Matlab Embedded Coder. The embedded toolbox produces "better/cleaner" code but is pricier than its non-embedded alternative. Please also note that if you need to do the same with Matlab simulink models you need even more toolboxes.

Is there any Matlab toolbox for neural network that can run on GPU?

I tried with GPUmat, but the neural network toolbox from mathworks does´t support it. Otherwise I must change the nn-toolbox by myself. But it´s too hard for me. Any suggestion for me?
I don't know whether this will accelerate the Neural Network Toolbox in particular, but the Mathworks now offers CUDA GPU support via the Parallel Computing Toolbox:
http://www.mathworks.com/discovery/matlab-gpu.html?s_cid=HP_MI_tech_gpu
Matlab provides its own toolbox for training neural networks on GPU, see here.
As an author, I also advice to use my toolbox ConvNet, that uses kernels of Alex Krizhevsky's library cuda-convnet2. It also has pure CPU and Matlab versions, that work identically. There is also another toolbox for Matlab, called MatConvNet, but I have not checked it.

convert matlab code to c code

is there s a way to convert simulink blocks or a matlab ".m" code to C code automatically ?
I'm not aware of any direct translation tool, but there are a few alternatives:
Matlab Compiler will let you create a shared library (callable from C code) from your .m code.
GNU Octave is an open source interpreter that has many of the same functions as Matlab. It is open source, and written in C. You might feasibly build a C library based on on this, although it would be a lot more heavy-weight than option (1).
To convert Simulink models or MATLAB m-code to C you need Real-Time Workshop.
It supports only subset of MATLAB language and oriented for embedded systems.
Look also at other MatWorks products for code generation and application deployment:
http://www.mathworks.com/products
With MATLAB Compiler you will not get a C code, but a binary code, executable or library (dll), which will run on machines without MATLAB installed, but with MATLAB Compiler Runtime (MCR) library. It quite large in size, and platform specific and I believe it has to match the MATLAB version of compiled code.
For the new coming release 2011a The MathWorks developed new code generation products: MATLAB Coder, Simulink Coder and Embedded Coder.