Does anyone know how to code (or have a code) for generating sobol sequences in matlab?
The initial chosen direction numbers is not a concern.
Thanks.
You'll need the stats toolbox function sobolset unless you're planning on programming your own from scratch?
Otherwise it looks like these file exchange entries can do it without the stats toolbox:
Global sensitivity analysis toolbox
Bridge Sampling
And also this one if you are running Linux. I recommend you start with the file exchange options as they are free, don't require the toolbox and don't require you to start from scratch. Unless you have the toolbox, in which case it's quite well documented so use that.
Related
I want to make a biometric identification system of the ECG/EKG.
Provided that Matlab does not perform Data Acquisition in Real Time (for monitoring), is there any way to make the monitoring and data acquisition in LabVIEW and then work simultaneously with Matlab for signal processing?
You could just get a matlab compatible daq and run everything in matlab. http://www.mathworks.com/products/daq/
You can indeed do some data acquisition with LabView and work simultaneously with Matlab for signal processing by calling the Matlab script node, which executes some Matlab code during vi execution.
You may have some performance issues, though, because both Labview and Matlab have to run on your machine simultaneously.
Question:
is there any way to make the monitoring and data acquisition on
LabView and then work simultaneously with Matlab for signal processing
Answers:
LabVIEW has "MathScript" node which is basic MatLab built into
an add-on. It is not the MatLab toolboxes. It runs native MatLab
code. It also runs slightly faster LabVIEW updates to the code. If
your code runs there, then LabVIEW will pass data natively
to your code. This box does not have direct MatLab toolbox access, so if
you use any special calls then that can cause a problem.
If you have MatLab on the box, then you can call the external MatLab
function/code using mathscript (link), and the MatLab will run
the function.
Clarification:
Real time just means "bounded time" (link), not "instant". If your idea of bounds are loose enough then many systems can work for them. You do not state it in your question - but what do you consider acceptable response time?
I've worked a lot with LabVIEW and Matlab. Personally, I would not use the Math Scripting node and would opt for using the Matlab Automation Server. You can call Matlab from LabVIEW using the ActiveX palette in LabVIEW (See Functions>>Connectivity>>ActiveX>>Automation Open) A couple reasons why I'd go for ActiveX and NOT the MathScript node:
The Math Script node does not allow you to change code dynamically. You must hardcode your data into the Math Script node and any future changes would require a change to LabVIEW's G code and therefore a recompile of your EXE
The Math Script node does not support all functions when compiled to an executable. Most notably graphing functions. See the help file here to read more on this.
Calling Matlab from ActiveX is going to give you a lot more flexibility in regards to how data is passed and processed.
Is it possible to use a MATLAB code on Scilab? Is that what is meant when saying that Scilab is a "clone" from MATLAB?
There is a tool to automatically convert Matlab source to Scilab source, it's called M2SCI. A script parses the Matlab source code and replaces Matlab-specific functions by Scilab ones. See the documentation of the mfile2sci function.
Yes you can use MATLAB code on scilab. See these links for more information:
http://help.scilab.org/docs/5.4.0/fr_FR/section_36184e52ee88ad558380be4e92d3de21.html
http://help.scilab.org/docs/5.4.0/en_US/index.html
I would not bet on it. But if your code is simple enough chances are good.
Problems are:
There is encrypted p-code in Matlab that Scilab will not be able to open.
Matlab usually comes with a number of toolboxes that might not be available to you (i think especially Simulink)
last but not least (i don't know about scilab) there usually are minute differences in how functions are implemented.
There are a number of projects out there trying to replicate/replace MATLAB:
Julia language: which has a relatively similar syntax to MATLAB and offers great performance, but still lacks a lot of toolboxes/libraries, as well as not having a GUI like MATLAB. I think this has the brightest future among all MATLAB alternatives.
Python language and its libraries NumPy and matplotlib: which is the most used alternative. I think at this moment the community is a couple of orders of magnitude even bigger than MATLAB. Python is the de facto standard in machine learning and data science at the moment. But still, the syntax and memory concept is a bit far from what people are used to in the MATLAB ecosystem. There are also no equivalent to SIMULINK, although Spyder and Jupyter projects have come a long way in terms of the development environment.
Octave: is basically a clone of MATLAB to a point they consider any incompatibility as a bug. If you have a long MATLAB code that you don't want to touch, this is the safest bet. But again no alternative for SIMULINK.
SciLab and it's fork ScicoLab are the best alternatives in terms of GUI, having a SIMULINK replica xcos / scicos and a graphical user interface development features. However the community is not as big as Octave and the syntax is not completely compatible. Sadly the Scilab development team has gone through a devastating family crisis leading to the software falling behind.
Honorary mention of Modelica language implementations OpenModelica and jModelica for being a superior alternative to SIMULINK-SimScape. You should know that you can load Modelica scrips also in xcos and scicos. If you want to kno wmore about JModelica you may see this post.
you may check the MATLAB's Alternativeto page to see more Free and Open source alternatives.
Is there a module for the Perl Data Language that is similar to the Matlab signal processing toolbox? I'm aware of PDL::FFT(W), but can't find any functions for filter construction or estimation of statistical properties.
Alas, no. While it is likely that you can accomplish the same ends as Matlab's signal processing toolbox with your own combination of convolutions, cross products, etc, nobody has written a comprehensive signal processing toolkit with PDL. PDL started as a competitor to IDL, so its strongest bindings focus on image manipulation.
If you are considering making your own toolbox (which would be FANTASTIC!), I would suggest PDL::TimeSeries.
There is now at least https://metacpan.org/pod/PDL::DSP::Fir for finite impulse response filter kernels.
PDL::Audio has filters (including FIR filters, aka convolution). I'm sure it's nowhere near as full-featured as the Matlab toolbox, and it has lots of stuff you're probably not interested in (it's designed for making sound, after all), but it will do at least some of what you want.
If I were making this module, I'd call it PDL::DSP.
I need to run several independent analyses on the same data set.
Specifically, I need to run bunches of 100 glm (generalized linear models) analyses and was thinking to take advantage of my video card (GTX580).
As I have access to Matlab and the Parallel Computing Toolbox (and I'm not good with C++), I decided to give it a try.
I understand that a single GLM is not ideal for parallel computing, but as I need to run 100-200 in parallel, I thought that using parfor could be a solution.
My problem is that it is not clear to me which approach I should follow. I wrote a gpuArray version of the matlab function glmfit, but using parfor doesn't have any advantage over a standard "for" loop.
Has this anything to do with the matlabpool setting? It is not even clear to me how to set this to "see" the GPU card. By default, it is set to the number of cores in the CPU (4 in my case), if I'm not wrong.
Am I completely wrong on the approach?
Any suggestion would be highly appreciated.
Edit
Thanks. I'm aware of GPUmat and Jacket, and I could start writing in C without too much effort, but I'm testing the GPU computing possibilities for a department where everybody uses Matlab or R. The final goal would be a cluster based on C2050 and the Matlab Distribution Server (or at least this was the first project).
Reading the ADs from Mathworks I was under the impression that parallel computing was possible even without C skills. It is impossible to ask the researchers in my department to learn C, so I'm guessing that GPUmat and Jacket are the better solutions, even if the limitations are quite big and the support to several commonly used routines like glm is non-existent.
How can they be interfaced with a cluster? Do they work with some job distribution system?
I would recommend you try either GPUMat (free) or AccelerEyes Jacket (buy, but has free trial) rather than the Parallel Computing Toolbox. The toolbox doesn't have as much functionality.
To get the most performance, you may want to learn some C (no need for C++) and code in raw CUDA yourself. Many of these high level tools may not be smart enough about how they manage memory transfers (you could lose all your computational benefits from needlessly shuffling data across the PCI-E bus).
Parfor will help you for utilizing multiple GPUs, but not a single GPU. The thing is that a single GPU can do only one thing at a time, so parfor on a single GPU or for on a single GPU will achieve the exact same effect (as you are seeing).
Jacket tends to be more efficient as it can combine multiple operations and run them more efficiently and has more features, but most departments already have parallel computing toolbox and not jacket so that can be an issue. You can try the demo to check.
No experience with gpumat.
The parallel computing toolbox is getting better, what you need is some large matrix operations. GPUs are good at doing the same thing multiple times, so you need to either combine your code somehow into one operation or make each operation big enough. We are talking a need for ~10000 things in parallel at least, although it's not a set of 1e4 matrices but rather a large matrix with at least 1e4 elements.
I do find that with the parallel computing toolbox you still need quite a bit of inline CUDA code to be effective (it's still pretty limited). It does better allow you to inline kernels and transform matlab code into kernels though, something that
I am trying to do some text classification with SVMs in MATLAB and really would to know if MATLAB has any methods for feature selection(Chi Sq.,MI,....), For the reason that I wan to try various methods and keeping the best method, I don't have time to implement all of them. That's why I am looking for such methods in MATLAB.Does any one know?
svmtrain
MATLAB has other utilities for classification like cluster analysis, random forests, etc.
If you don't have the required toolbox for svmtrain, I recommend LIBSVM. It's free and I've used it a lot with good results.
The Statistics Toolbox has sequentialfs. See also the documentation on feature selection.
A similar approach is dimensionality reduction. In MATLAB you can easily perform PCA or Factor analysis.
Alternatively you can take a wrapper approach to feature selection. You would search through the space of features by taking a subset of features each time, and evaluating that subset using any classification algorithm you decide (LDA, Decision tree, SVM, ..). You can do this as an exhaustively or using some kind of heuristic to guide the search (greedy, GA, SA, ..)
If you have access to the Bioinformatics Toolbox, it has a randfeatures function that does a similar thing. There's even a couple of cool demos of actual use cases.
May be this might help:
There are two ways of selecting the features in the classification:
Using fselect.py from libsvm tool directory (http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/#feature_selection_tool)
Using sequentialfs from statistics toolbox.
I would recommend using fselect.py as it provides more options - like automatic grid search for optimum parameters (using grid.py). It also provides an F-score based on the discrimination ability of the features (see http://www.csie.ntu.edu.tw/~cjlin/papers/features.pdf for details of F-score).
Since fselect.py is written in python, either you can use python interface or as I prefer, use matlab to perform a system call to python:
system('python fselect.py <training file name>')
Its important that you have python installed, libsvm compiled (and you are in the tools directory of libsvm which has grid.py and other files).
It is necessary to have the training file in libsvm format (sparse format). You can do that by using sparse function in matlab and then libsvmwrite.
xtrain_sparse = sparse(xtrain)
libsvmwrite('filename.txt',ytrain,xtrain_sparse)
Hope this helps.
For sequentialfs with libsvm, you can see this post:
Features selection with sequentialfs with libsvm