My MATLAB code uses the GPU to create raw data(25 seconds). Then in MATLAB, this raw data gets processed into scaler quantities that can be fed into an objective function(15 seconds).
Is it possible for a MATLAB particle swarm optimization code to start retrieving the next batch of raw data while the current raw data is processed?
Thank you.
You mean essentially work load distribution and timing. MatLab does have a ability to perform tasks in parallel by using SPMD:
http://www.mathworks.nl/help/distcomp/spmd.html
You might be able to first retrieve a set from your GPU and then start processing the data using MatLab AND retrieving the next set simultaneously using the SPMD statement.
Related
I have a question regarding the Matlab NN toolbox. As a part of research project I decided to create a Matlab script that uses the NN toolbox for some fitting solutions.
I have a data stream that is being loaded to my system. The Input data consists of 5 input channels and 1 output channel. I train my data on on this configurations for a while and try to fit the the output (for a certain period of time) as new data streams in. I retrain my network constantly to keep it updated.
So far everything works fine, but after a certain period of time the results get bad and do not represent the desired output. I really can't explain why this happens, but i could imagine that there must be some kind of memory issue, since as the data set is still small, everything is ok.
Only when it gets bigger the quality of the simulation drops down. Is there something as a memory which gets full, or is the bad sim just a result of the huge data sets? I'm a beginner with this tool and will really appreciate your feedback. Best Regards and thanks in advance!
Please elaborate on your method of retraining with new data. Do you run further iterations? What do you consider as "time"? Do you mean epochs?
At a first glance, assuming time means epochs, I would say that you're overfitting the data. Neural Networks are supposed to be trained for a limited number of epochs with early stopping. You could try regularization, different gradient descent methods (if you're using a GD method), GD momentum. Also depending on the values of your first few training datasets, you may have trained your data using an incorrect normalization range. You should check these issues out if my assumptions are correct.
I want to make a biometric identification system of the ECG/EKG.
Provided that Matlab does not perform Data Acquisition in Real Time (for monitoring), is there any way to make the monitoring and data acquisition in LabVIEW and then work simultaneously with Matlab for signal processing?
You could just get a matlab compatible daq and run everything in matlab. http://www.mathworks.com/products/daq/
You can indeed do some data acquisition with LabView and work simultaneously with Matlab for signal processing by calling the Matlab script node, which executes some Matlab code during vi execution.
You may have some performance issues, though, because both Labview and Matlab have to run on your machine simultaneously.
Question:
is there any way to make the monitoring and data acquisition on
LabView and then work simultaneously with Matlab for signal processing
Answers:
LabVIEW has "MathScript" node which is basic MatLab built into
an add-on. It is not the MatLab toolboxes. It runs native MatLab
code. It also runs slightly faster LabVIEW updates to the code. If
your code runs there, then LabVIEW will pass data natively
to your code. This box does not have direct MatLab toolbox access, so if
you use any special calls then that can cause a problem.
If you have MatLab on the box, then you can call the external MatLab
function/code using mathscript (link), and the MatLab will run
the function.
Clarification:
Real time just means "bounded time" (link), not "instant". If your idea of bounds are loose enough then many systems can work for them. You do not state it in your question - but what do you consider acceptable response time?
I've worked a lot with LabVIEW and Matlab. Personally, I would not use the Math Scripting node and would opt for using the Matlab Automation Server. You can call Matlab from LabVIEW using the ActiveX palette in LabVIEW (See Functions>>Connectivity>>ActiveX>>Automation Open) A couple reasons why I'd go for ActiveX and NOT the MathScript node:
The Math Script node does not allow you to change code dynamically. You must hardcode your data into the Math Script node and any future changes would require a change to LabVIEW's G code and therefore a recompile of your EXE
The Math Script node does not support all functions when compiled to an executable. Most notably graphing functions. See the help file here to read more on this.
Calling Matlab from ActiveX is going to give you a lot more flexibility in regards to how data is passed and processed.
I have set up a model using SimMechanics. It outputs data at the times where the solver steps to. Is there any possibility to have some kind of dense output such that it is possible to interpolate these data to get the solution at arbitrary points without losing the high order of the integrator?
In Matlab this is easily possible using the function deval after the integration of one of the built-in ODE integrators.
In SimMechanics I can select these integrators, too. Is there some kind of analouge way to deval?
Yes, it's possible, although it's a Simulink functionality, not specific to SimMechanics. In the Configuration Parameters of the model, you can set the model to Produce Specified Output Only (see http://www.mathworks.co.uk/help/simulink/gui/data-import-export-pane.html#bq9_fhw-1), under Data Import/Export. This way, only the outputs you specify will be produced regardless of the time steps taken by the solver.
I'm trying to solve a problem of simulating in real time in Simulink (This is solved) but plotting (real time) in Matlab ?
Details:
I want to be able to run a Simulink simulation (which is running in real time) and be able to turn on / off manual switches while the simulation is happening. This works well when I'm using the built in Scopes in Simulink but now I want to export that data to Matlab in real time as well (To make a custom looking graph).
So is there a way, to export this data (it can be sampled if that is necessary) to Matlab and make a plot that is constantly updating. Meanwhile I can still manipulate the switches in Simulink and influence the simulation manually ?
Simulink is effectively running continuously until I stop it.
Thanks for the help!
There should be some kind of notification going when simulink updates the data to be visualized. Maybe this is the linkdata feature.
Another, worse, solution is the drawnow command to redraw the graphs continously (the latter could be unnescessary costly for you program).
I have a simulink model which uses inputs from 6 webcams for live video processing. For that I use 6 'From Video Device' blocks. The output from these blocks are processed upon to generate output in the form of (x,y) co-ordinates. All 6 webcam outputs are processed at a time in parallel. However matlab hangs and stops simulation if I use more than 3 webcams. I want to divide the 6 processing blocks among two processor cores using Parallel Computing Toolbox. But couldn't find suitable instructions anywhere for distributing a single simulink model among multiple processor cores. I am using Matlab R2011a.
Well, I cannot post my code or my model, but I can tell you what my model does. It takes input from 6 usb camera, tracks a moving object in each frame of each camera and gives me the location of the moving object in (x,y) co-ordinates. Thus, I get 6 (x,y) co-ordinates as output at a time. My model works well till I use 3 cameras and generate 3 (x,y) outputs. Adding fourth camera hangs the matlab and stops the simulation.
I'm afraid the reason you haven't found instructions for spreading a Simulink model across multiple cores is because these instructions do not currently exist (up to and including R2012b). The Parallel Computing Toolbox only allows you to conduct multiple, seperate simulations simultaneously across the different cores (i.e. to investigate the effects of parameter changes and such like).
For your application, you will likely be better off using Matlab "proper" and writing everything in m-functions and/or scripts. That way you will be able to leverage you multi-core processor by using commands such as parfor.