I need to analize just one minute of a EEG test which last around 50 minutes. The part to be analized is around the minute 10, but I can't access only to that part, because EEGLAB loads always from the begining of the test by default.
When loading the EDF file, EEGLAB has the option: Data range (in seconds) to read (default all [num num]). I have tried here to insert only the time range needed, but I always get the begining of the test.
Related
I have a simulation that takes 4 hours to run and don't want to repeat it when I have to measure timing. Is there a way to save the data so that I can review it on demand.
The simulation data is store in a waveform database (.wdb) file. The waveform configuration file (.wcfg) contains only the selected signals and its display settings like radix, color, position, grouping, ...
So don't delete the *.wdb file and load/open it on your next run. Don't overwrite it :).
I have written code in Matlab to read data from CSV file and store it in an array the file contain many rows and 4 main columns (time, x,y,z). the time is then divided into segments of 10 seconds. i have calculated the start and finish time for each segment, now I want to get the x,y,and z-data into each segment. can you guys help me please?
I am getting some readings off an accelerometer connected to an Arduino which is in turn connected to MATLAB through serial communication. I would like to write the readings into a text file. A 10 second reading will write around 1000 entries that make the text file size around 1 kbyte.
I will be using the following code:
%%%%%// Communication %%%%%
arduino=serial('COM6','BaudRate',9600);
fopen(arduino);
fileID = fopen('Readings.txt','w');
%%%%%// Reading from Serial %%%%%
for i=1:Samples
scan = fscanf(arduino,'%f');
if isfloat(scan),
vib = [vib;scan];
fprintf(fileID,'%0.3f\r\n',scan);
end
end
Any suggestions on improving this code ? Will this have a time or Size limit? This code is to be run for 3 days.
Do not use text files, use binary files. 42718123229.123123 is 18 bytes in ASCII, 4 bytes in a binary file. Don't waste space unnecessarily. If your data is going to be used later in MATLAB, then I just suggest you save in .mat files
Do not use a single file! Choose a reasonable file size (e.g. 100Mb) and make sure that when you get to that many amount of data you switch to another file. You could do this by e.g. saving a file per hour. This way you minimize the possible errors that may happen if the software crashes 2 minutes before finishing.
Now knowing the real dimensions of your problem, writing a text file is totally fine, nothing special is required to process such small data. But there is a problem with your code. You are writing a variable vid which increases over time. That may cause bad performance because you are not using preallocation and it may consume a lot of memory. I strongly recommend not to keep this variable, and if you need the dater read it afterwards.
Another thing you should consider is verification of your data. What do you do when you receive less samples than you expect? Include timestamps! Be aware that these timestamps are not precise because you add them afterwards, but it allows you to identify if just some random samples are missing (may be interpolated afterwards) or some consecutive series of maybe 100 samples is missing.
This question already exists:
how to control the duration of a videoreader function in MatLab? [duplicate]
Closed 8 years ago.
I am using a Matlab code to record a video file using the VideoWriter function. I want to change the code to record only a certain part of the video file rather than the whole video. What command can I use to record only the first 40 secs of the data recorded at normal speed? I also want to know if there is a way I can record only a small part from the middle of the data recorded.
You can add conditional statements such as if to control the writing of the video file.
Alternatively, you can wrap the video writer function into a wrapper, which accepts your actual data, and a control boolean.
If you mean you want to record 40 seconds for each data set, with different frame rates, a wrapper function that takes frame rate and time length, and calculates the frame counts for itself may work.
If you mean you are frequently changing data sets, which will add up to one piece of video, and you want it be 40 second long, then a 'global' variable storing how many seconds you have recorded, as well as a function to calculate time increments, is needed.
Edited -
Based on your refined details, you may find these necessary and - hopefully - helpful.
exact frame rate for each data set.
a variable (should be in your controlling function/script) to store
how many milliseconds you already have.
a wrapper function that does the following job (and takes arguments
accordingly):
check if the current time already exceeds 40,000 milliseconds (if so, do nothing and return);
calculate the time period in milliseconds of your data to be added = (num of frames to record) / (frame rate per second) * 1000;
call video writer to record your data set, either frame by frame, or together in a whole;
add the time period into current time.
You can make it fancier by making it able to cut a piece of data series somewhere in the middle, so for example a 10-second data won't add the extra 4 seconds, if you already have 34 seconds on file.
I want to load a large set of MAT-files in a loop. I'm testing different ways to make the files load faster, and I have a subset of 10,000 files I'm working with, each containing about 50 variables of different sizes. I noticed an interesting detail:
If I load 10,000 files using load(filename) in a loop one after another, it takes about 5 minutes.
If I load the same set of files a few more times (basically repeat the test), the time doesn't change.
If I load only one variable from each file using load(filename, 'varname'), it takes about the same amount of time.
If I repeat step 3, it takes about 15 seconds to complete the load. Same files, same variable being loaded.
If I now run step 1 and again repeat step 3, I'm back to the load taking about 5 minutes. But once I try to do a second load, it takes a very short time again.
I'm puzzled. Is Matlab somehow keeping the data in memory once it loads it from a file once? This phenomenon, however, survives Matlab restarts and clear commands, so can it actually be Windows 7 that's keeping a memory cache of some of the data?
Needless to say, I would like to determine what's causing the unexpected improvement and, if possible, reproduce it to make the first load as fast as the subsequent ones.