I'm trying to design a storage system where excess energy goes into it. There is a cap of a maximum storage size for the system. I am struggling to work out how to code this in matlab.
Currently im using a function similar to this
max_storage = no_tanks*tank_size
if cumsum(excess) > 0
storage = cumsum(excess)
elseif cumsum(excess) < 0
After that I am confused how to continue writing the code. Any help would be greatly appreciated
Attempt at mind-reading, while awaiting an update to the question.
To limit the storage size to max_storage, you need to have some code like
storage = calc_storage(excess); % or whatever
storage = min(storage, max_storage);
Don't forget to finish your statements with ;, and if you need to use cumsum(excess) lots of times it is better to assign it to a variable rather than calculating it over and over again.
Related
I want to get current disk info Read Speed and Write Speed not max speed I/O
like image above, read speed = 0; write speed = 8.8MB/s
Please, anyone can show me a command can get that info.
Just to begin with, this class will give you everything you need that you desire to have:
Get-WmiObject Win32_PerfFormattedData_PerfProc_Process
But my preference is
1) powershell-is-king-Measure-disk-performance-for-iops-and-transfer-rate
2) 2 Functions for IO
Hope it helps
I have a ".mat" file supposedly containing a [30720000x4 double] matrix (values from accelerometers). When I try to open this file with "Import data" in Matlab I get the following error:
Error using load
Can't read file F:\vibration_exp_2\GR_UB50n\bearing1\GR_UB50n_1_2.mat.
Error using load
Unknown text on line number 1 of ASCII file
F:\vibration_exp_2\GR_UB50n\bearing1\GR_UB50n_1_2.mat
"MATLAB".
Error in uiimport/runImportdata (line 456)
datastruct = load('-ascii', fileAbsolutePath);
Error in uiimport/gatherFilePreviewData (line 424)
[datastruct, textDelimiter, headerLines]= runImportdata(fileAbsolutePath,
type);
Error in uiimport (line 240)
[ctorPreviewText, ctorHeaderLines, ctorDelim] = ...
The filesize is 921MB which is the same as my other files that do open. I also tried opening the file using python, but no success. Any suggestions? I use MATLAB R2013b .
More info:
How the file was create:
%% acquisition of vibration data
% input:
% sample rate in Hz (max. 51200 Hz, should be used as bearing
% faults are high-frequent)
% time in seconds, stating the duration of the measurement
% (e.g. 600 seconds = 10 minutes)
% filename for the file to be saved
%
% examples:
% data = DAQ(51200, 600, 'NF1_1.mat');
% data = DAQ(51200, 600, 'NF1_2.mat');
function data = DAQ(samplerate,time,filename)
s = daq.createSession('ni'); % Creates the DAQ session
%%% Add the channels as accelerometer channels (meaning IEPE is turned on)
s.addAnalogInputChannel('cDAQ1Mod1','ai0','Accelerometer');
s.addAnalogInputChannel('cDAQ1Mod1','ai1','Accelerometer');
s.addAnalogInputChannel('cDAQ1Mod1','ai2','Accelerometer');
s.addAnalogInputChannel('cDAQ1Mod1','ai3','Accelerometer');
%s.addAnalogInputChannel('cDAQ1Mod2','ai0','Accelerometer');
s.Rate = samplerate;
s.NumberOfScans = samplerate*time;
%%% Defining the Sensitivities in V/g
s.Channels(1).Sensitivity = 0.09478; %31965, top outer
s.Channels(2).Sensitivity = 0.09531; %31966, back outer
s.Channels(3).Sensitivity = 0.09275; %31964, top inner
s.Channels(4).Sensitivity = 0.09363; %31963, back inner
data = s.startForeground(); %Acquiring the data
save(filename, 'data');
More info:
When I open the file using a simple text editor I can see a lot of characters that do not make sense but also the first line:
MATLAB 5.0 MAT-FILE, Platform: PCWIN64, Created on: Thu Apr 30
16:29:07 2015
More info:
The file itself: https://www.dropbox.com/s/r7mavil79j47xa2/GR_UB50n_1_2.mat?dl=0
It is 921MB.
EDIT:
How can I recover my data?
I've tried this, but got memory errors.
I've also tried this, but it did not work.
I fear I can't add many good news to what you know already, but it hasn't been mentioned yet.
The reason the .mat-file can't be load is due to the data beeing corrupted. What makes it 'unrecoverable' is the way it is stored internally. The exact format is specified in the MAT-File Format Documentation. So I decided to manually construct a simple reader to specifically read your .mat file.
It makes sense, that the splitmat.m can't recover anything, as it will basicly split the data into chunks, one stored variable per chunk, however in this case there is only 1 variable stored and thus only one chunk, which happens to be the corrupted one.
In this case, the data is stored as a miCOMPRESSED, which is a normal matlab array compressed using gzip. (Which, as a side note, doesn't seem like a good fit for 'random' vibration data.) This might explain previous comments about the smaller file size then the full data, as the filesize matches exatly with the internally stored value.
I extracted the compressed archive and tried to uncompress it in a variety of ways. Basicly it is a '.gz' without the header, that can be appended manually. Unfortunatly there seems to be a corrupted block near the start of the dataset. I am by no means an expert on gzip, but as far as I know the dictionary (or decryption key) is stored dynamicly which makes all data useless from the point the block is corrupted. If you are really eager, there seems to be a way to recover data even behind the point where data is corrupted, but that method is massively timeconsuming. Also the only way to validate data of those sections is manual inspection, which in your case might proof very difficult.
Below is the code, that I used to extract the .gz-file, so if you want to give it a try, this might get you started. If you manage to decrypt the data, you can read it as described in the MAT-File Format, 13f.
corrupted_file_id = fopen('corrupt.mat','r');
%% some header data
% can be skipped replacing this block with
% fread(id,132);
%header of .mat file
header_text = char(fread(corrupted_file_id,116,'char')');
subsystem_data_offset = fread(corrupted_file_id,8,'uint8');
version = fread(corrupted_file_id,1,'int16');
endian_indicator = char(fread(corrupted_file_id,2,'int8')');
data_type = fread(corrupted_file_id,4,'uint8');
%data_type is 15, so it is a compressed matlab array
%% save te content
data_size = fread(corrupted_file_id,1,'uint32');
gz_file_id = fopen('compressed_array.gz','w');
% first write a valid gzip head
fwrite(gz_file_id,hex2dec('1f8b080000000000'),'uint64',0,'b');
% then write the data sequentialy
step = 1:1e3:data_size;% 1MB steps
for idx = step
fwrite(gz_file_id,fread(corrupted_file_id,1e3,'uint8'));
end
step = step(end):data_size;% 1B steps
for idx = step
fwrite(gz_file_id,fread(corrupted_file_id,1,'uint8'));
end
fclose(gz_file_id);
fclose(corrupted_file_id);
To answer literally to the question, my suggestion would be to make sure first that the file is okay. This tool on File Exchange apparently knows how to diagnose corrupted .MAT files starting with version V5 (R8):
http://www.mathworks.com/matlabcentral/fileexchange/6893-matcat-mat-file-corruption-analysis-tool
The file's size (indices going out of range) seems to be a problem. Octave, which should read .mat files, gives the error
memory exhausted or requested size too large for range of Octave's index type
To find out what is wrong you may need to write a test program outside MatLab, where you have more control over memory management. Examples are here, including instructions on how to build them on your own platform. These stand-alone programs may not have the same memory issues. The program matdgns.c is specifically made to check .mat files for errors.
The following space allocation is giving me an sB37 JCL error. The cobol size of the output file is 100 bytes and the lrecl size is 100 bytes. What do you think is causing this error? I have tried increase the size to 500,100 and still get the same error.
Code:
//OUTPUT1 DD DSN=A.B.C,DISP=(NEW,CATLG,DELETE),
// DCB=(LRECL=100,BLKSIZE=,RECFM=FBM),
// SPACE=(CYL,(10,5),RLSE)
Try to increase not only the space, but the volume as well.
Include VOL=(,,,#) in your DD. # is the numbers of values you want to allocate
Ex: SPACE=(CYL,(10,5),RLSE),VOL=(,,,3) - includes 3 volumes.
Additionally, you can increase the size, but try to stay within reasonable limits :)
The documentation for B37 says the application programmer should respond as indicated for message IEC030I. The documentation for IEC030I says, in part...
Probable user error. For all cases, allocate as many units as volumes
required.
...as noted in another answer. However, be advised that the documentation for the VOL parameter of the DD statement says...
If you omit the volume count or if you specify 1 through 5, the system
allows up to five volumes; if you specify 6 through 20, the system
allows 20 volumes; if you specify a count greater than 20, the system
allows 5 plus a multiple of 15 volumes. You can override the maximum
volume count in data class by using the volume-count subparameter. The
maximum volume count for an SMS-managed mountable tape data set or a
Non-managed tape data set is 255.
...so for DASD allocations you are best served specifying a volume count greater than 5 (at least).
//OUTPUT1 DD DSN=A.B.C,DISP=(NEW,CATLG,DELETE),
// DCB=(LRECL=100,BLKSIZE=,RECFM=FBM),
// SPACE=(CYL,(10,5),RLSE)
Try this instead. Notice that the secondary will take advantage of a large dataset whereas without that parameter the largest secondary that makes any sense is < 300. Oh, and if indeed it is from a COBOL program make sure that the FD says "BLOCK 0"!!!!! If it isn't "BLOCK 0" then you might not even need to change your JCL because it wasn't fixed block machine. It was merely fixed and unblocked so the space would almost never be enough. And finally you may wish to revisit why you have the M in the RECFM to begin with. Notice also that I took out the LRECL, the BLKSIZE and the RECFM. That is because the FD in the COBOL program is all you need and putting it in the JCL is not only redundant but dangerous because any change will have to be now done in multiple places.
//OUTPUT1 DD DSN=A.B.C,DISP=(NEW,CATLG,DELETE),
// DSNTYPE=LARGE,UNIT=(SYSALLDA,59),
// SPACE=(CYL,(10,1000),RLSE)
There is a limit of 65,535 tracks per one volume. So if you will specify a SPACE that exceeds that limit - system will simply ignore it.
You can increase this limit to 16,777,215 tracks by adding DSNTYPE=LARGE paramter.
Or you can specify that your dataset is a multi volume by adding VOL=(,,,3)
You can also use DATACLAS=xxxx paramter here, however first of all you need to find it. Easy way is to contact your local Storage Team and ask for one. Or If you are familiar with ISPF navigation, you can enter ISMF;4 command to open a panel
use bellow paramters before hitting enter.
CDS Name . . . . . . 'ACTIVE'
Data Class Name . . *
It should produce a list of all available data classes. Find the one that suits you ( has enougth amount of volume count, does not limit primary and secondary space
I'm trying to track down all user-created functions and scripts within lenghty MATLAB code. The following code does this but due to MATLAB's profiler's default history size of 1,000,000 I'm missing a good amount of functions.
function endProfile(p)
profile off
diary('Diary_endProfile')
for k = 1:size(p.FunctionHistory, 2)
if p.FunctionHistory(1, k) == 0
str = 'entering function: ';
else
str = 'exiting function: ';
end
if isempty(strfind(p.FunctionTable(p.FunctionHistory(2,k)).FileName, 'C:\Program Files\MATLAB\')) &&...
~strcmp(p.FunctionTable(p.FunctionHistory(2, k)).FileName, '')
disp([str p.FunctionTable(p.FunctionHistory(2, k)).FunctionName])
end
end
profile off
profile viewer
end
I initialize the profiler with the following code from the first script of the code under analysis:
profile clear
profile on -history -historysize 1000000000
The previous function is called at the end of the first script as follows:
endProfile(profile('info'))
Does anyone know what the maximum history size is and/or if there are alternate ways of increasing the size?
Thanks!
A much better method for looking at dependencies is to use either depfun, or the excellent fdep from the File Exchange.
I have just recently started to use MATLAB to acquire data off of a data acquisition board and was in need of a function to acquire data continuously (i.e. until I ctrl^C out of the function). To do this I am using the data acquisition toolbox on a 32-bit windows OS.
Based on the documentation in matlab help and a few of the answers on this site, I found that after adding channels to my input handle I should:
set my 'SamplesPerTrigger' to Inf
set the 'TimerPeriod' to some value to trigger the 'TimerFcn'
set the 'TimerFcn' to some subfunction callback which appends data to a persistent variable
Is this a correct way to do this?
My code is as follows:
function acquire_arena_test(samprate,daq_device ,device_ID ,channels, saveroot)
setup.SampleRate = samprate;
setup.DAQdevice = {daq_device, device_ID};
setup.AIChannels = channels;
setup.SaveRoot = {saveroot};
ai = analoginput(setup.DAQdevice{1},setup.DAQdevice{2});
addchannel(ai,[setup.AIChannels]);
set(ai,'SamplesPerTrigger',Inf);
set(ai,'TimerPeriod',0.5);
set(ai,'TimerFcn',{#AcquireData,ai});
start(ai);
while(strcmpi(get(ai,'Running'),'On'))
pause(1)
end
stop(ai);
time = datestr(now,30);
save([saveroot time], 'data');
delete(ai);
clear ai;
function AcquireData(hObject, ~)
persistent totalData;
data = getdata(hObject);
if isempty(totalData)
totalData =data;
else
totalData = [totalData; data];
end
The initial analog input is definitely working properly. I have tried many permutations of giving the AcquireData callback to 'TimerFcn'. The error I receive is
`??? Error using ==> acquire_arena_test>AcquireData
Too many input arguments.
Warning: The TimerFcn callback is being disabled.
To enable the callback, set the TimerFcn property. `
Thanks in advance for any help.
I think the syntax you use for setting up your TimerFcn is wrong. You write
set(ai,'TimerFcn',{#AcquireData,ai});
but this means that your function AcquireData will be called with tree parameters: AcquireData(ai, event, ai) as explained here, which then of course triggers the error message since your AcquireData function only accepts two parameters. Just change your code to
set(ai,'TimerFcn',#AcquireData);
and it should work; the ai object is automatically passed as the first parameter (see the link to the MATLAB documentation above).
Sorry about answering my own question, but I figured it out. The trigger was not needed after all. Using a national instruments board (or a sound card, as it turns out) you can just change the LoggingMode to 'disk' and specify a file to save the .daq (data acquisition toolbox) file to save as with LogFileName. If you want to use the memory on your board, change the mode to disk&Memory. Helpful document:
http://www.mathworks.com/help/toolbox/daq/f12-16658.html
The script below acquires data during the pause, which is as long as you want it to be..
daqreset;
clear all;
ai = analoginput('nidaq','Dev1');
chans = addchannel(ai,0:6);
set(ai,'SamplesPerTrigger',Inf);
set(ai,'SampleRate',1000)
set(ai,'LogToDiskMode','Overwrite')
set(ai,'LogFileName','log.daq')
set(ai,'LoggingMode', 'disk')
start(ai)
pause()
stop(ai)
data = daqread('log.daq');
delete(ai);
Note that you still need to set 'SamplesPerTrigger' to Inf for this to work properly. Thank you to Jonas for his help as well.