Open multiple folders which have about 500 files under them and extract vtk files under them - matlab

I am trying to open multiple folders which have about 500 files under them and then use a function called vtkread to read the files in those folders. I am not sure how to set that up.
So here is my function but I am stuggling with setting up the mainscript to select files from a folder
function [Z_displacement,Pressure] = Processing_Code2_Results(filename, reduce_time, timestep_total)
fid = fopen(filename,'r');
Post_all = [];
vv=[1:500];
DANA0 = vtkRead('0_output_000000.vtk'); %extract all data from the vtk file including disp, pressure, points, times
C = [DANA0.points,reshape(DANA0.pointData.displacements,size(DANA0.points)),reshape(DANA0.pointData.pressure,[length(DANA0.points),1])];
disp0 = reshape(DANA0.pointData.displacements,[1,size(C,1),3]);
points = DANA0.points; % This is a matrix of the xyz points
for i = 1:reduce_time:timestep_total %34
DANA = vtkRead(sprintf('0_output_%06d.vtk',i)); % read in each successive timestep
disp(i,:,:) = DANA.pointData.displacements; % store displacement for multiple timesteps
pressure(i,:) = DANA.pointData.pressure; % store pressure for multiple timesteps
% press = pressure';
end
...
I have tried something like this:
clc; clear;
timestep_total = 500;
reduce_time = 100;
cd 'C:\Users\Admin\OneDrive - Kansas State University\PhD\Project\Modeling\SSGF_Model\New_Model_output'
for i = 1:3
filename = sprintf("Gotherm_%d",i)
[Z_displacement_{i},Pressure_{i}] = Processing_Code2_Results(filename, reduce_time, timestep_total);
end

Related

Accessing parsim data in Matlab

Good evening, May I please get advice with the following Matlab code? Here it is:
%% CLEAR ALL
close all
clear all
clc
%% LOAD MODEL AND LHC FILE
tic %start the clock
idx=1;
model = 'PG_PN_basic_rev1'; %This is the simulink file you wish to run.
load_system(model);
load 'LHC_input.mat' %Call in the file created by LHC_Final.m
LHC = (LHC1_input);
k_dc = LHC((1:5),1);
k_r = LHC((1:5),2);
a_1 = LHC((1:5),3);
b_1 = LHC((1:5),4);
Kg_PG = LHC((1:5),5);
Kg_PN = LHC((1:5),6);
for i = length(k_dc):-1:1
in(i) = Simulink.SimulationInput('PG_PN_basic_rev1');
in(i) = in(i).setVariable('k_dc',k_dc(i));
for j = length(k_r):-1:1
in(j) = in(j).setVariable('k_r',k_r(j));
for k = length(a_1):-1:1
in(k) = in(k).setVariable('a_1',a_1(k));
for l = length(b_1):-1:1
in(l) = in(l).setVariable('b_1',b_1(l));
for m = length(Kg_PG):-1:1
in(m) = in(m).setVariable('Kg_PG',Kg_PG(m));
for n = length(Kg_PN):-1:1
in(n) = in(n).setVariable('Kg_PN',Kg_PN(n));
end
end
end
end
end
end
out = parsim(in, 'ShowProgress', 'on');
% eval(['PN_batch', int2str(idx),' =PN;']);
% data = eval(['PN_batch', int2str(idx)]);
% a{idx} = data;
% idx=idx+1;
% run = idx
timeElapsed = toc %How long did you code run for?
I wish to be able to generate an output file per parsim run (PN_batch1, PN_batch2,...etc.). However, the data often falls under just 1 output, and isn't divided up into readable workspace objects that I can read later with another script. Any advice would be greatly appreciated. Thank you.
out is a vector of length equal to the number of simulations with the data of a simulation stored in each entry. If you have to workspace blocks in your model, you can access that data per simulation using out(10).NameOftoWorkspaceData, in case you want to get the data of the 10th simulation. More info on the out variable can be found here on the Mathworks site.
Tip: run the model and check out the variable out, then you can explore its structure

MATLAB - Read multiple *.log files, calculate and save as .*txt in new folder

I make different measures and save it like *.log files, calculate it and save as .*txt.
%Filtering log files
l = dir('*.log');
%Array size detection
[rows cols] = size(l);
%Choose a last file
file_name = strcat(strcat(l(rows).folder,'/'),l(rows).name)
%Reading log last file
fileread = fopen(file_name);
%Convert to float
times = fread(fileread,'float32');
%Filtering times and set to 0 small values
times(times<1e-8)=0;
%Set right times values
times_s = times * 1.0e-06;
%Solve full rotation speed (Hz)
motorspeed_full = 1./(2.*times_s)
%Filtering inf values and set to 0
motorspeed_full(motorspeed_full>1e+10)=0;
%Solve half rotation speed (Hz)
motorspeed_half = 1./(times_s);
A = '.txt';
[filepath,name,ext] = fileparts(file_name);
Xfilename = cat(2,name,A);
dlmwrite(Xfilename,motorspeed_full,'precision','%.3f');
So, it's possible to choose a last file, calculating it and convert it to .*txt. So now, I have to make a calculation after every measure.
My aim is:
Making first 1...n measures (1...n - *.log's and wav's)
Calculating and saving 1...n *.log's to *.txt's (see picture)
Create folder with file_name (ex. 20181120_125713) and insert file_name.txt and file_name.wav into this folder structure
Questions:
How can I converting all *.log files to *.txt files using dlmwrite?
How can I create a new folder with file_name
mkdir(name);
for all files?
How to move the files with same names to folder with same name? Name of folder changes every time, so i can't work with
movefile source destination
Thank you very much for any help :*)
Here is my solution. Maybe it can help somebody:
clear all;
clc;
%Filtering log files
l = dir('*.log');
for k = 1:length(l)
next_name = l(k).name
%Array size detection
[rows cols] = size(l);
%Choose a last file
file_name = next_name;
%Reading log last file
fileread = fopen(file_name);
%Convert to float
times = fread(fileread,'float32');
%Filtering times and set to 0 small values
times(times<1e-8)=0;
%Set right times values
times_s = times * 1.0e-06;
%Solve full rotation speed (Hz)
motorspeed_full = 1./(2.*times_s);
%Filtering inf values and set to 0
motorspeed_full(motorspeed_full>1e+10)=0;
%Solve half rotation speed (Hz)
motorspeed_half = 1./(times_s);
A = '.txt';
[filepath,name,ext] = fileparts(file_name);
Xfilename = [name,A];
mkdir(name)
dlmwrite([name,filesep,Xfilename],motorspeed_full,'precision','%.3f');
% [name,'.wav'],[name, file, name, '.wav']
movefile([name,'.wav'],[name, filesep, name, '.wav']);
end

How to add standrad deviation and moving average

What I want to is:
I got folder with 32 txt files and 1 excle file, each file contain some data in two columns: time, level.
I already managed to pull the data from the folder and open each file in Matlab and get the data from it. What I need to do is create plot for each data file.
each of the 32 plots should have:
Change in average over time
Standard deviation
With both of this things I am straggling can't make it work.
also I need to make another plot this time the plot should have the average over each minute from all the 32 files.
here is my code until now:
clc,clear;
myDir = 'my path';
dirInfo = dir([myDir,'*.txt']);
filenames = {dirInfo.name};
N = numel(filenames);
data=cell(N,1);
for i=1:N
fid = fopen([myDir,filenames{i}] );
data{i} = textscan(fid,'%f %f','headerlines',2);
fclose(fid);
temp1=data{i,1};
time=temp1{1};
level=temp1{2};
Average(i)=mean(level(1:find(time>60)));
AverageVec=ones(length(time),1).*Average(i);
Standard=std(level);
figure(i);
plot(time,level);
xlim([0 60]);
hold on
plot(time, AverageVec);
hold on
plot(time, Standard);
legend('Level','Average','Standard Deviation')
end
the main problam with this code is that i get only average over all the 60 sec not moving average, and the standard deviation returns nothing.
few things you need to know:
*temp1 is 1x2 cell
*time and level are 22973x1 double.
Apperently you need an alternative to movmean and movstd since they where introduced in 2016a. I combined the suggestion from #bla with two loops that correct for the edge effects.
function [movmean,movstd] = moving_ms(vec,k)
if mod(k,2)==0,k=k+1;end
L = length(vec);
movmean=conv(vec,ones(k,1)./k,'same');
% correct edges
n=(k-1)/2;
movmean(1) = mean(vec(1:n+1));
N=n;
for ct = 2:n
movmean(ct) = movmean(ct-1) + (vec(ct+n) - movmean(ct-1))/N;
N=N+1;
end
movmean(L) = mean(vec((L-n):L));
N=n;
for ct = (L-1):-1:(L-n)
movmean(ct) = movmean(ct+1) + (vec(ct-n) - movmean(ct+1))/N;
N=N+1;
end
%mov variance
movstd = nan(size(vec));
for ct = 1:n
movstd(ct) = sum((vec(1:n+ct)-movmean(ct)).^2);
movstd(ct) = movstd(ct)/(n+ct-1);
end
for ct = n+1:(L-n)
movstd(ct) = sum((vec((ct-n):(ct+n))-movmean(ct)).^2);
movstd(ct) = movstd(ct)/(k-1);
end
for ct = (L-n):L
movstd(ct) = sum((vec((ct-n):L)-movmean(ct)).^2);
movstd(ct) = movstd(ct)/(L-ct+n);
end
movstd=sqrt(movstd);
Someone with matlab >=2016a can compare them using:
v=rand(1,1E3);m1 = movmean(v,101);s1=movstd(v,101);
[m2,s2] = moving_ms(v,101);
x=1:1E3;figure(1);clf;
subplot(1,2,1);plot(x,m1,x,m2);
subplot(1,2,2);plot(x,s1,x,s2);
It should show a single red line since the blue line is overlapped.

Read all .csv-files in folder and plot their content

By an old post (https://stackoverflow.com/a/13744310/3900582) I have been able to read all the .csv-files in my folder into a cell array. Each .csv-file has the following structure:
0,1024
1,427
2,313
3,492
4,871
5,1376
6,1896
7,2408
8,2851
9,3191
Where the left column is the x-value and the right column is the y-value.
In total, there are almost 200 files and they are each up to 100 000 lines long. I would like to plot the contents of the files in one figure, to allow the data to be more closely inspected.
I was able to use the following code to solve my problem:
dd = dir('*.csv');
fileNames = {dd.name};
data = cell(numel(fileNames),2);
data(:,1) = regexprep(fileNames, '.csv','');
for i = 1:numel(fileNames)
data{i,2} = dlmread(fileNames{i});
end
fig=figure();
hold on;
for j = 1:numel(fileNames)
XY = data{j,2};
X = XY(:,1);
Y = XY(:,2);
plot(X,Y);
end

fprintf Octave - Data corruption

I am trying to write data to .txt files. Each of the files is around 170MB (after writing data to it).
I am using octave's fprintf function, with '%.8f' to write floating point values to a file. However, I am noticing a very weird error, in that a sub-set of entries in some of the files are getting corrupted. For example, one of the lines in a file is this:
0.43529412,0.}4313725,0.43137255,0.33233533,...
that "}" should have been "4". Now how did octave's fprintf write that "}" with '%.8f' option in the first place? What is going wrong?
Another example is,
0.73289\8B987,...
how did that "\8B" get there?
I have to process a very large data-set with 360 Million points in total. This error in a sub-set of rows in some files is becoming a big problem. What is causing this problem?
Also, this corruption doesn't occur at random. For example, if a file has 1.1 Million rows, where each row corresponds to a vector representing a data-instance, then the problem occurs say in 100 rows at max, and these 100 rows are clustered togeter. Say for example, these are distributed from row 8000 to 8150, but it is not the case that out of 100 corrupted rows, first 50 are located near say 10000th row and the remaining at say 20000th row. They always form a cluster.
Note: Below code is the code-block responsible for extracting data and writing it to files. Some variables in the code, like K_Cell have been computed computed earlier and play virtually no role in data-writing process.
mf = fspecial('gaussian',[5 5], 2);
fidM = fopen('14_01_2016_Go_AeossRight_ClustersM_wLAMRD.txt','w');
fidC = fopen('14_01_2016_Go_AeossRight_ClustersC_wLAMRD.txt','w');
fidW = fopen('14_01_2016_Go_AeossRight_ClustersW_wLAMRD.txt','w');
kIdx = 1;
featMat = [];
% - Generate file names to print the data to
featNo = 0;
fileNo = 1;
filePath = 'wLRD10_Data_Road/featMat_';
fileName = [filePath num2str(fileNo) '.txt'];
fidFeat = fopen(fileName, 'w');
% - Compute the global means and standard deviations
gMean = zeros(1,13); % - Global mean
gStds = zeros(1,13); % - Global variance
gNpts = 0; % - Total number of data points
fidStat = fopen('wLRD10_Data_Road/featStat.txt','w');
for i=1600:10:10000
if (featNo > 1000000)
% - If more than 1m points, close the file and open new one
fclose(fidFeat);
% - Get the new file name
fileNo = fileNo + 1;
fileName = [filePath num2str(fileNo) '.txt'];
fidFeat = fopen(fileName, 'w');
featNo = 0;
end
imgName = [fAddr num2str(i-1) '.jpg'];
img = imread(imgName);
Ir = im2double(img(:,:,1));
Ig = im2double(img(:,:,2));
Ib = im2double(img(:,:,3));
imgR = filter2(mf, Ir);
imgG = filter2(mf, Ig);
imgB = filter2(mf, Ib);
I = im2double(img);
I(:,:,1) = imgR;
I(:,:,2) = imgG;
I(:,:,3) = imgB;
I = im2uint8(I);
[Feat1, Feat2] = funcFeatures1(I);
[Feat3, Feat4] = funcFeatures2(I);
[Feat5, Feat6, Feat7] = funcFeatures3(I);
[Feat8, Feat9, Feat10] = funcFeatures4(I);
ids = K_Cell{kIdx};
pixVec = zeros(length(ids),13); % - Get the local image features
for s = 1:length(ids) % - Extract features
pixVec(s,:) = [Ir(ids(s,1),ids(s,2)) Ig(ids(s,1),ids(s,2)) Ib(ids(s,1),ids(s,2)) Feat1(ids(s,1),ids(s,2)) Feat2(ids(s,1),ids(s,2)) Feat3(ids(s,1),ids(s,2)) Feat4(ids(s,1),ids(s,2)) ...
Feat5(ids(s,1),ids(s,2)) Feat6(ids(s,1),ids(s,2)) Feat7(ids(s,1),ids(s,2)) Feat8(ids(s,1),ids(s,2))/100 Feat9(ids(s,1),ids(s,2))/500 Feat10(ids(s,1),ids(s,2))/200];
end
kIdx = kIdx + 1;
for s=1:length(ids)
featNo = featNo + 1;
fprintf(fidFeat,'%d,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f\n', featNo, pixVec(s,:));
end
% - Compute the mean and variances
for s = 1:length(ids)
gNpts = gNpts + 1;
delta = pixVec(s,:) - gMean;
gMean = gMean + delta./gNpts;
gStds = gStds*(gNpts-1)/gNpts + delta.*(pixVec(s,:) - gMean)/gNpts;
end
end
Note that the code block:
for s=1:length(ids)
featNo = featNo + 1;
fprintf(fidFeat,'%d,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f,%.8f\n', featNo, pixVec(s,:));
end
is the only part of the code that writes the data-points to the files.
The earlier code-block,
if (featNo > 1000000)
% - If more than 1m points, close the file and open new one
fclose(fidFeat);
% - Get the new file name
fileNo = fileNo + 1;
fileName = [filePath num2str(fileNo) '.txt'];
fidFeat = fopen(fileName, 'w');
featNo = 0;
end
opens a new file for writing the data to it, when the currently opened file exceeds the limit of 1 million data-points.
Furthermore, note that
pixVec
variable cannot contain anything other than floats/double values, or the octave will throw an error.