Using data from txt file to make a plot - matlab

I have a file with float numbers( -1.22070E-02 -7.32420E-02 ) on two columns.
I want to read them and make a plot. Something like the examples here: https://www.mathworks.com/help/signal/examples/signal-generation-and-visualization.html.
After searching this is my code:
fid = fopen('sample_b.txt', 'r');
a = fscanf(fid, '%f %f', [2 inf]);
simin = a';
plot(simin(:,1),simin(:,2));
fclose(fid);
But this is the plot I get:
And when I display simin using 'disp(simin)' I see:
81.4724 90.5792
12.6987 91.3376
63.2359 9.7540
27.8498 54.6882
I do not understand why my program uses these 8 numbers. My file contains many numbers, but the first 8 are these:
-1.22070E-02 -7.32420E-02
1.22070E-02 0.197750
4.88280E-03 9.27730E-02
-4.39450E-02 3.41800E-02
If I use 'plot(a(:,1),a(:,2));' the plot is just a line.
Edit: my data represents and EEG signal.

Related

Matlab fwrite precision conversion

I want to write some data to a binary file in single precision. The data is originally in double precision. Is there any difference between converting the data to single by calling the single command before fwrite and just letting Matlab do the conversion in the fwrite call?
Case 1
data1 % double precision
fwrite(fid,data1,'single');
Case 2
data2=single(data1);
fwrite(fid,data2,'single');
In the 2nd case, is Matlab doing any modifications to data2 before writing it since it is already in single format ? Will there be any difference in data written to the two files ?
Let's try this:
data1 = 1.555555555555555555;
data2 = single(data1);
fid = fopen('C:\Some\Address\data1.bin', 'w');
fwrite(fid, data1, 'single');
fclose(fid);
fid = fopen('C:\Some\Address\data2.bin', 'w');
fwrite(fid, data2, 'single');
fclose(fid);
% Lets read them back (note that MATLAB stores them in a double-precision variable by default)
fid = fopen('C:\Some\Address\data1.bin', 'r');
data1 = fread(fid, 'single');
fclose(fid);
fid = fopen('C:\Some\Address\data2.bin', 'r');
data2 = fread(fid, 'single');
fclose(fid);
format long;
[data1 data2] % or use fprintf to see the values
ans =
1.555555582046509 1.555555582046509
To your questions:
In the 2nd case, is Matlab doing any modifications to data2 before
writing it since it is already in single format ?
I don't think so but cannot be confident without knowing what is going on under the hood of fwrite.
Will there be any difference in data written to the two files ?
According to the test above I don't believe so,
You should use formatSpec to specify the format:
Like this
A = [6.6,1.11111];
formatSpec = '%4.5f'; % modify it accordingly.
fprintf(formatSpec ,A);

Fastest way to export a 2d matrix to a triples format CSV file in Matlab

I want to convert a 2d matrix, for example:
10 2
3 5
to a (row,col,value) CSV file, for example:
1,1,10
1,2,2
2,1,3
2,2,5
is it possible to do it in a single Matlab command?
I didn't find a way with a single command, but try the following code:
[i1,i2] = ind2sub(size(A),1:numel(A));
csvwrite('test.csv',[i2',i1',reshape(A',numel(A),1)]);
The output is:
type test.csv
1,1,10
1,2,2
2,1,3
2,2,5
Assuming A to be the input matrix, two approaches can be suggested here.
fprintf based solution -
output_file = 'data.txt'; %// Edit if needed to be saved to a different path
At = A.'; %//'
[y,x] = ndgrid(1:size(At,1),1:size(At,2));
fid = fopen(output_file, 'w+');
for ii=1:numel(At)
fprintf(fid, '%d,%d,%d\n',x(ii),y(ii),At(ii));
end
fclose(fid);
dlmwrite based approach -
At = A.'; %//'
[y,x] = ndgrid(1:size(At,1),1:size(At,2));
dlmwrite(output_file,[x(:) y(:) At(:)]);
Some quick tests seem to suggest that fprintf performs better across varying input datasizes.

read a txt file to matrix and cellarray Matlab

I have a txt file with those entries and I would like to know how to get the numerical values from the second column until the last column in a matrix and the first column in a cell array.
I've tried with import data and fscanf and I dont understand what's going on.
CP6 7,2 -2,7 6,6
P5 -5,8 -5,9 5,8
P6 5,8 -5,9 5,8
AF7 -5,0 7,2 3,6
AF8 5,0 7,2 3,6
FT7 -7,6 2,8 3,6
This should give you what you want based on the text sample you supplied.
fileID = fopen('x.txt'); %open file x.txt
m=textscan(fileID,'%s %d ,%d %d ,%d %d ,%d');
fclose(fileID); %close file
col1 = m{1,1}; %get first column into cell array col1
colRest = cell2mat(m(1,2:6)); %convert rest of columns into matrix colRest
Lookup textscan for more info on reading specially formatted data
This function should do the trick. It reads your file and scans it according to your pattern. Then, put the first column in a cell array and the others in a matrix.
function [ C1,A ] = scan_your_txt_file( filename )
fid = fopen(filename,'rt');
C = textscan(fid, '%s %d,%d %d,%d %d,%d');
fclose(fid);
C1 = C{1};
A = cell2mat(C(2:size(C,2)));
end
Have you tried xlsread? It makes a numeric array and two non-numeric arrays.
[N,T,R]=xlsread('yourfilename.txt')
but your data is not comma delimited. It also looks like you are using a comma to represent a decimal point. Does this array have 7 columns or 4? Because I'm in the US, I'm going to assume you have paired coordinates and the comma is one kind of delimiter while the space is a second one.
So here is something klugy, but it works. It is a gross ugly hack, but it works.
%housekeeping
clc
%get name of raw file
d=dir('*22202740*.txt')
%translate from comma-as-decimal to period-as-decimal
fid = fopen(d(1).name,'r') %source
fid2= fopen('myout.txt','w+') %sink
while 1
tline = fgetl(fid); %read
if ~ischar(tline), break, end %end loop
fprintf(fid2,'%s\r\n',strrep(tline,',','.')) %write updated line to output
end
fclose(fid)
fclose(fid2)
%open, gulp, parse/store, close
fid3 = fopen('myout.txt','r');
C=textscan(fid3,'%s %f %f %f ');
fclose(fid3);
%measure waist size and height
[n,m]=size(C);
n=length(C{1});
%put in slightly more friendly form
temp=zeros(n,m);
for i=2:m
t0=C{i};
temp(:,i)=t0;
end
%write to excel
xlswrite('myout_22202740.xlsx',temp(:,2:end),['b1:' char(96+m) num2str(n)]);
xlswrite('myout_22202740.xlsx',C{1},['a1:a' num2str(n)])
%read from excel
[N,T,R]=xlsread('myout_22202740.xlsx')
If you want those commas to be decimal points, then that is a different question.

Reading a CSV file and plotting graph using Matlab

I would like to create a script in Matlab that can read data from a CSV file and plot it. My data looks something like:
Time BPM(HeartRate)
5:55:26 0
5:55:26 0
5:55:27 66
5:55:27 70
5:55:27 71
5:55:27 74
...
I would like to plot time on the x axis and BPM on the y axis. I have tried the following:
clear, clc;
ftoread = 'data.csv';
fid = fopen(ftoread);
y=data(:,1);
x=data(:,2);
plot(x,y);
xlabel('Time');
ylabel('Heart Rate');
title('Heart Rate Vs. Time');
Unfortunately, I am getting an error at y=data(:,1).
Why don't you use csvread instead?
data = csvread('data.csv');
x=data(:,1);
y=data(:,2);
% etc...
It looks like you're missing some steps. csvread or dlmread may not work well since you have a string and a number on each line. textscan should be fast and easy:
ftoread = 'data.csv';
fid = fopen(ftoread);
data = textscan(fid,'%s%f'); % Read in a string and a double
fclose(fid); % If you call fopen, make sure you also call fclose
x = data{1};
y = data{2};
You'll then may want to use functions like datestr and datenum to convert your time strings to other values. I believe that the even accept cell array (like x) as inputs.

Matlab: how handle abnormal data files

I am trying to import a large number of files into Matlab for processing. A typical file would look like this:
mass intensity
350.85777 238
350.89252 3094
350.98688 2762
351.87899 468
352.17712 569
352.28449 426
Some text and numbers here, describing the experimental setup, eg
Scan 3763 # 81.95, contains 1000 points:
The numbers in the two columns are separated by 8 spaces. However, sometimes the experiment will go wrong and the machine will produce a datafile like this one:
mass intensity
Some text and numbers here, describing the experimental setup, eg
Scan 3763 # 81.95, contains 1000 points:
I found that using space-separated files with a single header row, ie
importdata(path_to_file,' ', 1);
works best for the normal files. However, it totally fails on all the abnormal files. What would the easiest way to fix this be? Should I stick with importdata (already tried all possible settings, it just doesn't work) or should I try writing my own parser? Ideally, I would like to get those values in a Nx2 matrix for normal files and [0 0] for abnormal files.
Thanks.
I don't think you need to create your own parser, nor is this all that abnormal. Using textscan is your best option here.
fid = fopen('input.txt', 'rt');
data = textscan(fid, '%f %u', 'Headerlines', 1);
fclose(fid);
mass = data{1};
intensity = data{2};
Yields:
mass =
350.8578
350.8925
350.9869
351.8790
352.1771
352.2845
intensity =
238
3094
2762
468
569
426
For your 1st file and:
mass =
Empty matrix: 0-by-1
intensity =
Empty matrix: 0-by-1
For your empty one.
By default, text scan reads whitespace as a delimiter, and it only reads what you tell it to until it can no longer do so; thus it ignores the final lines in your file. You can also run a second textscan after this one if you want to pick up those additional fields:
fid = fopen('input.txt', 'rt');
data = textscan(fid, '%f %u', 'Headerlines', 1);
mass = data{1};
intensity = data{2};
data = textscan(fid, '%*s %u %*c %f %*c %*s %u %*s', 'Headerlines', 1);
scan = data{1};
level = data{2};
points = data{3};
fclose(fid);
Along with your mass and intensity data gives:
scan =
3763
level =
81.9500
points =
1000
what do you mean 'totally failes on abnormal files'?
you can check if importdata finds any data using e.g.
>> imported = importdata(path_to_file,' ', 1);
>> isfield(imported, 'data')