I am trying to export a double array from MATLAB into a txt file. I can do this easily but the data is not structured how i need it. I need the data to be structured in the following way in the txt file;
-0.0195
-0.0217
-0.0260
-0.0274
-0.0258
-0.0246
-0.0244
-0.0233
-0.0209
-0.0221
Does anyone know how this would be done using dlmwrite?
Maybe something like this?
A=[-0.0195; -0.0217; -0.0260; -0.0274; -0.0258; -0.0246; -0.0244; -0.0233; -0.020;-0.0221];
dlmwrite('example.txt', A, 'newline', 'pc')
The last two argument determines the new line character used (CR or CR+LF), depending on the platform. Use 'pc' for the Windows version, and 'unix' for all others.
For full cross-platformness, you can use the isunix function, and have something like the following preceding your code:
if isunix==true
platform='unix'
else
platform='pc'
end
and then use the platform variable as the last argument in dlmwrite.
If your data is in a row-vector called A this will write it into a column in afile.txt:
dlmwrite('afile.txt',A,'\n')
Related
I have a bunch of automatically-generated CSV files with headers, which I'd like to import into Matlab 2016a as a table. I used code such as
T = readtable('d:\test.csv', 'readvariablenames', true);
However, even though the name of the CSV's first column is "runNr", the first column in the Matlab table gets named "x___runNr"
This clearly has something to do with the CSV files being in a slightly format different from that expected by Matlab. For instance, it could be that my CSVs have a Byte Order Mark in the beginning. Still, I am not sure what to do to fix this, since I cannot change the format of the CSVs. Readtable, on the other hand, gives me the output format I am most comfortable with.
Upon calling readtable, the following warning is issued:
"Warning: Variable names were modified to make them valid MATLAB identifiers. "
However, some of my CSVs (perhaps produced by a different version of the software that outputs them) are still read OK, and for those CSVs the same warning is displayed, thus the warning alone is not indicative of the problem.
I think I found the source of the problem:
Like you have suspected, the encoding of your CSV file is "UTF-8-BOM" (I saw it using Notepad++).
The UTF-8 representation of the BOM is the (hexadecimal) byte sequence 0xEF,0xBB,0xBF
MATLAB R2019a knows to ignore the first 3 bytes, but R2016a is "confused" by the 3 characters, and add x___ prefix to runNr.
A workaround, is create a temporary file with out the first 3 characters:
f = fopen('test.csv', 'r');
A = fread(f, '*uint8');
fclose(f);
if all(A(1:3) == hex2dec(['EF'; 'BB'; 'BF']))
f = fopen('tmp.csv', 'w');
fwrite(f, A(4:end)); %Skip first 3 characters.
fclose(f);
T = readtable('tmp.csv', 'readvariablenames', true);
else
T = readtable('test.csv', 'readvariablenames', true);
end
There might be more efficient solutions (like simply removing the x___).
I have a text file with numbers that looks like this:
a.txt
0.001240242769
-0.000829468827
-0.0001689229831
0.0008228798977
-3.86881172e-05
in a MATLAB I used to be able to use
x_in = importdata('a.txt');
and x_in was in fact a vector of 5 double numbers.
I don't know what I changed yesterday but all of a sudden when I use the same function it downloads as char
x_in='0.001240242769
-0.000829468827
-0.0001689229831
0.0008228798977
-3.86881172e-05'
What did I change and how can I fix it back?
The likely cause is that the text file has something that is not recognized as number or delimiter. I would suggest to use load as following:
x_in = load('a.txt', '-ascii');
I am creating a matlab application that is analyzing data on a daily basis.
The data is read in from an csv file using xlsread()
[num, weather, raw]=xlsread('weather.xlsx');
% weather.xlsx is a spreadsheet that holds a list of other files (csv) i
% want to process
for i = 1:length(weather)
fn = [char(weather(i)) '.csv'];
% now read in the weather file, get data from the local weather files
fnOpen = xlsread(fn);
% now process the file to save out the .mat file with the location name
% for example, one file is dallasTX, so I would like that file to be
% saved as dallasTx.mat
% the next is denverCO, and so denverCO.mat, and so on.
% but if I try...
fnSave=[char(weather(i)) '.mat'] ;
save(fnSave, fnOpen) % this doesn't work
% I will be doing quite a bit of processing of the data in another
% application that will open each individual .mat file
end
++++++++++++++
Sorry about not providing the full information.
The error I get when I do the above is:
Error using save
Argument must contain a string.
And Xiangru and Wolfie, the save(fnSave, 'fnOpen') works as you suggested it would. Now I have a dallasTX.mat file, and the variable name inside is fnOpen. I can work with this now.
Thanks for the quick response.
It would be helpful if you provide the error message when it doesn't work.
For this case, I think the problem is the syntax for save. You will need to do:
save(fnSave, 'fnOpen'); % note the quotes
Also, you may use weather{i} instead of char(weather(i)).
From the documentation, when using the command
save(filename, variables)
variables should be as described:
Names of variables to save, specified as one or more character vectors or strings. When using the command form of save, you do not need to enclose the input in single or double quotes. variables can be in one of the following forms.
This means you should use
save(fnSave, 'fnOpen');
Since you want to also use a file name stored in a variable, command syntax isn't ideal as you'd have to use eval. In this case the alternative option would be
eval(['save ', fnSave, ' fnOpen']);
If you had a fixed file name (for future reference), this would be simpler
save C:/User/Docs/MyFile.mat fnOpen
I have several tables in matlab that and I would like to write all to one .csv file, vertically concatenating. I would like to keep the column names from each table as the top row, and would like to use a loop to write the csv. The ultimate goal is to read the data in to R, but R.matlab did not work well. Suggestions about how to do this?
Alternatively how can I change filenames in a for loop using the iterator?
e.g. along the lines of
for i=1:10
writecsv('mydatai.csv',data(i))
end
So I must have at the end 10 csv files as output.
You can change the filename within the loop by using for sprintf string formatting function, for example:
dlmwrite(sprintf('mydata%i.csv', i), data(i) )
Note that the %i portion of the string is the sprintf formatting operator for an integer, it is just a coincidence that you also decided to name your iterator variable 'i'.
You can append extra data to an existing CSV by using the dlmwrite function, which uses a comma delimiter as the default, and including the '-append' flag.
Another way would be to use
writetable(Table,filename )
and to change file name after every alternation you can use
filename = ['mydata' num2str(i) '.csv']
I have a file full of ascii data. How would I append a string to the first line of the file? I cannot find that sort of functionality using fopen (it seems to only append at the end and nothing else.)
The following is a pure MATLAB solution:
% write first line
dlmwrite('output.txt', 'string 1st line', 'delimiter', '')
% append rest of file
dlmwrite('output.txt', fileread('input.txt'), '-append', 'delimiter', '')
% overwrite on original file
movefile('output.txt', 'input.txt')
Option 1:
I would suggest calling some system commands from within MATLAB. One possibility on Windows is to write your new line of text to its own file and then use the DOS for command to concatenate the two files. Here's what the call would look like in MATLAB:
!for %f in ("file1.txt", "file2.txt") do type "%f" >> "new.txt"
I used the ! (bang) operator to invoke the command from within MATLAB. The command above sequentially pipes the contents of "file1.txt" and "file2.txt" to the file "new.txt". Keep in mind that you will probably have to end the first file with a new line character to get things to append correctly.
Another alternative to the above command would be:
!for %f in ("file2.txt") do type "%f" >> "file1.txt"
which appends the contents of "file2.txt" to "file1.txt", resulting in "file1.txt" containing the concatenated text instead of creating a new file.
If you have your file names in strings, you can create the command as a string and use the SYSTEM command instead of the ! operator. For example:
a = 'file1.txt';
b = 'file2.txt';
system(['for %f in ("' b '") do type "%f" >> "' a '"']);
Option 2:
One MATLAB only solution, in addition to Amro's, is:
dlmwrite('file.txt',['first line' 13 10 fileread('file.txt')],'delimiter','');
This uses FILEREAD to read the text file contents into a string, concatenates the new line you want to add (along with the ASCII codes for a carriage return and a line feed/new line), then overwrites the original file using DLMWRITE.
I get the feeling Option #1 might perform faster than this pure MATLAB solution for huge text files, but I don't know that for sure. ;)
How about using the frewind(fid) function to take the pointer to the beginning of the file?
I had a similar requirement and tried frewind() followed by the necessary fprintf() statement.
But, warning: It will overwrite on whichever is the 1st line. Since in my case, I was the one writing the file, I put a dummy data at the starting of the file and then at the end, let that be overwritten after the operations specified above.
BTW, even I am facing one problem with this solution, that, depending on the length(/size) of the dummy data and actual data, the program either leaves part of the dummy data on the same line, or bring my new data to the 2nd line..
Any tip in this regards is highly appreciated.