knitr chapter bibliographies from .Rnw files? - knitr

One can create chapter bibliographies using BibLaTeX and straight .tex files, as shown in this MWE:
% main.tex
\documentclass{report}
\usepackage{natbib}
\usepackage{chapterbib}
\begin{document}
\include{chap1}
\include{chap2}
\end{document}
% chap1.tex
\chapter{one chapter}
text~\cite{paper1}
text~\cite{paper2}
\bibliographystyle{plainnat}
\bibliography{biblio}
% chap2.tex
\chapter{another chapter}
text~\cite{paper2, paper3}
\bibliographystyle{plainnat}
\bibliography{biblio}
% biblio.bib
#Article{paper1,
author = {John Smith},
title = {A title},
journal = {A Journal},
year = {2010}
}
#Article{paper2,
author = {John Doe},
title = {A paper},
journal = {Another journal},
year = {2009}
}
#Article{paper3,
author = {Yuppie Networking},
title = {My paper},
journal = {The best journal},
year = {2000}
}
The above files can be successfully compiled with the following script:
% compile.bash
#!/bin/bash
pdflatex main.tex
for auxfile in chap*.aux
do
bibtex `basename $auxfile .aux`
done
pdflatex main.tex
pdflatex main.tex
I wish to re-create the above functionality using knitr, BibLaTeX, and .Rnw files (not .tex files). Here's a small non-working example. (I know this example lacks R code chunks, but my use case will have R code chunks.)
% main.Rnw
\documentclass{report}
\usepackage{natbib}
\usepackage{chapterbib}
\begin{document}
\Sexpr{knitr::knit_child("chap1.Rnw")}
\Sexpr{knitr::knit_child("chap2.Rnw")}
\end{document}
% chap1.Rnw
\chapter{one chapter}
text~\cite{paper1}
text~\cite{paper2}
\bibliographystyle{plainnat}
\bibliography{biblio}
% chap2.Rnw
\chapter{another chapter}
text~\cite{paper2, paper3}
\bibliographystyle{plainnat}
\bibliography{biblio}
% knit_script.R
knitr::knit(input = "main.Rnw")
% compileRnw.bash
/usr/local/bin/Rscript knit_script.R
pdflatex main.tex
for auxfile in chap*.aux
do
bibtex `basename $auxfile .aux`
done
pdflatex main.tex
pdflatex main.tex
When I run the compileRnw.bash script, the resulting document lacks bibliographies.
What should main.Rnw and compileRnw.bash look like to create a document with chapter bibliographies from the .Rnw files?
Thanks in advance for any help!

\Sexpr{knitr::knit_child("chap1.Rnw")} knits chap1.Rnw and returns the text content of chap1.tex instead of \include{chap1}. That's the difference between using .Rnw (your non-working example) and plain .tex (your first working example), which means there will not be a chap1.aux, because there is one and only one .tex output file, which is main.tex. You will only get one .aux file, too, which is main.aux. Then you need to run bibtex on main.aux instead of chap*.aux. Your shell script should be:
pdflatex main.tex
bibtex main.aux
pdflatex main.tex
pdflatex main.tex
In fact, you do not really need this shell script, because knitr::knit2pdf('main.Rnw') does all these steps automatically. Or equivalently, you can click the button "Compile PDF" on the RStudio toolbar if you use RStudio.
BTW, I don't know why you had to declare
\bibliographystyle{plainnat}
\bibliography{biblio}
in each chapter. That seems to be unnecessary. You can declare the bib style and database once in main.Rnw. If you duplicate them in each chapter, bibtex will complain, although it doesn't really hurt.

Thanks for this response!
A point of clarification.
In the examples, I use
\bibliographystyle{plainnat}
\bibliography{biblio}
in every chapter file, because I want a bibliography at the end of every chapter.
To be specific, I want 2 bibliographies, one after Chapter 1 (containing only references from Chapter 1)
and another after Chapter 2 (containing only references from Chapter 2).
Thanks to the answer from Yihui,
I found that the following changes produce the output that I want.
In main.Rnw:
\documentclass{report}
\usepackage{natbib}
\usepackage{chapterbib}
\begin{document}
% \Sexpr{knitr::knit_child("chap1.Rnw")}
% \Sexpr{knitr::knit_child("chap2.Rnw")}
\include{chap1}
\include{chap2}
\end{document}
In knit_script.R:
# knit_script.R
# knitr::knit(input = "main.Rnw")
knitr::knit(input = "chap1.Rnw")
knitr::knit(input = "chap2.Rnw")
knitr::knit(input = "main.Rnw")
With those changes, I obtain a bibliography for each chapter.

Related

Edit an Abaqus input file and Run it from Matlab

I need perform 50 Abaqus simulations, each simulation analyses a certain material property and each differs by changing one parameter. So the idea is to write a Matlab script that:
opens the .inp file
edits the material parameter of interest
prints it into a new file which will be the new .inp file
runs it to perform the simulation
This is what I accomplished so far in a very simplified version:
f= fopen('PRD8_30s.inp');
c = textscan(f,'%s %s %s %s %s ','delimiter',',');
fclose(f) ;
S = [c{1}];
A = {'5e-08'} ;
S(12496) = A ;
fid = fopen('file.inp','w') ;
fprintf(fid,'%s \n',S{:} );
fclose(fid) ;
PRD_8_30s.inp
I manually found out the position of the parameter of interest (A at 12496 hence below the line *Viscoelastic). The code actually changes the parameter I need but there are major problems: it prints a new file with additional lines with respect to the original .inp (12552 vs 8737) and it doesn't print the entire .inp but only the first column.
How can I edit the .inp changing the parameter and obtaining a new .inp with the edited parameter that can be used to run the new simulation?
Thank you in advance for your help!
If your input file is not multiple Gb in size, The following might help.
create a template input and mark the parameter you want to change as, for example para_xxxx
Use the following script:
text=fileread('template.inp');
newtext=replace(text,'para_xxxx',newParameter);
fid=fopen('newcase.inp','w');
fprintf(fid,newtext);
fclose(fid);
The file name 'newcase.inp' should be updated each time in the loop.

Kill Excel process created in Matlab

Given that I write to a workbook, I currently kill all Excel processes so that my code works when I call it in a loop.
xlswrite(path,values);
system('taskkill /F /IM EXCEL.EXE');
This makes me unable to run the code while I am working in another Excel file. How do I make it so that Matlab terminates only the Excel processes that itself created?
This was a "feature" introduced somewhere around R2015b to speed up multiple writes to Excel... not very user/memory friendly!
The xlswrite documentation links to this MathWorks Support Answer to manually write to Excel using actxserver, you can then manually delete the COM object referencing Excel.
You can actually edit xlswrite and see that it uses matlab.io.internal.getExcelInstance, which does the same thing and creates a COM interface with actxserver.
A cheeky option would be to copy xlswrite, add the Excel variable it creates as an output, and Quit and delete it afterwards, as shown below. I don't advocate breaking any of The MathWorks' copyright ownership of that function.
A less cheeky option would be to create a comparable function based on the answer I linked above, for writing data only it would look something like this:
function xlswriteClean( File, Data, Range )
% XLSWRITECELAN writes data like XLSWRITE, but deletes the Excel instance!
% XLSWRITECELAN (FILE,DATA,RANGE) writes the variable in
% DATA to FILE, in the range specified by RANGE.
% RANGE is optional, defaults to "A1"
% Obtain the full path name of the file
% Could handle this more elegantly, i.e.
% this always assumes the current directory, but user might give a full path
file = fullfile(pwd, File);
% Open an ActiveX connection to Excel
h = actxserver('excel.application');
%Create a new work book (excel file)
wb = h.WorkBooks.Add();
% Select the appropriate range
if nargin < 3
Range = 'A1';
end
rng = h.Activesheet.get('Range', Range);
% Write the data to the range
rng.value = Data;
% Save the file with the given file name, close Excel
wb.SaveAs( File );
% Clean up - the point of this function
wb.Close;
h.Quit;
h.delete;
end
You can customise basically everything within the new Excel workbook using the COM object h, so you could add any functionality which you use in xlswrite like sheet naming etc.
You can start the excel process by powershell and get its process id then use the process id to kill the process:
[~, pid] = system('powershell (Start-Process excel.exe -passthru).Id');
% Do your work
% ...
system(['powershell Stop-Process -Id ' pid])

How do you view the plots saved by the print() function in Matlab?

This is probably a simple question, but I have some files and data. After printing each one of them to a '-dpng' file, I just want to view the plot and copy them elsewhere. However, opening the file, all I see is the "Import Wizard". I click "Finish" and nothing happens. This is my code:
files = dir('*.csv');
for file = files'
lab5 = csvread(file.name, 9)
lab5(:,1) = log10(lab5(:,1))
plot(lab5(:,1),lab5(:,2))
print(strcat(file.name,'plot'), '-dpng')
end
I tried to avoid print() by using savefig, but for some reason savefig was giving me a vague error. Only print works, but I'm not sure how to view the output.
You are saving your image as filename.csvplot, which Preview does not accept as a valid image file.
For example:
% Generate dummy file
fID = fopen('blah.csv', 'w');
fclose(fID);
% Recreate print job
files = dir('*.csv');
plot(1:10)
fname = strcat(files(1).name, 'plot');
print(fname, '-dpng');
Which gives us:
fname =
blah.csvplot
Why isn't .png appended to the filename? Per the documentation for print:
If the file name does not include an extension, then print appends the appropriate one.
Filename inputs are parsed for an extension (e.g. things prepended with .), and does not append an extension if one is found. In this case, the filename passed to print has the .csvplot extension. This might be unexpected but it does make sense, file extensions don't actually control anything about the file itself; you could save your file as image.finderpleaseopen and have it still be a valid PNG file. Finder is just too stubborn to open it without being forced because it's not a known, supported file extension.
To fix this, you should save your file with the correct file extension. There are two ways to do this, append the correct extension or remove the undesired extension with something like fileparts or regexprep and let print handle it for you.
For example:
% blah.csvplot.png
fname = strcat(files(1).name, 'plot', '.png');
print(fname, '-dpng');
% blahplot.png
[~, filename] = fileparts(files(1).name);
fname = strcat(filename, 'plot');
print(fname, '-dpng');
savefig does not produce a valid output for Finder because it does not produce any output without a .fig extension:
If the specified file name does not include a .fig file extension, then MATLAB appends the extension. savefig does not accept other file extensions.
*.fig files are not image files and cannot be opened natively by finder.

Display TODO/FIXME report in Matlab's command window

Using the command
checkcode('function.m')
you can run the code analyzer on an m-file and output the report in the command window.
Is there any way to do this for TODO/FIXME reports? (without having to cd to the folder that contains the function and manually run it on the whole directory)
Bonus: If so, is it also possible to create custom tags? In eclipse, you can create custom TODO tags like "MTODO" and "JTODO" for different purposes/different people and have them displayed separately. Is this possible in Matlab?
Thanks in advance for any help! I will be continuing my google searches and will post the results if I find something.
You can use the internal function dofixrpt. This returns the HTML displayed in the report, rather than displaying the information at the command line, however.
% Run the report and show it
cd('myfolder')
dofixrpt;
% Alternatively, get the HTML of the report directly
html = dofixrpt;
% Write the HTML to a file
filename = tempname;
fid = fopen(filename, 'w');
fprintf(fid, '%s', html);
fclose(fid);
% View the HTML file
web(filename)
Type which dofixrpt or edit dofixrpt to see more details about what it does (it's basically a regular expression search for %.*TODO and %.*FIXME).
In the HTML report, you can find markers other than TODO and FIXME by specifying a custom marker (the default is NOTE). Unfortunately you can specify only one. If you're up to looking within dofixrpt and modifying it very slightly though, it would be very easy to make it look for more.
Finally you could also put in an enhancement request with MathWorks to provide a command similar to checkcode that will just do this for you and return the results at the command line. It seems like that would be very easy for them to do, and I'm surprised they haven't already done it, given that they've done something similar for helprpt, coveragerpt, deprpt etc.
Hope that helps!
I ended up writing my own code checker that calls checkcode on every m file in the specified folders.
fld_list = {pwd, 'folder', 'other_folder'};
nProblems = 0;
for iFld = 1:length(fld_list)
% fprintf('Checking %s...\n', fld_list{n});
files = dir(fullfile(fld_list{iFld}, '*.m'));
for f = 1:length(files)
filename = fullfile(fld_list{iFld}, files(f).name);
customCodeCheck(filename); %custom function
% check code analyzer
codeWarnings = checkcode(filename);
if not(isempty(codeWarnings))
fprintf('Problem found in %s\n', files(f).name);
for iData = 1:length(codeWarnings)
nProblems = nProblems + 1;
% print out link to problem
fprintf('line %d: %s\n', ...
filename, ...
codeWarnings(iData).line, codeWarnings(iData).line, ...
codeWarnings(iData).message);
end
end
end
end
You can add to this a customCodeCheck function that searches for TODO and FIXME and alerts you to their existence
function customCodeCheck(filename)
fileContents = fileread(filename);
toDos = strfind(fileContents, 'TODO');
fixMes = strfind(fileContents, 'FIXME');
% do other stuff
end

Abaqus *.inp file created using Matlab

I was trying to do parametric studies in ABAQUS. I created an *.inp file (master) using GUI in abaqus, then wrote a matlab code to create a new *.inp file using the master. Master *.inp file can be found here and will be required to run the code.
In the new *.inp file everything was same as the master except a few specific lines which I am changing for parametric studies, code is given below. I am getting the files nicely but the problem is ABAQUS can't read the file and gives error messages. By visual inspection I don't find any faults. I guess matlab is writing the *.inp file in some other format which ABAQUS can't interpret.
clc;
%Number of lines to be copied
total_lines=4538; %total number of lines
lines_b4_RP1=4406; % lines before reference point 1
%creating new files
for A=0
for R=[20 30 40 50 100 200 300 400 500]
fileroot = sprintf('P_SHS_120X120X1_NLA_I15_A%dR%d.inp', A,R);
main_inp=fopen('P_SHS_120X120X1_NLA_I15_A0R10.inp','r'); %inputting the main inp file to be copied
wfile=fopen(fileroot,'w+'); %wfile= writing the new file
for i=1:total_lines
data=fgets(main_inp);
if i<lines_b4_RP1
fprintf(wfile,'%s\n', data);
elseif i==lines_b4_RP1
formatline1=('%s\n');
txtline='*Node';
fprintf(wfile, formatline1 ,txtline);
elseif i==(lines_b4_RP1+1)
formatline2=('%d%s%d%s%d%s%d\r\n');
comma=',';
refpt1=1;
xcoord1=R*cosd(A);
ycoord1=R*sind(A);
zcoord1=-20;
fprintf(wfile, formatline2, refpt1,comma,xcoord1,comma,ycoord1,comma,zcoord1);
elseif i==(lines_b4_RP1+2)
fprintf(wfile, formatline1 ,txtline);
elseif i==(lines_b4_RP1+3)
refpt2=2;
xcoord2=R*cosd(A);
ycoord2=R*sind(A);
zcoord2=420;
fprintf(wfile, formatline2 ,refpt2,comma,xcoord2,comma,ycoord2,comma,zcoord2);
elseif i>(lines_b4_RP1+3)
fprintf(wfile,'%s\n', data);
else break;
end
end
fclose(main_inp);
fclose(wfile);
end
end
Thanks in advance.
N.B. A sample *.dat file containing the error message is given here.
You are using fgets to get each line of the input file. From the matlab help:
fgets: Read line from file, keeping newline characters
You then print each line using
fprintf(wfile,'%s\n', data);
This creates two newlines at the end of each data line in the file. A second problem in your file is that you use \r\n in your format specifier. In matlab (unlike C) this will give you two newlines. e.g.
>> fprintf('Hello\rWorld\nFoo\r\nBar\n')
Hello
World
Foo
Bar
>>
I would suggest in future to test this approach with a much simpler format that you can use. Also there is a
*preprint
option that allows you to echo the contents of the input file back into the dat file. This creates big dat files, but it is useful for debugging.