'ccache' configuration - ccache

I have a question related to ccache configuration. In our development environment we have hundreds of make files that build objects using absolute paths.
I wanted to speed up the process and use ccache. Unfortunately when compiling from different locations I can see cache misses. Below is an example of
simplified situation where source files are placed in different directories. How do I have to setup ccache to get the proper hit ratio?
I tried to play with setting the CCACHE_BASEDIR variable with no success:
developer#crunchbang:~$ pwd
/home/developer
developer#crunchbang:~$ ccache -s
cache directory /home/developer/.ccache
cache hit (direct) 0
cache hit (preprocessed) 0
cache miss 0
files in cache 0
cache size 0 Kbytes
max cache size 1.0 Gbytes
developer#crunchbang:~$ ccache g++ -c /home/developer/unique_name1/contest.cpp
developer#crunchbang:~$ ccache g++ -c /home/developer/unique_name2/contest.cpp
developer#crunchbang:~$ ccache -s
cache directory /home/developer/.ccache
cache hit (direct) 0
cache hit (preprocessed) 0
cache miss 2
files in cache 4
cache size 16 Kbytes
max cache size 1.0 Gbytes
developer#crunchbang:~$ ccache g++ -c /home/developer/unique_name1/contest.cpp
developer#crunchbang:~$ ccache g++ -c /home/developer/unique_name2/contest.cpp
developer#crunchbang:~$ ccache -s
cache directory /home/developer/.ccache
cache hit (direct) 2
cache hit (preprocessed) 0
cache miss 2
files in cache 4
cache size 16 Kbytes
max cache size 1.0 Gbytes
developer#crunchbang:~$ ccache --version
ccache version 3.1.7
Copyright (C) 2002-2007 Andrew Tridgell
Copyright (C) 2009-2011 Joel Rosdahl
This program is free software; you can redistribute it and/or modify it under
the terms of the GNU General Public License as published by the Free Software
Foundation; either version 3 of the License, or (at your option) any later
version.

Have you considered changing your Makefiles to use relative paths? You could use a technique like mentioned in this post to do this without having to make too much changes.
Note additionally: CCACHE_BASEDIR makes paths relative to the current working directory (something which perhaps could be specified a bit more clearly in the manpage). This means your 2 compilation commands will result in (with CCACHE_BASEDIR=/home/developer):
developer#crunchbang:~$ ccache g++ -c unique_name1/contest.cpp
developer#crunchbang:~$ ccache g++ -c unique_name2/contest.cpp
In other words: they will still be different.
This issue will only be resolved if you compile inside the unique_name directories.
For example
developer#crunchbang:~$ cd /home/developer/unique_name1 && ccache g++ -c /home/developer/unique_name1/contest.cpp
developer#crunchbang:~$ cd /home/developer/unique_name2 && ccache g++ -c /home/developer/unique_name2/contest.cpp
Will result in:
developer#crunchbang:~$ ccache g++ -c contest.cpp
developer#crunchbang:~$ ccache g++ -c contest.cpp

The ccache misses(2) after the second compilation is the old statistics of last run.
you could run "ccache -z" to clear last ccache statistics before you compile again.

Related

wget --warc-file --recursive, prevent writing individual files

I run wget to create a warc archive as follows:
$ wget --warc-file=/tmp/epfl --recursive --level=1 http://www.epfl.ch/
$ l -h /tmp/epfl.warc.gz
-rw-r--r-- 1 david wheel 657K Sep 2 15:18 /tmp/epfl.warc.gz
$ find .
./www.epfl.ch/index.html
./www.epfl.ch/public/hp2013/css/homepage.70a623197f74.css
[...]
I only need the epfl.warc.gz file. How do I prevent wget to creating all the individual files?
I tried as follows:
$ wget --warc-file=/tmp/epfl --recursive --level=1 --output-document=/dev/null http://www.epfl.ch/
ERROR: -k or -r can be used together with -O only if outputting to a regular file.
tl;dr Add the options --delete-after and --no-directories.
Option --delete-after instructs wget to delete each downloaded file immediately after its download is complete. As a consequence, the maximum disk usage during execution will be the size of the WARC file plus the size of the single largest downloaded file.
Option --no-directories prevents wget from leaving behind a useless tree of empty directories. By default wget creates a directory tree that mirrors the one on the host, and downloads each file into the appropriate directory of the mirrored tree. wget does this even when the downloaded file is temporary due to --delete-after. To prevent that, use option --no-directories.
The below demonstrates the result, using your given example (slightly altered).
$ cd $(mktemp -d)
$ wget --delete-after --no-directories \
--warc-file=epfl --recursive --level=1 http://www.epfl.ch/
...
Total wall clock time: 12s
Downloaded: 22 files, 1.4M in 5.9s (239 KB/s)
$ ls -lhA
-rw-rw-r--. 1 chadv chadv 1.5M Aug 31 07:55 epfl.warc
If you forget to use --no-directories, you can easily clean up the tree of empty directories with find -type d -delete.
For individual files (without --recursive) the option -O /dev/null will make wget not to create a file for the output. For recursive fetches /dev/null is not accepted (don't know why). But why not just write all the output concatenated into one single file via -O tmpfile and delete this file afterwards?

Eclipse Makefile: Make Variables are skipped

I have a project with a Makefile in it, on Unix console it works fine, compiles, builds and I can run the binary at the end.
I imported the project into Eclipse workspace and somehow Makefile module of Eclipse cannot build the project now. It gives the following error:
g++: error: /src/main: No such file or directory
Whereas there should have been
g++ -I $(APR_INCLUDE) -I $(CMS_HOME)/src/main
which uses two make variables. I already put them before this line and define them as :
export APR_INCLUDE=/usr/include/apr-1
export CMS_HOME=~/Desktop/activemq-cpp-library-3.8.4
Same Makefile is fine with console, but not with Eclipse, which is weird.
Any thoughts?
Here is where I put my export lines:
obstacleDetection_cpp: src/obstacleDetection.cpp protoc_middleman
export APR_INCLUDE=/usr/include/apr-1
export CMS_HOME=~/Desktop/activemq-cpp-library-3.8.4
g++ -I $(APR_INCLUDE) -I $(CMS_HOME)/src/main -g -o src/obstacleDetection.o -c src/obstacleDetection.cpp
cd libs && cp $(CMS_HOME)/src/main/.libs/libactivemq-cpp.so.18.0.4 . && ln -sf libactivemq-cpp.so.18.0.4 libactivemq-cpp.so.18
g++ -L $(CMS_HOME)/src/main/.libs/ -g -o bin/obstacleDetection src/obstacleDetection.o src-gen/Point.pb.cc src-gen/Point.pb.h -lactivemq-cpp -lssl -lprotobuf -pthread
#echo "Success. Run the executable from the binary directory with: LD_LIBRARY_PATH=../libs/ ./obstacleDetection"
This is not right:
obstacleDetection_cpp: src/obstacleDetection.cpp protoc_middleman
export APR_INCLUDE=/usr/include/apr-1
export CMS_HOME=~/Desktop/activemq-cpp-library-3.8.4
g++ $(APR_INCLUDE) -I $(CMS_HOME)/src/main ...
All lines in the recipe (that is, lines that are indented with a TAB in a target context like this) are passed to the shell. These are not make variable assignments. There are two things wrong with that:
First, each logical line in the recipe is passed to a new shell. That means any changes to the process context (such as the environment or the working directory) are present only for the duration of that logical line; once the shell processing that line exits, all those changes are lost. So, these lines have no impact: they set an environment variable in the shell, then the shell exits and that setting is gone.
Second, the variable references you make in your compile line, such as $(APR_INCLUDE), are make variable references, not environment variable references. So even if those environment variable assignments still had effect, they would not be used because you're not referring to environment variables here.
You want to create make variable assignments. That can only be done outside of a recipe. Also, you don't need to export them because only make needs to see them (make will expand them before invoking the shell). So, your makefile should look like this:
APR_INCLUDE = /usr/include/apr-1
CMS_HOME = $(HOME)/Desktop/activemq-cpp-library-3.8.4
obstacleDetection_cpp: src/obstacleDetection.cpp protoc_middleman
g++ -I $(APR_INCLUDE) -I $(CMS_HOME)/src/main -g -o src/obstacleDetection.o -c src/obstacleDetection.cpp
cd libs && cp $(CMS_HOME)/src/main/.libs/libactivemq-cpp.so.18.0.4 . && ln -sf libactivemq-cpp.so.18.0.4 libactivemq-cpp.so.18
g++ -L $(CMS_HOME)/src/main/.libs/ -g -o bin/obstacleDetection src/obstacleDetection.o src-gen/Point.pb.cc src-gen/Point.pb.h -lactivemq-cpp -lssl -lprotobuf -pthread
#echo "Success. Run the executable from the binary directory with: LD_LIBRARY_PATH=../libs/ ./obstacleDetection"

raspberry pi darkice Buffer overrun

I'm trying to do streaming with Darkice. The console shows:
root#raspberrypi:~# darkice
Using config file: /etc/darkice.cfg
Using ALSA DSP input device: hw:1,0
Using POSIX real-time scheduling, priority 98
Buffer overrun!
Buffer overrun!
...
Buffer overrun!
Buffer overrun!
DarkIce: AlsaDspSource.cpp:265: Input/output error [0]
root#raspberrypi:~#
I use:
Raspberry Pi B
Raspbian Soft-float Debian “wheezy” (I need use Java for other things)
Darkice 1.2
I compile source codes with this commands:
codec vorbis
cd /opt/
tar -zxvf libvorbis-1.3.3.tar.gz
tar -zxvf libogg-1.3.1.tar.gz
cd ./libogg-1.3.1/
./configure
make
make install
cd ../libvorbis-1.3.3
./configure
make
make install
codec aacplus
cd ..
tar -zxvf libaacplus-2.0.2.tar.gz
cd ./libaacplus-2.0.2/
apt-get install autoconf #I need the utility "autoconf" for the installation
apt-get install libtool #and the library "libtool"
./autogen.sh
./configure
make
make install
codec lame / mp3
cd ..
tar -zxvf lame-3.99.5.tar.gz
cd ./lame-3.99.5/
./configure
make
make install
this I'm not sure that I need
wget http://www.mega-nerd.com/SRC/libsamplerate-0.1.8.tar.gz
tar -zxvf darkice-1.2.tar.gz
cd ./libsamplerate-0.1.8
./configure
make
make check
make install
ldconfig -v
sndfile-resample
darkice
cd ..
tar -zxvf darkice-1.2.tar.gz
cd ./darkice-1.2/
apt-get install libasound2-dev
./configure --with-vorbis-prefix=/usr/local/lib/ --with-lame-prefix=/usr/local/lib/ --with-aacplus-prefix=/usr/local/lib/ --with-alsa-prefix=/usr/lib/arm-linux-gnueabi/ --with-samplerate-prefix=/usr/local/lib --with-jack-prefix=/usr/lib/arm-linux-gnueabi/
make
make install
The installation ends fine for all packages.
The params of de file: /etc/darkice.cfg are:
[general]
duration = 0
bufferSecs = 1
reconnect = yes
[input]
device= hw:1,0
sampleRate = 44100
bitsPerSample = 16
channel= 2
[shoutcast-0]
bitrateMode = cbr
bitrate = 56
format = mp3
quality = 1.0
...
The device hw:1,0 is correct, because when I use Audacity I record fine.
I followed this instruccions: http://www.t3node.com/blog/live-streami ... pberry-pi/
But the problem persists...
I saw other posts talking about this problem, but nothing helps me.
Any idea? What can I do?

Best way to package a binary which has two sources for different architectures

I'm trying to create an RPM of some software we have from an external entity. THey provide us with tarballs of 32bit binaries, and 64bit binaries.
I'm wondering what is the best way to create a spec file which could handle both types of binaries.
I tried something like :
%prep
%ifarch i686
# Use Source0 (32bit)
%setup -c -T -a 0
%endif
%ifarch x86_64
# Use Source1 (64bit)
%setup -c -T -a 1
%endif
But this is giving me back :
+ %setup -c -T -a 1
/var/tmp/rpm-tmp.67731: line 25: fg: no job control
error: Bad exit status from /var/tmp/rpm-tmp.67731 (%prep)
I'm guessing this is due to the -a option given to %setup, which I believe means "change directory first, then extract source $arg1.
Is there a better way to do this?
I am not sure what is contained in your Source0 or Source1, but likely they are not tarballs so I see no reason to call %setup. Instead work with them like this:
%prep
#no %setup
%ifarch i686
#use %{SOURCE0}
%endif
%ifarch x86_64
#use %{SOURCE1}
%endif

solaris tar for files > 8G

I made an archive of 19G size in Solaris10 with tar E option. But now neither tar tvf nor tar xvf on the tarball works!! How can I extract the files?
Have you tried GNU tar (gtar)? There is a solaris SFW package for this SUNWgtar or try SunFreeware.
From the tar(1) man page:
See largefile(5) for the description of the behavior of tar
when encountering files greater than or equal to 2 Gbyte (
2^31 bytes).
On my Solaris 10 system largefile(5) says that tar is largefile-aware.
Perhaps truss can help:
truss -a -f -o /tmp/truss.out tar xf foo.tar
(please post truss.out if it's not too long, or perhaps just the tail of it otherwise).
EDIT: I just stumbled across patches 138621-02/138622-02, "SunOS 5.10: tar patch" from June 2010. In particular, fixes bug "6578528 /usr/bin/tar dumps core when extracting large files". (This is not a Recommended or Security patch so could have been missed).