Installing library on dev server without touching anything - server

I want to install wkhtmltopdf library inside /home/dev directory, and I can't touch anything else outside of this directory, because it's not my server.
The file has .deb extension, I have run in /home/dev:
$ wget "http://file-to-install.com/"
$ dpkg -x my_file.deb
So the file exists. Now I want to run:
$ dpkg -i my_file.deb
Which will install it, but my question is - does this install the library only inside this dev folder, without touching anything else?

You should refer to How to extract RPM or DEB packages, which is linked to from the FAQ on the downloads page:
ar p wkhtmltox.deb data.tar.xz | tar zx

Related

Steps to install perl version 5.26.3 in Solaris 11 Host

I would like to install perl version 5.26.3 in host running with Solaris 11 using service account. The installation has to be done in Application file system or in user directory.
Please could someone share with the steps to install ?
Please note that perl 5.26 version is already installed at OS level , but we want to have our own perl installation in application folder rather than using OS level perl interpreter
Thanks.
Here is an example of installing from source using the defaults. If you want to modify the defaults, have a look at the INSTALL document.
$ wget https://www.cpan.org/src/5.0/perl-5.26.3.tar.bz2
$ bunzip2 perl-5.26.3.tar.bz2
$ tar xvf perl-5.26.3.tar
$ cd perl-5.26.3
$ sh Configure -de -Dprefix='/some/dir' # Where to install
$ make
$ make test
$ make install
Then edit the PATH environment variable to include /some/dir/bin such that the shell can find the new perl.

How to install file in include directory yocto

I'm trying install files extracted from tar file, but non of my files are installing under usr/include directory on target board, but I see my files under temp/work/aarch64/recipedir/image/usr/include/mydir/ and include/myfile.h. While building I haven't got any errors.
do_install() {
install -d ${D}${includedir}
mkdir -p ${D}${includedir}/mydir
install -m 0644 ${S}/include/myfile.h ${D}${includedir}
install -m 0644 ${S}/include/mydir/*.h ${D}${includedir}/mydir/
}
FILES_${PN} += "${includedir}/mydir
Everything in ${includedir} is put into ${PN}-dev by default.
c.f.: https://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/meta/conf/bitbake.conf#n316
You have to remember that a file can only be in one package. To determine in which package a file is going to make it, it's pretty simple. Starting from leftmost package in PACKAGES, the first package to have the file matching any path in FILES_<pkg> will have the file.
By default, ${PN}-dev appears before ${PN} in PACKAGES.
c.f.: http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/meta/conf/bitbake.conf#n294
You can check which package has your file without "reverse-engineering" the whole thing by running oe-pkgdata-util find-path '/usr/include/mydir'.
If you really want this header file in your system (why?), you can either add ${PN}-dev to your image or hack things (remove the -dev package from PACKAGES or move ${PN} before ${PN}-dev, if you only have one file in ${includedir}, etc.).

How to download packages from pypi using Wget?

Where from should I download virtualenv to use locally from source using wget?
I'm having troubles downloading virtualenv from the command line.
Info: if you search for virtualenv you will find the site for the stable version and its installation guide which is just equals to the latest version installation guide.
To install locally it describe this :
To install version X.X globally from source:
$ curl -O https://pypi.python.org/packages/source/v/virtualenv/virtualenv-X.X.tar.gz
$ tar xvfz virtualenv-X.X.tar.gz
$ cd virtualenv-X.X
$ [sudo] python setup.py install
To use locally from source:
$ curl -O https://pypi.python.org/packages/source/v/virtualenv/virtualenv-X.X.tar.gz
$ tar xvfz virtualenv-X.X.tar.gz
$ cd virtualenv-X.X
$ python virtualenv.py myVE
I'm using wget instead curl but should not be problem at all with this. Am I wrong?
The only place where I can download it (and not from the command line) is from here.
I'm typing the url correctly.
Different virtualenv versions at the url return the same not found:
HTTP request sent, awaiting response... 404 Not Found
2017-07-21 17:53:09 ERROR 404: Not Found.
Please note that I have already downloaded the tar.gz so I don't need it right now but I'm not sure if this is a broken link issue or I'm forgotten something else at the download command. I will not associate an issue to virtualenv just because a broken link but I need to know why this is not working.
EDIT: I can't download it using wget from ..python..packages/source/v/virtualenv etc.
TARGET="https://pypi.python.org/simple/virtualenv/"
PATTERN="virtualenv-15.1.0.tar.gz"
wget --recursive --no-directories --accept=$PATTERN $TARGET
Yes, curl and wget are equivalent for what you're trying.
No, you can download from anywhere. From PyPI, e.g.
There is no functional difference between pip install virtualenv and what you're trying to do but the former is simpler and less error-prone so why bother with manual labour?

Installing LuaMongo on Ubuntu 11.10

I have researched and viewed the post to install luamongo- http://groups.google.com/group/luamongo/browse_thread/thread/1eaa56974614dc90/c91c842e241aa4de#c91c842e241aa4de
But the installation will not work. I already have mongodb-10gen version 2.0.3 and lua5.1 version 5.1.4.10 installed.
How do I download luamongo from https://github.com/moai/luamongo and install it and get it working as an import statement in a lua script to be able to write to a mongo db? Any suggestions would be helpful, nothing I have tried so far or read has been able to help. If more information is needed I will post it. Thanks in advance.
I got this script from a friend of mine which should be helpful:
# Download mongodb and driver
wget http://downloads.mongodb.org/cxx-driver/mongodb-linux-x86_64-v2.0-latest.tgz
wget http://fastdl.mongodb.org/linux/mongodb-linux-x86_64-2.0.2.tgz
# Extract each
tar xvzf mongodb-linux-x86_64-2.0.2.tgz
tar xvzf mongodb-linux-x86_64-v2.0-latest.tgz
# Add mongo bin to PATH
export PATH=$PATH:~/mongodb-linux-x86_64-2.0.2/bin
# Grab dev tools and dependencies (May need to run apt-get update to download all)
sudo apt-get -y install tcsh scons libpcre++-dev libboost-dev libreadline-dev libboost-program-options-dev libboost-thread-dev libboost-filesystem-dev libboost-date-time-dev gcc g++ git lua5.1-dev make
# Grab latest luamongo (will need to add your github ssh key)
git clone git#github.com:moai/luamongo
# Compile mongo driver
cd mongo-cxx-driver-v2.0
sudo scons install
# Install where lua can load it
sudo cp libmongoclient.* /usr/lib

MongoDB SpiderMonkey doesn't understand UTF-8

If I add non-ASCII characters to MongoDB database then all db.find() fail telling "non ascii character detected".
It's problem of SpiderMonkey, I have to rebuild it with UTF-8 support.
I've tried to do it like in
http://www.mongodb.org/display/DOCS/Building+Spider+Monkey
but it doesn't work (SpiderMonkey is not installed after I've completed all steps).
I've got Ubuntu 11.04. Does anybody have instruction how to make it work there?
Working instruction how to make work MongoDB with Google V8 can also help.
I'm using MongoDB on Ubuntu Server 11.04, installed it after making fresh OS install using this instruction: http://www.mongodb.org/display/DOCS/Ubuntu+and+Debian+packages
Everything is working fine out of the box. Is it critical for you to build MongoDB from scratch?
Using the 10gen-published packages works fine, but if you actually want to compile SpiderMonkey from source with UFT-8 support:
curl -O ftp://ftp.mozilla.org/pub/mozilla.org/js/js185-1.0.0.tar.gz
tar xvzf js185-1.0.0.tar.gz
cd js-1.8.5/js/src
export CFLAGS="-DJS_C_STRINGS_ARE_UTF8"
export CXXFLAGS="-DJS_C_STRINGS_ARE_UTF8"
And then follow the instructions from https://developer.mozilla.org/En/SpiderMonkey/Build_Documentation
autoconf-2.13
./configure
make
make install
cp js /usr/local/bin/
This will install into /usr/local/lib, however the mongodb package looks for it in /usr/lib (where the spidermonkey package is installed). So, we link all files installed to /usr/local /lib from /usr/lib
ln -s /usr/local/lib/libmozjs185.so /usr/lib/libmozjs185.so
ln -s /usr/local/lib/libmozjs185.so.1.0 /usr/lib/libmozjs185.so.1.0
ln -s /usr/local/lib/libmozjs185.so.1.0.0 /usr/lib/libmozjs185.so.1.0.0
ln -s /usr/local/lib/libmozjs185-1.0.a /usr/lib/libmozjs185-1.0.a
Of course you could just move them into /usr/lib instead of symlinking, but I wanted to keep the utf-enabled libs away from the default location, to prevent conflicts with the default spidermonkey package. Without the libmozjs package installed, apt complains that dependencies for mongodb are not satisfied, so I've left it installed.
Keep in mind that if the spidermonkey package gets upgraded, it can overwrite the symlinks to our new libs (or the libs themselves if you've moved them to /usr/local/lib). The ideal solution would be to build your own package to solve dependency issues for good.