take long time installing and updating packages by wget on arch linux - wget

I don't know why, when im downloding or installing or updating package, download speed is very slow?
and take very long time!!!!
when im working in windows my download speed is 1M but in arch is changable ,is between 50k to 100k?
im using wget in pacman.conf
I use reflector but it had nothing changed and my package download speed is to slow.
can i create list from pacman update package and then download those from windows and again install all packages in Arch?

Please consider reading the documentation about pacman performance here.
You can manually force pacman to refresh the package lists (mirror list) with:
pacman -Syyu
To answer your question about downloading out of arch, the first part of the following will be with laverna which is a software I maintain on AUR, and the second part will be with a package in official repository.
AUR
Download
We have 2 solutions:
Use git
Download snapshot
Using Git
Clone in the current folder the repository with
git clone https://aur.archlinux.org/packages/laverna/ .
Using Snapshot
Just wget the snapshot or download directly with the browser.
wget https://aur.archlinux.org/cgit/aur.git/snapshot/laverna.tar.gz
Official Repositories
You can easily get a download link the package web page.
For example with sqlite-doc the web page link is here and the download link (from a mirror) is here.
Back to Arch
Then back to arch, we have to build and install packages. To do so we navigate to the folder where you cloned/downloaded the package (cf. cd) then execute the following to build.
(Only if packageName-version.pkg.tar.xz doesn't exist)
makepkg
Finally, after makepkg you obtain a file in the current directory which is in our AUR example laverna-0.7.4-RC1-linux-x64.pkg.tar.xz.
To install the packageName-version.pkg.tar.xz file we just have to execute the following and answer the questions as usually.
sudo pacman -U laverna-0.7.4-RC1-linux-x64.pkg.tar.xz

Check out your network settings,ip dns route... sometimes this will solve download speed problem.
Find the fastest mirror.
$ cd /etc/pacman.d/
$ cp mirrorlist mirrorlist.bak
$ rankmirrors -n 6 mirrorlist.bak > mirrorlist
Check mirrors status https://www.archlinux.org/mirrors/status/.
You can see more from Archwiki Mirrors.
Hope this helps.

Related

Conda: How to install latest version of `pandoc-crossref` from Github in `conda` environment?

pandoc-crossref must match the pandoc version, and also only the 3.10.0 release works on OSX Big Sur. Thus, it is not possible to get pandoc and pandoc-crossref running in a conda environment from the official channel or from conda-forge.
I could easily download the matching binaries from https://github.com/lierdakil/pandoc-crossref/releases/tag/v0.3.10.0 and copy them e.g. to the binpath:
$ which pandoc-crossref
/usr/local/bin/pandoc-crossref
$ curl -OL https://github.com/lierdakil/pandoc-crossref/releases/download/v0.3.10.0/pandoc-crossref-macOS.tar.xz
$ tar -xzvf pandoc-crossref-macOS.tar.xz
$ mv pandoc-crossref /usr/local/bin/pandoc-crossref
But I think that is not a clean approach, because conda will not know that I updated the version for pandoc-crossref.
What is a clean approach for updating a package managed by conda from a binary available on Github?
Update Feedstock
I updated it on the Conda Forge feedstock, which is what I regard as the "cleanest" solution.
How does one do that? First, OP had posted a comment on the feedstock in the PR that they wanted merged. This was the appropriate first step and hopefully in future cases that should be sufficient to prompt maintainers to act. In this case, it was not sufficient. So, as a follow up, I chatted on the Conda Forge Gitter to point out that the feedstock had gone stale and had non-responding maintainer(s). One of the core Conda Forge members suggested I make a PR bumping the version and adding myself as maintainer, and they merged it for me. In all, this took about 10 mins of work and ~2 hours from start to having an updated package on Anaconda Cloud.
Custom Conda Build
Otherwise, there isn't really a clean solution for non-Python packages outside of building a Conda package. That is, clone the feedstock or write a new recipe, modify it to build from the GitHub reference, then install that build into your environment. It may also be worth uploading to an Anaconda Cloud user account, so there is some non-local reference for it.
Pip Install (Python Packages Only)
In the special case that it is a Python package, one could dump the environment to YAML, edit to install the package through pip, then recreate the environment.

Install a package to a docker container (managed by dokku)

I have a hard time understanding where is the right place to place a code that will install the needed packages for the given docker container managed by dokku.
We have a scala application and, unfortunately, we need to have one shell call that is dependent on an environment. I would like to install the given package for the given container using "apt-get install". Right now I am using a custom plugin with a file named "post-release-build". However, I don't have the permission to install anything in that phase.
Basically, my script that should be invoked looks like this (based on a dockerfile that is available online):
apt-get update
apt-get install -y build-essential xorg libssl-dev libxrender-dev wget gdebi
wget http://download.gna.org/wkhtmltopdf/0.12/0.12.2.1/wkhtmltox-0.12.2.1_linux-trusty-amd64.deb
gdebi --n wkhtmltox-0.12.2.1_linux-trusty-amd64.deb
echo "-----> wkhtmltox installed!"
Is there a way how to make it work? I would also prefer to have such a file somewhere in the application so I don't need to setup environment before pushing the app (in the future).
EDIT:
I have found a plugin that should be capable of installing packages using apt-get (https://github.com/F4-Group/dokku-apt) however, I am a little bit unlucky because it downloads a package that is not working properly.
Since just downloading with apt-get will download a package that fails, I investigated deeper into dokku and came out with a new plugin that should install the package for you.
I have created a script, documented how to use it and licenced it over MIT license so feel free to use it. Hopefully it will save you the time I had to spend realizing what is going on.
URL: https://github.com/mbriskar/dokku-wkhtmltopdf

Download RPMs for all dependencies for package using yum

I'm attempting to create a local yum repo on my system containing various packages from, chiefly, the CentOS base repos. The server which is hosting the yum repo will not necessarily have the same base packages installed by default as the servers which will be using the yum repo. For this reason, I need to ensure that my repos contain the packages that I want and every single one of their dependencies.
I'm creating my repos using the yumdownloader tool provided in the yum-utils package to try to download an RPM file for a package using yum from the standard CentOS mirrors. Helpfully it provides a command line option, --resolve, which also downloads dependencies. However, because it's built on yum itself, yumdownloader will only download dependencies for the package that are not already present on the system.
For example, I wish to download package A, which depends on Packages B, C and D. If package D is already installed on the system, yumdownloader --resolve A will only download A, B and C, but not D.
Is there a way to download the RPMs for all dependencies on a package from a yum repo?
There's this bash script, which the maintainer of rpm has kindly shared with me, and I put on github. Hope you find it useful!
You can also read the original SO question, where the issue was discussed.
The script works on Fedora 23+ as it uses dnf's download plugin. It's probably very easy to make it work on Fedora 22-, as yum surely has got a similar plugin.
Additionaly, it's valuable since repotrack does not work on fedora 23 (at least it doesn't work for me).
After a lot of frustration looking around for a solution I have written a simple script that uses repotrace and wget. I've found that yumdownloader (even with the resolve flag) does not resolve all dependencies.
if you have a long list of packages you are bound to run into duplicates, downloading just the urls first with the "repotrack -u flag" and then getting unique records resolves having to download the same rpm multiple times.
#!/bin/bash
while read i; do
repotrack -u $i >> dep_rpm_urls_02.txt
done < list_of_packages_01.txt
awk '!seen[$0]++' dep_rpm_urls_02.txt > dep_rpm_urls_clean_03.txt
while read j; do
wget $j
echo dowloaded $j
done < dep_rpm_urls_clean_03.txt
happy rpming

How to build gstreamer ugly plugins from source

I would like to change some code in one element X in gstreamer ugly plugin and rebuild and use it.
How I can do it?
I have gstreamer-0.10 and installed gstreamer-ugly plugin.
I would like to download only gstreamer0-10 ugly plugin code and change it and would like to use the new lib file. How I can do it?
unfortunately gstreamer-ugly depends on a lot of stuff in at least libgstreamer and plugins-base (if you're using linux and your distro provides *-dev packages as debian/ubuntu does).
If you're on debian you could use dpkg-buildpackage after checking out the source using apt-source. The big advantage here is that all the build dependencies can be easily installed.
The manual way will probably need you to first build all the other gstreamer packages have a close look on what ./configure tells you
I'm workin on debian and have already built gstreamer+plugins to backport the recent ones to ubuntu (although I'm not sure if I did it in a best-practice way ;) )
/edit: I'll try to cover the basic steps for ubuntu here:
add the source repositories to apt (check the "source code" checkbox in the ubuntu software center's "software sources" tool
sudo apt-get install dpkg-dev devscripts
sudo apt-get build-dep gst-plugins-ugly0.10
apt-get source gst-plugins-ugly0.10
change to the newly created gst-plugins-base* folder
dpkg-buildpackage (and make sure it works)
change the source to your needs
you can rebuild it any time using dpkg-buildpackage (to simply see if it compiles make might be faster though). This creates a .deb file in the parent folder that you can simply install using dpkg -i
If it's a useful change you might want to get in touch with the gstreamer-devs ;)
On a debian system, run apt-get build-dep gstreamer0.10-plugins-ugly to get all the build dependencies for that package. After that you can build the package from git, source tarball or even rebuild the debian package (using dkgp-buildpackage).

Using downloaded wget

I downloaded the source code of wget using apt-get source wget. I want to modify it a little, then use this wget rather than the one I'm using in /usr/bin/wget. How can I do that?
apt-get source wget is retrieving your distribution's source code of wget.
You may want to work on the genuine upstream wget source, which you can get (with some wget or some browser) by following links from http://www.gnu.org/software/wget/
Then you configure, build and install - usually with ./configure; make; sudo make install but the details may vary from package to package. You should look into files named README and INSTALL
You could also be interested by libcurl
Notice that the GPL license requires more or less that you publish your patch (in source form) if you redistribute your patched version of your improved wget software binary