Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I am a massive fan of Homebrew and have taken to using it to manage all my applications. One very useful feature is brew switch which enables switching between different versions of Ansible. Something which I require to compile some of my websites running older software.
However, I have noticed that whenever I wish to run brew cleanup, it deletes all old versions even version 2.3.2.0 of Ansible which I still require alongside the most current version.
After sifting through numerous forums and sites I have been unable to find a solution which allows me to keep this old version of Ansible and the most current when using the brew cleanup command other than deleting everything manually.
Does anyone have a workaround or solution, I thought brew pin may be a possibility, but this seems to only work with the version currently linked.
I don't see a clean built-in way with brew cleanup to do this, but a workaround: since brew cleanup optionally takes a list of formulae to clean up, we can make such a list that contains everything but Ansible.
This is how I can get that list:
brew list | grep -v ansible
And this is how I can call cleanup to ignore Ansible:
brew cleanup $(brew list | grep -v ansible)
Maybe I want that as an alias somewhere, like bca for "brew cleanup (but not) ansible":
alias bca='brew cleanup $(brew list | grep -v ansible)'
and add that line to my ~/.bashrc.
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed last month.
Improve this question
I am still able to launch files with the code command, but if I try using sudo along with it I get sudo: code: command not found. It worked fine in the past, not sure how long it's been broken for me. It was nice being able to edit .rc files in code instead of nano, but I need root privileges to save those files.
I have tried uninstalling/reinstalling the WSL extensions in VSC, adding export PATH="/usr/share/code/bin:$PATH" in my .zshrc, and adding new aliases per this guide.
sudo likely resets your environment including PATH for safety purposes (I believe this is default on Ubuntu and maybe other distros). Even if you extend PATH to include VSCode in your .zshrc it will be removed by using sudo. To verify this you can do sudo zsh and then type echo $PATH.
To keep environment you can either use sudo -E switch:
-E, --preserve-env
Indicates to the security policy that the user wishes to preserve their
existing environment variables. The security policy may return an error
if the user does not have
or run visudo and add following configuration to your sudoers file that will make it default behavior limited to PATH environment variable:
Defaults env_keep += "PATH"
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
After updating to macos big sur 11.3 (20E232) I can no longer launch mytop from the terminal.
When launching mytop - which is installed via brew - I get this error:
> mytop
ListUtil.c: loadable library and perl binaries are mismatched (got handshake key 0xc500080, needed 0xc400080)
> which mytop
/usr/local/bin/mytop
> ls -la /usr/local/bin/mytop
lrwxr-xr-x 1 username admin 33 9 Dec 10:24 /usr/local/bin/mytop -> ../Cellar/mytop/1.9.1_8/bin/mytop
So far to attempt to fix I have run:
brew update
xcode-select --install (wait 5 hours)
brew upgrade
brew remove mytop; brew install mytop
Still haven't resolved it.
I imagine this would a number of binaries. Has anyone seen similar and/or have a fix ?
Solution : brew reinstall -s mytop
Details from github conversation
This was caused by Big Sur 11.3 switching the default perl to 5.30. It used to be 5.28, and that's the version that mytop expects to find at /usr/bin/perl. See Homebrew/brew#10127.
In the meantime, try brew reinstall -s mytop to rebuild mytop against the new version of system perl.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I'm using Eclipse with Perl (ActivePerl) on a PC without an Internet connection. It was quite tricky to add EPIC Perl into Eclipse, but this works fine.
Now I'd like to add the PadWalker debugger to my Perl installation - but I need an offline installer.
I found some information at:
http://perlmaven.com/padwalker
How do I install PadWalker using CPAN (cpan PadWalker) or PPM (ppm install PadWalker), but it is only specified for online installation.
Even the hint with the proxy system variable (incl. username + password) doesn't work, as there isn't any Internet connection on this PC.
So wherefrom can I get an offline installer for PadWalker? Or wherefrom can I download a ZIP archive to put it to the local repository that can be defined within the PPM (Perl package manager)?
Here's a quick version.
Go to any facility that has an Internet connection, and search CPAN for PadWalker.
The latest version is v2.2 and is documented here.
On the right of that page is a link to the latest gzipped release, currently PadWalker-2.2.tar.gz.
Copy that file to your target system.
You should download that file and follow the directions in perldoc perlmodinstall, which are essentially:
Unzip the compressed file
Unpack the tar contents
cd to the unpacked directory, and do
perl Makefile.pl
make test
And, if the tests were successful
make install
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have unfortunately deleted some important files and folders using 'rm -R ' command in Linux server.
Is there any way to recover?
since answers are disappointing I would like suggest a way in which I got deleted stuff back.
I use an ide to code and accidently I used rm -rf from terminal to remove complete folder. Thanks to ide I recoved it back by reverting the change from ide's local history.
(my ide is intelliJ but all ide's support history backup)
Short answer: You can't. rm removes files blindly, with no concept of 'trash'.
Some Unix and Linux systems try to limit its destructive ability by aliasing it to rm -i by default, but not all do.
Long answer: Depending on your filesystem, disk activity, and how long ago the deletion occured, you may be able to recover some or all of what you deleted. If you're using an EXT3 or EXT4 formatted drive, you can check out extundelete.
In the future, use rm with caution. Either create a del alias that provides interactivity, or use a file manager.
Not possible with standard unix commands. You might have luck with a file recovery utility. Also, be aware, using rm changes the table of contents to mark those blocks as available to be overwritten, so simply using your computer right now risks those blocks being overwritten permanently. If it's critical data, you should turn off the computer before the file sectors gets overwritten. Good luck!
Some restore utility:
http://www.ubuntugeek.com/recover-deleted-files-with-foremostscalpel-in-ubuntu.html
Forum where this was previously answered:
http://webcache.googleusercontent.com/search?q=cache:m4hiPw-_GekJ:ubuntuforums.org/archive/index.php/t-1134955.html+&cd=1&hl=en&ct=clnk&gl=us
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I used maven to download mahout and hadoop recently. Because I could not seem to do that without using sudo mvn commands, eclipse could not seem to be able to use anything I had downloaded (there were lots of errors like parents of things like POM.xml being permission denied etc.) and more recently than that I was trying out mahout (with local jars downloaded directly from one of apache's mirrors, not from maven) and although I could run the class the first time, I couldn't do it again because my eclipse instance could not overwrite the file I had already written.
These are just examples of times I feel it would have been good to be running eclipse as superuser by doing
sudo eclipse
Instead of just launching it normally. The only problem I can think of is that as root eclipse suggests you use the root/workspace, but is it ok to just tell it to use yourusername/workspace?
In general- no. It's tempting, but not very good practice to do all of your development as the superuser. If you're running Eclipse as root, then you're also launching Java processes as root when you run your software. (You could change your Java run settings to sudo back to a regular user before running, but I wouldn't recommend that as a solution).
In addition to being a security risk, you are also making it difficult to track down bugs if you want to distribute the software to others to run as non-root (e.g. doing root only things like reading a protected file or using a well known port might work for you, but not for the average user).
I recommend finding the files that are causing issues and doing chmod o+r on them.