How to recover the deleted files using "rm -R" command in linux server? [closed] - command

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I have unfortunately deleted some important files and folders using 'rm -R ' command in Linux server.
Is there any way to recover?

since answers are disappointing I would like suggest a way in which I got deleted stuff back.
I use an ide to code and accidently I used rm -rf from terminal to remove complete folder. Thanks to ide I recoved it back by reverting the change from ide's local history.
(my ide is intelliJ but all ide's support history backup)

Short answer: You can't. rm removes files blindly, with no concept of 'trash'.
Some Unix and Linux systems try to limit its destructive ability by aliasing it to rm -i by default, but not all do.
Long answer: Depending on your filesystem, disk activity, and how long ago the deletion occured, you may be able to recover some or all of what you deleted. If you're using an EXT3 or EXT4 formatted drive, you can check out extundelete.
In the future, use rm with caution. Either create a del alias that provides interactivity, or use a file manager.

Not possible with standard unix commands. You might have luck with a file recovery utility. Also, be aware, using rm changes the table of contents to mark those blocks as available to be overwritten, so simply using your computer right now risks those blocks being overwritten permanently. If it's critical data, you should turn off the computer before the file sectors gets overwritten. Good luck!
Some restore utility:
http://www.ubuntugeek.com/recover-deleted-files-with-foremostscalpel-in-ubuntu.html
Forum where this was previously answered:
http://webcache.googleusercontent.com/search?q=cache:m4hiPw-_GekJ:ubuntuforums.org/archive/index.php/t-1134955.html+&cd=1&hl=en&ct=clnk&gl=us

Related

Preventing brew cleanup from deleting specific old version of software [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I am a massive fan of Homebrew and have taken to using it to manage all my applications. One very useful feature is brew switch which enables switching between different versions of Ansible. Something which I require to compile some of my websites running older software.
However, I have noticed that whenever I wish to run brew cleanup, it deletes all old versions even version 2.3.2.0 of Ansible which I still require alongside the most current version.
After sifting through numerous forums and sites I have been unable to find a solution which allows me to keep this old version of Ansible and the most current when using the brew cleanup command other than deleting everything manually.
Does anyone have a workaround or solution, I thought brew pin may be a possibility, but this seems to only work with the version currently linked.
I don't see a clean built-in way with brew cleanup to do this, but a workaround: since brew cleanup optionally takes a list of formulae to clean up, we can make such a list that contains everything but Ansible.
This is how I can get that list:
brew list | grep -v ansible
And this is how I can call cleanup to ignore Ansible:
brew cleanup $(brew list | grep -v ansible)
Maybe I want that as an alias somewhere, like bca for "brew cleanup (but not) ansible":
alias bca='brew cleanup $(brew list | grep -v ansible)'
and add that line to my ~/.bashrc.

Windows 7 64 Bit/Save mode: Rename msi.dll not possible [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
Due to problems with the windows installer 5 I must rename msi.dll for reinstallation. That's not possible in save mode/as administrator (access denied). Of course the service is stopped. Any hints?
Edit: My problem is that I can't install msi files anymore. Everytime after some dialogs I get an error message that the corresponding msi file can't be read. I have tried any infos I found in the www universe and lost a lot of time already. E. g. I replaced the registry settings, used sfc /scannow without success.
In Windows 7 there is no dllcache, so I really don't know what is preventing renaming.
My problem is that I can't install msi files anymore. Everytime after some dialogs
I get an error message that the corresponding msi file can't be read
This sounds a little bit strange. If you see MSI dialogs and the install fails when you kick it off there must be something else wrong. I assume you have, but have you verified that the problem exists with multiple MSI files? Try with a fresh MSI file, preferably one that you downloaded fresh from the Internet. Try to run from the local disk and from a network disk.
Have you enabled logging for the install? Try to do so with flush to log enabled (the ! character enables continuous flushing to log so that an msiexec.exe crash doesn't leave an empty log file):
msiexec.exe /i C:\Path\Your.msi /L*vx! C:\Your.log
See msifaq.com for more details (logging faq entry). Search for "value 3" in the log file to find errors as explained by Rob Mensching (Wix & Orca author).
Also try to disable any anti virus software and / or desktop security that may be interferring with the file extraction from the MSI's cab file. Is there plenty of disk space? Are there any errors found during a disk scan?
Are you trying to revert to a previous version of Windows Installer? Here is some information: http://support.microsoft.com/?kbid=315346.
I suppose you could use system restore as well, but that would have other side effects.
What is the overall problem? Windows Installer 5 does not seem to introduce anything very controversial: http://msdn.microsoft.com/en-us/library/windows/desktop/dd408114(v=vs.85).aspx

Version controlling bashrc, etc [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I want to version control all the configuration files I have in my home directory on Linux machines. Files like
.bashrc
.bash_aliases
.bash_functions
.emacs
.gitconfig
.profile
Then I could just clone the repo into my home directory on any computers I had to do work on, and keep nifty emacs macros or bash functions I create up to date on all my servers. GitHub has a lot of features that make it an attractive solution for this, but I can't clone repos into existing directories, which is a problem.
What is a good way to manage these files across all the computers I use?
So the approach I use, and a lot of others use, is to have a dotfiles folder. In here you keep your .bashrc, .vimrc, etc. and create a repo of that folder. Clone this folder to all your machines, and soft link to the files using the ln command.
cd ~
mkdir dotfiles
mv .bashrc dotfiles/
#move other files
ln -s dotfiles/.bashrc .bashrc
#link other files
#do the git stuff

Restore postgresql from files [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have a big problem - I managed to accidentally uninstall the whole PostgreSQL DBMS from my hard drive. I also lost my database and haven't made any dumps of the containing data. I do, however, have a backup of all files from the server. Is it possible to somehow restore the database from these files?
The OS I am using is Debian 6, and the DBMS version is PostgreSQL 8.4.
If it is indeed possible, then how should I go about achieving this?
ps. Sorry for my English.
Make sure your backup is safe. So long as we have that we can start again.
Restore the PostgreSQL server software (check package titles)
apt-get install postgresql-8.4 postgresql-client-8.4 postgresql-contrib-8.4
Stop the server
/etc/init.d/postgresql stop
Restore all your data files. Make sure the ownership is correct:
cd /var/lib/postgresql/8.4/
mv main main.OLD
cp -a /path/to/backup/main .
/etc/init.d/postgresql start
Check the logs (/var/log/postgresql/...) - if your backup occurred while the database was idle you are probably in luck.
Note that you need everything in .../main/ - the database files are in main/base but there are the transaction logs and other assorted bits and pieces needed too.
If you get problems, check your permissions, check your postgresql.conf file (restore that from backup too if you have it, pg_hba.conf etc too). There might be some other packages you need to install too if you were using pl/perl or some such earlier
Now. if you get problems complaining about missing log-files or bad blocks then that means the backup happened while the database was writing to the disk and there may be corruption. However, let's be optimistic and hope for the best.
If it works, check everything looks OK and take a pg_dump of any databases you want straight away.

In Ubuntu, is it a good idea to launch eclipse with sudo? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I used maven to download mahout and hadoop recently. Because I could not seem to do that without using sudo mvn commands, eclipse could not seem to be able to use anything I had downloaded (there were lots of errors like parents of things like POM.xml being permission denied etc.) and more recently than that I was trying out mahout (with local jars downloaded directly from one of apache's mirrors, not from maven) and although I could run the class the first time, I couldn't do it again because my eclipse instance could not overwrite the file I had already written.
These are just examples of times I feel it would have been good to be running eclipse as superuser by doing
sudo eclipse
Instead of just launching it normally. The only problem I can think of is that as root eclipse suggests you use the root/workspace, but is it ok to just tell it to use yourusername/workspace?
In general- no. It's tempting, but not very good practice to do all of your development as the superuser. If you're running Eclipse as root, then you're also launching Java processes as root when you run your software. (You could change your Java run settings to sudo back to a regular user before running, but I wouldn't recommend that as a solution).
In addition to being a security risk, you are also making it difficult to track down bugs if you want to distribute the software to others to run as non-root (e.g. doing root only things like reading a protected file or using a well known port might work for you, but not for the average user).
I recommend finding the files that are causing issues and doing chmod o+r on them.