Restore deleted files on GCS, versioning off? - google-cloud-storage

Accidentally we deleted some images files from our GCS bucket which didn't have the "versioning" activated (it was OFF).
Is it still a way we could restore these deleted files?

Unfortunately, no.
GCS objects generally cannot be recovered after being permanently deleted by a user.

There is no way to restore a deleted object unless versioning is enabled on a bucket.

Related

Is it possible to delete a file from GitHub and also erase all evidence of its existence

Is it possible to delete a file from github.com and also erase all evidence that it ever existed?
When I click on history, I can see all modifications of the github project, including deletion of files and directories. Is it possible to put a data file, then delete it, and then delete the evidence of its existence from history?
Yes, it is possible.
However, since git is a versioning tool and was not actually made to change the history and permanently remove files, it can be implemented.
But be aware, the selected file will be removed from the Git history, and you can no longer access a previous version of that file, it will be removed from the .git folder until it is tracked again at some point.
With git-filter-repo you can rewrite your Git history and remove a file from every commit that it was involved with. Read the documentation about it.
Another alternative is the BFG Repo-Cleaner.

Deleting files and folders from a helix server while keeping them on the pc

I am new to perforce, i submittet my previous project to it as asked and added the p4ignore later. but now i dont want the files from the p4ignore like the .vscode folder on the server however when i try to mark for delete it also deletes them from my machine. how can i remove them on my server but keep them on the local machines
You probably want to use the p4 obliterate command; this is used to permanently remove files from the server (including all their history), which will leave your local files in an untracked state. Note that this requires admin level permission since file history is normally considered immutable.
If you can't get an admin to help with this, you can use the p4 delete -k command to open the files for delete while keeping the local files. This is a little tricky because it still results in a deleted revision, and if you're not careful you might end up getting surprised at some point by having a sync operation delete your local files (e.g. a force sync may delete your local files to force them into agreement with the head depot revision even though they aren't on the client have list).
To avoid that potential problem, after you delete the files, exclude them from your client view. That will not only prevent them from being added (similar to .p4ignore) but will also firmly exclude them from any operation that touches client files, including sync. (I usually recommend using the client view to exclude files in the first place instead of p4ignore -- it has the advantage of being tracked on the server, and it also prevents you from syncing down "ignored" files submitted by other workspaces whose settings don't match yours.)
tl;dr: use obliterate for mistakenly added files if you can, otherwise use a combination of delete -k and client view tuning to make sure the depot and client files are hidden from each other.

Restore corrupted TFS files from $tf\ .gz files?

My hard disk had errors and I lost a number of files before they could be checked in. But I'm wondering if there's a way to look in the \$tf folder's .gz files to restore the files that were lost and not checked in? Any chance a diff version was saved in there that I can restore from?
But I'm wondering if there's a way to look in the \$tf folder's .gz files to restore the files that were lost and not checked in?
I am afraid you could not to restore the files based on the .gz files.
The .gz files are generated when mapping sources, which keeps a hash and some additional information on all file in the workspace so that it can do change tracking for Local Workspaces and quickly detect the changes in the files. But it does not contain source files. If you do not check the source files in the source control, TFS could not restore those files from the source.
To restore those files, you need to find a way to recover hard disk data.
Hope this helps.

How to recover files on cpanel after deleting from trash folder?

All files are deleted from trash it's looking empty. anyway to recover my files and folders?
Image
You need to see if your hosting provider offers backups for your account - if they do you can restore from there, the most used options are R1 Soft and JetBackup
Unless you have your own backups those files are lost permanently.

best practice for backing up cvs repository?

Some of our projects are still on cvs. We currently use tar to backup the repository nightly.
Here's the question:
best practice for backing up a cvs repository?
Context: We're combining a several servers across the country onto one central server. The combined repsitory size is 14gb. (yes this is high, most likely due to lots of binary files, many branches, and the age of the repositories).
A 'straight tar' of the cvs repository yields ~5gb .tar.gz file. Restoring files from 5gb tar files will be unwieldy. Plus we fill up tapes quickly.
How well does a full-and-incremental backup approach, i.e. weekly full backup, nightly incremental backups? What open source tools solve this problem well? (e.g. Amanda, Bacula).
thanks,
bill
You can use rsync to create backup copy of your repo on another machine if you don't need history of backups. rsync works in incremental mode, so bandwidth will be consumed only for sending changed files.
I don't think that you need full history of backups as VCS provides its own history management and you need backups ONLY as failure-protection measure.
Moreover, if you worry about consistent state of backed up repository you MAY want to use filesystem snapshots, e.g. LVM can produce them on Linux. As far as I know, ZFS from Solaris also has snapshots feature.
You don't need snapshots if and only if you run backup procedure deeply at night when noone touches your repo and your VCS daemon is stopped during backup :-)
As Darkk mentioned rsync makes for good backups since only charged things are copied. Dirvish is a nice backup system based on rsync. Backups run quickly. Restores are extremely simple since all you have to do is copy things. Multiple versions of the backups are store efficiently stored.