Restore corrupted TFS files from $tf\ .gz files? - azure-devops

My hard disk had errors and I lost a number of files before they could be checked in. But I'm wondering if there's a way to look in the \$tf folder's .gz files to restore the files that were lost and not checked in? Any chance a diff version was saved in there that I can restore from?

But I'm wondering if there's a way to look in the \$tf folder's .gz files to restore the files that were lost and not checked in?
I am afraid you could not to restore the files based on the .gz files.
The .gz files are generated when mapping sources, which keeps a hash and some additional information on all file in the workspace so that it can do change tracking for Local Workspaces and quickly detect the changes in the files. But it does not contain source files. If you do not check the source files in the source control, TFS could not restore those files from the source.
To restore those files, you need to find a way to recover hard disk data.
Hope this helps.

Related

How to clean up and create a Archiving strategy for a folder on Github

I have
A git repository
A folder on this repository
To this folder I upload .SQL files. A new DDL is a new .SQL file and it is uploaded to the same folder as this is the place from which a CICD process kicks off to act upon this new file. I do change the sql code now and then but have no use for them after a certain point as it gets executed to the ultimate database via Liquibase
The Problem
Over time this folder now has close to 5000 .SQL files and growing everyday
Its getting cumbersome to navigate and find anything in this folder
The CICD build out of this folder is taking a lot of time it zips the entire folder
I want to
Archive/Remove everything more than 3 months old from the main folder
Move the old files to an Archived location so that I can refer to them
Get the file count down to a manageable level
Possibly do the archiving in a automated way without manual intervention
I Do not want to
Delete the files as I have to maintain a history
Change the process as may be have only one sql file and keep changing it.
I do not want Delete the files as I have to maintain a history
And yet, this is the simplest solution, as you still list the history of a deleted file.
# filter the deleted file to find one:
git log --diff-filter=D --summary | grep pattern_to_search
# Find the log of a deleted file:
git log --all -- FILEPATH
That means your process would simply:
list all files older than a certain date
add their name to a "catalog" file (in order to query them easily later on)
delete (git rm) them (their history is still there
For any file present in the "catalog" file, you still can check their log with:
git log -- a/deleted/file

Perforce verify checksum before flush?

In perforce manuals about p4 flush, they say it's dangerous operation.
Because it does not actually transfer files. but I was thinking if it's possible to make this less dangerous by first verifying the checksum of files, then performing the flush if checksum are equal.
Me and my friend have identical files in our workspace, but the files are too big, and it takes time to upload them on the server. so when the upload is finished, I want to make sure that files are still identical.
we could calculate SHA-1 of our workspace and manually make sure files are still identical. (we are working on a unreal engine project with lots of binary files and some files might be changed by now.)
also the project contains some ignored files and we have to make sure to exclude them from checksum.
does perforce perform this verification by itself? or is there a command(or script) for this?
The p4 diff -se command will tell you whether your unopened workspace files match the corresponding depot revisions.
For your particular use case, I'd recommend skipping that step; if you follow p4 flush with p4 clean, it will force a re-sync of everything that doesn't match the depot (but only those files).

Perforce - Recover deleted file

I opened a file (p4 edit) and made few changes to it. I deleted (rm -rf) the directory which contained this file followed by a p4 sync -f to bring back the depot files (in hopes of getting rid of a lot of untracked/generated files in the directory).
However, it helped me only partially. While I was able to get rid of of the undesired files, the sync step could not bring back the edited file.
I can see that the file is in opened state with p4 opened, but I can't seem to find a way to bring back this opened file along with my changes.
Any clues?
Edited files are not stored on the server; they are only stored locally. Since you removed the modified file with rm -rf you cannot get it back (unless the file was backed up by another process, such as a netapp .snapshot directory).
The server keeps track of the state of files but the changes are not stored until you submit.

Renaming .nupkg to .zip

We have recently changed our build process to output only .nupkg files and one of our clients doesn't like this idea.I have renamed the .nupkg files to .zip and I can access the files.
My question is... is it acceptable to rename the file extension or will it damage any of the files, compression isn't an issue, so it doesn't matter about that, we just need to be able to give them a .zip version.
No, the contents of a file are not changed when the filename is changed and your files in the .nupkg won't get damaged by renaming it to .zip.

How can I prevent version control from tracking temporary files?

I am using Mercurial to version my source code and am fairly new to it and still learning about it's abilities. I added my source to a repository by adding the entire folder and all subdirectories.
The problem is that now I have temporary files which show up as 'changed'. I realized I don't need to track these temporary files at all. Is there some way I can tell mercurial to forget all files with a specific extension such as .~temp?
Use an hgignore file: http://www.selenic.com/mercurial/hgignore.5.html