How to mirror/synchronize a local workspace folder with the server folder? - azure-devops

I have a folder that regularly gets updated. This folder is a part of a TFS workspace. I need to commit all those changes to the workspace once they occur (add what isn't there, delete what was removed and update what was changed).
Currently, I have a script that runs tf vc folderdiff command on the folder and its server counterpart, parses out the output to get three lists - files that need to be added, deleted and updated. It then manually adds, deletes and updates those files by invoking tf add/delete/checkout on batches of files (trying to add/delete/checkout in one go can cause an error if there are too many files in the list).
There has to be some better way. Is there some kind of tf command where I can tell it, look at this local folder, look at the server folder that is mapped to it and make the server folder look exactly the same? Bonus points if I can specify some kind of filter to exclude certain paths or file names/extensions.

Apparently there exists a tf reconcile command. You can learn more about the syntax here: https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/reconcile-command
Off the top of my head, the following command should do what I want:
tf reconcile [path to folder] /promote /adds /deletes /recursive /noprompt
There is also /exclude that can be used to filter unwanted files, so I get the bonus points too.
Of course I had to stumble upon it right after asking a question...

Related

Deleting files and folders from a helix server while keeping them on the pc

I am new to perforce, i submittet my previous project to it as asked and added the p4ignore later. but now i dont want the files from the p4ignore like the .vscode folder on the server however when i try to mark for delete it also deletes them from my machine. how can i remove them on my server but keep them on the local machines
You probably want to use the p4 obliterate command; this is used to permanently remove files from the server (including all their history), which will leave your local files in an untracked state. Note that this requires admin level permission since file history is normally considered immutable.
If you can't get an admin to help with this, you can use the p4 delete -k command to open the files for delete while keeping the local files. This is a little tricky because it still results in a deleted revision, and if you're not careful you might end up getting surprised at some point by having a sync operation delete your local files (e.g. a force sync may delete your local files to force them into agreement with the head depot revision even though they aren't on the client have list).
To avoid that potential problem, after you delete the files, exclude them from your client view. That will not only prevent them from being added (similar to .p4ignore) but will also firmly exclude them from any operation that touches client files, including sync. (I usually recommend using the client view to exclude files in the first place instead of p4ignore -- it has the advantage of being tracked on the server, and it also prevents you from syncing down "ignored" files submitted by other workspaces whose settings don't match yours.)
tl;dr: use obliterate for mistakenly added files if you can, otherwise use a combination of delete -k and client view tuning to make sure the depot and client files are hidden from each other.

How can I add a directory tree to my github repo?

I've been working on a project that's fairly far a long now and I decided it's time to use some sort of version control etc. I decided to go with github. Before I get in too deep let me state explicitly that I am new to github.
My project resides in a directory that contains myriad subdirectories and files of all different kinds. I'd like to take my project directory as is (structure and all) and put it in my github repo.
I followed the tutorials on github's webpage, created the repo, and manually added some files. Obviously I don't want to manually add every file (there are several hundred). I'd like to know how I can add the root directory or for that matter any parent directory and all files/folders in said said directory. In other words I'm looking for a recursive add.
I read on this SO page (How to create folder in github repository?) that you can just use
git add directory/
That works fine for me when I'm dealing with the lowest level directory, but when I try the same command on a directory with subdirectories my terminal just sits there and I have to ctrl-c. I can't tell if it's just taking a long time (as I mentioned there are lots of files) or if this is just the wrong way to add a directory with subdirectories.
Apologies in advance if this is a super ignorant question -- I have looked at a lot of blogs/posts/etc and I cannot find a solution that seems to work.
Use the Current Working Directory
Assuming you're on Linux or OS X, from the command line you would do the following:
git add .
from the root of your repository tree. That will add all non-ignored files, including non-empty directories, into the repository.
From the root directory (the one with all the subdirectories), use git add -A.
If you have a ton of subdirectories and files, it may take a long while, so just let it sit there until it's done.

History with diffs embedded

To see the history of each checkin a user has done in a directory tree i can type:
tf history . /recursive /user:name /noprompt /format:detailed
It displays all checkins "name" has performed with checkin comments and paths to the changed files. I want to display, in addition to that, the diff of each affected file. Like /format:extraverbose. Is there a way to have tf do that? If not, how can you create a powershell script that does that for me?
You can disregard things like branches and merges - there are none in the directory tree.
I don't think there's a command line for that right now, maybe you can create a Powershell script using the TFS Powertools CmdLet.
Otherwise you can still make a command line exe using the TFS API, it's easier than one might think. Look at this answer to get the source files of a command line tool that I made for someone.

How to export the files under a label from TFS in a script using tf?

I need a batch script that uses tf to retrieve the directory structure for a label in TFS, something like the equivalent of svn export, while not messing up with my current working workspace.
This is what I managed to come up with:
tf workspace /new TemporaryWorkspace /noprompt
This will create a new workspace, but with the following working folder:
$/: C:\
(considering that I ran the command from C:)
This is not what I want, but "tf workspace /new" doesn't seem to allow specifying the mapping, so I ran this to remove the default mapping:
tf workfold /unmap $/ /workspace:TemporaryWorkspace
then this one to create my desired mapping.
tf workfold /workspace:TemporaryWorkspace /map $/Project/Path C:\Temp\Path
Change the current directory to the local working folder (I don't know of another way to select the current workspace)
PUSHD C:\Temp\Path
Now I can finally retrieve the label and do my stuff with it.
tf get /version:LMyBeautifulLabel
Now the clean up.
tf workspace /delete TemporaryWorkspace /noprompt
Go back
POPD
All these seems a bit too cumbersome for my humble purpose. Is there a simpler way?
Thanks.
Unfortunately, you will need to create a workspace with the proper working folder mappings and then run the get. There's no one-liner alias to set this up for you.
You may be able to get by with creating a longer-lived workspace with the proper working folder mappings that you need not delete, but certainly if you're using this workflow frequently but with different labels or in different locations, creating a new temporary workspace each time probably does make the most sense.
Your best solution here is to either create a command script that executes this workflow or use the little known script functionality of the tf command line client. You can run a tf script by using:
tf #<filename>
or simply using:
tf #
to read from standard input.

How to clear TFS server knowledge of my local version?

Our build person was having issues compiling some source code that is checked into our TFS instance.
I was working on some changes that I was not ready to check in so I made a manual backup of my local folder and deleted the contents of my local folder. Then I did a "Get Latest - Specific Version , with overwrite" to ensure I got the latest. And made sure it compiled (it did, the issue was a setup issue on the build machine).
So now if I manually rename folders locally to go back to my version I have the problem that TFS thinks I have all the latest source ... which I don't. Files were changed by another developer but since I did a "Get Latest - Specific Version , with overwrite" it considers my code to be completely up to date.
Questions:
Can some how 'tell' tfs that my local versions are not that latest?
(I'm thinking that I might to do this with a TFS cmd line util but not really sure which one)
Was there a different way I should have done this?
Thanks.
You could delete/remove your local workspace.
Source Control Explorer -> Workspace dropdown -> Workspaces -> Remove
If you get specific version of Changeset "1" of your source code, TFS will delete local files, and will believe that you no longer have the latest code in your workspace. Then, when you do a get latest it will actually get the latest.
In future, instead of making a manual copy, create a shelveset. In the "pending changes" window, click "Shelve" and follow the dialogue (in this case you'd not want to keep your pending changes locally). This puts your work on the server in a secure, recoverable place, but without checking it in.
Alternatively, in the workspace dropdown, you can create a second workspace. That gives you two separate copies of the code locally, but also two separate sets of checkouts. This is really useful if you often find yourself interrupting one piece of work to look at something else.
If you do another "get specific" with overwrite, this should still fix your problem.
Do you know which files are changed? Are we talking a lot of files? Or just a few?
If it is just a few, then you should just copy your changed version back in then re-checkout the files. TFS will then register than you have changed those files.
If you have a lot of changed files then I recommend you give the Team Foundation Power Tools (tfpt) Online "Command Line" command a try.
The Command Line Help can be seen here.
Here some more info from Buck Hodges:
Online
With Team Foundation, a server connection is necessary to check files in or out, to delete files, to rename files, etc. The TFPT online tool makes it easier to work without a server connection for a period of time by providing functionality that informs the server about changes made in the local workspace.
Non-checked-out files in the local workspace are by default read-only. The user is expected to check out the file with the tf checkout command before editing the file. When working in this
When working offline with the intent to sync up later by using the TFPT online tool, users must adhere to a strict workflow:
* Users without a server connection manually remove the read-only flag from files they want to edit. Non-checked-out files in the local workspace are by default read-only, and when a server connection is available the user must check out the file with the tf checkout command before editing the file. When working offline, the DOS command “attrib –r” should be used.
* Users without a server connection add and delete files they want to add and delete. If not checked out, files selected for deletion will be read-only and must be marked as writable with “attrib –r” before deleting. Files which are added are new and will not be read-only.
* Users must not rename files while offline, as the TFPT online tool cannot distinguish a rename from a deletion at the old name paired with an add at the new name.
* When connectivity is re-acquired, users run the TFPT online tool, which scans the directory structure and detects which files have been added, edited, and deleted. The TFPT online tool pends changes on these files to inform the server what has happened.
To invoke the TFPT online tool, execute
tfpt online
at the command line. The online tool will begin to scan your workspace for writable files and will determine what changes should be pended on the server.
By default, the TFPT online tool does not detect deleted files in your local workspace, because to detect deleted files the tool must transfer significantly more data from the server. To enable the detection of deleted files, pass the /deletes command line option.
When the online tool has determined what changes to pend, the Online window is displayed.
Individual changes may be deselected here if they are not desired. When the Pend Changes button is pressed, the changes are actually pended in the workspace.
Important Note: If a file is edited while offline (by marking the file writable and editing it), and the TFPT online tool pends an edit change on it, a subsequent undo will result in the changes to the file being lost. It is therefore not a good idea to try pending a set of changes to go online, decide to discard them (by doing an undo), and then try again, as the changes will be lost in the undo. Instead, make liberal use of the /preview command line option (see below), and pend changes only once.
Preview Mode
The Online window displayed above is a graphical preview of the changes that will be pended to bring the workspace online, but a command-line version of this functionality is also available. By passing the /preview and /noprompt options on the command line, a textual representation of the changes that the TFPT online tool thinks should be pended can be displayed.
tfpt online /noprompt /preview
Inclusions
The TFPT online tool by default operates on every file in the workspace. Its focus can be more directed (and its speed improved) by including only certain files and folders in the set of items to inspect for changes. Filespecs (such as *.c, or folder/subfolder) may be passed on the command line to limit the scope of the operation, as in the following example:
tfpt online *.c folder\subfolder
This command instructs the online tool to process all files with the .c extension in the current folder, as well as all files in the folder\subfolder folder. No recursion is specified. With the /r (or /recursive) option, all files matching *.c in the current folder and below, as well as all files in the folder\subfolder folder and below will be checked. To process only the current folder and below, use
tfpt online . /r
Exclusions
Many build systems create log files and/or object files in the same directory as source code which is checked in. It may become necessary to filter out these files to prevent changes from being pended on them. This can be achieved through the /exclude:filespec1,filespec2,… option.
With the /exclude option, certain filemasks may be filtered out, and any directory name specified will not be entered by the TFPT online tool. For example, there may be a need to filter out log files and any files in object directories named “obj”.
tfpt online /exclude:*.log,obj
This will skip any file matching *.log, and any file or directory named obj.
I'm using a hack with opening the solution without network connection (unplug cable, turn off wifi) and solution will be opened in offline mode.
There is also a plugin called "go offline" for that.
And then, you click on "go online" which is automatically displayed, in case of offline solution.
After this, VS will check all your local files against TFS and automatically checkout files which were changed.
But for your case, I would also suggest to use shelvesets.
in TFS 2013+ and VS 2015+ you have Cloak option which deletes local files and cloaks those branches from getting downloaded to your local workspace (basically unmaps specific branches)