I need to rename a TFS folder-full of files, to remove a specific block of text - powershell

I'm using VS 2015, so the TFSCMDLets add-in for 2015. TFS server is 2013.
The old "New-TfsPendingChange -Rename" syntax that I found here on SO is no longer supported, both per the docs and per the error messages I get when I try.
I can't just check them out and rename them using the filesystem rename command, because that causes TFS to lose track of the file.
I've also tried this using tf.exe, where the commands I need are available, but it can't seem to figure out the workspace to use, even though the containing folder only maps to a single workspace. In this case it works fine when passed a literal filename, but fails when passing a path using foreach, like this:
tf workspaces /collection:devtfs\DeltaWA_ITA_BI
dir "C:\TFS\BusinessIntelligence\Database\Reporting_Prod\Test Views\" -filter "*.View.sql" | foreach { $newname = $.Name -replace ".View.sql", ".sql"; tf rename $ $newname }
Does anyone have a sample script that works in VS2015, TFS 2013?

You should avoid renaming items managed by TFVC using your operating system (for example, using Windows File Explorer, or the rename command in the Windows command prompt). Instead, do this in Source Control Explorer:
In Source Control Explorer, select the file that you want to edit,
open its shortcut menu, and choose Rename.
Type the desired name for the item.
If you want to use command prompt or script to do this, you could use tf rename command. Not sure the detail error message for your workspace error.
However, you need to make sure that you are running the commands from a mapped folder, you can run tf workfold to double check if the current folder is mapped or not.
If it's mapped and the error still exists, you might have a problem with your workspace cache. Try to remove cache through tf workspaces command:
tf workspaces /remove:(*|workspace1[,workspace2,...])
/collection:(*|TeamProjectCollectionUrl)

The overall answer turned out be in two parts:
1. You really must execute tf.exe in the folder where you want the work done. I saw this in the code I looked at, but didn't understand it was a requirement.
2. This means you have to get tf.exe into your path. Got help from a coworker on that.
After that, my tiny script worked as desired.
I still don't see a way to do this with TFS CMDLets. This is kind of a pity, bu not really important.
Thanks for the assistance!

Related

opening vscode to latest directory in specified path

Is it possible to have vscode automatically open the newest folder within a specific path?
For example, with this configuration:
{
"folders": [
{
"path": "\\\\FromABC\\Archive",
"name":"From ABC"
},
{
"path": "\\\\FromXYZ\\Archive",
"name":"From XYZ"
}
]
}
I would expect these folders in the workspace to be pointing to \07\07 because those were created today:
\\\\FromABC\\Archive\\2021\\07\\07
\\\\FromXYZ\\Archive\\2021\\07\\07
Is it possible to create a workspace where the folders are opened to the latest folder within each configured path?
There's not enough information in the original question to fully answer it, however, I can suggest a few avenues of attack
Custom Command (error-prone and picky)
Modify Upstream Process (likely the best overall)
Combining Both (perhaps the best for your immediate case)
Creating a Custom Command
Create a new command per https://code.visualstudio.com/api/extension-guides/command#creating-new-commands
VSCode Commands Listing: https://code.visualstudio.com/api/references/commands
new command
detect latest folder through whatever logic you like
call vscode.openFolder to navigate to it
call your custom command through Activation Events (activationEvents) at either onStartupFinished or * (Start Up; less-preferable, but may be required to avoid confusing the editor)
https://code.visualstudio.com/api/references/activation-events#Start-up
Check out Start app when opening project in VS Code? for a few answers related to this
Modifying the Upstream Process
Cutting the gordian knot, it's likely some process (perhaps a human) is creating the directories for you
Change the upstream process so when it creates the directories, it also creates/updates a link to the directory labeled something like latest
/FromABC/Archive/2021/06/03
/FromABC/Archive/2021/07/05
/FromABC/Archive/2021/07/07
/FromABC/Archive/latest --> /FromABC/Archive/2021/07/07
/FromXYZ/Archive/2020/04/12
/FromXYZ/Archive/2021/08/18
/FromXYZ/Archive/latest --> /FromXYZ/Archive/2021/08/18
Then you can always refer to the latest directory and it will always be correct
This is quite common when something can change frequently, but another process is expecting a static path and/or has no way to know the schedule of change
I don't have any Windows systems to try this out with and you may be able to create a regular shortcut for this .. however, you may need a Junction (soft-link) or Hard Link to convince VSCode that the directory is a real directory
https://learn.microsoft.com/en-us/windows/win32/fileio/hard-links-and-junctions
This also provides an opportunity to include more files, such a beta versions of some software, which it's desirable to package into the same directory structure, but not truly the latest stable!
Combining Both
If your upstream process is either not modifiable (or some manual process it's annoying or error-prone to add extras steps to) you can likely combine both solutions to get what you really want
Use the * Action Event to call a script to detect and create the new directory - create a binary or PowerShell script to make your link
In this and with the upstream change, just point VS Code to the latest directory and it shouldn't mind
Not sure on which platform you are, I assume windows, but essentially similar.
Instead of trying to get VSCode to open the latest folder, I would create a script that updates a softlink for each folder to the latest subfolder in it. Then you can point VSCode to the softlink, which can be updated whenever needed to the latest subfolders.

Not able to check out files with long names in ClearCase Explorer

I am not able to find checkout option for a file when I right click on it in ClearCase explorer while the option is available for rest of the files.
The only difference is that the file I want to check in has a very long name ( I am using Windows ).
While I am able to check in via cleartool commands, it isn't possible from front end ( CC explorer) Is there any way to check out such files from CC explorer?
There are two possible causes for a checkout unavailable:
the path of the file itself is too long (combined with the long filename, more than 256 characters): a subst command can help shorten the path.
or the file is already checked out.
A cleartool status can help distinguish between those two cases.
And a cleartool lsvtree -graph aFile (replace aFile with your long filename) can help see the file history and see if it is checked out (reserved or not) in another view.
The window 255 characters restriction for file name can't be bypass or any sidestep if it.
You have the following options:
1. Try to shorten the name. Including all the folders that are the prefix for the view. Like vonC suggested.
2. Try create the view on Linux, and mount the folder to some shared area that can be seen in windows.
3. Use dynamic view, but Im not sure it will work.
Keep in mind, that other apps can also fail even after you solve it for clearcase as it is a Windows problem. Like msbuild or visual studio.

How to use Pentaho Spoon to rename files that do not have an extension

I am new to using Pentaho Spoon. I have about 100 text files in a folder, none of which have file extensions. I have found that if I create a job and move a file, one at a time, that I can simply rename that file, adding a .txt extension to the end. What I'd like to do is create a job that goes through and renames each file and adds the .txt extension. I've tried using the regex, but can't seem to get it to work because there's no file extension.
Any help would be greatly appreciated.
It's a pretty straightforward solution but you need to use a Transformation, as Job steps won't do it, ok?
You need the following steps:
Get File Names: just add your folder and the RegExp ".*" (without the double quotes), so everything is listed. Check if it's ok with "Show filename(s)..." button.
Modified Java Script Value: declare a new_filename var concatenating the desired extension. Remember to click "Get Variables" after adding the script to output the new field.
var new_filename = filename + '.txt';
Process Files: select Operation = Move and filename/new_filename as your source/target filenames.
That's it!
Renaming a group of files is one thing I wouldn't use Kettle for. Why not let the shell do what the shell does best?
rem example for Windows CMD shell
ren absolute-path-to-folder\*. *.txt
This can be done using a Shell job entry, if you find reason to do it in Kettle at all.
I've seen "just use a shell script" answers for this before. Works great if you can guarantee you're Kettle server is on the same OS as the developer workstation. I'm in an environment where the Dev/Spoon instance is Windows, but the Prod/Kettle environment is Linux, so you can't write one script file to rule them all.
As for "Why on earth would you do this?", my scenario is an integration scenario. We're using Pentaho for Data Integration, but a different tool for Enterprise Integration. I want a Pentaho Job to produce an output file, and I want my Enterprise Integration tool to pick up the file and do something with it, but not before Pentaho is done writing the file. Renaming helps avoid a race condition when the Enterprise Integration solution recognizes the file is there, but Pentaho isn't done writing it yet.
If I could rename a set of files, for example change from test..csv.processing to test..csv, then Pentaho would create the file initially with the .processing extension, and then remove the extension once it's done. The Enterprise Integration solution that's looking for test.*.csv won't start processing the file until Pentaho renames it. Bingo, no race condition.

"tf get" from command line doesn't get latest, UI does

Did I get the syntax correct?
tf get .\Web\project.root /recursive
All files are up to date.
tf get /version:T .\Web\project.root /recursive
All files are up to date.
Getting latest using the command line will report that all files are up to date when they are not! However, when I get latest using the TFS UI within Visual Studio, the latest code does actually download.
Until this gets resolves, my super fancy msbuild script can't be used without opening visual studio to get latest first!!
<Target Name="GetLatestCoreLibraries" Condition="'$(GetLatest)' == 'true'">
<Exec Command='tf get /version:T "$(CoreLibPath)\Source\Libraries /recursive' ContinueOnError="false" />
</Target>
The $(CoreLibPath) is a relative path passed into the script. Something like...
<PropertyGroup>
<CoreLibraryPath>..\..\Core\Release\xx.xx.xx.xx</CoreLibraryPath>
</PropertyGroup
Is using relative paths to the local file system less reliable than using SCS paths? ie... $/Core/Release/xx.xx.xx.xx/Source/Libraries"?
Could it be that we're sometimes using Dev Studio UI, and other times using the command line that is confusing the command line version of TFS?
So, I realized my mistake. It was very simple.
Command =
'tf get
/version:T
/recursive
"$(CoreLibPath)\Source\Libraries <<<-- Missing closing quote.
'
When you miss the closing double quote on a tf get, there is no error thrown. It simply reports that "All files are up to date."
c:\Web\Release\x.x\x.x.xxxx>tf get /version:T "..\..\..\..\..\Core\Release\x.x\x.x.xxxx.xxxxx(xxxx xx xx - xxx)\Source\Libraries /recursive
All files are up to date.
Q: Is using relative paths to the local file system less reliable than using SCS paths? ie... $/Core/Release/xx.xx.xx.xx/Source/Libraries"?
A: No, it doesn't seem to be any less reliable.
Q: Could it be that we're sometimes using Dev Studio UI, and other times using the command line that is confusing the command line version of TFS?
A: No, this was a case of user confusion, not SCS confusion.
Try using the /force parameter. That'll force everything to be retrieved, which maybe you don't want.
Alternatively you could get the MSBuild Extension Pack from CodePlex - they have MSBuild tasks that wrap these calls and work with IntelliSense if you're using Visual Studio to manage your build scripts.
Your itemspec looks odd to me, but I don't have any specific corrections to offer. I explicitly call out the workspace when using command line calls, e.g. tf get "$/<our product>/<branch>[/<project>]" /force /recursive. Otherwise pathing is relative to the current active workspace mapping.
Another option is that the user does not have access to the project collection. In my case I could somehow see the directories but did not have access to pull down the code. I had to have a TFS admin add me as "developer" access.

Eclipse's local history...where are files saved?

Can someone explain how Eclipse's local history works?
I accidentally overwrote a file in a project but need to revert to an earlier version.
Is there a chance that Eclipse has the older file cached somewhere?
To complete CurtainDog's answer: from eclipse FAQ
Every time you modify a file in Eclipse, a copy of the old contents is kept in the local history. At any time, you can compare or replace a file with any older version from the history.
Although this is no replacement for a real code repository, it can help you out when you change or delete a file by accident.
Local history also has an advantage that it wasn’t really designed for: The history can also help you out when your workspace has a catastrophic problem or if you get disk errors that corrupt your workspace files.
As a last resort, you can manually browse the local history folder to find copies of the files you lost, which is a bit like using Google’s cache to browse Web pages that no longer exist.
Each file revision is stored in a separate file with a random file name inside the history folder. The path of the history folder inside your workspace is
.metadata/.plugins/org.eclipse.core.resources/.history/
You can use your operating system’s search tool to locate the files you are looking for.
Note, if your need to import your local history into a new workspace, you will need both:
.metadata/.plugins/org.eclipse.core.resources/.history
.metadata/.plugins/org.eclipse.core.resources/.project
to have a functional local history in that new workspace.
Try right-clicking on the file in eclipse, and choose Replace With->Local History.
If there's history available, it'll show up as a list of edit times.
But more importantly, as pointed out in other answers, be sure to put your files in version control! SVN is pretty easy to set up (you don't need a server; it can just use the file system); use it even if you aren't sharing with others.
A tip: whenever you hear yourself say "yes!", check in all of your code. 10 minutes later, you'll be saying "how did I mess that up?"
If you have lost a full package structure due to accidental deletion or svn/cvs override, select the project> right click> Restore from local history => select the files.
VonC's answer has all the information you need for finding the location of your code backups. I would simply add that if you are on a Mac or Linux, you can do something like this:
$ cd [WORKSPACE]/.metadata/.plugins/org.eclipse.core.resources/.history/
$ grep -rl "class Foo" . | xargs ls -lt
This will find all the versions of a file that contains a particular string (ie. "class Foo"), and sort them by date/time to easily find the most recent version.
You can use the link http://wiki.eclipse.org/FAQ_Where_is_the_workspace_local_history_stored%3F is very helpfull
Open the CVS view and you should see a filter for local history. You should then be able to right-click on the correct version and Get Contents or do a manual compare and merge. I'm not sure what the eclipse defaults are for keeping local history but there is a decent chance you'll be able to get your stuff back if you act quickly.