Here is my problem. I have a build script which adds mappings to a certain workspace dynamically, then unmaps them when it is through. I am worried that if (when) my script fails before the unmapping is done, the mappings will holdover until the next time and screw things up.
So I would like to unmap the entire workspace at the start of the script, and recreate it, but the problem is I don't know specifically what might be there. Through the TFS command line I can unmap easily enough, but you have to know exactly what the mapping is. My question is how is the easiest, best way to get this done?
Thanks for your help!
If you're sure there won't be any pending changes in the workspace you need to preserve, it's probably easier to delete it entirely and create a new one.
Or if you do want to stick with your algorithm, the Power Tools make it much easier:
$ws = get-tfsworkspace .
$ws.Folders | % { $ws.DeleteMapping($_) }
My solution was to save the output of this...
$workspace_info = [String[]] (&$tfs_cli workfold /workspace:$workspace_name)
...into a string array, then iterate over it, looking for "$/" which indicates a mapping, and unmap
foreach($wi in $workspace_info)
{
if($wi.Contains("$/"))
{
$mapping = $wi
#minor string manipulation code left out for brevity
&$tfs_cli /unmap $mapping /workspace:$workspace_name
}
}
Related
I want to build a simple script that may be useful for others as well, but I have only very basic programming knowledge and can't do it myself without learning how to write powershell scripts from scratch.
What this script is supposed to do is, open an INI file (really just a txt), look for a variable with an assigned value and replace that value from a txt hosted on GitHub, save and then run a program.
This is for the tracker list of qBittorrent, since that feature still hasn't been implemented and the only other script that I could find that does this is for linux and mac, there seem to be none for windows.
The basic idea is this:
get-content "c:\users\[user]\appdata\roaming\qbittorrent\qbittorrent.ini"
# This is where pseudo code starts
get file from "[github-link.txt]"
save file to cache # keeping it is useless as it gets updated daily
find variable "Session\AdditionalTrackers=" in qbittorrent.ini
replace value of variable with content of cached file # this is what I struggle with most when looking for example code. Everything I could find specified the exact string that needed replacing, which in this case is quite long and may change with every update of the file.
overwrite original file
launch program qbittorrent.exe
end script
Conveniently or most likely deliberately all (most) of the tracker lists on GitHub are already formatted in a way that they can be directly pasted into the file without having to worry about formatting. Example.
I can totally understand if nobody wants to do the work, but I would greatly appreciate it and possibly others that are looking for a stopgap for the lacking feature.
If this already exists, go ahead and call me an idiot and while you're at it drop a link ;)
I just found a little tool called Power Automate and it pretty much does what I was looking for. It's not quite as elegant as a single click script but it does the job. Sadly I can't share the "flow" I built because, well, there is no option for it - thanks Microsoft. So, I'll try my best to write it out.
Not quite a "solution" but pretty to close to it.
Here is the "flow":
get file from web // from github for example
read text from file // read downloaded .txt file
read text from file // read qBittorrent.ini
crop text // crop between flags in qBittorrent.ini use "Session\AdditionalTrackers=" as start and "Session\GlobalMaxRatio=" as end and save to cropVar2
crop text // crop before flag use "Session\AdditionalTrackers=" as flag and save to cropVar1
crop text // crop after flag use cropVar2 as flag and save to cropVar3
replace text // replace cropVar2 with content of downloaded file and save to cropVar2
write text to file // write cropVar1,cropVar2,cropVar3
end flow
Keep in mind that any changes to the qBittorrent.ini may change the order of the entries. Which means you have to check if it's still correct after every update and after every change you make in the options. This is a massive cludge after all...
You can input fail saves so that you won't break anything if the order changed.
I have a bunch of Archives that I want to extract. Problem is, there's a lot of them, and it's a lot of info to move around. I'd like to do it all at once. It's probably taken more time to research than to do it manually, but research is more interesting.
TL;DR: Would like help with 7-zip command line to extract multiple archives into their own directory. Autohotkey, Powershell, and batch files answers would also be nice if you are feeling extra helpful.
Win10, latest update and all that. I've been using 7-zip, so if there's a better extractor for this it might be a helpful suggestion. I have a little experience with coding, so I can usually pars an example and apply it to my project, but I can't come up with code on my own. So with that said, I'm comfortable using cmd, autohotkey, powershell, batch files, and a few others, but I need an example before I can do anything. haha
So, in my research, I found
(7z x -o"...\Stellaris\mod\Examples\" "...\content\281990\*")
for cmd, which works, except that extracts everything to the same dir since the archive files are in the root archive dir (I think that's why; if they were one folder down, it should work like I want right?). I don't think you can use environment variables in the path(?). Not sure what would make it work here...
Powershell: I only recently started tinkering with it so the one script I found didn't make any sense to me. And never found anyone using AutoHotKey for this.
And finally a **batch file* I found here seemed to come closest (normally I'd comment on that thread cause apparently it's still active, but I don't have 50 rep), but I wasn't sure how to modify it for my purposes:
#echo off
SET "filename=%~1" #Where does the working dir path go?
SET dirName=%filename:~0,-4% #How/where would you put in wildcards?
7z x -o"%dirName%" "%filename%"
I don't mind using any method, though I might prefer AHK? I'm probably most experienced there.
If you made it this far, wow, I'm impressed! I hope it was coherent enough to understand (probably not at first?). And maybe a little entertaining? I think I'm funny. Let me know if I should add or remove anything for the future. I know it's probably way too much context, but I would rather have too much than not enough, and I'm never sure what would be relevant and what would not. I'm not happy with my code format here, but I didn't quite understand what the help was saying about whitespace and I'm not familiar enough with Markdown yet (I wanted comments to be in line). Also, I'm honestly not sure about the tags.
EDIT: Added TL;DR at the top, and...
Found an answer via a program that does this. I'll post it in an answer as well: ExtractNow seems to be a bit outdated, last update was in '17, but it did what I wanted it to.
For interactive use at the command prompt:
for %z in ("\path\to\dir\subdir\*.zip") do #echo 7z "-o\path\to\extracted\%~nz" "%~z"
This won't run 7z, but it will print out the commands. Once you are satisfied that the printed commands look fine, remove the #echo to execute them.
In a batch script you must of course duplicate the % signs.
Found an answer via a program that does this. ExtractNow seems to be a bit outdated, last update was in '17, but it did what I wanted it to with only a few settings changes.
So, in my research, I found
(7z x -o"...\Stellaris\mod\Examples" "...\content\281990\*")
for cmd, which works, except that extracts everything to the same dir...
Assuming you were using Windows, 7-zip would have worked fine to do what you wanted. The only thing you were missing is the * character, which 7-zip expands to be the archive name when used with the -o switch:
7z x "dir\subdir\*.*" -o"dir\*"
So 7z x -o"...\Stellaris\mod\Examples" "...\content\281990\*"
becomes:
7z x -o"...\Stellaris\mod\Examples\*" "...\content\281990\*"
Also be aware that *.* does not mean any file under 7-zip. 7-Zip takes *.* to be name of any file that has an extension. To process all files just use a "dir\subdir\*" without the extra .*.
I'm trying to copy several files with changing filenames. It seems very easy but I can't seem to work out how to do it without actually listing out the filenames in their entirety. The first few letters of the filenames correspond to the subject names which I'm looping through one by one. In each folder, there are 2 files, one is something like this subj1_load1_vs_load2.img, one is subj1_load1_vs_load2.hdr. I want both of them copied. Below is what I have:
subj={'subj1','subj2','subj3','subj4','subj5'}
for i=1:length(subj)
source=fullfile(filedir,subj{i},sprintf('^%s_.*\.*',subj{i})); % this doesn't seem to work
destination=fullfile(destdir,subj{i});
copyfile(source,destination);
end
I've also tried:
source=dir([filedir subj{i} strcat(subj{i},'*')]);
This appears to needlessly complicate since I will need to deal with .name. But perhaps I don't know how to use this well.
Anyway, the problem is with source as I'm trying to find the files i want to copy.
I'd appreciate any suggestions.
Below is Daniel's answer (which solved the issue for me)
source=fullfile(filedir,subj{i},strcat(subj{i},'*'))
I currently have a mess of Perl code that includes something like a configuration.pm file that exports a large number of variables that other modules are using. The same module uses at least one module, call it Foo, which we wrote in some of the helper methods provided by the configuration.pm (they should be in a different module, but not ready to change this yet).
Currently it loads the module with something like this right near the top of the file:
Begin{ push #INC, 'hard/coded/directory'}
use Module::Foo;
I'm trying to get rid of this hard coded directory. I've already added a default configuration file for it to read data from. I moved the import down some and replaced the use with a require, something like this...
$script_directory = $config_data_from_file{'script_directory'};
push #inc, $script_directory;
require Module::Foo;
However, I want to add a command line argument to Main.pl to point to a different configuration file if I don't want to use the default one. My problem is that all the other modules expect configuration.pm to have loaded configuration data and required foo as soon as they include it. So I can't have configuration.pm wait to initialize until main.pl is ready. The closest I can come up with is something like this:
package Configuration;
load_config_file('default/file/location');
sub load_config_file($){
$config_data_from_file = read_file(#_[0]);
$script_directory = $config_data_from_file{'script_directory'};
push #inc, $script_directory;
require Module::Foo;
#load the rest
}
and have Main.pl recall the load_config_file if a command line option changes the configuration file.
But this is a problem for two reasons. First, if my default script location doesn't exist I still explode when I try to do the first import. Second, I'm requiring Foo twice, overwriting it, which could lead to issues if there are difference between the files. For that matter adding the default script_directory to #INC should be avoided.
There are a few ways to fix the problem I could see. A way to more cleanly load different versions of a module to replace the old one, a way to make Foo delay it's attempt to load until the first time it's used in the file, or a way to delay the $load_config_file method until after I read the configuration file for example. However, as a perl newbie I don't know how to do any of them, and haven't had much luck finding out how online.
I actually can do this now, with a fragile order of loading data that makes presumptions or by skipping ahead to a more through refactor of dozens of scripts to implement the long term solution sooner (but I'm really afraid to touch that much code before I have a way to test the code on my computer). However, I'm asking partially in hopes of learning more features of Perl I may find useful later; how would this be solved if I couldn't do the refactoring?
If you want to give the configuration file as the first parameter you can do something like this:
Main script:
#!perl
BEGIN {
use Configuration;
}
use Module::Foo;
... rest of script ...
Configuration.pm:
package Configuration;
load_config_file($ARGV[0] || 'default/file/location');
sub load_config_file($){
$config_data_from_file = read_file(#_[0]);
$script_directory = $config_data_from_file;
push #INC, $script_directory;
}
My solution in general was to look for my -f argument for a configuration file in my configuration.pm as soon as it is loaded and load the configuration file if possible then, while leaving the #ARGV variable untouched so that others could still parse it. This means we end up parsing command line args twice (actually 3 times), but that doesn't do any real harm. I am enforcing the -f argument being predefined in any module that uses my configuration.pm, and sort of require configuration.pm to be the very first module we include, but I consider that a minor expense. Anyone using our configuration.pm file for configuration arguments should desire that behavior.
I found AppConfig was the best module for handling this. My solution could be done without it, but AppConfig made it cleaner because it combined means of loading variables from config file and command line. in fact I, by pure accident, ended up adding the ability to modify any single variable directly from the command line if they choose the way I did it.
My configuration.pm looks something like hits (rewriting this from memory, not exact)
$conf = AppConfig -> new({
GLOBAL=> {
EXPAND => AppConfig::Expand_Var,
ARGCOUNT => AppConfig::ARGCOUNT_ONE
}})
$conf.define("script_dir", {DEFAULT = "/default/location"});
$conf->define("f", {ALIAS ="file|conf_file"});
...other defines here
#read config file if -f arg exists
parse_commandline_args();
$conf->file($conf->conf_file()) if defined $conf->conf_file()
#reread command line so that arguments on it override those in conf file
parse_commandline_args();
#at this point script_dir should be correct so safely include it.
push #INC $conf->script_dir();
sub parse_commandline_args(){
$copy_of_args = [#ARGV];
$conf->args($copy_of_args);
}
My main.pl is practically untouched. I use configuration.pm near the top of the module and everything else just works. I still need to go through and redefine all the scripts that Use a script to require it instead so that configuration.pm has time to update the INC before it runs, but other then that the rest just works. Anywhere I want to use content from the configuration file I now just can $conf->variable()
The parse_commandline_args is important. just using $conf->args() will erase the content of #ARGV, making them unavailable for later modules, like my main.pl. By copying the array first we leave the original #ARGV untouched for later use.
Not sure if I would recommend this from scratch, feels wrong the way configuration.pm is automagically doing everything, but for updating our ugly prototype to function long enough to maintain it until were funded to write the proper version, which I will not be doing in perl, it will do.
I was wondering if there is some way to call (or load) a function located lower in the script execution path.
I wrote a script to run deployment and as one of the last steps, the script parses web.config making a ton of changes based on configuration file. A feature request came in, asking for a switch to generate the web.config without actual deployment.
The only way I can think of doing it, is making all the parsing logic into a gigantic function, and loading it at the start of the script. However, that approach will make the script horribly ugly. Nor do I want to carve out all the logic into another script and dot sourcing it.
Any suggestions?
Thank you.
Make it 2 functions one for deploy one for webconfig use a seperate function to check for switches and call functions based on those variables.
dc
I ended up reading another 30 or so articles and decided to simply bracket the functionality and move it up higher in the script, then dot source the function from inside the script.
Thanks.