I need to navigate files from a SVN location instead of a windows location using PowerShell script, If I substitute windows path with SVN path it does not works, I have below working code for windows location:
function CheckPath($path)
{
Test-Path $path -PathType Container
}
function Navigate()
{
$directory = "C:\Test"
#Instead of above windows path I want to give SVN path as below:
#https://svn.test.local/svn/Files/
$qscomponent="SomeValue"
if (CheckPath -path $directory)
{
Write-Host "Executing scripts from $directory"
Get-ChildItem $directory -Filter LTR*.sql | Foreach-Object -Process { RunScript -comp $component -file $_.FullName }
}
else
{
Write-Host "Invalid component: the path does not exist!"
}
}
You can't execute anything that is on a remote SVN server. Any way of doing that will have to include downloading at least that file locally.
That said, to make this totally transparent, you could create script that depends on svn ls command.
One example is to change prompt function so that when you cd into some folder it will get the svn file list (ideally use proxy to set-location) but prompt is easier. Suppose you have .svn folder but no files (for this set depth to empty when doing checkout)
function prompt() {
$lastCmd = h | select -Last 1 -expand CommandLine
if ($lastCmd -match '^cd ') {
$x = svn info
if ($x -ne $null) {
svn ls | % {Write-Host $_}
}
}
"PS $pwd>"
}
This prompt function will list you all files on svn remote repository when you enter the folder. You could crete a simple powershell function that will download desired file and execute it (svn can checkout single files).
It will feel more Powershelly if you proxy cd or ls as you can then actually return array of items that you can pipe to for instance execution function, something like:
ls *.exe | execute
where ls is proxied and executed inside the svn repository local path and execute uses svn checkout for piped files, perhaps to TEMP and executes them.
Related
When I use Set-Location (aka cd) to change the current directory in a PowerShell window, but deliberately avoid auto-complete and type the name in the "wrong" case...
PS C:\> Set-Location winDOWs
...then Get-Location (aka pwd) will return that "wrong" path name:
PS C:\winDOWs> Get-Location
Path
----
C:\winDOWs
This causes problems with svn info:
PS C:\svn\myDir> svn info --show-item last-changed-revision
2168
PS C:\svn\myDir> cd ..\MYDIR
PS C:\svn\MYDIR> svn info --show-item last-changed-revision
svn: warning: W155010: The node 'C:\svn\MYDIR' was not found.
svn: E200009: Could not display info for all targets because some targets don't exist
As you can see svn info fails when the user doesn't type the name of the working copy directory "myDir" with the correct letter case when cd'ing into it.
Is there a way to solve this? I could not find a suitable parameter of svn info.
Another option could be to overwrite PowerShell's cd alias and make sure the letter case of the typed path is fixed before actually cd'ing, but how to accomplish that? Resolve-Path, for example also returns the "wrong" directory name.
Something like this might work for you:
Set-Location C:\winDOWs\sysTEm32
$currentLocation = (Get-Location).Path
$folder = Split-Path $currentLocation -Leaf
$casedPath = ([System.IO.DirectoryInfo]$currentLocation).Parent.GetFileSystemInfos($folder).FullName
# if original path and new path are equal (case insensitive) but are different with case-sensitivity. cd to new path.
if($currentLocation -ieq $casedPath -and $currentLocation -cne $casedPath)
{
Set-Location -LiteralPath $casedPath
}
This will give you the proper casing for the "System32" portion of the path. You will need to recursively call this piece of code for all pieces of the path, e.g. C:\Windows, C:\Windows\System32, etc.
Final recursive function
Here you go:
function Get-CaseSensitivePath
{
param([System.IO.DirectoryInfo]$currentPath)
$parent = ([System.IO.DirectoryInfo]$currentPath).Parent
if($null -eq $parent)
{
return $currentPath.Name
}
return Join-Path (Get-CaseSensitivePath $parent) $parent.GetDirectories($currentPath.Name).Name
}
Example:
Set-Location (Get-CaseSensitivePath C:\winDOWs\sysTEm32)
I have made a script that builds the .vbp files in the company that I work. I want to build only the last changed files. By using timestamps I will build more unecessary files as some people test the files. So I thought that the best solution is to store the latest commits from svn to a log file (which I already do) and search in this file for the .vbp files and then build them. How can I search for the *.vbp wildcard and build these files only?
Below I have the part of code that stores the logs from the client trunk in $outputFileClientCustom2 and builds $factory, but I want factory to be take from the XML that I have the last commits stored.
$clientTrunk | % {
svn log -l 1 -v --xml $_ 2>&1 |
ft -AutoSize -Wrap |
Out-File $outputFileClientCustom2
}
& Get-ChildItem $factory -Include *.vbp -recurse | foreach ($_) {
Write-Host 'Building ------->' $_.FullName;
& $vb6 /out $outputFileClient $_.FullName /make | Out-Null
}
First I will give a brief overview of what im trying to achieve. I want to go through a series of HTML files, replace code and then re-save these HTML files. This all works however the PS command will only execute this on HTML files which are on the default Powershell path (for me this is the H drive).
I want to be able to have a seperate folder which contains my powershell script and HTML files and convert them in that folder NOT from the H drive.
The code I have is follows:
Powershell script
$HTMLfiles=get-childitem . *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
This successfully changes all HTML files on the H drive that contain the words 'this text' with 'TEST'. I want to be able to change these HTML files from where the Powershell script is located, NOT from the H drive?
I appreciate any help.
Thanks
Use the built-in variable called $PSScriptRoot to retrieve the files from the same folder where the PowerShell script resides.
Get-ChildItem -Path $PSScriptRoot -Include *.HTML;
In your script, you ask to the Get-ChildItem cmdlet to look for items in the current directory, to make the script look for files in another directory, you just have to specify it to Get-ChildItem :
$HTMLpath="C:\path\to\your\html\files"
$HTMLfiles=get-childitem $HTMLpath *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
Edit :
if you want the path to be passed as an argument to your script, just do the following :
param($HTMLpath)
$HTMLfiles=get-childitem $HTMLpath *.html -rec
foreach ($files in $HTMLfiles)
{
(Get-Content $files.PSPath) | ForEach-Object { $_ -replace "this text", "TEST" } | Set-Content $files.PSPath
}
then you can call your script in the console (assuming you are in the directory where your script is) : ./myscript "C:\path\to\your\files"
Calling Get-ChildItem . *.html -Rec will get all files under the current working directory. If you happen to be in the same folder as your script when you call it, I'd expect it to work as you want. If you call the script from another path, e.g. by setting up a scheduled task to run powershell.exe <path_to_script> then it may not pick up the files you want. Maybe H: is the root of your Windows user profile?
As per other answers, using $PSScriptRoot or passing the path under which the .html files reside in a parameter would be good. To combine both, you can add a parameter to your script AND set the default value for that parameter to be $PSScriptRoot:
param($HTMLpath = $PSScriptRoot)
This will (1) allow you to specify a remote path if necessary and (2) otherwise default to the path where the script is saved.
I have created a PowerShell script for copying files to a directory, the script, first creates a folder , or forces a new folder event if it exists. Then copies a directory from another location. After copying, the files I then need to copy the correct web config depending on a value given by the user execturing the script. The issue I am having is I can copy the files, but all the files are set to read-only meaning when I try and copy the correct web.config, the script fails as access is denied.
This is a cut down version of script for simplicity.
$WebApp_Root = 'C:\Documents and Settings\user\Desktop\Dummy.Website'
$Preview_WebApp_Root = 'c:\applications\Preview\'
$Choice = read-host("enter 'preview' to deploy to preview, enter Dummy to deploy to Dummy, or enter test to deploy to the test environment")
if (($Choice -eq 'Preview') -or ($Choice -eq 'preview'))
{
$Choice = 'Preview'
$Final_WebApp_Root = $Preview_WebApp_Root
}
write-host("Releasing Build to " + $Choice +'...')
write-host("Emptying web folders or creating them if they don't exist... ")
New-Item $Final_WebApp_Root -type directory -force
write-host("Copying Files... ")
Copy-Item $WebApp_Root $Final_WebApp_Root -recurse
write-host("Copy the correct config file over the top of the dev web config...")
Copy-Item $Final_WebApp_Root\Config\$Choice\Web.configX $Final_WebApp_Root\web.config
write-host("Copying correct nhibernate config over")
Copy-Item $Final_WebApp_Root\Config\$Choice\NHibernate.config $Final_WebApp_Root\NHibernate.config
write-host("Deployed full application to environment")
Try to use -Force parameter to replace read-only files. From documentation:
PS> help Copy-Item -Par force
-Force [<SwitchParameter>]
Allows the cmdlet to copy items that cannot otherwise be changed,
such as copying over a read-only file or alias.
Is there a simple way to in PowerShell (I imagine using gacutil.exe) to read from a text document a path\assembly and register it in the GAC? So for example a .txt file that looks like:
c:\test\myfile.dll
c:\myfile2.dll
d:\gac\gacthisfile.dll
The PowerShell script would read that into a stream and then run gacutil on each of those assemblies found? I guess it would be something like:
#read files into array?
foreach ($file in Get-ChildItem -Filter "*.dll" )
{
Write-Host $file.Name
C:\WINDOWS\Microsoft.NET\Framework\v1.1.4322\gacutil.exe /nologo /i $file.Name
}
How about let the .Net worry about gacutil?
# load System.EnterpriseServices assembly
[Reflection.Assembly]::LoadWithPartialName("System.EnterpriseServices") > $null
# create an instance of publish class
[System.EnterpriseServices.Internal.Publish] $publish = new-object System.EnterpriseServices.Internal.Publish
# load and add to gac :)
get-content fileOfDlls.txt | ?{$_ -like "*.dll"} | Foreach-Object {$publish.GacInstall($_)}
If you sort out your text file such that the each dll is on a separate line, you could use the Get-Content command and pipe each to a filter that did your command:
filter gac-item { C:\WINDOWS\Microsoft.NET\Framework\v1.1.4322\gacutil.exe /nologo /i $_}
get-content fileOfDlls.txt | ?{$_ -like "*.dll"} | gac-item
I would suggest calling the function to add an assembly to the GAC something following PowerShell guidelines like Add-GacItem. Also the location of gacutil.exe varies based on your system. If you have VS 2008 installed, it should be at the location shown below.
function Add-GacItem([string]$path) {
Begin {
$gacutil="$env:ProgramFiles\Microsoft SDKs\Windows\v6.0A\bin\gacutil.exe"
function AddGacItemImpl([string]$path) {
"& $gacutil /nologo /i $path"
}
}
Process {
if ($_) { AddGacItemImpl $_ }
}
End {
if ($path) { AddGacItemImpl $path }
}
}
Get-Content .\dlls.txt | Split-String | Add-GacItem
Note that the Split-String cmdlet comes from Pscx. The function isn't super robust (no wildcard support doesn't check for weird types like DateTime) but at least it can handle regular invocation and pipeline invocation.
Do you want to replace gacutil.exe? If not, why not use gacutil's included /il switch?
From the gacutil /h:
/il <assembly_path_list_file> [ /r <...> ] [ /f ]
Installs one or more assemblies to the global assembly cache.
<assembly_list_file> is the path to a text file that contains a list of
assembly manifest file paths. Individual paths in the text file must be
separated by CR/LF.
Example: /il MyAssemblyList.txt /r FILEPATH c:\projects\myapp.exe "My App"
myAssemblyList.txt content:
myAsm1.dll
myAsm2.dll
If you create an alias in your profile (just type $profile at a ps prompt to determine this file location) like so new-alias "gac" ($env:ProgramFiles+"\Microsoft Visual Studio 8\SDK\v2.0\Bin\gacutil.exe") then you can use gac like so:
get-childitem $basedirectory "*$filter.dll" | foreach-object -process{ WRITE-HOST -FOREGROUND GREEN "Processing $_"; gac /i $_.FullName /f}
the last part is the most important. it calls gacutil with the switches you want.
Hope this helps.
This PowerShell script will add assemblies to the GAC without using GacUtil. http://blog.goverco.com/2012/04/use-powershell-to-put-your-assemblies.html
After downloading the Add-AssemblyToGlobalAssemblyCache.ps1 you can deploy to the gac.
Usage example for adding multiple assemblies Dir C:\MyWorkflowAssemblies | % {$_.Fullname} | .\Add-AssemblyToGlobalAssemblyCache.ps1
See the full documentation by running Get-Help .\Add-AssemblyToGlobalAssemblyCache.ps1 -Detailed
Not wanting to install the Windows 8 SDK on all machines I needed to put assemblies in the GAC to get gacutil, I've written a powershell module using the GAC API. It works with any .Net version. With PowerShell GAC you can do it like so:
Get-Content ListOfAssemblies.txt | Add-GacAssembly