Executing XMLA with AMO PowerShell does not create measures - powershell

I wrote a PowerShell script to automate deployment of SSAS Cubes. I use the Deployment Wizard to generate an XMLA file and then PowerShell AMO command to deploy it. However, when I run it the tabular database gets created but all measures are missing. Running the same XMLA from SQL Management Studio or using Invoke-ASCmd produce the correct database with all measures in it. Am I missing an option or something in the "Invoke" command?
[CmdletBinding()]
Param(
# InputDir is required
[Parameter(Mandatory=$True,Position=1)]
[string]$InputDir,
# Server is optional
[Parameter(Mandatory=$False,Position=2)]
[string]$Server=$env:computername
)
# Output execution parameters.
"Executing with the following parameters:"
" InputDir: $InputDir"
" AS Database Server: $Server`n"
$XmlaDir = Resolve-Path($InputDir)
$Xmla = Join-Path -Path $XmlaDir -ChildPath '\Model.xmla'
$ASFiles = Get-ChildItem -Recurse -Path $InputDir -Filter *.asdatabase
$Count = $ASFiles.Count
If($Count -ne 1)
{
Write-Host("`ERROR: Count asdatabase file(s) found at $inputdir")
Exit 1
}
$ASDatabase = $ASFiles[0].FullName
Write-Host("`nUsing $ASDatabase for deployment.")
Write-Host("`nAttempting to create $Xmla ...`n")
# Use Analysis Services Deployment Utility to generate XMLA file from
.asdatabase file
$Script:ASDeployWizard = "E:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\Microsoft.AnalysisServices.Deployment.exe
"
$Arguments = #("`"$ASDatabase`"", "/s", "/o:`"$Xmla`"")
Start-Process -FilePath $Script:ASDeployWizard -ArgumentList $Arguments -Wait
If ((-Not $?) -Or -Not (Test-Path $Xmla))
{
"Cannot generate deployment descriptor. Deployment aborted."
Exit 1
}
Write-Host("XMLA deployment descriptor generated.`n")
Try {
Import-Module SQLPS -DisableNameChecking
# Deploy Cube
"Invoking deployment script. This may take several minutes...`n"
$AS = New-Object Microsoft.AnalysisServices.Server
$AS.connect($Server)
$CubeDescriptor = [string](Get-Content $Xmla)
$Results = $AS.Execute($CubeDescriptor)
Foreach ($r in $Results) {
$r.Messages.Description
}
"Done.`n"
} Catch {
Write-Host($_.Exception.GetType().FullName + "`n" + $_.Exception.Message + "`n")
Write-Host("Deployment FAILED.`n")
}
Exit 0

Related

PowerShell and WinSCP script to check if there are more than 4 files in a SFTP remote directory

With PowerShell language and WinSCP I'm trying to create a script that daily check an SFTP remote directory to see if there are more than 4 files into it.
If there are less than 4 files it's okay but if there are more that 4 files it will output an error message.
Thanks to WinSCP, the connexion is automatically created and I can below connect into the SFTP-Server:
& "C:\Program Files (x86)\WinSCP\WinSCP.com"
/log="C:\Users\scripts\WinSCP.log" /ini=nul
/command
"open sftp://..."
"cd" `
"cd ./my remote directory"
#"ls *.csv"
#"exit"
$winscpResult = $LastExitCode
if ($winscpResult -eq 0)
{
Write-Host "Success"
}
else
{
Write-Host "Error"
}
exit $winscpResult
I don't know if I have to do this through script or .NET assembly language:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Set up session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Sftp
HostName = ""
UserName = ""
Password = ""
SshHostKeyFingerprint = ""
TimeoutInMilliseconds = 60000
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Your code
}
finally
{
$session.Dispose()
}
I however currently don't know how to perform the condition.
When the script is done the goal would be to use it into Jenkins to run it daily.
Could you help me to build the check condition with and else? Thanks!
With the WinSCP .NET assembly, it's trivial:
$files =
$session.ListDirectory($remotePath).Files |
Where-Object { -Not $_.IsDirectory }
$count = $files.Count
if ($count -gt 4)
{
Write-Host "There are more than 4 files in $remotePath"
}
I finally modified the script from Martin and now it cans successfully check if there are less or more than 4 files in my directory:
$remotePath = "/my-directory"
$files = $session.ListDirectory($remotePath).Files | Where-Object { -Not $_.IsDirectory }
$count = $files.Count
if ($count -le 4)
{
Write-Host "There are less than 4 files into the"$remotePath" directory. All good!"
}
else
{
Write-Host "There are more than 4 files into the"$remotePath" directory. Please check!"
}
I actually needed the .Files argument after $session.ListDirectory.
Thanks!

Powershell script to install multiple msi on remote machines

Good day, I would ask you to help me with finding the solution how to copy each MSI package to remote machine using link on nas storage.
# Get list of servers
param(
[ValidateSet('STUDENT_LAB', 'LIBRARY_LAB', 'TEACHER_LAB')]
[Parameter(Mandatory = $true,
HelpMessage = 'Select one of the valid servers by typing one of these names: STUDENT_LAB, LIBRARY_LAB, TEACHER_LAB')]
[string]$ServerGroup
)
$servers = #{
STUDENT_LAB = ('192.168.1.1','192.168.1.2','192.168.1.3')
LIBRARY_LAB = ('192.168.10.1','192.168.10.2','192.168.10.3')
TEACHER_LAB = ('192.168.15.1','192.168.15.2','192.168.15.3')
}[$ServerGroup]
Write-Output "The user chose $ServerGroup"
#this is what I don't know how to implement - download file from nas storage on remote machine
$sourcefiles = '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi' ; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'; '\\NASSTORAGE\MSI\MICROSOFT\Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
foreach($server in $servers) {
# Destination UNC path changes based on server name
$destinationPath = "\\$server\D$\tmp\"
# Check that full folder structure exists and create if it doesn't
if(!(Test-Path $destinationPath)) {
New-Item -ItemType Directory -Force -Path $destinationPath
}
# Copy the file across
Copy-Item $sourcefiles $destinationPath
#list of packages to install
$msiList = #(
'Microsoft-ODBCDriver-11-SQLServer-x64\msodbcsql.msi'
'Microsoft-ODBCDriver-17-SQLServr-x64\msodbcsql_17.2.0.1_x64.msi'
'Microsoft-OLEDBDriver-SQL Server-x64\msoledbsql_18.1.0.0_x64.msi'
)
#now I'm trying to install on remote machine
foreach ($msi in $msiList) {
$install = Join-Path -Path $destinationPath -ChildPath $msi
Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait
}
}
And is there any way how to check if the msi was installed properly?
Thank you for your time.
you can add this at the installation section :
$LaunchMsi = Start-Process "msiexec.exe" -ArgumentList "/I $install",'/qn' -Wait -PassThru
$ReturnCode = $LaunchMsi.ExitCode
if (($ReturnCode -eq "0") -OR ($ReturnCode -eq "3010")) {Write-Host "installation OK, return code : $ReturnCode"}
Else {Write-host "installation KO, return code : $ReturnCode"}

Powershell will not start or stop Windows service with nssm

i have a script made in powershell and i am using nssm to create as a service to be executed every "x" time, however when starting the service it generates error and does not execute.
I have full administrator rights and I even tried to run PowerShell as an administrator without success.
If I run the script directly it works, however using nssm it is not working.
The error that happens is this:
Start-Service: Service 'nice (nice)' start failed.
At C: \ Program Files \ NICE Systems \ nssm.ps1: 10 char: 14
Start-Service <<<< $ serviceName
CategoryInfo: OpenError: (System.ServiceProcess.ServiceController: ServiceController) [Start-Service],
ServiceCommandException
FullyQualifiedErrorId: StartServiceFailed, Microsoft.PowerShell.Commands.StartServiceCommand
nssm.ps1
$nssm = (Get-Command nssm.exe).Definition
$serviceName = 'nice'
$powershell = (Get-Command powershell.exe).Definition
$scriptPath = 'C:\Program Files\NICE Systems\script_delecao.ps1'
$arguments = '-ExecutionPolicy Bypass -NoProfile -File "{0}"' -f $scriptPath
& $nssm install $serviceName $powershell $arguments
& $nssm status $serviceName
Start-Service $serviceName
Get-Service $serviceName
script_delecao.ps1
$logPath = "C:\Program Files\NICE Systems\Logs\*\Archive\*"
# -------------------------------------------------------------------------------------------
# SET $NDAYS WITH THE NUMBER OF DAYS TO KEEP IN LOG FOLDER.
$nDays = 180
# -------------------------------------------------------------------------------------------
# SET $EXTENSIONS WITH THE FILE EXTENSION TO DELETE.
# YOU CAN COMBINE MORE THAN ONE EXTENSION: "*.LOG, *.TXT,"
$Extensions = "*.log*"
# -------------------------------------------------------------------------------------------
# PAY ATTENTION! IF YOU COMBINE MORE THAN ONE LOG PATH AND EXTENSIONS,
# MAKE SURE THAT YOU ARE NOT REMOVING FILES THAT CANNOT BE DELETED
# -------------------------------------------------------------------------------------------
$PathDelete = "C:\Program Files\NICE Systems\Delecoes"
while ($true) {
If(!(test-path $PathDelete))
{
New-Item -ItemType Directory -Force -Path $PathDelete
}
$LogDate = (Get-Date).ToString("dd_MM_yyyy")
$DateTime = (Get-Date).ToString("yyy-MM-ddThh:mm:ss")
$Files = Get-Childitem $LogPath -Include $Extensions -Recurse | Where `
{$_.LastWriteTime -le (Get-Date).AddDays(-$nDays)}
foreach ($File in $Files)
{
if ($File -ne $NULL)
{
$Log = $DateTime + " - O arquivo " + $File + " foi deletado "
$Log | Out-File -Append $PathDelete\DeleteLogFile_$LogDate.log
Remove-Item $File.FullName| out-null
}
}
# Add a sleep at the end of the loop to prevent the script from eating
# too much CPU time
$Log = $DateTime + " FINAL DO ARQUIVO "
$Log | Out-File -Append $PathDelete\DeleteLogFile_$LogDate.log
Start-Sleep -Seconds 300
}
I believe I have a similar scenario where I cannot back-up Bamboo file system while it's running. My back-up executes from a rundeck server via Remote PowerShell, and even though the user has local admin rights it cannot stop and start services using NSSM. So I use this function to run the command elevated
ELEVAT "nssm stop bamboo"
tar --exclude=./logs --exclude=./temp --exclude=*.log --exclude=*.jar --verbose -czf E:\dropfolder\bamboo-home.tar.gz --directory=E:\bamboo-home .
ELEVAT "nssm start bamboo"
the function itself...
function ELEVAT ($command) {
$scriptBlock = [scriptblock]::Create($command)
configuration elevated {
Import-DscResource -ModuleName 'PSDesiredStateConfiguration'
Set-StrictMode -Off
Node localhost {
Script execute {
SetScript = $scriptBlock
TestScript = {
if (([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator")) {
Write-Verbose "Verified Elevated Session"
return $false
} else {
Write-Verbose "Not an Elevated Session!"
exit 9996
}
}
GetScript = { return #{ 'Result' = 'RUN' } }
}
}
}
$mof = elevated
Start-DscConfiguration ./elevated -Wait -Verbose -Force
if ( $error ) { Write-Host "[ELEVAT][WARN] `$Error[] = $Error" ; $Error.clear()
}
}

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.

Powershell: Update-TfsWorkspace cmdlet how to update two workspaces

I want to update 2 workspaces from two different tfs in one script using powershell.
The first Workspace is updating without any Problems. After the update is finished powershell connects to the second Workspace, but isn't updating the local data like the first time.
I guess the old Connection might still block the pipe or something like that, but I haven't found any cmd to clean the pipe. My code looks like this:
param(
[string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[string]$Teamproject="$\TeamprojectPath",
[string]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[string]$LocalProdWorkspace="C:\LocalProdWorkspacePath"
)
# Import Microsoft.TeamFoundation.PowerShell Snapin
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
# Connect to production-TFS
$ProdEnvServer = Get-TfsServer -Name $ProdTFS
Write-Host "tfsConnect ="$ProdEnvServer
# Get prod teamprojekt
Get-TfsChildItem $Teamprojekt -Server $ProdEnvServer
# Update files in local prod workspace
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace
# Connect to test-TFS
$TestEnvServer = Get-TfsServer -Name $TestTFS
Write-Host "tfsConnect ="$TestEnvServer
# Get test teamprojekt
Get-TfsChildItem $Teamprojekt -Server $TestEnvServer
# Update files in local test workspace
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace
After 3 months and noone coming up with an answer. I just assume that the Cmdlets don't work as they should. The only option here seems to be a Workaround.
# Copy Team Project from Prod to Test TFS
param([string]$TestTFS = "http://TestTFS",
[string]$ProdTFS = "http://ProdTFS",
[String]$Teamproject="$/Teamproject",
[String]$LocalTestWorkspace="C:\LocalTestWorkspacePath",
[String]$LocalProdWorkspace="C:\LocalProdWorkspacePath")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.VersionControl.Client")
try
{
clear
$LocalTestProjectPath = $LocalTestWorkspace + $Teamproject.Substring(1)
$LocalProdProjectPath = $LocalProdWorkspace + $Teamproject.Substring(1)
# Connect to production-TFS
Write-Host "Getting latest of $ProdTFS"
$tfsColProd = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($ProdTFS)
[Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer] $vcsProd = $tfsColProd.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
$workspaceProd = $vcsProd.TryGetWorkspace($LocalProdWorkspace)
$isProdTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceProd) {
Write-Host "No workspace found, creating temporary for prod"
$workspaceProd = $vcsProd.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceProd.Map($Teamproject, $LocalProdProjectPath)
$isProdTempWorkspace = $true
}
$itemSpecFullTeamProj = New-Object Microsoft.TeamFoundation.VersionControl.Client.ItemSpec($Teamproject, "Full")
$fileRequest = New-Object Microsoft.TeamFoundation.VersionControl.Client.GetRequest($itemSpecFullTeamProj,
[Microsoft.TeamFoundation.VersionControl.Client.VersionSpec]::Latest)
$workspaceProd.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
if ($isProdTempWorkspace) {
Write-Host "Deleting temporary workspace for prod"
$workspaceProd.Delete()
}
Write-Host "Getting latest of $TestTFS"
$tfsColTest = [Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($TestTFS)
$vcsTest = $tfsColTest.GetService([type] "Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer")
# TryGetWorkspace is sometimes buggy and doesn't return an existing workspace
# Delete existing workspace manually before if that happens
[Microsoft.TeamFoundation.VersionControl.Client.Workspace] $workspaceTest = $vcsTest.TryGetWorkspace($LocalTestWorkspace)
$isTestTempWorkspace = $false
# create Workspace if it doesn't exists
if (-not $workspaceTest) {
Write-Host "No workspace found, creating temporary for test"
$workspaceTest = $vcsTest.CreateWorkspace("Temp_" + [System.Guid]::NewGuid().ToString())
$workspaceTest.Map($Teamproject, $LocalTestProjectPath)
$isTestTempWorkspace = $true
}
$workspaceTest.Get($fileRequest, [Microsoft.TeamFoundation.VersionControl.Client.GetOptions]::GetAll)
# Remove local test folder and copy prod folder into test workspace
Write-Host "Copying over Prod to Test"
# Delete updated test project folder
Remove-Item -Path $LocalTestProjectPath -Force -Recurse
# Copy prod folder to test workspace
Copy-Item -Path $LocalProdProjectPath -Destination $LocalTestProjectPath -Force -Recurse
# Calling tfpt is the only thing that works
Write-Host "Comparing for changes"
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $env:TFSPowerToolDir + "tfpt.exe"
$ps.StartInfo.Arguments = "online /adds /deletes /diff /noprompt /recursive $LocalTestProjectPath"
$ps.StartInfo.RedirectStandardOutput = $false # careful, only output works, has hanging problems (2k Buffer limit)
$ps.StartInfo.RedirectStandardError = $false
$ps.StartInfo.UseShellExecute = $false
$ps.Start()
$ps.WaitForExit()
# Check in new test project folder into test environment
$wsCheckinParams = New-Object Microsoft.TeamFoundation.VersionControl.Client.WorkspaceCheckInParameters(
#($itemSpecFullTeamProj),"Update project to production environment version")
# CheckIn better manually to check for errors
$workspaceTest.CheckIn($wsCheckinParams)
if ($isTestTempWorkspace) {
Write-Host "Deleting temporary workspace for test"
$workspaceTest.Delete()
Remove-Item -Path D:\Development -Force -Recurse
}
}
catch [System.Exception]
{
Write-Host "Exception: " ($Error[0]).Exception
EXIT $LASTEXITCODE
}
My approach is very similar to Zittelrittel. Just send the path and it will automatically figure out the workspace.
This will not work in PowerShell ISE (x86), I had to use the 64-bit version!
Add-PSSnapin Microsoft.TeamFoundation.PowerShell
Write-Host "Updating Workspace1, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
Write-Host "Updating Workspace2, please wait..."
Update-TfsWorkspace -item C:\dev\Workspace1\code -Recurse | Format-Table
In your calls to update TFS workspace, pipe the result to out-null. This should effectively remove any data that would otherwise be stored in the pipeline.
Update-TfsWorkspace -Force -Recurse $LocalProdWorkspace | Out-Null
Update-TfsWorkspace -Force -Recurse $LocalTestWorkspace | Out-Null