I am working on Azure DevOps Build Pipeline and one of the task is to copy my dll and pdb files into a staging folder for example
Code
MyProject
Bin
Debug
MyProject.dll
MyProject.pdb
Staging
Client
Libraries
I want to use PowerShell script task and I am using inline script.
When I give below it is not working
Copy-Item $(Build.Repository.LocalPath)\Code\MyProject\Bin\$(DebugBuildConfiguration)
-Destination $(ClientLibrariesFolder)
Below are my variables
Variable Name Variable Value
StagingFolder $(Build.Repository.LocalPath)\Staging
DebugBuildConfiguration Debug
ClientLibrariesFolder $(StagingFolder)\Client\Libraries
I donot get any error. But nothing happens.
SOLUTION:
I solved my issue following below
I added new variable like below
CodeLocalPath : $(Build.Repository.LocalPath)
I added Powershell task to my Azure DevOps build pipeline.
I gave Type as Inline.
In Script I gave below
$destination = "{0}" -f $env:ClientLibrariesFolder
# Copy MyProject.dll to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.dll" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
# Copy MyProject.pdb to Staging\Client\Libraries
$sourcefolder = "{0}\Code\MyProject\Bin\{1}\MyProject.pdb" -f $env:CodeLocalPath, $env:DebugBuildConfiguration
"Source : {0} and Destination : {1} " -f $($sourcefolder), $($destination)
Copy-Item $($sourcefolder) -Destination $($destination)
I donot get any error. But nothing happens.
What do you mean "But nothing happens"? Do you mean that no files have been copied into your Repos?
If yes, that is the correct behavior of Devops. Because this is not recommended to upload any file back to your repo.
If you set the system.debug=true in the variables tab, you will find the log like:
##[debug]Copy-Item C:\VS2017Agent\_work\8\s\TestSample\TestSample\Bin\Debug\*.* -Destination C:\VS2017Agent\_work\8\s\TestSample\Staging\Client\Libraries'
It will not copy the file to the repos. That should be the reason why you see nothing happens.
Besides, Looking at Microsoft's documentation the descriptions are as follows:
$(Build.Repository.LocalPath): The local path on the agent where
your source code files are downloaded. For example: c:\agent_work\1\s
By default, new build definitions update only the changed files. You
can modify how files are downloaded on the Repository tab.
So, the value of this variable is pointing to the agent instead of repos.
Note: Copy-Item in powershell should copy the files instead of a folder, try to use *.* to include all file in the folder.
Related
I have a self-hosted GitHub Action Runner that builds a Visual Studio solution and then launches a powershell script that lives inside the repo. This script modifies some files in the repo. I would like these modified files to also be copied to a folder outside the GitHub Action Runner directory as the last step of my action within my PC (Windows 10). This is what the launched script looks like:
$scriptFile = Get-Item ".\SFDScripts\Internal\Hardcore\Hardcore.cs"
$maps = Get-ChildItem ".\SFDScripts\Internal\Hardcore\Maps\"
foreach ($map in $maps) {
if ($map.PSIsContainer) { continue }
# This modifies the file at {$map.FullName}
SFDScriptInjector $scriptFile.FullName $map.FullName
# Then I want to copy that file to a specific directory on the device where the GitHub Action Runner runs
Copy-Item $map.FullName "C:\Users\juans\OneDrive\Documents\Superfighters Deluxe\Maps\Custom"
}
I am currently getting an error at the Copy-Item $map.FullName "C:\Users\juans\OneDrive\Documents\Superfighters Deluxe\Maps\Custom" line. The error is the following
Copy-Item : Access is denied
At E:\actions-runner\_work\SFDScripts\SFDScripts\injectScriptToAllMaps.ps1:7 char:5
+ Copy-Item $map.FullName "C:\Users\juans\OneDrive\Documents\Superf ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], UnauthorizedAccessException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.CopyItemCommand
Error: Process completed with exit code 1.
I also tried running the script using this run_as.ps1 script. This allows the Action to finish without errors, but the files are still not copied (I'm guessing it fails silently inside the new instance of powershell.
Here is the .yml action file at its current state.
I can't find out how to give access to the github runner to modify external directories. So, how can I make it to copy the file to an external directory?
What needs to be changed is actually the concerned folder Security parameters. In this case I went to the folder at "C:\Users\juans\OneDrive\Documents\Superfighters Deluxe\Maps\Custom" then did the following:
Right Click on folder > Properties > Security Tab > Advanced > Add > Select a principal > Advanced... > Find Now
In the list under Search Results: search for an entity with a name like GITHUB_ActionsRunner_[short-id].
OK > OK
Choose the permissions you want the Github Action Runner to have on this folder, in my case: Full Control
OK > OK > OK
Then I re runned the Github Action, and the script was launched without any problems
Am working on VSTS release task for deploying the Web Application along with Database. Unfortunately, we are not creating any Build Definition for creating drop folder. But, my client will provide drop folder for this project, what I need is “I want to copy the files in VM along with creation of System-Timed folder” at release level. For that I created a folder with the help of PowerShell Task.
$FileName = (Get-Date).tostring("dd-MM-yyyy-hh-mm-ss")
$Fname = New-Item -itemType Directory -Path C:\Database -Name ("Test "+ $FileName)
Write-Host "##vso[task.setvariable variable=$Fname;]$Fname"
Write-Output ("##vso[task.setvariable variable=$Fname;]UpdatedValueInScript")
But, I’m not able to use that the above PowerShell Script output variable in next “Copy Files” task.
Note: For creating folder in VM, I followed this link
Your variable name should be static.
The value should change.
Write-Host "##vso[task.setvariable variable=Fname;]$Fname"
I have an Azure Cloud Service Worker Role which needs a separate Windows Service installed to redirect application tracing to a centralized server. I've placed the installation binaries for this Windows Service in a Storage Account's file storage as shown below. I then have my startup task call a batch file, which in turn executes a power-shell script to retrieve the file and install the service
When Azure deploys a new instance of the role, the script execution fails with the following error:
Cannot find path
'\\{name}.file.core.windows.net\utilities\slab1-1.zip' because it does
not exist
However, when I run the script after connecting through RDP, all is fine. Does anybody know why this might be happening? Here is the script below...
cmdkey /add:$storageAccountName.file.core.windows.net /user:$shareUser /pass:$shareAccessKey
net use * \\$storageAccountName.file.core.windows.net\utilities
mkdir slab
copy \\$storageAccountName.file.core.windows.net\utilities\$package .\slab\$package
I always have problem here and there by using a script to access the mounted azure file drive. I believe this is more or less related to the drive is mounted only for the current user and may not always work the same when called from a script.
I ended up pulling files from azure file the hard way without network drive.
$source= $stroageAccountName
$sourceKey = $shareAccessKey
$sharename = "utilities"
$package = "slab1-1.zip"
$dest = ".\slab\" + $package
#Define Azure file share root
$ctx=New-AzureStorageContext $source $sourceKey
$share = get-AzureStorageShare $sharename -Context $ctx
Get-AzureStorageFileContent -share $share -Destination $dest -Path $package -confirm:$false
Code example here will get you a good start:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
It would be harder to manage if you have more complex folder structure, but objects there are CloudFileDirectory and CloudFile, property and methods there works seamlessly for me in powershell 4.0
*Azure Powershell module is required for 'Get-AzureStorageFileContent' cmdlet
I've been toying around with DSC and I think it's an awesome platform. I made a few tests to automate the deployment of our TFS build outputs and to automatically install the web applications and configure an environment.
This was relatively easy, as I could pass my drop folder path to the DSC script using a file share on our internal network, and use relative paths inside the configuration to select each of our modules.
My problem now is in how to expand this to Azure virtual machines. We wanted to create these scripts to automatically deploy to our QA and Production servers, which are hosted on Azure. Since they are not in our domain, I can't use the File resource anymore to transfer the files, but at the same time I want exactly the same functionality: I'd like to somehow point the configuration to our build output folder and copy the files from there to the virtual machines.
Is there some way I can copy the drop folder files easily from inside a configuration that is run on these remote computers, without sharing the same network and domain? I successfully configured the VMs to accept DSC calls over https using certificates, and I just found out that the Azure PowerShell cmdlets enable you to upload a configuration to Azure storage and run it in the VMs automatically (which seems a lot better than what I did) but I still don't know how I'd get access to my build outputs from inside the virtual machine when the configuration script is run.
I ended up using the Publish-AzureVMDscExtension cmdlet to create a local zip file, appending my build outputs to the zip, and then publishing the zip, something along those lines:
function Publish-AzureDscConfiguration
{
[CmdletBinding()]
Param(
[Parameter(Mandatory)]
[string] $ConfigurationPath
)
Begin{}
Process
{
$zippedConfigurationPath = "$ConfigurationPath.zip";
Publish-AzureVMDscConfiguration -ConfigurationPath:$ConfigurationPath -ConfigurationArchivePath:$zippedConfigurationPath -Force
$tempFolderName = [System.Guid]::NewGuid().ToString();
$tempFolderPath = "$env:TEMP\$tempFolderName";
$dropFolderPath = "$tempFolderPath\BuildDrop";
try{
Write-Verbose "Creating temporary folder and symbolic link to build outputs at '$tempFolderPath' ...";
New-Item -ItemType:Directory -Path:$tempFolderPath;
New-Symlink -LiteralPath:$dropFolderPath -TargetPath:$PWD;
Invoke-Expression ".\7za a $tempFolderPath\BuildDrop.zip $dropFolderPath -r -x!'7za.exe' -x!'DscDeployment.ps1'";
Write-Verbose "Adding component files to DSC package in '$zippedConfigurationPath'...";
Invoke-Expression ".\7za a $zippedConfigurationPath $dropFolderPath.zip";
}
finally{
Write-Verbose "Removing symbolic link and temporary folder at '$tempFolderPath'...";
Remove-ReparsePoint -Path:$dropFolderPath;
Remove-Item -Path:$tempFolderPath -Recurse -Force;
}
Publish-AzureVMDscConfiguration -ConfigurationPath:$zippedConfigurationPath -Force
}
End{}
}
By using a zip inside the zip used by Azure, I can access the inner contents in the working directory of the PowerShell DSC Extension (in the DSCWork folder). I tried just adding the drop folder directly to the zip (without zipping it first), but then the DSC Extension copies the folder to the modules path, thinking it is a module.
I'm not completely happy with this solution yet, and I'm having a few problems already, but it makes sense in my mind and should work fine.
Good morning friends,
I've been writing a script in PowerShell to replace our current manual process to deploy our application to Azure Blob Storage in a ZIP folder during the Build Process in VS. I'm about done, but I've run into this issue:
When the ZIP that I upload to Azure is downloaded by anyone, the ZIP cannot be manipulated without having to extract the files first. This is something the current process is able to accomplish and I don't know how (The current process was written in C# and is done through a GUI). It needs to be editable via the ZIP because the current Updater is set to manipulate the ZIP without the extraction first.
So the initial question is: How do I set permissions on a ZIP archive that will follow it to Azure Blob Storage and then when it's downloaded on a client's machine that allow it's contents to be manipulated (The error itself at this time is that it cannot delete a file in a child folder) without extraction?
Currently, to ZIP my folder up, I use this process:
$src = "$TEMPFOL\$testBuildDrop"
$dst = "$TEMPFOL\LobbyGuard.zip"
[Reflection.Assembly]::LoadWithPartialName( "System.IO.Compression.FileSystem" )
[System.IO.Compression.ZipFile]::CreateFromDirectory($src, $dst)
and then push it to blob with:
set-azurestorageblobcontent -Container test -blob "LobbyGuard.zip" -file "$TEMPFOL\LobbyGuard.zip" -context $storageCreds -force
I've tried to set permissions on the folder prior to upload using
$getTEMPFOLACL = Get-ACL $TEMPFOL
$accessRule = New-Object System.Security.AccessControl.FileSystemAccessRule("Everyone", "FullControl", "Allow")
$getTEMPFOLACL.SetAccessRule($accessRule)
Which works on the current local file, but once downloaded, the permissions on the file are set as
Owner: BUILTIN\Adminstrators Access: NT AUTHORITY\SYSTEM Allow FullControl
Which is exactly the same permissions as the file that's downloaded from the current process. I'm not understanding what I'm missing here to make this work.
If necessary I can provide the DL link to our blob to show the current manual processes folder that can be manipulated IN the ZIP vs. My Scripts ZIP that cannot.
Try unblocking the file after downloading, ie
Unblock-File C:\path\yourDownloaded.zip