I am using the following code to download files from Jfrog Artifactory using Powershell. It's working perfectly. I now have a repository on Artifactory that has couple of recursive folders with files. So I need to download all contents of that repository. Can anyone suggest What I need to change with the following code:
#example Artifactory url
$artifactory_url = "https://artifactory.company.com/artifactory/"
#example Artifactory Key
$ArtifactoryKey = "AKCp2VpEfLuMVkxpmH9rSiZT3RPoWCucL8kEiq4SjbEuuuCFdNf5t5E6dom32TCE3efy2RCyg"
$wc = New-Object System.Net.WebClient
$wc.Headers.Add("X-JFrog-Art-Api", $ArtifactoryKey)
$files = #("test1.zip", "test.zip")
try {
foreach($file in $files) {
$wc.DownloadFile("$artifactory_url/$file", "D:\download\$file")
}
}
catch {
$Host.UI.WriteErrorLine("Error while Trying to download Artifacts.")
$Host.UI.WriteErrorLine($_.Exception.Message)
exit
}
To download the entire folder under a repository you can utilize the JFrog CLI. First, configure the Artifactory with the JFrog CLI and download the entire folder as below,
$ jfrog rt dl "my-local-repo/*.jar" all-my-frogs/
Related
Is it possible to use an Rest API to download a file from a TFVC repos in Azure DevOps Service?
I find a lot of topics that talk about downloading with a Git repos, but not with TFVC.
You can use the REST API Items - Get to get a file from TFVC repos.
GET https://dev.azure.com/{organization}/{project}/_apis/tfvc/items?path={path}&download=true&api-version=6.0
Here are a few points that need your attention:
The path parameter omits the name of the TFVC repo. For example, if I want to get a file $/{name}/A.txt, then I need to set path=A.txt.
You need to specify the file path instead of the folder path. Otherwise only the folder information will be returned and the file will not be downloaded. If you want to download multiple files, you need to use the REST API Items - Get Items Batch.
Set download parameter to true to download the file. Otherwise only the file information will be returned and the file will not be downloaded.
Using PowerShell and Items - Get API:
$ADOHeaders = #{
Authorization = 'Basic ' + [Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes(":ado_personalaccesstoken"))
Accept = 'application/zip'
}
Invoke-WebRequest `
-Uri "https://dev.azure.com/{organization}/{project}/_apis/tfvc/items?path=%24/full/path/to/folder/&download=true&api-version=7.1-preview.1" `
-Headers $ADOHeaders `
-OutFile ./files.zip
I am tasked with writing a PowerShell script to download the latest source code for a given branch, rebuild it and deploy it. The script I've written, is able to download projects, and in most cases has been able to rebuild them. But with one web project I get this error:
error : This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is ..\packages\Microsoft.Net.Compilers.2.0.1\build\Microsoft.Net.Compilers.props.
I've researched if PowerShell has a Update-Package command like the one available in the VS command prompt but have been unable to find the equivalent.
I know there's a cmdlet for Packages but from what I've seen they're used to update a specific package...what I need is to be able to have it download/update all packages referenced in the project.
Some points that might be of interest...
When I get latest on the solution to a new empty folder, the only thing in the packages folder is Modernizr.2.6.2. This is the same whether I'm getting latest in VS or in my PowerShell script.
If I open the solution within VS 2017, I am able to rebuild the solution with no problems. It downloads/installs over a dozen other packages...one of which is the Microsoft.Net.Compilers.props package referred to in the error message.
But if I delete everything and re-download the source code and then through my PowerShell script I call MSBuild to rebuild the solution I get the error mentioned above. It never seems to download/install the missing packages.
Anyone have any ideas how I can use MSBuild within my PowerShell script to rebuild the project and have it automatically update/install any packages it needs?
Thanks
I was able to find a solution to my problem on this page :Quickly Restore NuGet Packages With PowerShell
On that page is a script that uses Nuget.exe to download the packages based on the packages.config:
#This will be the root folder of all your solutions - we will search all children of
this folder
$SOLUTIONROOT = "C:\Projects\"
#This is where your NuGet.exe is located
$NUGETLOCATION = "C:\Projects\NuGet\NuGet.exe"
Function RestoreAllPackages ($BaseDirectory)
{
Write-Host "Starting Package Restore - This may take a few minutes ..."
$PACKAGECONFIGS = Get-ChildItem -Recurse -Force $BaseDirectory -ErrorAction SilentlyContinue | Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -eq "packages.config")}
ForEach($PACKAGECONFIG in $PACKAGECONFIGS)
{
Write-Host $PACKAGECONFIG.FullName
$NugetRestore = $NUGETLOCATION + " install " + " '" + $PACKAGECONFIG.FullName + "' -OutputDirectory '" + $PACKAGECONFIG.Directory.parent.FullName + "\packages'"
Write-Host $NugetRestore
Invoke-Expression $NugetRestore
}
}
RestoreAllPackages $SOLUTIONROOT
Write-Host "Press any key to continue ..."
$x = $host.UI.RawUI.ReadKey("NoEcho,IncludeKeyDown")
I modified and added this function to my PS script and call it first to download all the packages and that does the job!
You need to call the restore target of MSBuild to download NuGet packages. You can do that by running something like:
git clone [your repo]
cd [your repo]
msbuild /target:Restore [Your Solution]
msbuild [Your Solution]
function buildVS
{
param
(
[parameter(Mandatory=$true)]
[String] $path,
[parameter(Mandatory=$false)]
[bool] $nuget = $true,
[parameter(Mandatory=$false)]
[bool] $clean = $true
)
process
{
$msBuildExe = 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\MSBuild\Current\Bin\MSBuild.exe'
if ($nuget) {
Write-Host "Restoring NuGet packages" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /p:Configuration=Release /p:platform=x64 /t:restore
}
if ($clean) {
Write-Host "Cleaning $($path)" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /t:Clean /m
}
Write-Host "Building $($path)" -foregroundcolor green
& "$($msBuildExe)" "$($path)" /t:Build /p:Configuration=Release /p:platform=x64
}
}
I was trying to write a powershell script which downloads multiple files from my artifactory repo. I could use some kind of logic as below by passing file names.
$files = #("test1.zip", "test.zip")
foreach($file in $files)
{
Invoke-WebRequest -Uri "$artifactory_url/$file" -OutFile "D:\download\$file"
}
But, Is there any way to download all the files with out passing names? I tried with wildcards like (*zip) but, looks like Invoke-webrequest isn't accepting the wildcards. And found no luck with Start-bittransfer cmdlet as well as described in article https://blogs.technet.microsoft.com/heyscriptingguy/2012/08/17/use-powershell-3-0-to-easily-download-60-spanned-files .
I was able to pull up list of files in the repo using below command
((Invoke-WebRequest $url).links | Where href -match "zip$").href
How can I use this command to download the files? Is there any better way to download multiple files from the artifactory repo or http endpoint? I have to perform this action on multiple servers. So, I was not looking at usage of jfrog cli.
Thanks in advance
You may be missing the credentials to be sent with the request.
If you are using an Artifactory Key you could use the WebClient object like the following -
#example Artifactory url
$artifactory_url = "https://artifactory.company.com/artifactory/"
#example Artifactory Key
$ArtifactoryKey = "AKCp2VpEfLuMVkxpmH9rSiZT3RPoWCucL8kEiq4SjbEuuuCFdNf5t5E6dom32TCE3efy2RCyg"
$wc = New-Object System.Net.WebClient
$wc.Headers.Add("X-JFrog-Art-Api", $ArtifactoryKey)
$files = #("test1.zip", "test.zip")
try {
foreach($file in $files) {
$wc.DownloadFile("$artifactory_url/$file", "D:\download\$file")
}
}
catch {
$Host.UI.WriteErrorLine("Error while Trying to download Artifacts.")
$Host.UI.WriteErrorLine($_.Exception.Message)
exit
}
I've been toying around with DSC and I think it's an awesome platform. I made a few tests to automate the deployment of our TFS build outputs and to automatically install the web applications and configure an environment.
This was relatively easy, as I could pass my drop folder path to the DSC script using a file share on our internal network, and use relative paths inside the configuration to select each of our modules.
My problem now is in how to expand this to Azure virtual machines. We wanted to create these scripts to automatically deploy to our QA and Production servers, which are hosted on Azure. Since they are not in our domain, I can't use the File resource anymore to transfer the files, but at the same time I want exactly the same functionality: I'd like to somehow point the configuration to our build output folder and copy the files from there to the virtual machines.
Is there some way I can copy the drop folder files easily from inside a configuration that is run on these remote computers, without sharing the same network and domain? I successfully configured the VMs to accept DSC calls over https using certificates, and I just found out that the Azure PowerShell cmdlets enable you to upload a configuration to Azure storage and run it in the VMs automatically (which seems a lot better than what I did) but I still don't know how I'd get access to my build outputs from inside the virtual machine when the configuration script is run.
I ended up using the Publish-AzureVMDscExtension cmdlet to create a local zip file, appending my build outputs to the zip, and then publishing the zip, something along those lines:
function Publish-AzureDscConfiguration
{
[CmdletBinding()]
Param(
[Parameter(Mandatory)]
[string] $ConfigurationPath
)
Begin{}
Process
{
$zippedConfigurationPath = "$ConfigurationPath.zip";
Publish-AzureVMDscConfiguration -ConfigurationPath:$ConfigurationPath -ConfigurationArchivePath:$zippedConfigurationPath -Force
$tempFolderName = [System.Guid]::NewGuid().ToString();
$tempFolderPath = "$env:TEMP\$tempFolderName";
$dropFolderPath = "$tempFolderPath\BuildDrop";
try{
Write-Verbose "Creating temporary folder and symbolic link to build outputs at '$tempFolderPath' ...";
New-Item -ItemType:Directory -Path:$tempFolderPath;
New-Symlink -LiteralPath:$dropFolderPath -TargetPath:$PWD;
Invoke-Expression ".\7za a $tempFolderPath\BuildDrop.zip $dropFolderPath -r -x!'7za.exe' -x!'DscDeployment.ps1'";
Write-Verbose "Adding component files to DSC package in '$zippedConfigurationPath'...";
Invoke-Expression ".\7za a $zippedConfigurationPath $dropFolderPath.zip";
}
finally{
Write-Verbose "Removing symbolic link and temporary folder at '$tempFolderPath'...";
Remove-ReparsePoint -Path:$dropFolderPath;
Remove-Item -Path:$tempFolderPath -Recurse -Force;
}
Publish-AzureVMDscConfiguration -ConfigurationPath:$zippedConfigurationPath -Force
}
End{}
}
By using a zip inside the zip used by Azure, I can access the inner contents in the working directory of the PowerShell DSC Extension (in the DSCWork folder). I tried just adding the drop folder directly to the zip (without zipping it first), but then the DSC Extension copies the folder to the modules path, thinking it is a module.
I'm not completely happy with this solution yet, and I'm having a few problems already, but it makes sense in my mind and should work fine.
I'm creating a Nuget package that has native dependencies. I put them inside the package without problems by specifying additional file entries in the .nuspec file.
However, I also want to copy these to the output folder of the project that is going to use my package, so that the dependencies can be found at runtime.
My idea is to add the native dependencies to the project and set their BuildAction to CopyToOutputDirectory. This I have also managed with the PowerShell script below:
param($installPath, $toolsPath, $package, $project)
Function add_file($file)
{
$do_add = 1
foreach($item in $project.DTE.ActiveSolutionProjects[0].ProjectItems)
{
if ($item -eq $file)
{ $do_add = 0 }
}
if ($do_add -eq 1)
{
$added = $project.DTE.ItemOperations.AddExistingItem($file)
$added.Properties.Item("CopyToOutputDirectory").Value = 2
$added.Properties.Item("BuildAction").Value = 0
}
}
add_file(<dependency1>)
add_file(<dependency2>)
...
add_file(<dependencyN>)
So far so good.
But, now my project becomes completely polluted with these dependencies.
Is there a way to add files to a project using PowerShell and put them inside a folder?
Or is there another way to achieve what I want: adding native dependencies to a NuGet package and outputting them to the bin-folder of the project using my Nu-package?
The SqlServerCompact package did something similar, copying the relevant dlls to the bin folder in the post build event. Here the relevant code:
File:install.ps1
param($installPath, $toolsPath, $package, $project)
. (Join-Path $toolsPath "GetSqlCEPostBuildCmd.ps1")
# Get the current Post Build Event cmd
$currentPostBuildCmd = $project.Properties.Item("PostBuildEvent").Value
# Append our post build command if it's not already there
if (!$currentPostBuildCmd.Contains($SqlCEPostBuildCmd)) {
$project.Properties.Item("PostBuildEvent").Value += $SqlCEPostBuildCmd
}
File:GetSqlCEPostBuildCmd.ps1
$solutionDir = [System.IO.Path]::GetDirectoryName($dte.Solution.FullName) + "\"
$path = $installPath.Replace($solutionDir, "`$(SolutionDir)")
$NativeAssembliesDir = Join-Path $path "NativeBinaries"
$x86 = $(Join-Path $NativeAssembliesDir "x86\*.*")
$x64 = $(Join-Path $NativeAssembliesDir "amd64\*.*")
$SqlCEPostBuildCmd = "
if not exist `"`$(TargetDir)x86`" md `"`$(TargetDir)x86`"
xcopy /s /y `"$x86`" `"`$(TargetDir)x86`"
if not exist `"`$(TargetDir)amd64`" md `"`$(TargetDir)amd64`"
xcopy /s /y `"$x64`" `"`$(TargetDir)amd64`""
I'd suggest you open the 4.0.8852.1 version of the SqlServerCompact Nuget package with NuGet Package Explorer (Microsoft Store, GitHub) and use it as a template. It worked for me.