How can I generate the same checksum as artifactory? - powershell

As art.exe (the artifactory CLI interface) hangs whenever I call it from my build script, I am rewriting the bash script from their page on the topic in powershell 2.0. I'm using this code to generate my checksum:
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$path = Resolve-Path "CatVideos.zip"
$bytes = [System.IO.File]::ReadAllBytes($path.Path)
$sha1.ComputeHash($bytes) | foreach { $hash = $hash + $_.ToString("X2") }
$wc=new-object System.Net.WebClient
$wc.Credentials= new-object System.Net.NetworkCredential("svnintegration","orange#5")
$wc.Headers.Add("X-Checksum-Deploy", "true")
$wc.Headers.Add("X-Checksum-Sha1", $hash)
The problem is it consistently produces a different checksum on my local than artifactory generates. It's not a 'corruption during transmission' error because I'm generating the checksum on my local and I've manually deployed several times.
How can I generate a checksum in the same manner as artifactory (2.6.5) so our checksums will match when I send mine and the deploy won't fail? Am I doing something obviously wrong when generating my checksum?
Thanks!

You have to use
$wc.Headers.Add("X-Checksum-Sha1", $hash.ToLower())

Related

Powershell - Download the latest FTP files from Ftp server [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

Find the last modified file on FTP site using powershell [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

How do I query a file on FTP server in PowerShell to determine if an upload is required?

The project is an MVC website coded and built using VS2017 and (on premises) TFS2017. The Build Definition is currently working and publishing to the staging location upon check-in.
The PowerShell script below, derived from David Kittle's website, is being used but it uploads all files every time. I abbreviated the listing using comments to focus on the part of the script for which I'd like to ask for help/guidance.
# Setup the FTP connection, destination URL and local source directory
# Put the folders and files to upload into $Srcfolders and $SrcFiles
# Create destination folders as required
# start file uploads
foreach($entry in $SrcFiles)
{
    #Create full destination filename from $entry and put it into $DesFile
    $uri = New-Object System.Uri($DesFile)
    #NEED TO GET THE REMOTE FILE DATA HERE TO TEST AGAINST THE LOCAL FILE
If (#perform a test to see if the file needs to be uploaded)
{ $webclient.UploadFile($uri, $SrcFullname) }
}
In the last few lines of the script above I need to determine if a source file requires upload. I am assuming I can check the time stamp to determine this. So;
If my assumption is wrong, please advise the best way to check for a required upload.
If my assumption is correct, how do I (1) retrieve the time stamp from the remote server and then (2) make the check against the local file?
You can use the FtpWebRequest class with its GetDateTimestamp FTP "method" and parse the UTC timestamp string it returns. The format is specified by RFC 3659 to be YYYYMMDDHHMMSS[.sss].
That would work only if the FTP server supports MDTM command that the method uses under the cover (most servers do, but not all).
$url = "ftp://ftp.example.com/remote/folder/file.txt"
$ftprequest = [System.Net.FtpWebRequest]::Create($url)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$response = $ftprequest.GetResponse().StatusDescription
$tokens = $response.Split(" ")
$code = $tokens[0]
if ($code -eq 213)
{
Write-Host "Timestamp is $($tokens[1])"
}
else
{
Write-Host "Error $response"
}
It would output something like:
Timestamp is 20171019230712
Now you parse it, and compare against a UTC timestamp of a local file:
(Get-Item "file.txt").LastWriteTimeUtc
Or save yourself some time and use an FTP library/tool that can do this for you.
For example with WinSCP .NET assembly, you can synchronize whole local folder with a remote folder with one call to the Session.SynchronizeDirectories. Or your can limit the synchronization to a set of files only.
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Ftp
$sessionOptions.HostName = "ftpsite.com"
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
$result = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Remote, "C:\local\folder", "/remote/folder")
$result.Check()
To use the assembly, just extract a contents of .NET assembly package to your script folder. No other installation is needed.
The assembly supports not only the MDTM, but also other alternative methods to retrieve the timestamp.
(I'm the author of WinSCP)

Download multiple files from the artifactory repo using powershell

I was trying to write a powershell script which downloads multiple files from my artifactory repo. I could use some kind of logic as below by passing file names.
$files = #("test1.zip", "test.zip")
foreach($file in $files)
{
Invoke-WebRequest -Uri "$artifactory_url/$file" -OutFile "D:\download\$file"
}
But, Is there any way to download all the files with out passing names? I tried with wildcards like (*zip) but, looks like Invoke-webrequest isn't accepting the wildcards. And found no luck with Start-bittransfer cmdlet as well as described in article https://blogs.technet.microsoft.com/heyscriptingguy/2012/08/17/use-powershell-3-0-to-easily-download-60-spanned-files .
I was able to pull up list of files in the repo using below command
((Invoke-WebRequest $url).links | Where href -match "zip$").href
How can I use this command to download the files? Is there any better way to download multiple files from the artifactory repo or http endpoint? I have to perform this action on multiple servers. So, I was not looking at usage of jfrog cli.
Thanks in advance
You may be missing the credentials to be sent with the request.
If you are using an Artifactory Key you could use the WebClient object like the following -
#example Artifactory url
$artifactory_url = "https://artifactory.company.com/artifactory/"
#example Artifactory Key
$ArtifactoryKey = "AKCp2VpEfLuMVkxpmH9rSiZT3RPoWCucL8kEiq4SjbEuuuCFdNf5t5E6dom32TCE3efy2RCyg"
$wc = New-Object System.Net.WebClient
$wc.Headers.Add("X-JFrog-Art-Api", $ArtifactoryKey)
$files = #("test1.zip", "test.zip")
try {
foreach($file in $files) {
$wc.DownloadFile("$artifactory_url/$file", "D:\download\$file")
}
}
catch {
$Host.UI.WriteErrorLine("Error while Trying to download Artifacts.")
$Host.UI.WriteErrorLine($_.Exception.Message)
exit
}

DSC problems with Credentials and build 10586

The latest windows 10 build pushed out an updated version of powershell (10586).
In addition to the change required for the certificate documented at https://dscottraynsford.wordpress.com/2015/11/15/windows-10-build-10586-powershell-problems/ i seem to have an additional problem, while trying to apply the configuration:
WarningMessage An error occured while applying the partial configuration [PartialConfiguration]ExternalIntegrationConfiguration. The error message is :
Decryption failed..
Using the same certificate I can successfully create a MOF with build 10.0.10240.16384 , and successfully apply it. So looking at the difference between the two MOFs I see that the MOF built by build 10586 looks like:
instance of MSFT_Credential as $MSFT_Credential6ref
{
Password = "-----BEGIN CMS-----
\nBase64 encrypted
\n-----END CMS-----";
UserName = "SomeUser";
};
instead of what it used to be like in build (10.0.10240.16384):
instance of MSFT_Credential as $MSFT_Credential6ref
{
Password = "Base64 encrypted";
UserName = "SomeUser";
};
So the content is different. I did check to see whether I could decrypt the credential using Get-CmsMessage and unprotect-CmsMessage, and I could. So the public/private key stuff appears to be good.
Should there be an update to the machine that the configuration is being applied to? I don't see any new powershell build.
Any ideas would be appreciated.
Update 2015-12-18: Installing Windows Management Framework (WMF) 5.0 RTM edition that was released 2015-12-17 on the nodes being configured will resolve this error. WMF 5.0 can be downloaded here.
MS has changed the Get-EncryptedPassword function in the PSDesiredStateConfiguration to generate the new format for Password field in a MOF. This prevents DSC nodes from decrypting the password if WMF has not been upgraded to support it. But as MS has not released an update to allow WMF to read this new format then this should be considered a completely broken release.
I have managed to find a work around:
Copy the PSDesiredStateConfiguration module from a pre 10586 machine (e.g. Windows Server 2012 R2 with latest WMF 5.0) to the PowerShell modules folder on the Built 10586 machine.
E.g.
Replace the C:\Windows\System32\WindowsPowerShell\v1.0\Modules\PSDesiredStateConfiguration folder with an older version
Note: You'll need to take ownership of this folder and give yourself permission to write into it because by default only TrustedInstaller can write to this folder.
As far as I'm concerned this version of PSDesiredStateConfiguration is completely broken and you're better off rolling it back until MS can fix it. This will also fix some other reported problems (module versions, new certificate Policy requirements).
FYI, here is the changed code that changes the credential encryption:
Old code in PSDesiredStateConfiguration.psm1:
# Cast the public key correctly
$rsaProvider = [System.Security.Cryptography.RSACryptoServiceProvider]$cert.PublicKey.Key
# Convert to a byte array
$keybytes = [System.Text.Encoding]::UNICODE.GetBytes($Value)
# Add a null terminator to the byte array
$keybytes += 0
$keybytes += 0
try
{
# Encrypt using the public key
$encbytes = $rsaProvider.Encrypt($keybytes, $false)
# Reverse bytes for unmanaged decryption
[Array]::Reverse($encbytes)
# Return a string
[Convert]::ToBase64String($encbytes)
}
catch
{
if($node)
{
$errorMessage = $LocalizedData.PasswordTooLong -f $node
}
else
{
$errorMessage = $LocalizedData.PasswordTooLong -f 'localhost'
}
$exception = New-Object -TypeName System.InvalidOperationException -ArgumentList $errorMessage
Write-Error -Exception $exception -Message $errorMessage -Category InvalidOperation -ErrorId PasswordTooLong
Update-ConfigurationErrorCount
}
New code in PSDesiredStateConfiguration.psm1:
# Encrypt using the public key
$encMsg =Protect-CmsMessage -To $CmsMessageRecipient -Content $Value
# Reverse bytes for unmanaged decryption
#[Array]::Reverse($encbytes)
#$encMsg = $encMsg -replace '-----BEGIN CMS-----',''
#$encMsg = $encMsg -replace "`n",''
#$encMsg = $encMsg -replace '-----END CMS-----',''
return $encMsg