PowerShell download and save wav file from API - powershell

I am trying to download WAV files via a web API using PowerShell. The request succeeds, but when I write the content to a file using Out-File it will not play in windows media player. It shows like this (opened in text editor):
Whereas a sample wav file that does play, looks like this in text editor:
I am guessing it is some sort of encoding problem? But I don't know how to output to a wav file using proper encoding in PowerShell...
Here is my script, I am currently saving out a substring of "RawContent" to skip all the HTTP headers:
(Invoke-WebRequest -URI 'https://developer.fuze.com/api/v1/call-recordings/recording-id/media').RawContent.Substring(488) | Out-File c:\stuff\recording.wav

I downloaded a file in raw format. Then I converted it to wav using ffmpeg.
This is how I solved this problem:
`$response = $client.PostAsync("https://tts.api.cloud.yandex.net/speech/v1/tts:synthesize", $con)
$responseBytes = $response.Result.Content.ReadAsByteArrayAsync()
[io.file]::WriteAllBytes('.\test.raw',$responseBytes.Result)
.\ffmpeg\bin\ffmpeg.exe -i test.raw file.wav 2> $null
$Player = New-Object System.Media.SoundPlayer '.\file.wav'
$Player.Play()
Remove-Item .\test.raw
Remove-Item .\file.wav`

Related

Find the last modified file on FTP site using powershell [duplicate]

I am working on a PowerShell script, which will pull files from an FTP site. The files are uploaded to the FTP site every hour so I need to download the most recent one. The code I currently have downloads all the files from today instead of just one file. How do I make it download only the most recent file?
Here is the code that I am currently using
$ftpPath = 'ftp://***.***.*.*'
$ftpUser = '******'
$ftpPass = '******'
$localPath = 'C:\Temp'
$Date = get-date -Format "ddMMyyyy"
$Files = 'File1', 'File2'
function Get-FtpDir ($url, $credentials)
{
$request = [Net.FtpWebRequest]::Create($url)
if ($credentials) { $request.Credentials = $credentials }
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
(New-Object IO.StreamReader $request.GetResponse().GetResponseStream()) -split "`r`n"
}
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($ftpUser,$ftpPass)
$webclient.BaseAddress = $ftpPath
Foreach ( $item in $Files )
{
Get-FTPDir $ftpPath $webclient.Credentials |
? { $_ -Like $item+$Date+'*' } |
% {
$webClient.DownloadFile($_, (Join-Path $localPath $_))
}
}
It's not easy with the FtpWebRequest. For your task, you need to know file timestamps.
Unfortunately, there's no really reliable and efficient way to retrieve timestamps using features offered by FtpWebRequest/.NET framework/PowerShell as they do not support an FTP MLSD command. The MLSD command provides listing of remote directory in a standardized machine-readable format. The command and the format is standardized by RFC 3659.
Alternatives which you can use, that are supported by .NET framework:
ListDirectoryDetails method (an FTP LIST command) to retrieve details of all files in a directory and then you deal with FTP server specific format of the details (*nix format similar to ls *nix command is the most common, drawback is that the format may change over time, as for newer files "May 8 17:48" format is used and for older files "Oct 18 2009" format is used)
GetDateTimestamp method (an FTP MDTM command) to individually retrieve timestamps for each file. Advantage is that the response is standardized by RFC 3659 to YYYYMMDDHHMMSS[.sss]. Disadvantage is that you have to send a separate request for each file, what can be quite inefficient.
Some references:
C# class to parse WebRequestMethods.Ftp.ListDirectoryDetails FTP response
Parsing FtpWebRequest ListDirectoryDetails line
Retrieving creation date of file (FTP)
Alternatively, use a 3rd party FTP library that supports the MLSD command, and/or supports parsing of the proprietary listing format.
For example WinSCP .NET assembly supports both.
An example code:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
# Connect
$session.Open($sessionOptions)
# Get list of files in the directory
$directoryInfo = $session.ListDirectory($remotePath)
# Select the most recent file
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
# Any file at all?
if ($latest -eq $Null)
{
Write-Host "No file found"
exit 1
}
# Download the selected file
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($remotePath + $latest.Name)
$session.GetFiles($sourcePath, $localPath).Check()
For a full code, see Downloading the most recent file (PowerShell).
(I'm the author of WinSCP)
I tried this, but i get an error:
Error: Exception calling "ListDirectory" with "1" argument(s): "Error listing directory '/path/'.
Could not retrieve directory listing
Can't open data connection for transfer of "/path/"
I read a lot about this problem on the internet, but could not find a solution which seemed fairly simple, and I am not a network setup wizard. So I choose a different approach. In our case the filename of the file which I want to automate the download for, has the date specified in it: backup_2018_08_03_020003_1048387.bak
So we can get the file by using mget *2018_08_03* in a command line ftp session.
Our backup procedure is run every morning at 01.00 AM, so we have a backup each day that we can fetch.
Of course it would have been prettier and nicer to have a script that fetched the latest backup file based on the backup file timestamps, just in case that something went wrong with the latest backup or the backup file naming format changes. The script is just a script to fetch the backup for internal development purposes so its not a big deal if it breaks. I will look into this later and check whether i can make a cleaner solution.
I made a batch script which just asks for todays backup file with the ordinary ftp command prompt scripting.
It is important to get the formatting of todays date right. It must match the formatting of the date in the filename correctly.
If you want to use the script you should replace the variables with your own information. You should also have write access to the directory where you run it from.
This is the script that I made:
#Echo Off
Set _FTPServerName=xxx.xxx.xx.xxx
Set _UserName=Username
Set _Password=Password
Set _LocalFolder=C:\Temp
Set _RemoteFolder="/path/"
Set _Filename=*%date:~-4,4%_%date:~-7,2%_%date:~-10,2%*
Set _ScriptFile=ftptempscript
:: Create script
>"%_ScriptFile%" Echo open %_FTPServerName%
>>"%_ScriptFile%" Echo %_UserName%
>>"%_ScriptFile%" Echo %_Password%
>>"%_ScriptFile%" Echo lcd %_LocalFolder%
>>"%_ScriptFile%" Echo cd %_RemoteFolder%
>>"%_ScriptFile%" Echo binary
>>"%_ScriptFile%" Echo mget -i %_Filename%
>>"%_ScriptFile%" Echo quit
:: Run script
ftp -s:"%_ScriptFile%"
del "%_ScriptFile%"

Expand-Archive odd errors

I am trying to get at some data in an Autodesk Revit file, which is just a ZIP under the skin. I can use 7zip to extract but I am hoping to automate things with all native PS or Windows. I tried Expand-Archive after I renamed the RVT file to ZIP, but Expand-Archive has an odd error. The code is
Expand-Archive -path:'C:\RevitVersionTest\22-PLUMB-CLR-RECTANGULAR.zip' -destinationPath:'C:\Revit Fam'
And the error is
New-Object : Exception calling ".ctor" with "3" argument(s): "End of Central Directory record could not be found."
The file is corrupt. Re-download (or obtain) the ZIP file.
Background: I landed here having experienced the same error downloading a ZIP from Google Drive through a private link:
Invoke-WebRequest -Uri $zipFile -OutFile "$destPath\myZip.zip"
...then using the command:
Expand-Archive c:\a.zip -DestinationPath c:\a`
The file downloaded but it wouldn't extract. I downloaded through the browser using the UI, and then compared the download file sizes. Sure enough, the downloaded ZIP was corrupted. When I updated the URL to a Google Docs link directly to the file (that's public with permission), the ZIP then downloaded and extracted correctly.
Hence, comments under the question alluding to the file being corrupt are correct.

Powershell download to specific folder without knowing file name

I'm creating a Powershell script to download the latest version of a bunch of utilities. For some I know the URL and file name so this works:
$url = "https://download.sysinternals.com/files/SysinternalsSuite.zip"
$file = "SysinternalsSuite.zip"
$webclient.DownloadFile("$url","$storageDir\$file")
For some I just know the URL. For these I just know how to open it in the browser and file file is downloaded to the browsers download location.
$url = "https://toolslib.net/downloads/finish/1-adwcleaner/"
Start-Process "chrome.exe" "$url"
Is there any way to find the name of the file that was downloaded so it can be moved to $storageDir?
Alternately, is there a way to temporarily change a browsers default download location from inside the script and change it back at the end of the script? I'm not locked to any specific browser?

Calling a powershell script from another powershell script and guaranteeing it is UTF8

I assembled a Powershell script that is designed to grab other scripts that are hosted on Azure blobs, and execute them.
The relevant code blocks:
Obtaining the script:
$resp = (Invoke-WebRequest -Uri $scriptUri -Method GET -ContentType "application/octet-stream;charset=utf-8")
$migrationScript = [system.Text.Encoding]::UTF8.GetString($resp.RawContentStream.ToArray());
$tempPath = Get-ScriptDirectory
$fileLocation = CreateTempFile $tempPath "migrationScript.ps1" $migrationScript
Creating the file:
$newFile = "$tempFolder"+"\"+"$fileName"
Write-Host "Creating temporary file $newFile"
[System.IO.File]::WriteAllText($newFile, $fileContents)
And then I invoke the downloaded file with
Invoke-Expression "& `"$fileLocation`" $migrationArgs"
This is working well, for what I need. However, the Invoke-Expression is not correctly reading the encoding of the file. It opens correctly in Notepad or Notepad++, but not in ISE (where I am executing the script right now).
Is there a way I can ensure the script is read correctly? It is necessary to support UTF8, as there is a possibility that the scripts will need to perform operations such as setting an AppSetting to a value that contains special characters.
EDIT: Behaviour is the same on "vanilla" non-ISE Powershell invocation.
As per #lit and #PetSerAI, the BOM is required for Powershell to work correctly.
My first attempt had not been successful, so I switched back to non-BOM, but, with the following steps, it worked:
Perform the Invoke-WebRequest with -ContentType "application/octet-stream;charset=utf-8"
Grab the Raw content (you will see it in Powershell as a series of numbers, which I assume are the ascii codes?) and convert its bytes with [system.Text.Encoding]::UTF8.GetString($resp.RawContentStream.ToArray()); to an array containing the characters you want.
When saving the file via .NET's WriteAllText, ensure you use UTF8,
[System.IO.File]::WriteAllText($newFile, $fileContents, [System.Text.Encoding]::UTF8). In this case, UTF8 is understood to be UTF8 with a byte order mark, and is what Powershell needs.

Trying to download a zip file from a weblink with powershell

Ok, I am trying to download a file off of a web link that we use with powershell. I am downloading a zip file where the begining of the name is always the same, but the the middle part will change based off of the version number of the zip. I have been able to get the file to download when I use the fully qualified web address and have the file name hard coded into the script. I have tried every version of using the wild cards to get all the most common version of the zip, but it errors out saying that it can't find the file on there server. This is the code that I have already, and any help would be greatly appreciated since I feel like I am at a wall with it.
$url = 'http://blah/blah/blah/My File Name 11.1111.11.zip'
$localFileName = 'C:\temp\MYzip.zip'
Invoke-WebRequest $url -UseDefaultCredentials -OutFile $localFileName
If the site has directory browsing enabled (unlikely unless you have control of the site and can turn it on), you can do this:
$url = 'http://blah/blah/blah/'
$wr = iwr $url
$filename = $wr.Links.href | Where {$_ -match 'My File Name.*?\.zip'}
$wr = iwr "$url/$filename"
If the site doesn't have directory browsing enabled then surely it has a page with a link to the ZIP file on it. Download that page and use the same $wr.Links.href trick to get all the links and look for the one that matches "My File Name.*?.zip".