Download a CSV to a file containing the current timestamp - powershell

I am trying to download a CSVfile from an API, I would like to save it in a folder with a timestamp so distinguish it from each of the previous files within the folder.
The below script is where I am up to but I am hitting a block whereby the file doesn't appear to be saving. Is there something fundamentally wrong with the script.
$client = new-object System.Net.WebClient
$client.DownloadFile("https://www.url.com/csvdownload"
((Get-Date)).ToString('MM-dd-yyyy_hh-mm-ss')) export-csv C:\Users\me\Documents\folder\sub-folder\test_output_file_((Get-Date)).csv

You probably don't need to use the Export-CSV cmdlet since you are already download a CSV. Just format the filename to <timestamp>.csv and save it:
$fileName = '{0:MM-dd-yyyy_hh-mm-ss}.csv' -f (Get-Date)
$client = new-object System.Net.WebClient
$client.DownloadFile("https://www.url.com/csvdownload", $fileName)

Related

powershell download file preserving metadata

I am having difficulty figuring this one out. I am trying to download a file using powershell (via batch, so it must be on one line), but preserve the original creation date/time and modified date/time. The code I am currently using writes the date/time that the file was downloaded as the created & modified date.
(new-object System.Net.WebClient).DownloadFile('https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3','%USERPROFILE%\Downloads\file_example_MP3_700KB.mp3')
I've been able to accomplish this with a download manager, but I would like to be able to get this done via powershell so I can schedule the download on system start-up. I've searched for code to suit my needs, but can't find any that fit the criteria.
Ive found these, but i'm not sure how to incorporate them:
.TimeLastModified .LastModified .LastWriteTime .CreationTime
any help would be greatly appreciated.
Just use BITS, it copies remote file time by default and will even draw a nice progress bar when running interactively.
Start-BitsTransfer -Source 'https://google.com/favicon.ico' -Destination 'c:\Temp\2.ico'
My previous answer for history:
$request = [System.Net.WebRequest]::Create('https://google.com/favicon.ico')
$response = $request.GetResponse()
$stream = $response.GetResponseStream()
$file = New-Object System.IO.FileInfo 'c:\Temp\1.ico'
$fileStream = $file.OpenWrite()
$stream.CopyTo($fileStream)
$stream.Close()
$fileStream.Close()
$file.CreationTime = $file.LastWriteTime = $response.LastModified
If the server does not report file time, it will be set to current time.
If you need an one-liner, combine the lines with ;.
In PowerShell 3+ you can use a simpler alternative:
$fileName = 'c:\Temp\1.ico'
$response = Invoke-WebRequest 'https://google.com/favicon.ico' -OutFile $fileName -PassThru
$file = Get-ChildItem $fileName
$file.CreationTime = $file.LastWriteTime = $response.BaseResponse.LastModified

why my program does not work. i need a program that check file for creation date and download only new one

why my program does not work. i need a program that check file for creation date and download only new one?
$url = "http://www.automo.com/prices.php?uid=0d85f09deeabc9f7473512ee368ed321&opt=Acura&type=csv"
$output = "c:\download\price.csv"
$start_time = Get-Date
Invoke-WebRequest -Uri $url -OutFile $output
Write-Output "Time taken: $((Get-Date).Subtract($start_time).Seconds) second(s)"
I think, that you cannot use foreach for your $RemotePath, since it's only string and it requires array. Try to change your first line to:
$RemotePath = #("http://www.automo.ru/prices.php?uid=0d85f09deeabc9f7473512ee368ed321&opt=Acura&type=csv","http://www.automo.ru/prices.php?uid=0d85f09deeabc9f7473512ee368ed456&opt=Acura&type=csv")
(don't forget to add both paths to the quotation marks separately)
And then, I'm not sure, if Get-ChildItem can read files somewhere on remote websites.

PowerShell SFTP Download without Writing Files to Disk

I am trying to use PowerShell to Sync Payroll files stored on SFTP to SharePoint. I have most of the code written, the only thing I can't figure out is if there is a way to avoid temporarily downloading the file to the disk. Given the sensitivity of these files I would rather store the files as a variable not unlike how get-content works so no one on the Jenkins slave would be able to see its content or undelete temp files.
Here is my working code that does download the file:
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
$file = $session.ListDirectory("/") | select -ExpandProperty files | ? {$_.FileType -ne "D"} | select -First 1
$session.GetFiles($file, "c:\temp\$($file.name)", $False, $transferOptions)
$session.Close()
Is there something I can use in replacement of the second parameter of WinSCP.Session.GetFiles [C:\temp\$($file.name)] that would allow me to drop the file directly into memory so I can turn around and dump the to SharePoint?
If you were wondering how I would then get it into sharepoint I have used this with get-content without issue:
$fileToUpload = get-content c:\temp\test.csv -Encoding Byte
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = $fileToUpload
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Url = "test.csv"
$Upload = $Folder.Files.Add($FileCreationInfo)
$Ctx.Load($Upload)
$Ctx.ExecuteQuery()
WinSCP simply doesn't do it. I had been hoping for a downstream object to take the replcement of a file path but that does not seem to be possible. However I did figure this out. Moving to the posh-ssh module I was able to use the Get-SFTPContent command which allows me to read in the file to memory.
install-module posh-ssh
import-module posh-ssh
$Session = New-SFTPSession -ComputerName $SFTPHostName -Credential $SFTPcredential
Get-SFTPContent -SessionId $session.SessionId -Path $file.FullName -Encoding unicode -ContentType byte
Streaming a context of a remote file is supported since WinSCP 5.18 beta using the Session.GetFile method.
$stream = $session.GetFile($file)

Jenkins: Convert .docx to .pdf issues

We're trying to use Jenkins to convert a set of "Working Documents" into "Release Documents" when it builds out project. This involved taking the .docx files and saving them as .pdf files, which we accomplish with the following Powershell script:
$documents_path = "E:\Documentation\Working Documents\"
$word_app = New-Object -ComObject Word.Application
echo $word_app
# This filter will find .doc as well as .docx documents
$files = Get-ChildItem -Path $documents_path -Filter *.doc?
ForEach ($file in $files) {
echo "Converting Document to PDF: $($file.FullName)"
$document = $word_app.Documents.Open($file.FullName)
$pdf_filename = "$($file.DirectoryName)\..\Release Documents\$($file.BaseName).pdf"
$document.SaveAs([ref] $pdf_filename, [ref] 17)
$document.Close()
}
$word_app.Quit()
Now, that script works 100% the way we expect when I log in to the Jenkins PC and run it myself in Powershell. However, when Jenkins tries running it we get You cannot call a method on a null-valued expression at $document.SaveAs and $document.Close.
I assume this is because the user Jenkins runs as, SYSTEM does not have permission to access the .docx files, or can't find the Word installation, or something of that nature, but I can't think of how I should try to debug it further than this. Any tips are very much appreciated!
I had the same problem and found a simple workaround.
Create an empty directory "Desktop" in
C:\Windows\SysWOW64\config\systemprofile\

How to copy file for one time a day - elegant solution

I have remote server, where will be uploaded one file per day. I don't know when the file will be uploaded. I need to COPY this file to another server for processing and I need to do this just once per file (once a day). When the file is uploaded on remote server, I need to copy it within a hour, so I have to run this script at least once per hour. I'm using this script:
# Get yesterday date
$date = (Get-Date).Adddays(-1) | Get-Date -Format yyyyMMdd
$check = ""
$check = Get-Content c:\checkiftransfered.txt
# Test if file checkiftransfered.txt contains True or False. If it contains True, file for this day was already copyied
if ($check -ne "True") {
#Test if file exists - it has specific name and yesterday date
if(Test-Path \\remoteserver\folder\abc_$date.xls) {
Copy-Item \\remoteserver\folder\abc_$date.xls \\remoteserver2\folder\abc_$date.xls
# Write down information that file was already copyied
$check = "True" | Out-File c:\checkiftransfered.txt
} else { Write-Host "File has not been uploaded."}
} else { Write-Host "File has been copyied."}
# + I will need another script that will delete the checkiftransfered.txt at 0:00
It will work fine, I think, but I'm looking for more elegant solution - the best way how to solve it. Thank you
In PowerShell V3, Test-Path has a handy -NewerThan and -OlderThan parameters so you could simplify to this:
$yesterday = (Get-Date).AddDays(-1)
$date = $yesterday | Get-Date -Format yyyyMMdd
$path = "\\remoteserver\folder\abc_$date.xls"
if (Test-Path $path -NewerThan $yesterday)
{
Copy-Item $path \\remoteserver2\folder\abc_$date.xls -Verbose
(Get-Item $path).LastWriteTime = $yesterday
}
This eliminates the need to track copy status in a separate by using the LastWriteTime. One note about using -NewerThan and -OlderThan - don't use them together. It doesn't work as expected.
And lest we forget about some great native tools, here's a solution using robocopy:
robocopy $srcdir $destdir /maxage:1 /mot:60
The /mot:n option will cause robocopy to continuously monitor the source dir - every 60 minutes as specified above.
There is a much, much easier and more reliable way. You can use the FileSystemWatcher class.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = 'C:\Uploads'
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$created = Register-ObjectEvent $watcher "Created" -Action {
Sleep (30*60)
Copy-Item $($eventArgs.FullPath) '\\remoteserver2\folder\'
}
So lets take a look at what we doing here, we create a new watcher and tell it to watch C:\Uploads when a new file is uploaded there the file system sends a notification through the framework to our program, which in turn fires the created event. When that happens, we tell our program to sleep to for 30 minutes to allow the upload to finish (that may be to long depending on the size of the upload) then we call Copy-Item on the event arguments which contains a full path to our new file.
By the way you would need to paste this in a powershell window and leave it open on the server, alternatively you could use the ISE and leave that open. Either way it is way more reliable that what you currently have.