PowerShell SFTP Download without Writing Files to Disk - powershell

I am trying to use PowerShell to Sync Payroll files stored on SFTP to SharePoint. I have most of the code written, the only thing I can't figure out is if there is a way to avoid temporarily downloading the file to the disk. Given the sensitivity of these files I would rather store the files as a variable not unlike how get-content works so no one on the Jenkins slave would be able to see its content or undelete temp files.
Here is my working code that does download the file:
$session = New-Object WinSCP.Session
$session.Open($sessionOptions)
$file = $session.ListDirectory("/") | select -ExpandProperty files | ? {$_.FileType -ne "D"} | select -First 1
$session.GetFiles($file, "c:\temp\$($file.name)", $False, $transferOptions)
$session.Close()
Is there something I can use in replacement of the second parameter of WinSCP.Session.GetFiles [C:\temp\$($file.name)] that would allow me to drop the file directly into memory so I can turn around and dump the to SharePoint?
If you were wondering how I would then get it into sharepoint I have used this with get-content without issue:
$fileToUpload = get-content c:\temp\test.csv -Encoding Byte
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Content = $fileToUpload
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.Url = "test.csv"
$Upload = $Folder.Files.Add($FileCreationInfo)
$Ctx.Load($Upload)
$Ctx.ExecuteQuery()

WinSCP simply doesn't do it. I had been hoping for a downstream object to take the replcement of a file path but that does not seem to be possible. However I did figure this out. Moving to the posh-ssh module I was able to use the Get-SFTPContent command which allows me to read in the file to memory.
install-module posh-ssh
import-module posh-ssh
$Session = New-SFTPSession -ComputerName $SFTPHostName -Credential $SFTPcredential
Get-SFTPContent -SessionId $session.SessionId -Path $file.FullName -Encoding unicode -ContentType byte

Streaming a context of a remote file is supported since WinSCP 5.18 beta using the Session.GetFile method.
$stream = $session.GetFile($file)

Related

Adding each item from for loop to csv file

I am trying to use power shell to determine whether a server has a particular patch installed based on the KB and if not append the name to a csv. my input file has system names so I want to export that system name if it does not find the patch installed.
here is what i have so far. The export to csv part does not seem to work.
forEach-Object{
try{
$status = wmic /node:#sys.csv qfe list full /format:table | findstr /i $kb_number
if(!$status){
$output_file = New-Item C:\temp\$kb_number.csv -ItemType File
export-csv $output_file -append -Force
}
else{
write-output $status
}
}
catch{
$error_message = $_.Exception.Message
#write-output "the error message is" $error_message
write-output "Could not find any system with this patch installed."
}
}
Why your code might be failing
We don't see where you're setting the values of #sys.csv or $kb_number in the code you shared. Either of those could be throwing you off.
But the real issue is Export-Csv. For one, you're making a new CSV with every iteration of the loop. And for two, you have to pass in some item for the cmdlet to export as a CSV. Right now, you're only providing these values.
$output_file = New-Item C:\temp\$kb_number.csv -ItemType File
Export-csv -Path $output_file -append -Force
Export-Csv requires an input object. You're not giving it one now.
What do you want to export? If you just want a list of computers without a patch, do this instead.
if(-not(Test-path C:\temp\$kb_number.csv)){
#if file doesn't exist, make it
$output_file = New-Item C:\temp\$kb_number.txt -ItemType File
}
#adds computer name if it doesn't have the patch
Add-Content -Path $output_file -Value $computer
General Suggestions
Instead of using ForEach-Object, you might find it's easier to debug if you use a ForEach loop like this.
$computers = Get-Content C:\pathTo\Your\ComputerList.txt
ForEach($computer in $computers){
}
One additional source of trouble is that your code is using older dos commands in WMIC and then tries to use PowerShell to store the records. You don't need to do this and can make it easier on yourself if you swap out the calls to wmic for Get-WmiObject or Get-CimInstance, the PowerShell native versions of the commands.
To do that, change this line:
wmic /node:#sys.csv qfe list full /format:table | findstr /i $kb_number
translates into
$kb_number = "KB4576484"
Get-CimInstance Win32_QuickFixEngineering -Filter "HotfixID = '$kb_number'" -ComputerName $computer
Source Description HotFixID InstalledBy InstalledOn
------ ----------- -------- ----------- -----------
Update KB4576484 NT AUTHORITY\SYSTEM 9/14/2020 12:00:00 AM
You can store the output of that in a variable and then call Export-Csv on it and that should work.
When in doubt, remove the filter part and just get it working to export all patches to a csv. Then add complexity by adding back the filtering statements.

powershell download file preserving metadata

I am having difficulty figuring this one out. I am trying to download a file using powershell (via batch, so it must be on one line), but preserve the original creation date/time and modified date/time. The code I am currently using writes the date/time that the file was downloaded as the created & modified date.
(new-object System.Net.WebClient).DownloadFile('https://file-examples.com/wp-content/uploads/2017/11/file_example_MP3_700KB.mp3','%USERPROFILE%\Downloads\file_example_MP3_700KB.mp3')
I've been able to accomplish this with a download manager, but I would like to be able to get this done via powershell so I can schedule the download on system start-up. I've searched for code to suit my needs, but can't find any that fit the criteria.
Ive found these, but i'm not sure how to incorporate them:
.TimeLastModified .LastModified .LastWriteTime .CreationTime
any help would be greatly appreciated.
Just use BITS, it copies remote file time by default and will even draw a nice progress bar when running interactively.
Start-BitsTransfer -Source 'https://google.com/favicon.ico' -Destination 'c:\Temp\2.ico'
My previous answer for history:
$request = [System.Net.WebRequest]::Create('https://google.com/favicon.ico')
$response = $request.GetResponse()
$stream = $response.GetResponseStream()
$file = New-Object System.IO.FileInfo 'c:\Temp\1.ico'
$fileStream = $file.OpenWrite()
$stream.CopyTo($fileStream)
$stream.Close()
$fileStream.Close()
$file.CreationTime = $file.LastWriteTime = $response.LastModified
If the server does not report file time, it will be set to current time.
If you need an one-liner, combine the lines with ;.
In PowerShell 3+ you can use a simpler alternative:
$fileName = 'c:\Temp\1.ico'
$response = Invoke-WebRequest 'https://google.com/favicon.ico' -OutFile $fileName -PassThru
$file = Get-ChildItem $fileName
$file.CreationTime = $file.LastWriteTime = $response.BaseResponse.LastModified

Powershell add header record

I have a process in SSIS where I create three files.
Header.txt
work.txt
Trailer.txt
Then I use an Execute Process Task to call my Powershell script. I basically need to take the work.txt file and prepend the header record to it (while maintaining integrity of original values in work.txt) and then append the trailer record (which is generated with total row counts, etc.).
Currently I have:
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Get-Content Header.txt, work.txt, Trailer.txt|out-file "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility" -Confirm
This is fine in testing where I only had 1000 rows, but now that I have 67,000 rows the process takes forever.
I was looking at the Add-Content cmdlet but I can't find an example where it adds the header. Can someone assist with the syntax on going to the first line in the file and then adding the content before that first line?
many thanks in advance!
Just to clarify: I would like to build off the work.txt file. This si where the majority of the data is already, so instead of rewriting it all to a new file, I think a copy would make more sense. So in theory I would create all three files. copy the work file to say workfile.txt . Prepend header to workfile, append trailer to workfile, rename workfile.
UPDATE
This seems to work for the trailer.
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
#Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Copy-Item work.txt workfile.txt
#Get-Content Header.txt, work.txt, Trailer.txt|out-file "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Add-Content workfile.txt -value (get-content Trailer.txt)
UPDATE
Also tried:
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
$header = "H:\Documentation\Projects\CVS\StageCVS\Header.txt"
#Clear-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility.txt"
Copy-Item work.txt workfile.txt
#(Get-Content Header.txt, work.txt, Trailer.txt -readcount 1000)|Set-Content "H:\Documentation\Projects\CVS\StageCVS\CVSMemberEligibility"
Add-Content workfile.txt -value (get-content Trailer.txt)
.\workfile.txt = $header + (gc workfile.txt)
This is something that seems so easy but the reality is that it is not due to the underlying filesystem. You are going to need a file buffer or a temp file or if you are really brave you can look at extending the file and transposing the characters. As this guy did in C#.
Insert Text into Existing Files in C#, Without Temp Files or Memory Buffers
http://www.codeproject.com/Articles/17716/Insert-Text-into-Existing-Files-in-C-Without-Temp
So as it turns out out-file and get-content are not very performance enhanced. I found that it was taking over 5 minutes to run 5000 record result set and write/read the data.
When i researched some different performance options for Powershell I found the streamwriter .NET method. For the same process this ran in under 15 seconds.
Being that my result set in production environment would be 70-90000 records this was the approach I took.
Here is what i did:
[IO.Directory]::SetCurrentDirectory("H:\Documentation\Projects\CVS\StageCVS")
Set-Location "H:\Documentation\Projects\CVS\StageCVS"
Copy-Item ".\ELIGFINAL.txt" H:\Documentation\Projects\CVS\StageCVS\archive\ELIGFINAL(Get-Date -f yyyyMMdd).txt
Clear-Content "H:\Documentation\Projects\CVS\StageCVS\ELIGFINAL.txt"
Copy-Item work.txt workfile.txt
Add-Content workfile.txt -value (get-content Trailer.txt)
$work = ".\workfile.txt"
$output = "H:\Documentation\Projects\CVS\StageCVS\ELIGFINAL.txt"
$readerwork = [IO.File]::OpenText("H:\Documentation\Projects\CVS\StageCVS\workfile.txt")
$readerheader = [IO.File]::OpenText("H:\Documentation\Projects\CVS\StageCVS\Header.txt")
try
{
$wStream = New-Object IO.FileStream $output ,'Append','Write','Read'
$writer = New-Object System.IO.StreamWriter $wStream
#$write-host "OK"
}
finally
{
}
$writer.WriteLine($readerheader.ReadToEnd())
$writer.flush()
$writer.WriteLine($readerwork.ReadToEnd())
$readerheader.close()
$readerwork.close()
$writer.flush()
$writer.close()

Powershell: NTFS paths in file metadata with New-ItemProperty, Set-ItemProperty?

I'm interested in adding a property to my files under a certain scope that contains their current locations in my file system, in order to track file movement. I would think that this could be done with New-ItemProperty, with a command similar to the following:
Get-ChildItem -recurse | foreach { New-ItemProperty -Path $.FullName -Name "OriginalLocation" -PropertyType string -Value $.FullName }
However, when I try this, I'm spammed with the following error:
New-ItemProperty : Cannot use interface. The IDynamicPropertyCmdletProvider interface is not implemented by this provider.
After some searching, it appears that New-ItemProperty is all but useless except for working with the registry. Fine. Windows has myriad other file properties I should be able to hijack in order to get this done. "Label" and "Tags" come to mind. So let's try setting those via Set-ItemProperty instead.
Set-ItemProperty : Property System.String Label=D:\test\file.txt does not exist.
It appears I need to create these properties after all. Is this a shortcoming of New-ItemProperty? Maybe setting properties such as this on arbitrary items is some WMI thing I don't know about?
Here is my solution using the redirections ('<' & '>') that allow to manipulate alternate data stream in CMD.EXE. It works in Powershell without any extentions
# AlternateDataStream.ps1
$scriptBlockSetStream = {cmd /C `"echo $($Args[0])`>$($Args[1]):$($Args[2])`"}
$scriptBlockGetStream = {cmd /C `"more `<$($Args[0]):$($Args[1])`"}
$streamName = "NativeFilePath"
$File = "C:\Temp\ADSTest\toto.txt"
$streamContent = Split-Path -Path $File -Parent
# Set the data stream
Invoke-Command -ScriptBlock $scriptBlockSetStream -ArgumentList $streamContent,$File,$streamName
# Get the Data Stream
$res = Invoke-Command -ScriptBlock $scriptBlockGetStream -ArgumentList $File,$streamName
$res
Another option might be to use alternate data streams to store your path. If you are running PowerShell 3.0, you can manipulate them quite easily. Based on the first article, you would have something resembling:
"echo test" | out-file c:\powershell\test.ps1
$fs = new NTFS.FileStreams('c:\powershell\test.ps1')
$fs.add('OriginalPath')
$stream = $fs.Item('OriginalPath').open()
$sw = [System.IO.streamwriter]$stream
$sw.writeline('<path>')
$sw.close()
$stream.close()

Powershell, ftp, get-childitem

Little new to powershell. I am trying to locate a get-childitem like command that will work on an ftp site.
Here is some psuedo-code:
$target = "c:\file.txt"
$username = "username"
$password = "password"
$ftp = "ftp://$username:$password#myftpsite"
$webclient = New-Object System.Net.WebClient
$uri = New-Object System.Uri($ftp)
#below is the code that does not work, get-childitem needs a local path
$name = get-childitem -path $ftp
The get-childitem only works with a local path. Does anyone know how I could access the filenames in this manner when on an ftp site?
Thanks
What you would need is a PowerShell provider for FTP if you wanted Get-ChildItem to work on a remote filesystem accessed by FTP. This forum post mentions work being done by Nick Howell on an FTP provider. Other than that, I haven't heard of any other FTP providers for PowerShell.