I have an API which returns a file as byte[].
I am trying to download this file onto a local machine using PowerShell (needs to be PowerShell for other automation reasons).
I am using WriteAllBytes, however, it throws error with files which are larger than 100MB (I think, threshold might be different).
Are there any other ways to download these files and convert byte[] into an actual file?
Here is what I have at the moment:
$fileInfo = New-Object ($namespace + ".fileInfoRequest")
$fileInfo.Filename = "$($File)"
$fileInfo.Hash = "e0d123e5f316bef78bfdf5a008837577" #random hash so ignore this.
$FileDetails = $WebService.GetFileInfo($fileInfo)
if ($FileDetails.Exists -eq "True") {
[IO.File]::WriteAllBytes("$($InstallPath)\$($File)", $WebService.GetFileData($FileDetails))
} else {
Write-Host -ForegroundColor Red "File $($File.FileName) could not be found in the system"
}
$WebService.GetFileData($FileDetails) returns the file data in byte[] so this is the one that I need to manipulate somehow.
I faced the same message right this morning.
Weirdly in my case the problem was triggered only when using a remote powershell session, so I can see affinity with API which can be passing trough network as well.
It happened, in my case, that the same command from a "standard" powershell session opened directly on the server console was not raising the error.
I was able to avoid it by using the following on an admin powershell session on the server console:
set-item wsman:localhost\Shell\MaxMemoryPerShellMB 2048
After that all the remote powershell session stop to give OutOfmemory.
Related
Currently, I run the following command to fetch the files to my local system.
Get-SCPFile
-ComputerName $server
-Credential $credential
-RemoteFile ($origin + $target + ".csv")
-LocalFile ($destination + $target + ".csv")
It works as I'd like (although it sucks that I can't copy multiple files by regex and/or wildcard). However, after the operation has been carried out, I'd like to move the remote files to another directory on the remote server so instead of residing in $origin at $server, I want them to be placed in $origin + "/done" at the same server. Today, I have to use PuTTY for that but it would be so much more convenient to do that from PS.
Googling gave me a lot of material but I couldn't make it work. At the moment, I'm not sure if I'm specifying the path incorrectly somehow or if it's not possible to use the plain commands when working against an external, secured, Unix-server.
For copying files, I can't use Copy-Item, hence the function Get-SCPFile. I can imagine that remote moving, renaming and listing the items isn't possible neither for the same reason (whatever that reason is).
This example as well as this one produce error cannot find path despite the value being used for copying the file successfully with the script at the top. I'm pretty sure it's a misleading error message (not being enitrely sure, though).
$file = "\\" + $server + "" + $origin + "" + $target + ".csv"
# \\L234231.vds.afm.se/var/trans/ut/drish/sxx/meta001.csv
Remove-Item $file -force
Many answers (like this) are very simple, which supports my theory that the combination of Unix and secure raise an extra challenge. Perhaps I'm wording the question insufficiently well.
There's also more advanced examples, still not working, just hanging up the window with no error messages. I feel my competence prevents me from estimating the degree of screwuppiness in this approach.
In PowerShell you can create a PowerShell Session (PSSession) from your System remotly on another System (and into another Session on your System but thats details... ) and execute your commands there.
You can create a PSSession with New-PSSession but a lot of cmdlets have a-ComputerName parameter (or something similar) so that they can be executed remotley without creating a PSSession first.
A PSSession can be used with Enter-PSSession to get an interactive Session or with Invoke-Command to execute a ScriptBlock. That way you could test your Remove-Item command directly on the target server. Depending on the setup you might need to use Linux syntax within the remote session.
Here are some more infos about_PSSessions and using it with SSH to connect to Linux
I wrote a small PowerShell script that I am using to query the Server Log, clean the return values and use some of the results to perform some server maintenance. However, when I schedule the save to file piece is not writing the whole content to the file and it is getting truncated, just like what I ma posting below, exactly. As you can observe, the end of the file is truncated with three dots added to replace the missing values:
Login failed for user 'sa'. Reason: An error occurred while evaluating the password. [CLIENT: 2...
However, if I run the code manually with Local Admin access, the content gets saved to the local file like this, exactly:
Login failed for user 'sa'. Reason: An error occurred while evaluating the password. [CLIENT: 112.103.198.2]
Why is this the case when I schedule the process or PS file to run under a schedule. BTW, I tried to run it under the SYSTEM context with full or highest privileges and even used the same Admin account that I use to run it manually to schedule and still do nt get the full content of the event that I save.
This is creating an issue and I am not able to use the content to process the IP.
Here is the PS code that I am using to query and save the content to file:
$SQL = 'C:\SQL.txt'
Remove-Item $SQL -ErrorAction Ignore
Get-EventLog -LogName Application | Where-Object {$_.EventID -eq 18456} |
Select-Object -Property Message | Out-File $SQL
The problem lies with out-file because it has a default character limit of 80 per line.
You can change it with -width property and give a value of say 200. However set-content doesn't have these limits set in. So it might be a more suitable option.
All that being said, I am not sure why it does it one way when ran manually vs another when the system runs it.
Out-file defaults to unicode when writing files
set-file defaults to ascii when writing files
I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe
I have a script that monitors the filesystem using FileWatcher.IO in Powershell.
Currently it finds the user that made the file with:
$owner = (Get-Acl $path).Owner
And it finds the computer that the file was made on with:
$Computer = get-content env:computername
But I'd also like to obtain what machine the file was created from. For instance, if a user is logged into a terminal server, I can see the file is made on the terminal server. But I want to know the host name of the local machine that made the file on the terminal server.
Is this possible? I've been searching the msdn PSEventArgs Class page without much success.
That information is not going to be stored in the file or its metadata, so no there's no straightforward way to get at it.
By the way, you can just use $env:computername directly as a variable; there's no need to use Get-Content.
We have a program running on about 400 PCs (All W7). This program is called Wisa.
We receive regular updates for this program, named something like wisa_update1.0.exe, wisa_update1.1.exe, wisa_update2.0.exe, etc. The users can not do the update themself due to account restrictions.
We manage to do the update once and distribute it with a copy-item to all PCs. Then with Enter-PSSession I can go to each PC and update the program with the following command:
wisa_update3.0 /verysilent
(with the argument /verysilent no questions are asked)
This is already a major gain in time, but I want to do the update more automatically.
I have a file "pc.txt" with all 400 PCs in it. I use this file already for the Copy-Item via Get-Content. Now I want to use this file to do the updates with the above command, but I can't find a good way to use a remote executable with a parameter in PowerShell.
What you want to do is load get-content -Path $PClist and then run your script actions in a foreach. You'll want to adapt this example to your own script:
$PClist = 'c:\pc.txt'
$aComputers = Get-Content -Path $PClist
foreach ($Computer in $aComputers)
{
code actions to perform
}
Also you can use multithreading and get it over with fraction of time (provided you have a good machine). The below mentioned link explains how to do it well.
http://www.get-blog.com/?p=22