is start-bitstransfer compatible with pester - powershell

When running start-bitstransfer in a script I'm trying to apply pester to... it doesn't seem to work...
I just tried downloading some random http page (while preparing to do some tropico 5 ;) ... to the pester testdrive, but this doesnt work...
do .net things not work with pester or such ?
set-content "testdrive:\bla.txt" -value "bla"
Start-BitsTransfer -source "http://www.tropicostrategy.com/p/blog-page.html" -destination 'testdrive:\' -Description "Downloading test"
Get-ChildItem 'TestDrive:\' | out-file c:\temp\b.txt
output:
Describing get-wallhavens
[-] downloads a file 591ms
ArgumentException: Value does not fall within the expected range.
ArgumentException: An incorrect value is specified in the Source parameter or in the Destination parameter. Verify that the directory and file names in the Source and Destination parameters are correct.
at <ScriptBlock>, D:\stack\Projects\Personal\wallchange\download-wallpaper.Tests.ps1: line 7

The documentation on GitHub indicates that you need to use a variable that holds the real location of the temporary storage:
Working with .NET Objects
When working directly with .NET objects, it's not possible to use the convenient TestDrive:\ PSDrive. Instead you need to use the $TestDrive variable which holds the actual path in a format that .NET understands. For example instead of using TestDrive:\somefile.txt use $TestDrive\somefile.txt instead.

Related

Error "Could not find a part of the path" while setting attributes on an existing file

I wrote a powershell script to strip R/H/S attributes off all files in a specified set of root paths. The relevant code is:
$Mask = [System.IO.FileAttributes]::ReadOnly.Value__ -bor [System.IO.FileAttributes]::Hidden.Value__ -bor [System.IO.FileAttributes]::System.Value__
Get-ChildItem -Path $Paths -Force -Recurse -ErrorAction SilentlyContinue | ForEach-Object {
$Value = $_.Attributes.value__
if($Value -band $Mask) {
$Value = $Value -band -bnot $Mask
if($PSCmdlet.ShouldProcess($_.FullName, "Set $([System.IO.FileAttributes] $Value)")) {
$_.Attributes = $Value
}
}
}
This works fine, but when processing one very large folder structure, I got a few errors like this:
Exception setting "Attributes": "Could not find a part of the path 'XXXXXXXXXX'."
At YYYYYYYYYY\Grant-FullAccess.ps1:77 char:17
+ $_.Attributes = $Value
+ ~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], SetValueInvocationException
+ FullyQualifiedErrorId : ExceptionWhenSetting
I find this strange because the FileInfo object being manipulated is guaranteed to exist, since it comes from a file search.
I can't give the file names because they are confidential, but I can say:
they are 113-116 characters long
the unique set of characters involved are %()+-.0123456789ABCDEFGIKLNOPRSTUVWX, none of which are illegal in a file name
the % character is there due to URL-encoded spaces (%20)
Do you have any suggestions as to what may be causing this? I assume that if the full path was too long, or I didn't have write permissions to the file, then a more appropriate error would be thrown.
As stated in your own answer, the problem turned out to be an overly long path (longer than the legacy limit of 259 chars.)
In addition to enabling long-path support via Group Policy, you can enable it on a per-computer basis via the registry as follows, which requires running with elevation (as admin):
# NOTE: Must be run elevated (as admin).
# Change will take effect in FUTURE sessions.
Set-ItemProperty HKLM:\SYSTEM\CurrentControlSet\Control\FileSystem LongPathsEnabled 1
Pass 0 to turn support off.
However, even with long-path supported turned OFF (as is invariably the case on pre-Windows 10 versions) it is possible to handle long paths:
In Windows PowerShell (PowerShell up to version 5.1), you must use the long-path opt-in prefix, \\?\, as discussed below.
In PowerShell [Core] v6+, no extra work is needed, because it always supports long paths - you neither need to turn on support system-wide nor do you need the long-path prefix discussed below.
Caveat: While you may use \\?\ in PowerShell [Core] as well in principle, support for it is inconsistent as of v7.0.0-rc.2; see GitHub issue #10805.
Important: Prefix \\?\ only works under the following conditions:
The prefixed path must be a full (absolute), normalized path (must not contain . or .. components).
E.g., \\?\C:\path\to\foo.txt works, but \\?\.\foo.txt does not.
Furthermore, if the path is a UNC path, the path requires a different form:
\\?\UNC\<server>\<share>\...;
E.g., \\server1\share2 must be represented as \\?\UNC\server1\share2
It did turn out to be a long path issue after all, despite the wording of the error messages. A simple Get-ChildItem search for the files produced the same errors. I finally tracked down the files mentioned in the error messages and measured their total path lengths. They were exceeding 260 characters.
I experimented with adding a \\?\ prefix to the paths, but powershell doesn't seem to like that syntax.
Fortunately, the script is being used on Windows 2016, so I tried enabling long path support in group policy. That made the whole problem go away.

Make a .json-file more flexible with Variables for automation deploy

I've got a PowerShell-Script to create a VM from an Image in Azure and in this Script I deposited a .json (Parameter for VM, etc.). But if I want to create more than one VM the Names of the VM, Vnet, etc. cannot be the same for every execution (have to be in the same Resource Group).
So my Question: How can I insert Variables in the .json File to change the Name of the VM, etc. for every execution? Perhaps I have to rethink?
A very basic approach could be something like this:
# Grab the file contents
$contents = Get-Content -Path $templateFile
# Update some tokens in the file contents
$contents = $contents.replace("original value", "new value")
# Push the updated contents to a new file
Set-Content -Path $updatedFile -Value $contents
If you have a value that changes with every deployment, you could also consider using the -TemplateParameterObject parameter with the New-AzureRmResourceGroupDeployment cmdlet. That way, you can generate the values in your powershell script without having to output them to json file first.
For more details, have a look at the cmdlet specs

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe

How to change extended windows file attributes via Powershell without using attrib.exe?

This seems to be a quite simple question, yet googling gave me nothing.
Here is the error (PS 5.1, win 10.0.14393 x64):
Set-ItemProperty $myFileInfo -Name Attributes -Value ([System.IO.FileAttributes]::Temporary)
The attribute cannot be set because attributes are not supported. Only the following attributes can be set: Archive, Hidden, Normal, ReadOnly, or System.
attrib.exe seems to support most of System.IO.FileAttributes. Unfortunately is does not seem to work with files referenced using FileSystem PSDrives. That is what I am using extensively.
Making wrapper for SetFileAttributes kernel API call would be the last resort.
Am I missing any other [more simple] ways setting these extended file attributes?
PS. Apart from [System.IO.FileAttributes]::Temporary I am interested in setting [System.IO.FileAttributes]::NotContentIndexed.
You can edit the Attributes property of a [FileInfo] object directly. For example, if you wanted to exclude all files in the C:\Temp folder from being content indexed, you could do this:
Get-ChildItem C:\Temp | ForEach{
$_.Attributes = $_.Attributes + [System.IO.FileAttributes]::NotContentIndexed
}
That would get each file, and then add the [System.IO.FileAttributes]::NotContentIndexed attribute to the existing attributes. You could probably filter the files to make sure that the attribute doesn't already exist before trying to add it, since that may throw errors (I don't know, I didn't try).
Edit: As noted by #grunge this does not work in Windows Server 2012 R2. Instead what you have to do is reference the value__ property, which is the bitwise flag value, and add the bitwise flag for NotContentIndexed. This should work for you on any Windows OS:
Get-ChildItem C:\Temp | ForEach{
$_.Attributes = [System.IO.FileAttributes]($_.Attributes.value__ + 8192)
}

Get the path of the importing script from within a module?

Is there a way to get the path of a script that imported a module from within that module?
The script module I'm writing is meant to load settings from files relative to the importing script. I plan on reusing the module for a number of projects, so I would prefer if the module could make no assumptions about where its being imported from.
This is a nice to have, it would be great the module could be as implicit as possible. If all else fails though, I can just have the caller pass in its location.
Unfortunately everything I've attempted so far returns the path to the module (not what imported it). Here's a simple demonstration:
Test-RelativeModule.ps1, Stored at: c:\test\
import-module "$PSScriptRoot\mod\Test.psm1"
Test.psm1, Stored at: c:\test\mod\
# returns 'c:\test\mod'
write-host "`$PSScriptRoot: $PSScriptRoot"
# returns 'c:\test\mod'
# The value of $MyInvocation.MyCommand.Path is 'c:\test\mod\Test.psm1'
write-host "Split Invoation: $(Split-Path $MyInvocation.MyCommand.Path)"
# returns whatever path the console is currently residing
write-host "Resolve Path: $((resolve-path '.\').path)"
# what I'm looking for is something to return 'c:\test' from within the module
# without making any assumptions about the folder structure
Try this:
Write-Host "My invoker's PSScriptRoot: $($MyInvocation.PSScriptRoot)"