I have been hopelessly trying to change the sector size of a freshly created Windows 10 VHDX file, to 4096 bytes per sector.
After searching the net and trying things, this comes closest (using Windows 10):
Install Hyper-V
Select Start button and type Powershell
Right mouse click "Run as Administrator"
Type command: Set-Vhd -Path 4Kn.vhdx –PhysicalSectorSizeBytes 4096
The latter command has been tried in all possible flavors, with full path, just the filename (after CD'ing to the folder) etc.
I always get errors:
Set-Vhd : The operation on computer 'DESKTOP-JCMNHRV' failed: Invalid class
At line:1 char:1
+ Set-Vhd -Path C:\Users\Peter\Desktop\4Kn.vhdx –PhysicalSectorSizeByte ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Set-VHD], VirtualizationException
+ FullyQualifiedErrorId : Unspecified,Microsoft.Vhd.PowerShell.Cmdlets.SetVhd
Any idea what to do or where to start? I don't see it anymore
The .vhdx is not attached, and I tried without any formatting inside, and later with an NTFS formatted volume.
Related
I have one utility provided by Dev, Which takes the firmware package as input and decrypts its content. After decrypts, it asks for a physical device to flash this decrypted content into that device. But before flashing, it creates one locked folder on my machine.
I am trying to unlock that locked window file folder.
Nothing happened when I tried to open it by double-clicking on the folder icon. Also, Write error occurred When I wanted to rename it using PowerShell.
PS C:\Users\RajNegi\AppData\Roaming> rename-item
cmdlet Rename-Item at command pipeline position 1
Supply values for the following parameters:
Path: C:\Users\abc\AppData\Roaming\vsc.{2559a1f2-21d7-11d4-bdaf-00c04f60b9f0}
NewName: C:\Users\abc\AppData\Roaming\vsc
rename-item : Access to the path 'C:\Users\abc\AppData\Roaming\vsc.{2559a1f2-21d7-11d4-bdaf-00c04f60b9f0}' is denied.
At line:1 char:1
+ rename-item
+ ~~~~~~~~~~~
+ CategoryInfo : WriteError: (C:\Users\abc...f-00c04f60b9f0}:String) [Rename-Item], IOException
+ FullyQualifiedErrorId : RenameItemIOError,Microsoft.PowerShell.Commands.RenameItemCommand
Note: I already set Full control in folder permissions.
I have this script:
$UserFile = Read-Host "Drag your file here"
Copy-Item -Path $UserFile -Destination .\input
I want user drag their file to console, so that the script can know the exact path of user file and copy user file to input folder. but i got this error
Copy-Item : Cannot find drive. A drive with the name '"C' does not exist.
At line:1 char:1
+ Copy-Item -Path $UserFile -Destination .\input
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: ("C:String) [Copy-Item], DriveNotFoundException
+ FullyQualifiedErrorId : DriveNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
UPDATE:
It getting more weird guys, i think there is something wrong with my machine, i try to run the code on VM, and the code run with ZERO problem, and there's something different between my machine and VM, when i drag the file in my machine it will write like this "C:\path\to\my\file.something" but when i do that on my VM it write C:\path\to\my\file.something without "
UPDATE:
I just realized that i use different file, the file that i try on my machine is have white spaces in it file one.something but file that i use on VM don't have space file.something but i've tried to run the code like this
Copy-Item -Path "C:\path\to\my\file one.something" -Destination .\input
using " and it work. but that not what i want, i want user to drag their file no matter there is white space or not.
Try to add the -Prompt param
$UserFile = Read-Host -Prompt "Drag your file here"
*** drags file ***
Write-Host $Userfile ## should display the filepath
My guess is that this isn't an issue for you anymore and an extra " was entered into the terminal before dragging your file. For me, the code you've posted works without issue.
You might want to check out Cannot find drive. A drive with the name '"C' does not exist where the issue was the exact same thing.
Depending on the use of this script, you can trim/remove any characters before using and, in my experience, it is almost always better to use absolute paths instead of relative (here the ./input) unless you explicitly want to have the script be relative.
In my ongoing war with WSL2/VSCODE I'm trying to completely remove the .vscode-insiders directory from my machine.
rm -rf wont work, neither will delete from windows. Anyone know How I can destroy this thing?
I'm trying to get a dev environment set up with WSL2 VS CODE and Docker in Windows 10 insiders (18965).
I ran into problems trying to install the remote development pack, specifically the /ms-vscode-remote.remote-wsl-0.39.4 extension which is causing no end of problems.
However When I try to remove all vs code-insiders residue I get permissions problems with:
:/mnt/c/Users/micro/.vscode-insiders/extensions/ms-vscode-remote.remote-wsl-0.39.4/scripts/wslServer.sh
I cant delete it from the Ubuntu side with rm -rf
cannot remove 'wslServer.sh': No such file or directory
I cant delete it from windows explorer - no permissions
I cant delete it from powershell as admin running Remove-item.
Looked at all the existing answers on stack and none have a solution as far as i can see.
Would love some help?
Remove-Item : Access is denied
At line:1 char:1
+ Remove-Item wslServer.sh
+ ~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : PermissionDenied: (C:\Users\micro...ts\wslServer.sh:String) [Remove-Item], Unauthorized
AccessException
+ FullyQualifiedErrorId : ItemExistsUnauthorizedAccessError,Microsoft.PowerShell.Commands.RemoveItemCommand
Remove-Item : Cannot find path
'C:\Users\micro.vscode-insiders\extensions\ms-vscode-remote.remote-wsl-0.39.4\scripts\wslServer.sh' because it does
not exist.
At line:1 char:1
+ Remove-Item wslServer.sh
+ ~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (C:\Users\micro...ts\wslServer.sh:String) [Remove-Item], ItemNotFoundEx
ception
+ FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.RemoveItemCommand
Ctrl+alt+delete - right click 'Windows Subsystem For Linux' - end task.
Now try deleting the folder or let VScode do it for you.
I need to compress a folder with power shell.
There is my code:
Get-ChildItem $YourDirToCompress -Directory |
where { $_.Name -notin $DirToExclude} |
Compress-Archive -DestinationPath $ZipFileResult -Update
Move-Item -Path $ZipFileResult -Destination $ZipFileDest
I get:
Exception calling "Write" with "3" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown."
At
C:\Windows\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Archive\Microsoft.PowerShell.Archive.psm1:820
char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : OutOfMemoryException
I have set :
Set-Item WSMan:\localhost\Plugin\Microsoft.PowerShell\Quotas\MaxMemoryPerShellMB 8000
Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 8000
Restart-Service WinRM
The whole file is about 1.9 GiB and the compressed file is 500 MiB.
I find it hard to believe it is really a memory problem.
Also, once or twice it succeeded on file creation (when MaxMemoryPerShellMB was set to 4000). But most times it fails.
What can I do?
I was getting the exact same error in 2022:.
Exception calling "Write" with "3" argument(s): "Exception of type
'System.OutOfMemoryException' was thrown."
At C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules\Microsoft.PowerShell.Arch
ive\Microsoft.PowerShell.Archive.psm1:820 char:29
+ ... $destStream.Write($buffer, 0, $numberOfBytesRead)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : OutOfMemoryException
OutOfMemoryException Microsoft.PowerShell.Archive.psm1
I was able to resolve it very simply after a bit of thinking.
I too doubted that it was really a memory problem, since I have 16 GiB of RAM and it was showing only around 9 GiB in use.
Microsoft Scripting Guy Ed Wilson has written an article Learn How to Configure PowerShell Memory, and for some this may provide the solution.
In my case it did not, as PS appeared to be already configured to use the maximum available memory.
I was getting the issue on a folder I normally zip up every few days. This folder is over 11 GiB and zips down to around 4 GiB. This had previously been working for years.
However, thinking about it, to create a zip of this size I think it's likely that PS holds a significant amount of data in memory, more and more right up to the point when it commits it to the file system at the end. And I realized that although the computer has 16 GiB of RAM and was showing only around 9 GiB in use, 9 GiB is an unusually large amount of memory to be in use on that computer. Subtracting memory used by Windows, graphics etc still left several GiB free—but perhaps not enough for PS to create a large zipfile.
Sure enough, after a reboot to release the unusually large amount of RAM that had been in use, the process ran successfully and created my zipfile as usual.
In summary:
Check whether you can configure PS to use more RAM;
Check in Task Manager to see if you have less RAM free than usual;
Reboot to make maximum RAM available to PS.
It seems that the built-in Zip module of PowerShell is quite limited and designed for simple straightforward tasks. One of those limitations is that the max file size is 2GB. Also, the way that it obtains memory seems not very efficient. For example, if you try to zip files by Winzip or WinRar using Windows UI, it will probably work. More info about the underlying library https://learn.microsoft.com/en-us/dotnet/api/system.io.compression.ziparchive
So, I recommend using 7Zip4Powershell library which worked perfectly for me with the same files/ Windows memory etc.
I am currently using the following sequence of commands in a Windows 7 PowerShell window.
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\Net35\AWSSDK.dll"
$secretKeyID="ABCDEFGHIJKLMNOPQRS"
$secretAccessKeyID="LONGKEYBLAHBLAHBLAHBLAHBLABLAH"
$AWSclient=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secretAccessKeyID)
However, I get this exception below. Please let me know what Im doing wrong. This is what I see most people using successfully.
Exception calling "CreateAmazonS3Client" with "2" argument(s): "No RegionEndpoint or ServiceURL configured"
At line:1 char:1
+ $oAWSclient=[Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID,$secret ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : AmazonClientException
One thing that I think is a problem is that my bucket is not checked for 'enable website hosting'. Does this need to be set to true? I just want to upload a file to the bucked via powershell script.
Duh... I realized that I need to use the AWS Powershell because it has all the settings built into it. Im not exactly sure how to properly refer to this version of powershell, but it is available after installing the AWS SDK.
Since the error specifically states that you're missing the RegionEndpoint or ServiceURL, you could also just include one of them. Since you're connecting to S3, all you need is to include the RegionEndpoint.
Add-Type -Path "C:\Program Files (x86)\AWS SDK for .NET\bin\Net35\AWSSDK.dll"
$secretKeyID="ABCDEFGHIJKLMNOPQRS"
$secretAccessKeyID="LONGKEYBLAHBLAHBLAHBLAHBLABLAH"
$client = [Amazon.AWSClientFactory]::CreateAmazonS3Client($secretKeyID, $secretAccessKeyID, [Amazon.RegionEndpoint]::USEast1)
(Replace USEast1 with the value configured with your S3 Bucket)