How to easily fix virtualbox removable drives being used as physical drive vmdk's when they get unplugged/replugged? - powershell

I am using removable thunderbolt (NVMe) drives that are partitioned/configured for bare metal boot drives in Virtualbox. The idea is that I either boot this drive bare metal or in Virtualbox.
When un-plugging/re-plugging removable drives in Windows, my Virtualbox rawvmdk files consistently break. The reason I found is that there is no guarantee how Windows may path your physical drive each time or where you plug it in. And you can't just create rawvmdk's for all the likely paths because Virtualbox machines can't have a broken vmdk file. Each time it breaks, I have to get just right the particulars of how to correctly re-create the rawvmdk file, and the right order of configuration of the virtual machine. It takes a bunch time/frustration to figure this out each time. I need an easy and consistent way to start the machine.
The following powershell script I worked on looks up the windows path for a hardcoded drive serial number, and does everything necessary to replace the broken rawvmdk file with a working one. The script shows the serials for all the host machine's physical drives each time it runs:
echo "*** This powershell script finds a raw drive by its serial number, and then attaches it to a desired vbox virtual machine***"
echo "*** When VBOX loses its track on the drive, between unplug/replug, this will fix it for that serial."
echo "*** Must be run as administrator ***
"
echo "Here are all the drives we can find:"
Get-Disk |Format-Table -Autosize -Property DeviceID, Model, Path, Partitions, SerialNumber
#This is the Serial number we're looking for:
$SERIAL = "6479_110F."
echo "Attempting to attach Drive Serial Number: $SERIAL
"
#Get the Drive Path for that Serial
$DrivePath = Get-Disk | Where-Object -Property SerialNumber -eq $SERIAL | Select -ExpandProperty "Path"
if ( $null -eq $DrivePath )
{
echo "Could not find your hardcoded serial number"
} else
{
# Where we want to put the rawVMDK file
$VMDKStorage = "C:\VirtualBox VMs\LaCie_Drive\"
cd "$VMDKStorage"
# The Target Vbox VM to attach the rawvmdk
$MachineName = "Ubuntu"
# Arbitrary name for controller.
$ControllerName = "NVMe"
#The Filename to call the rawvmdk
$FileName = "Firecuda.vmdk"
#Clean up the Target Machine of the non-working image.
echo "Attempting to remove specified vmdk hdd from target machine"
Start-Process -FilePath "C:\Program Files\Oracle\VirtualBox\VBoxManage.exe" -ArgumentList "storageattach $MachineName --storagectl $ControllerName --port 0 --device 0 --medium none" -NoNewWindow -Wait
# Remove the Virtualbox traces of the File
Start-Process -FilePath "C:\Program Files\Oracle\VirtualBox\VBoxManage.exe" -ArgumentList "closemedium $FileName --delete" -NoNewWindow -Wait
echo "Creating the new RawVMDK File"
Start-Process -FilePath "C:\Program Files\Oracle\VirtualBox\VBoxManage.exe" -ArgumentList "internalcommands createrawvmdk -filename $FileName -rawdisk $DrivePath" -NoNewWindow -Wait
echo "Adding the now supposedly working RawVMDK to Target Machine"
Start-Process -FilePath "C:\Program Files\Oracle\VirtualBox\VBoxManage.exe" -ArgumentList "storageattach $MachineName --storagectl=NVMe --port 0 --device 0 --type hdd --medium=$FileName" -NoNewWindow -Wait
echo "You now can start the Machine up normally from VirtualBox."
# I found I don't need to start VBox with admin privledges for my VM to work.
}
I did find that when plugging my drive in one particular port on my machine that windows wanted to call my drive an entirely different serial number. After changing this code's serial number to correct for that, I found that I also had other problems. Namely that the kernel did not recognize any of the LVM configuration that contained my drive root (/). What it appears is that on this computer port, linux cannot identify these partitions as nvme partitions, and that this mucks up the boot configuration pretty badly (I was stuck in initramfs). I believe this a USB-C port and that the drive has some alternative hardware to make the drive "work". I really want the drive to operate at NVMe speeds, so I'm just going to avoid this port.
Hope this helps someone.

Related

Running hyperV inside a windows docker container

I have a process which runs primarily with powershell and relies heavily on the hyperV pwsh module to build (at script runtime) and launch a hyper-v instance. This is to programmatically build a windows machine with specific features, updates, and applications, then capture an image of that machine for deployment to physical devices later.
We want to containerize this process so it can be run more dynamically than on a physical box like it is today.
Critical in this, is we need to be able to build and turn on a hyper-v instance inside the container. Currently experimenting on win 1803 with the dockerFile below.
# note it doesn't necessarily need to be this image, I just picked it because it was easy
FROM microsoft/powershell:nanoserver-1803 AS powershell
COPY ./mainContainer/ c:/app/
and then need pwsh like the lines below to work (primary issue is the lack of the hyperV pwsh module):
New-VHD -SizeBytes 100GB -Path $vhdPath
New-VM -Name $VmName -Generation 2 -Path "$TempDirectory\$VmName" -VHDPath $vhdPath -Switch $switchName
Set-VMMemory -VMName $VmName -StartupBytes 4096MB -DynamicMemoryEnabled $false
Add-VMScsiController -VMName $VmName
Add-VMDvdDrive -VMName $VmName -ControllerNumber 1 -ControllerLocation 0 -Path $WindowsIsoPath
Set-VMFirmware -VMName $VmName -FirstBootDevice $dvdDrive
Start-VM -VMName $VmName
$vm = Get-Vm $VMName
# then there's little loop waiting for the machine to turn off before continuing
I have tried Install-WindowsFeature and similar, but always get an error that:
"The term 'Install-WindowsFeature' is not recognized as the name of a cmdlet, function, script file, or operable program."
I have tried import-module servermanager, but that also gives "...not loaded because no valid module file was..."
It seems perhaps the main (current) hurdle may be to get a new module imported in pwsh (within the container) so I can enable the windows feature?
any advice?
update: I found part of the problem was it appears the nanoserver doesn't allow things like dism, so I've updated the dockerFile to:
FROM microsoft/powershell:windowsservercore-1803 AS powershell
COPY ./mainContainer/ c:/app/
RUN DISM /Online /Enable-Feature /All /FeatureName:Microsoft-Hyper-V
but now I get an error:
The source files could not be found.
Use the "Source" option to specify the location of the files that are required to restore the feature. For more information on specifying a source location, see http://go.microsoft.com/fwlink/?LinkId=243077.
seems I need a different base image, not sure if there is one that can enable hyper-V

How to bypass security warning when running EXE from network location?

I am trying to write a complex unattended install script that installs from a network directory. I'm running PS in administrator mode with bypass security.
When I run:
Start-Process "\\192.168.5.7\MSChart.exe" -ArgumentList "/q" -Wait
I get:
How can I bypass this without adding the network location as a trusted server? Ideally simply using PowerShell. I've tried Unblock-File, no luck.
The network share is not trusted by your computer, hence it warns you. You would have to add the share to the trusted zone in the systems internet settings, and allow "launching programs and unsafe files".
You cannot bypass it, but
add the required configuration to the registry
or copy the files locally and run it from there
using PowerShell
You can bypass the warning by adding -NoNewWindow as in Start-Process "\\192.168.5.7\MSChart.exe" -ArgumentList "/q" -Wait -NoNewWindow.
You should however leverage DNS for your path (e.g. \\share.domain.com\file.exe) and ensure the URI (share.domain.com) is in your system 'Trusted Sites' or 'Intranet Sites' list or you may still be blocked. Copying the file to the local system first may also fix the problem.
Reference: https://social.technet.microsoft.com/Forums/en-US/92eab96d-fe1a-4119-a5bc-f171d517466a/getting-open-file-security-warning-using-startprocess?forum=winserverpowershell
Maybe you want to Unblock-File and accept all of the risks that come with that and then try to execute it?
I don't recommend anyone EVER run a script like this:
function Unblock-Dir()
{
gci -Directory | % {
push-location $_ ;
gci | % {
Write-Host "Unblocking $_";
Unblock-File $_
}
Unblock-Dir ;
Pop-Location
}
Unblock-File -path .\*
}
It's just too dangerous.

Why will PsExec not execute a .ps1 script remotely?

I am trying to write a script in PowerShell that gathers free space on the C: drive and then appends it to a .csv file on a network drive.
Our company does not have PowerShell enabled by default. Remote PowerShell commands are restricted. Because of this, I have to script things out, then kick them off using PsExec.
I have two scripts for this process. The first one, is the one I run from my machine, this script asks for my credentials and then uses a foreach loop to cycle through a .txt file with IP addresses telling each IP to launch the second script from a network drive. The second script gathers the free space, performs a math function on it, then appends the hostname and free space to a .csv file on the network.
However, the second script never runs. After the connection is established using PsExec, my console just sits there. It should kick off the script and then loop again to the next IP. I have tested the second script manually on some of the target machines with no issues.
Locally executed script:
$username = Read-Host "Username"
$outputPath = "\\network\directory\sub-directory\FileName.csv"
$IPpath = Read-Host "Enter the path for the list of IP addresses"
$IPs = Get-Content -Path $IPpath
Add-Content -Path $outputPath -Value """IP"",""Free Space"""
foreach ($IP in $IPs)
{
if (Test-Connection -Count 1 -ComputerName $IP -Quiet)
{
\\network\directory\subdirectory\psexec.exe -accepteula \\$IP -u domain\$username -h PowerShell -ExecutionPolicy Bypass -Command \\network\directory\sub-directory\diskSpaceCheck.ps1
}
else
{
Add-Content -Path $outputPath -Value """$IP"",""Offline"""
}
}
Remotely executed script (diskSpaceCheck.ps1):
$outputPath = "\\network\directory\sub-directory\FreeSpace.csv"
$hostname = hostname
$disk = Get-WmiObject Win32_LogicalDisk -Filter "DeviceID='C:'"
$freeSpace = [System.Math]::Round((($disk.FreeSpace) / 1GB))
Add-Content -Path $outputPath -Value "$($hostname), $($freeSpace)"
I don't think you can have an if statement inside a foreach also but I've tried it without the if test and got the same results. I can't figure out why the remote machine won't execute this script. I have another script that is copied to these machines and then executed in the same manner that works correctly.
Also, the target machines are locked down to limit their use for only specific purposes. These machines use a generic auto logon and generally have access to a few applications. Everything else is inaccessible, including Windows Explorer, Control Panel, etc. I tested this script on some of the machines local to my office, and it worked fine.
I was thinking maybe a networking latency issue? Can that impact PsExec or .ps1 scripts? I can remote onto the machine, log out, log in with my credentials and run the script while PsExec just sits at running the command I gave it.
Any advice would be appreciated.
This can occur if you are using PSExec to directly execute the Ps1 file. Instead use PSexec to execute PowerShell.exe and pass your ps1 file as a parameter.
Also check this question: Run PowerShell on remote PC
I figured it out. The script was running but it was just sitting there waiting for another command. I added a stop-process command to the script that runs remotely.
Stop-Process -Name powershell
Now it runs, and then exits with error code -1.

Struggling with foreach

I'm rather new and, well, awful at this whole scripting thing: so any help would be appreciated.
Basically I am trying to create a PowerShell script that installs an undefined number of printers on an undefined number of computers. The computer names and printer names will come from local text files.
This is what I have so far:
$credentials = Get-Credential
$printerlist = Get-Content c:\setup\printers.txt
get-content c:\setup\names.txt | foreach-object {foreach($printer in $printerlist){rundll32 printui.dll PrintUIEntry /ge /c $_ /n $printer}}
EDIT: I am getting the error, unable to enumerate per machine printer connections operation could not be completed (error 0x0000007b) I have tried modifying the script anyway i can come up with, which is probably fewer ways than it should be.
I don't think you have an issue with your foreach loop here.
I think it's just the usage of rundll32 printui.dll PrintUIEntry
Install printer:
rundll32 printui.dll,PrintUIEntry /in /c "\\COMPUTER_NAME" /n "\\PRINT_SERVER_NAME\PRINTER_NAME"
Sets default printer:
rundll32 printui.dll,PrintUIEntry /y /c "\\COMPUTER_NAME" /n "\\PRINT_SERVER_NAME\PRINTER_NAME"
Try installing with the /in individually for one computer from powershell console without your script to see if you still get the same error, could be a permission but I don't think so.

PowerShell script runs Get-CDDrive | Set-CDDrive but won't continue after that line

In one of my scripts, PowerShell gets to this point, it actually does set the cd drive but then it doesn't progress from there. i have other Write-Host commands above the code and it looks like everything progresses and sets the cd drive, but then nothing below it will happen. Any ideas? Anyone seen this kind of issue before?
The Get/Set-CDDrive commands are coming from the VMware PowerCLI snap-in for PowerShell.
Get-CDDrive -VM $vmname | Set-CDDrive -IsoPath $isopath -StartConnected:$true -Confirm:$false
Write-host "ISO attached successfully"
Write-host "[INFO] VM $($vmname) deployed"