Moving VM to a folder within DataCenter - powershell

There are two datastores within the vcenter I'm working on. In both datastores, I want to have a folder named simply "Test".
I figured that if I ran Move-VM -VM VmName -Destination Test, the VM would be moved to the "Test" folder found within the datastore it resides on. This is proving not to be the case as I'm getting an error for having multiple values for my Destination.
Is there a way to accomplish this without having to move VMs to another datastore or having the two folders in question named differently?
Thanks for your help

The following line worked for what I wanted to do:
Move-VM -VM $vmObject -Destination (($vmObject | Get-Datacenter) | Get-Folder -Name "Test" )

If there are duplicate folder in Datacenter. You can use below Syntax
$folder = Get-Datacenter -Name "Prod" | Get-Folder -Name "MST" | Get-Folder -Name "Application" | Get-Folder -Name Linux

Related

Setting the VirtualHardDiskPath when adding a virtual machine through powershell to Hyper-V

I'm adding a vm through powershell to Hyper-V. The add is working but it's setting the config/xml files on the same drive as the vhdx file.
I am setting the $config and then running my new-vm.
$config= Get-VMHost | Select-Object VirtualMachinePath
I end up with this:
#{VirtualMachinePath=F:\vmconfigs}
This is how I'm adding the vm:
New-VM -Name $name -MemoryStartupBytes 8192MB -VirtualHardDiskPath $config -Path $driv\vm -Generation 2 -SwitchName (Get-VMSwitch).Name
If I run it without the -VirtualHardDiskPath, it places the configs in a folder on the same drive as the vhdx file. Of course, it will not run with the way it's set with the path added since it is not formatted correctly.
You can see here that my default is f:\vmconfigs but it's not using that folder when I manually add it.
So, I have two questions. First, how do I get the VirtualMachinePath correctly. Second, why isn't it putting the configs in the default folder (f:\vmconfigs) if I do not set it with powershell at the command line? If I add it through the interface, it is correct.
Thanks!
EDIT
This is what happens:
Even though the virtual machine path is f:\vmconfigs
My current command:
New-VM -Name $name -MemoryStartupBytes 8192MB -Path $driv\vm -Generation 2 -SwitchName (Get-VMSwitch).Name
I wasn't using -path correctly. I ended up with this:
# ---------------- vhdx File Locations -----------------
# the main virtual machine to be added
$sysprep= "C:\SysPrep\sysprep2019.vhdx"
# the 'd: drive' that is to be added
$sysprep2= "C:\SysPrep\sysprep2019_2Drive.vhdx"
# ---------------- Hardware Settings -----------------
# number of processors to allocate
$numprocs= 6
# startup memory, defaults to 8gb
$startupmem= 8192MB
Write-Output "Creating the new virtual machine $name."
New-VM -Name $name -MemoryStartupBytes $startupmem -Generation 2 -SwitchName (Get-VMSwitch).Name
Write-Output "Adding $sysprep to the new virtual machine."
Add-VMHardDiskDrive -VMName $name -Path $newfile
if($secdrive -match 'y')
{
Write-Output "Adding second drive to guest operating system."
Add-VMHardDiskDrive -VMName $name -Path $newfile2
Write-Output "$sysprep2 has been attached."
}
# set number of processors
Write-Output "Setting the number of processors to $numprocs"
Set-VMProcessor $name -Count $numprocs
Granted, this is only part of my script. So, I create the VM first in Hyper-V then add the drives to it.

Powershell copy all folders and files with certain extension

I have one package on my Windows machine and another package on a remote server.
The first one is -> C:\Users\One. It contains the following files:
adapter.jsx
result.js
system.jsx
moment.js
readme.txt
package called info that contains two files -> logger.jsx and date.js.
Another one is a remote target directory -> /mnt/media/Two. It is currently empty. The user and host for it are: $userAndHost = "user#foo.bar"
I want to copy all the packages and files of extensions .jsx and .js from package One to package Two. It's required to use scp here since this is a copy between two different platforms.
What I tried:
get all the items within the package:
Get-ChildItem -Path "C:\Users\One" -Recurse
filter items by certain extension, in my case they are .jsx and .js:
Get-ChildItem -Path "C:\Users\One" -Recurse | Where-Object {$_.extension -in ".js",".jsx"}
do the secure copy (scp) - I didn't come up with the script here.
Please, help me finish the script.
Hi i think you need something like this.
I write a code for you, tested working.
#Set execution policy to bypass
Set-ExecutionPolicy -ExecutionPolicy Bypass -Scope Process -Force
#install Posh-SSH from powershell gallery
#https://www.powershellgallery.com/packages/Posh-SSH/2.0.2
Install-Module -Name Posh-SSH -RequiredVersion 2.0.2
#import module
Import-Module Posh-SSH
#get all the items within the package in the path:
$path = 'C:\Users\One'
$items = (Get-ChildItem -Path $path -Name -File -Include ( '*.jsx', '*.js') -Recurse)
#Need destination credential
$credential = Get-Credential
#copy selected items to destination via scp
$items | ForEach-Object {
Set-SCPFile -ComputerName 'SCP-SERVER-HOST-HERE' -Credential $credential -RemotePath '/mnt/media/Two' -LocalFile "$path\$_"
}
Hope this helps you

PowerShell Only get files within folders > move to root > delete folders only > upload to ShareFile

I have the below powershell script which has been developed to copy files from a local unc path to the application Citrix ShareFile. Whilst all is good on that aspect, one issue we are facing is that we strictly can not support folders due to Citrix ShareFile not accepting the copy as a new upload, however it is creating the file as a New Folder and not triggering the workflow correctly.
One thing I think that will make our life easier is simply not supporting folders which will work for our environment.
What i am thinking is a script that pulls all files and moves them to the root directory, deletes all folders, then uploads them to ShareFile.
The below script will copy the folder and all its contents.
I have had a look around and am struggling to get it to do as i wish.
## Add ShareFile PowerShell Snap-in
Add-PSSnapin ShareFile
## Create new authentication file
#New-SfClient -Name "C:\Sharefile\SVCACC.sfps" -Account aws
## Variables ##
$OutputAppReqFID = "fo4a3b58-bdd6-44c8-ba11-763e211c183f"
$Project = 'A001'
$LocalPath = "\\file.server.au\$project\DATA\DATA CUSTODIAN\OUTPUT\"
$sfClient = Get-SfClient -Name C:\sharefile\SVCACC.sfps
$OutputAppReqFID_URL = (Send-SfRequest $sfClient -Entity Items -id $OutputAppReqFID).Url
## Create PS Drive ##
New-PSDrive -Name "sfDrive-$($project)" -PSProvider ShareFile -Client $sfClient -Root "\" -RootUri $OutputAppReqFID_URL
## Copy all files from specified path to ShareFile, followed by moving files to another folder ##
foreach ($object in Get-ChildItem -Path $LocalPath) {
Copy-SfItem -Path $object.FullName -Destination "sfDrive-$($project):"
remove-item $object.FullName -Recurse
}
## Remove PS Drive ##
Remove-PSDrive "sfdrive-$($project)"
Answered!
I managed to apply a simple Where-Object that excludes the mode directory d----- from the upload
Get-childitem -Path $LocalPath -Recurse | Where-Object {$_.Mode -ne "d-----"} | Select-Object -ExpandProperty FullName)
Seems to have worked a treat!

Files Could Not be Deleted Because They are Being Used by a Process

I tried to execute the following command in PowerShell in order to remove a virtual machine together with its associated files.
Get-VM "VM Name" | %{
Stop-VM -VM $_ -Force;
Remove-VM -VM $_ -Force;
Remove-Item -Path $_.Path -Recurse -Force
My problem, however, is that the script resulted to an error because some of the files (snapshots) were still being used by a different process. In addition to this, the .vhdx files were not deleted. Could anyone help me out how to solve this problem?
try this (this needs to merge snapshots for VM , it needs sometime to complete the merging ):
Get-VM "VM Name" | %{Stop-VM -VM $_ -Force;Remove-VMSnapshot -vm $_;Remove-VM -VM $_ -Force;Remove-Item -Path $_.Path -Recurse -Force;}
Or (stop VMM service to remove VM files without merging):
Get-VM "VM Name" | %{Stop-VM -VM $_ -Force;Stop-Service -Name vmms -Force;Remove-Item -Path $_.Path -Recurse -Force;Start-Service -Name vmms ; start-service -Name vmhostagent;start-sleep 3;Remove-VM -VM $_ -Force}
In addition , my test lab is 2012R2 hyper-v host , please ensure that "path" folder doesn't contain other VM's file .

Powershell, how to add permission to shared folder

I have the following code which creates a shared folder
if (!(Test-Path c:\myFolder))
{
New-Item -Path 'c:\myFolder' -ItemType Directory
}
If (!(GET-WMIOBJECT Win32_Share -Filter "Name='myFolder'”))
{
$Shares.Create(“c:\myFolder”,”myFolder”,0)
}
How can i add Read/Write permission to 'Everyone' to the shared folder?
I prefer not to add external dll
Thanks
The Carbon module has an Install-Share function that will do what you need:
Install-Share -Name myFolder -Path C:\myFolder -Permissions "EVERYONE,FULL"
Internally, Install-Share is using the net share console application. I believe if you run net share /?, you'll get syntax on how to create a share from the command line.
Disclaimer: I am the owner/maintainer of Carbon.
Try the Set-SharePermission function from the ShareUtils module (http://en-us.sysadmins.lv/Lists/Posts/Post.aspx?ID=28):
Import-Module ShareUtils
Get-Share -Name myFolder |
Set-SharePermission -User Everyone -AccessType Allow -Permission Change |
Set-Share