Compare txt data with variable - powershell - powershell

This is the script what I want to use:
$path = split-path $MyInvocation.MyCommand.path
$vcenter = Read-Host "Please enter the vCenter name where You want to connect"
Import-Module -Name VMware.VimAutomation.Core
Connect-VIserver $vcenter
$folderName = 'Datacenters'
$folder = Get-Folder -Name $folderName
$patches = Get-Content $path\patches.txt -Raw
$baseline = New-PatchBaseline -Name "Baseline$(Get-Random)" -Static -IncludePatch $patches
Attach-Baseline -Entity $folder -Baseline $baseline -Confirm:$false
Scan-Inventory -Entity $folder
Get-Compliance -baseline $baseline -entity $folder | select Entity, Status
Detach-Baseline -Entity $folder -Baseline $baseline -Confirm:$false
Remove-Baseline -Baseline $baseline -Confirm:$false
If I write multiple patch numbers into the txt - I tried the following methods - :
ESXi670-201912001,ESXi670-201905001
ESXi670-201905001ESXi670-201912001
Also I tried to separate the lines with enter, without comma, the script not able to compare with the $baseline variable.
The desired result would be: write the patch numbers into the text file, attach a new baseline to the vmware environment, and compare the installed patches on the hosts, with the patches what I wrote into the text.
Many thanks for the help!

The -IncludePatch parameter expects an array of items. First, I recommend removing the -Raw switch from Get-Content because that will read in the file contents as one, long string. Second, I recommend just listing the patches one line at a time. That combination will cause the file to be read as an array of strings with each string being a patch name.
# patches.txt Contents
ESXi670-201912001
ESXi670-201905001
# Update This Line
$patches = Get-Content $path\patches.txt

Related

Change a value of an .ini files in multiple directories

I need to change a value in the DefaultPrint.ini file in multiple users "home" directory.
I've managed to make a powershell script to change the value on a single user, but I am struggeling making the script accept multiple users.
$User = "Max"
$IniPath = "\\Network1\Home\$User\"
foreach ($User in $User)
{
$IniFile = "$IniPath\DefaultPrint.ini"
$IniText = Get-Content -Path $IniFile
$NewText = $IniText.Replace("Old_Priter","New_Printer")
Set-Content -Path $IniFile -Value $NewText -Force
}
The directories I need to enter to find the files is based on the usernames, so I need to change the file in:
\Network1\Home\Max
\Network1\Home\John
\Network1\Home\Sophia
etc
The script above is working for a single user, and I am trying to adapt it to work with multiple users though a .csv file
I tried the following
$User = "Max","John","Sophia"
But it does not seperate the user directories but gathers them togther? (\Network1\Home\Max John Sophia)
I also tried to import via csv, as I need to do this on 200+ users
$User = (Import-Csv -Path .\Users.csv).User
But it ends up doing the same, what am I doing wrong?
You need to move this statement:
$IniPath = "\\Network1\Home\$User\"
inside the loop, so that the path is updated with the correct $user every time, and then rename the $User variable used outside the loop ($Users seems an appropriate name here):
$Users = -split "Max John Sophia"
foreach ($User in $Users)
{
$IniPath = "\\Network1\Home\$User\"
$IniFile = "$IniPath\DefaultPrint.ini"
$IniText = Get-Content -Path $IniFile
$NewText = $IniText.Replace("Old_Priter","New_Printer")
Set-Content -Path $IniFile -Value $NewText -Force
}

Powershell dropping characters while creating folder names

I am having a strange problem in Powershell (Version 2021.8.0) while creating folders and naming them. I start with a number of individual ebook files in a folder that I set using Set-Location. I use the file name minus the extension to create a new folder with the same name as the e-book file. The code works fine the majority of the time with various file extensions I have stored in an array beginning of the code.
What's happening is that the code creates the proper folder name the majority of the time and moves the source file into the folder after it's created.
The problem is, if the last letter of the source file name, on files with the extension ".epub" end in an "e", then the "e" is missing from the end of the created folder name. I thought that I saw it also drop "r" and "p" but I have been unable to replicate that error recently.
Below is my code. It is set up to run against file extensions for e-books and audiobooks. Please ignore the error messages that are being generated when files of a specific type don't exist in the working folder. I am just using the array for testing and it will be filled automatically later by reading the folder contents.
This Code Creates a Folder for Each File and moves the file into that Folder:
Clear-Host
$SourceFileFolder = 'N:\- Books\- - BMS\- Books Needing Folders'
Set-Location $SourceFileFolder
$MyArray = ( "*.azw3", "*.cbz", "*.doc", "*.docx", "*.djvu", "*.epub", "*.mobi", "*.mp3", "*.pdf", "*.txt" )
Foreach ($FileExtension in $MyArray) {
Get-ChildItem -Include $FileExtension -Name -Recurse | Sort-Object | ForEach-Object { $SourceFileName = $_
$NewDirectoryName = $SourceFileName.TrimEnd($FileExtension)
New-Item -Name $NewDirectoryName -ItemType "directory"
$OriginalFileName = Join-Path -Path $SourceFileFolder -ChildPath $SourceFileName
$DestinationFilename = Join-Path -Path $NewDirectoryName -ChildPath $SourceFileName
$DestinationFilename = Join-Path -Path $SourceFileFolder -ChildPath $DestinationFilename
Move-Item $OriginalFileName -Destination $DestinationFilename
}
}
Thanks for any help you can give. Driving me nuts and I am pretty sure it's something that I am doing wrong, like always.
String.TrimEnd()
Removes all the trailing occurrences of a set of characters specified in an array from the current string.
TrimEnd method will remove all characters that matches in the character array you provided. It does not look for whether or not .epub is at the end of the string, but rather it trims out any of the characters in the argument supplied from the end of the string. In your case, all dots,e,p,u,b will be removed from the end until no more of these characters are within the string. Now, you will eventually (and you do) remove more than what you intended for.
I'd suggest using EndsWith to match your extensions and performing a substring selection instead, as below. If you deal only with single extension (eg: not with .tar.gz or other double extensions type), you can also use the .net [System.IO.Path]::GetFileNameWithoutExtension($MyFileName) method.
$MyFileName = "Teste.epub"
$FileExt = '.epub'
# Wrong approach
$output = $MyFileName.TrimEnd($FileExt)
write-host $output -ForegroundColor Yellow
#Output returns Test
# Proper method
if ($MyFileName.EndsWith($FileExt)) {
$output = $MyFileName.Substring(0,$MyFileName.Length - $FileExt.Length)
Write-Host $output -ForegroundColor Cyan
}
# Returns Tested
#Alternative method. Won't work if you want to trim out double extensions (eg. tar.gz)
if ($MyFileName.EndsWith($FileExt)) {
$Output = [System.IO.Path]::GetFileNameWithoutExtension($MyFileName)
Write-Host $output -ForegroundColor Cyan
}
You're making this too hard on yourself. Use the .BaseName to get the filename without extension.
Your code simplified:
$SourceFileFolder = 'N:\- Books\- - BMS\- Books Needing Folders'
$MyArray = "*.azw3", "*.cbz", "*.doc", "*.docx", "*.djvu", "*.epub", "*.mobi", "*.mp3", "*.pdf", "*.txt"
(Get-ChildItem -Path $SourceFileFolder -Include $MyArray -File -Recurse) | Sort-Object Name | ForEach-Object {
# BaseName is the filename without extension
$NewDirectory = Join-Path -Path $SourceFileFolder -ChildPath $_.BaseName
$null = New-Item -Path $NewDirectory -ItemType Directory -Force
$_ | Move-Item -Destination $NewDirectory
}

Append an array using powershell

I have an array with 3 elements(feature) in my code. Currently i have declared them as $feature = "System","Battery","Signal","Current";
But in future there can be more features. So I thought of giving an option in my code to add new feature(implemented as GUI) and using $feature.Add("$new_feature") command.
This works perfectly for that particular run of the script. But when i run the script again, this new added feature is not appearing. How can i solve this issue, so that when ever new feature is added, then it will remain in the script for ever?
Is this possible?
The simplest approach would be to store the array data in a file:
# read array from file
$feature = #(Get-Content 'features.txt')
# write array back to file
$feature | Set-Content 'features.txt'
You can use $PSScriptRoot to get the location of the script file (so you can store the data file in the same folder). Prior to PowerShell v3 use the following command to determine the folder containing the script:
$PSScriptRoot = Split-Path $MyInvocation.MyCommand.Path -Parent
Another option is to store the data in the registry (easier to locate the data, but a little more complex to handle):
$key = 'HKCU:\some\key'
$name = 'features'
# read array from registry
$feature = #(Get-ItemProperty -Path $key -Name $name -EA SilentlyContinue | Select-Object -Expand $name)
# create registry value if it didn't exist before
if (-not $?) {
New-ItemProperty -Path $key -Name $name -Type MultiString -Value #()
}
# write array back to registry
Set-ItemProperty -Path $key -Name $name -Value $feature

Powershell script to find all backups on all drives and print file path, file name, and file size

I found scripts that all circle around the answer I need but I cannot figure out how to combine them.
Here is a script to find all the backups on all drives but it moves them; I just want to print the details (to file preferably).
foreach ($server in Get-Content c:\scripts\sl.txt){
foreach ($root in 'c$','d$','e$','f$'){
cmd /c dir "\\$server\$root\*.bak" /B /S /A-D |%{
Move-Item $_ -destination C:\users\Scripts
}
}
}
And I found others that will print all files with particular extensions found in a single drive.
$Extensions = #(".bak",".csv",".txt")
Foreach ( $Extension in $Extensions )
{
[System.IO.Directory]::EnumerateFiles("C:\","*$Extension","AllDirectories")
}
I am having trouble combining the two and under tons of pressure. Please help!
That first example uses cmd to call dir which is unnecessary since Get-ChildItem can do a directory listing. Get-ChildItem actually returns much more information and in an object format which is very usable in further scripting. There are even aliases (Get-Help alias) for Get-ChildItem: dir, ls and gci. (Save these for commandline, scripts should use the long form for readability).
The second example is using some kind of .Net roundabout method of enumerating properties of the file objects. MUCH easier to use dot notation, or Select-Object -Property directly with the powershell objects. Use 'Get-Member' to see the list of properties and methods of an object. e.g. gci | gm
PS M:\> $file = gci c:\windows\notepad.exe
PS M:\> $file.DirectoryName
C:\windows
Or
PS M:\> (gci c:\windows\notepad.exe).DirectoryName
C:\windows
If you wanted to do a oneliner, set $server beforehand or insert actual name, and change the output file name each time:
"C$","D$","E$","F$" | %{gci "\\$server\$_\*.bak" -recurse} | %{export-csv -notypeinformation -append c:\temp\filelist.csv}
Another thing to consider would be modifying the objects returned by 'Get-ChildItem' to add a property to hold the 'server' property. Since the DirectoryName property already includes the root drive letter, you could then output all servers and drives .bak file lists into one file.
Bottom Line, use this modified version of what arco444 wrote:
function List-Backups {
foreach ($server in Get-Content c:\scripts\serverlist.txt){
foreach ($root in 'c','d','e','f'){
$outfile = "C:\Temp\FileList-$server-$root.csv"
Get-ChildItem "\\$server\$root"+'$'+"\*.bak" |
Add-Member –MemberType NoteProperty –Name ServerName –Value $server |
export-csv $outfile -NoTypeInformation -Append
}
}
}
This gives you a CSV file with all the files remaining as objects. You can then do what you want with the CSV.
import-csv c:\temp\filelist.csv | select Name, DirectoryName
Later, you can create function(s) to pull information from the text files output by this function.
Try the below:
foreach ($server in Get-Content c:\scripts\sl.txt){
foreach ($root in 'c$','d$','e$','f$'){
$files = Get-ChildItem \\$server\$root\*.bak
foreach($f in $files) {
Write-Output "$($f.directoryname) $($f.name) $($f.length)" | Tee-Object -Append C:\output.log
}
}
}
For each root volume you can use the Get-ChildItem command to get a list of *.bak files. This will return a list of FileInfo objects which contain properties such as length (size), name, LastWriteTime etc...
You can access these properties by looping over the list and accessing using the . notation. Use Write-Output to print the results to the screen, you can optionally pipe to Tee-Object to print to both the screen and a file.

Get Folder NTFS ACL on long path name

I have a PS script that will return NTFS ACLs if an individual user is assigned, works well until I hit a path exceeding 260 characters. I've found a lot of information on the path too long problem and some work-arounds but I'm struggling to integrate a solution into my script. Any suggestions?
Thanks!
$DateStart = Get-Date
$Path = "E:\"
$PermittedOU1 = "OU=Groups,dc=chiba,dc=localt"
$PermittedOU3 = "OU=System Accounts,OU=Accounts,dc=chiba,dc=local"
$PermittedACL1 = get-adgroup -Filter * -SearchBase $PermittedOU1
$PermittedACL3 = get-aduser -Filter * -SearchBase $PermittedOU3
$ObjectPathItem = Get-ChildItem -path $Path -Recurse | where-object {$_.PsIsContainer} | foreach- object -process { $_.FullName }
$howmany=0
$Logfilename = "C:\Users\administrator\Documents\$(get-date -f yyyy-MM-dd-hh-mm).csv"
Add-Content $Logfilename "$DateStart`n"
$totalfolders=0
$i=0
ForEach ($Folder in $ObjectPathItem)
{
$totalfolders++
}
Foreach ($Folder in $ObjectPathItem)
{
$ObjectACL = Get-ACL -Path $Folder
$i++
$howmany=0
Write-Progress -id 1 -Activity "Folder Recursion" -status "Folders Traversed: " -PercentComplete (($i / $totalfolders) * 100)
Foreach ($ACL in $ObjectACL.access)
{
$ACLstring = $ACL.identityreference.Value
$ACLstring = $ACLstring.Replace("CHIBA\","")
if (($ACLstring -notin $PermittedACL1.name)`
-and ($ACLstring -notin $PermittedACL3.SamAccountName)`
-and ($ACLstring -notin "NT AUTHORITY\SYSTEM") `
-and ($ACLstring -notin "BUILTIN\Administrators") `
-and ($ACLstring -notin "CREATOR OWNER"))
{
$newline = "`"$Folder`"" + "," + "$ACLString"
Add-Content $Logfilename "$newline"
$howmany+=1
}
else {
$howmany+=1
}
}
}
$DateEnd = Get-Date
Add-Content $Logfilename "`n`n$DateEnd"
One option you can usually use is to create a mapped drive using New-PSDrive. Something like:
Try{
$ObjectACL = Get-ACL -Path $Folder
}
Catch{
$SubPathLength = $Folder.FullName.substring(0,200).LastIndexOf('\')
$NewTempPath = $Folder.FullName.SubString(0,$SubPathLength)
New-PSDrive -Name Temp4ACL -Provider FileSystem -Root $NewTempPath
$ObjectACL = Get-ACL "Temp4ACL:$($Folder.FullName.SubSTring($SubPathLength,$Folder.FullName.Length-$SubPathLength))"
}
That will find the last \ before the 200th character in the path, grab a substring of the full path up to the end of that folder's name and create a temp drive of it, then get the ACL based off the temp drive and the remaining path. So this path:
C:\Temp\Subfolder\Really Long Folder Name\Another Subfolder\ABCDEFGHIJKLMNOPQRSTUVWXYZ\We Are Really Pushing It Now\Im Running Out Of Folder Name Ideas\Hello My Name Is Inigo Montoya\You Killed My Father Prepare To Die\ReadMe.txt
Gets cut at the second to last backslash. I would end up getting the ACL from:
Temp4ACL:\You Killed My Father Prepare To Die\ReadMe.txt
Easy way is to use "\\?" to support 32,767 characters.
$folder = "C:\MyFolder"
icacls "\\?\$folder"
https://msdn.microsoft.com/en-us/library/windows/desktop/aa364963(v=vs.85).aspx
In the ANSI version of this function, the name is limited to MAX_PATH characters. To extend this limit to 32,767 wide characters, call the Unicode version of the function (GetFullPathNameW), and prepend "\\?\" to the path.
Okay, this question is quite old but for those coming here as of today like myself I provide this information that I found through Google:
Microsoft Technet Script Center lists a "File System Security PowerShell Module" which claims that since version 3.0 it "leverages the AlphaFS (http://alphafs.codeplex.com) to work around the MAX_PATH limitation of 260 characters". At the time of this writing the module is at version 4.2.3.
The general idea of this module is described as "PowerShell only offers Get-Acl and Set-Acl but everything in between getting and setting the ACL is missing. This module closes the gap."
So without having tried this myself I suppose it should help in solving the OPs problem.
The module is also featured in a post by the "Hey, Scripting Guy! Blog".