I have two registry files (.reg) exported using powershell. I would like to compare the difference of the two files to the registry, ideally using powershell. I have been using compare-object but that compares the files at the text level. I want to "pre-load" the files into memory and compare them at the key/property level to determine which keys have changed. I would then want to create a third .reg file with the changes and apply this to the registry.
Is this possible, eg using the Compare-Object?
Multiple ideas, none of them what I would call good. I cannot find a better way, even using .NET APIs.
(Compare-Object): Use 'reg.exe' to export the target tree to a file, then Get-Content both files, and do a Compare-Object on their contents.
(Manual): Use get-content on new.reg, then parse each line with either split or regex-fu. For each item, get-itemproperty on the target and validate the values of the properties and child keys
(Compare-Object): Import new.reg into temporary registry location, then use Get-ChildItem on both trees and compare all the objects
Related
I am looking to do a hash compare of a variety of things; files, folders, registry properties and registry keys.
File hash is easy...
$Hash1 = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($filePath1)))
$Hash2 = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($filePath2)))
if ($Hash1 -eq $Hash2) {}
And folder is easy because I can do a get Get-ChildItem, and if the resulting list isn't identical then the compare is false, and if the list is identical then if any individual file hash isn't identical then the compare is false.
However, I am trying to add registry stuff to this, and I am not clear on how to do the ReadAllBytes() on a registry property. Or, am I really over thinking this since a property is not a complex file, and I should just first check for property kind with $sourceKey.GetValueKind($name) and then compare actual values if the kinds match? Does -eq work with every possible data type in the registry? Or are there gotchas I need to worry about?
I have a zip archive that I need to inspect. I need to find the file name of a specific file with the file extension .serverrevision using Powershell.
There will only be one file with this file extension in the zip archive. The file name will be something like "2.1.4.serverrevision". I need to extract the version number, i.e. 2.1.4 in this example.
I know that I can use the following method to list the contents of the zip archive:
[IO.Compression.ZipFile]::OpenRead($ziparchive.FullName).Entries.FullName | %{ "$ziparchive`:$_" }
But I cannot figure out how to search for that file extension in that list that the function returns and then bring out the filename. The file name could be something like 2.1.4.serverrevision.
Any suggestions?
Use a combination of the .Where() and .ForEach() array methods: .Where() to filter and .ForEach() to transform (extract the name part of interest):
[IO.Compression.ZipFile]::OpenRead($ziparchive.FullName).Entries.FullName.
Where({ [IO.Path]::GetExtension($_) -eq '.serverrevision' }, 'First').
ForEach({ [IO.Path]::GetFileNameWithoutExtension($_) }
Note:
You could achieve the same with the analogous Where-Object and ForEach-Object cmdlets, although with collections that are already in memory or easily fit into memory, the methods are faster.
However, the 'First' parameter, which stops processing once the first match is found - an important performance optimization - is currently not available with Where-Object; GitHub issue #v suggests bringing those features currently exclusive to the .Where() method to the cmdlet as well.
In PowerShell (Core) 7+, an alternative to calling the [IO.Path] methods is to use the Split-Path cmdlet, which now offers -Extension and -LeafBase switches (not available in Windows PowerShell).
First thank for those who stop here and try to help !
It's my first time touching powershell.
I search a way to list and export unattached or orphaned resources in Azure (IPs, RGs, Disks ...)
Get-AzDisk | Select-Object -Property Name,AttachedTo,Location | Export-Csv -path C:\Users\David\test.csv ";" -NoTypeInformation
With this I only have Azdisk on a single .csv file but how can I add per example "Get-AzPublicIpAddress" into the same .csv ?
You want to use a script to do this, not a one-liner.
First, create a function to build a custom object with as many properties as you need, using as many commandlets as you wish. Get this to work for one object.
Learn about PSCustomObject.
Learn about functions.
Then build a loop that runs the function against all of the objects creating an array holding items which are instances of your custom object.
At the end of the loop, output to CSV.
Objects, especially custom objects, are one of the things that sets Powershell apart.
Anyone have any ideas on how to rename files by finding an association with an index file?
I have a file/folder structure like the following:
Folder name = "Doe, John EO11-123"
Several files under this folder
The index file(MS Excel) has several columns. It contains the names in 2 columns(First and Last). It also has a column containing the number EO11-123.
What I would like to do is write maybe a script to look at the folder names in a directory, compare/find an associated value in the index file(like that number EO11-123) and then rename all the files under the folder using a 4th column value in the index.
So,
Folder name = "Doe, John EO11-123", index column1 contains same value "EO11-123", use column2 value "111111_000000" and rename all the files under that directory folder to "111111_000000_0", "111111_000000_1", "111111_000000_2" and so on.
This possible with powershell or vbscript?
Ok, I'll answer your questions in your comment first. Importing the data into PowerShell allows you to make an array in powershell that you can match against, or better yet make a HashTable to reference for your renaming purposes. I'll get into that later, but it's way better than trying to have PowerShell talk to Excel and use Excel's search functions because this way it's all in PowerShell and there's no third party application dependencies. As for importing, that script is a function that you can load into your current session, so you run that function and it will automatically take care of the import for you (it opens Excel, then opens the XLS(x) file, saves it as a temp CSV file, closes Excel, imports that CSV file into PowerShell, and then deletes the temp file).
Now, you did not state what your XLS file looks like, so I'm going to assume it's got a header row, and looks something like this:
FirstName | Last Name | Identifier | FileCode
Joe | Shmoe | XA22-573 | JS573
John | Doe | EO11-123 | JD123
If that's not your format, you'll need to either adapt my code, or your file, or both.
So, how do we do this? First, download, save, and if needed unblock the script to Import-XLS. Then we will dot source that file to load the function into the current PowerShell session. Once we have the function we will run it and assign the results to a variable. Then we can make an empty hashtable, and for each record in the imported array create an entry in the hashtable where the 'Identifier' property (in your example above that would be the one that has the value "EO11-123" in it), make that the Key, then make the entire record the value. So, so far we have this:
#Load function into current session
. C:\Path\To\Import-XLS.ps1
$RefArray = Import-XLS C:\Path\To\file.xls
$RefHash = #{}
$RefArray | ForEach( $RefHash.Add($_.Identifier, $_)}
Now you should be able to reference the identifier to access any of the properties for the associated record such as:
PS C:\> $RefHash['EO11-123'].FileCode
JD123
Now, we just need to extract that name from the folder, and rename all the files in it. Pretty straight forward from here.
Get-ChildItem c:\Path\to\Folders -directory | Where{$_.Name -match "(?<= )(\S+)$"}|
ForEach{
$Files = Get-ChildItem $_.FullName
$NewName = $RefHash['$($Matches[1])'].FileCode
For($i = 1;$i -lt $files.count;$i++){
$Files[$i] | Rename-Item -New "$NewName_$i"
}
}
Edit: Ok, let's break down the rename process here. It is a lot of piping here, so I'll try and take it step by step. First off we have Get-ChildItem that gets a list of folders for the path you specify. That part's straight forward enough. Then it pipes to a Where statement, that filters the results checking each one's name to see if it matches the Regular Expression "(?<= )(\S+)$". If you are unfamiliar with how regular expressions work you can see a fairly good breakdown of it at https://regex101.com/r/zW8sW1/1. What that does is matches any folders that have more than one "word" in the name, and captures the last "word". It saves that in the automatic variable $Matches, and since it captured text, that gets assigned to $Matches[1]. Now the code breaks down here because your CSV isn't laid out like I had assumed, and you want the files named differently. We'll have to make some adjustments on the fly.
So, those folder that pass the filter will get piped into a ForEach loop (which I had a typo in previously and had a ( instead of {, that's fixed now). So for each of those folders it starts off by getting a list of files within that folder and assigning them to the variable $Files. It also sets up the $NewName variable, but since you don't have a column in your CSV named 'FileCode' that line won't work for you. It uses the $Matches automatic variable that I mentioned earlier to reference the hashtable that we setup with all of the Identifier codes, and then looks at a property of that specific record to setup the new name to assign to files. Since what you want and what I assumed are different, and your CSV has different properties we'll re-work both the previous Where statement, and this line a little bit. Here's how that bit of the script will now read:
Get-ChildItem c:\Path\to\Folders -directory | Where{$_.Name -match "^(.+?), .*? (\S+)$"}|
ForEach{
$Files = Get-ChildItem $_.FullName
$NewName = $Matches[2] + "_" + $Matches[1]
That now matches the folder name in the Where statement and captures 2 things. The first thing it grabs is everything at the beginning of the name before the comma. Then it skips everything until it gets tho the last piece of text at the end of the name and captures everything after the last space. New breakdown on RegEx101: https://regex101.com/r/zW8sW1/2
So you want the ID_LName, which can be gotten from the folder name, there's really no need to even use your CSV file at this point I don't think. We build the new name of the files based off the automatic $Matches variable using the second capture group and the first capture group and putting an underscore between them. Then we just iterate through the files with a For loop basing it off how many files were found. So we start with the first file in the array $Files (record 0), add that to the $NewName with an underscore, and use that to rename the file.
I'm trying to write a script that would go through 1.6 million files in a folder and move them to the correct folder based on the file name.
The reason is that NTFS can't handle a large number of files within a single folder without a degrade in performance.
The script call "Get-ChildItem" to get all the items within that folder, and as you might expect, this consumes a lot of memory (about 3.8 GB).
I'm curious if there are any other ways to iterate through all the files in a directory without using up so much memory.
If you do
$files = Get-ChildItem $dirWithMillionsOfFiles
#Now, process with $files
you WILL face memory issues.
Use PowerShell piping to process the files:
Get-ChildItem $dirWithMillionsOfFiles | %{
#process here
}
The second way will consume less memory and should ideally not grow beyond a certain point.
If you need to reduce the memory footprint, you can skip using Get-ChildItem and instead use a .NET API directly. I'm assuming you are on Powershell v2, if so first follow the steps here to enable .NET 4 to load in Powershell v2.
In .NET 4 there are some nice APIs for enumerating files and directories, as opposed to returning them in arrays.
[IO.Directory]::EnumerateFiles("C:\logs") |%{ <move file $_> }
By using this API, instead of [IO.Directory]::GetFiles(), only one file name will be processed at a time, so the memory consumption should be relatively small.
Edit
I was also assuming you had tried a simple pipelined approach like Get-ChildItem |ForEach { process }. If this is enough, I agree it's the way to go.
But I want to clear up a common misconception: In v2, Get-ChildItem (or really, the FileSystem provider) does not truly stream. The implementation uses the APIs Directory.GetDirectories and Directory.GetFiles, which in your case will generate a 1.6M-element array before any processing can occur. Once this is done, then yes, the remainder of the pipeline is streaming. And yes, this initial low-level piece has relatively minimal impact, since it is simply a string array, not an array of rich FileInfo objects. But it is incorrect to claim that O(1) memory is used in this pattern.
Powershell v3, in contrast, is built on .NET 4, and thus takes advantage of the streaming APIs I mention above (Directory.EnumerateDirectories and Directory.EnumerateFiles). This is a nice change, and helps in scenarios just like yours.
This is how I implemented it without using .Net 4.0. Only Powershell 2.0 and old-fashioned DIR-command:
It's just 2 lines of (easy) code:
cd <source_path>
cmd /c "dir /B"| % { move-item $($_) -destination "<dest_folder>" }
My Powershell Proces only uses 15MB. No changes on the old Windows 2008 server!
Cheers!