I am creating a backup and restore tool with powershell script and I am trying to make it so that when restoring, the script picks the last folder created and restores from that directory structure. Basically I am having the script start with making a backup directory with a date/time stamp like so:
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$folderdate = Get-Date -f MMddyyyy_Hm
$homedir = Get-Aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty
HomeDirectory
New-Item $homedir -Name "TXBackup\$folderdate" -ItemType Directory
$cbookmarks = "$env:userprofile\Appdata\Local\Google\Chrome\User Data\Default\Bookmarks"
md $homedir\TXBackup\$folderdate\Chrome
Copy-Item $cbookmarks "$homedir\TXBackup\$folderdate\Chrome" -Recurse
Backup Folder Structure
Basically everytime someone runs the backup tool it will create a subfolder under the Backup directory with the date/time name to track the latest one. The problem comes when I want to restore from the last one create I can no longer use a $folderdate variable since it will pull the whatever the time is while the tool is being run. Here is the code without taking into account what the last folder is. I tried using sort but that doesn't appear to give me a clear path to select the last one created or I just am such a noob I didn't use it right :(
##Restoring Files from Backup
$CurrentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
$CurrentDomainName = $CurrentUser.split("\")[0]
$CurrentUserName = $CurrentUser.split("\")[1]
$homedir = get-aduser $CurrentUserName -prop HomeDirectory | select -ExpandProperty HomeDirectory
##Restoring Chrome Bookmarks
Sort-Object -Property LastWriteTime |
Select-Object -Last 1
$rbookmarks = "$homedir\TXBackup\$folderdate\Chrome\Bookmarks"
Copy-Item $rbookmarks "C:\Test\"
I know I didn't use that correctly but any direction would be awesome for this newbie :)
You can use Sort-Object with a script block and use [DateTime] methods to parse the date from the folder name, using the same format string you used to create them.
# Sort directory names descending
Get-ChildItem -Directory | Sort-Object -Desc {
# Try to parse the long format first
$dt = [DateTime]::new( 0 )
if( [DateTime]::TryParseExact( $_.Name, 'MMddyyyy_HHmm', [CultureInfo]::InvariantCulture, [Globalization.DateTimeStyles]::none, [ref] $dt ) ) {
return $dt
}
# Fallback to short format
[DateTime]::ParseExact( $_.Name, 'MMddyyyy', [CultureInfo]::InvariantCulture )
} | Select-Object -First 1 | ForEach-Object Name
Note: I've changed the time format from Hm to HHmm, because Hm would cause a parsing ambiguity, e. g. 01:46 would be formatted as 146, but parsed as 14:06.
Also I would move the year to the beginning, e. g. 20220821_1406, so you could simply sort by name, without having to use a script block. But that is not a problem, just an (in)convenience and you might have a reason to put the year after the day.
Given these folders:
08212022
08222022
08222022_1406
08222022_1322
08222022_1324
08222022_1325
08222022_1343
The code above produces the following output:
08222022_1406
To confirm the ordering is correct, I've removed the Select-Object call:
08222022_1406
08222022_1343
08222022_1325
08222022_1324
08222022_1322
08222022
08212022
Note that the ordering is descending (-Desc), so Select-Object -First 1 can be used to more effectively select the latest folder.
Related
I check every day, whether a CSV-File has been exported to a specific folder (path). At the moment there are 14 different paths with 14 different files to check. The files are being stored in the folder and are not deleted. So i have to differ between a lot of files with "lastwritetime". I would like a code to display the results in table format. I would be happy with something like this:
Name LastWriteTime Length
ExportCSV1 21.09.2022 00:50 185
ExportCSV2 21.09.2022 00:51 155
My code looks like this:
$Paths = #('Path1', 'Path2', 'Path3', 'Path4', 'Path5', 'Path6', 'Path7', 'Path8', 'Path9', 'Path10', 'Path11', 'Path12', 'Path13', 'Path13')
foreach ($Path in $Paths){
Get-ChildItem $path | Where-Object {$_.LastWriteTime}|
select -last 1
Write-host $Path
}
pause
This way i want to make sure, that the files are being sent each day.
I get the results that i want, but it is not easy to look at the results individually.
I am new to powershell and would very much appreciate your help. Thank you in advance.
Continuing from my comments, here is how you could do this:
$Paths = #('Path1', 'Path2', 'Path3', 'Path4', 'Path5', 'Path6', 'Path7', 'Path8', 'Path9', 'Path10', 'Path11', 'Path12', 'Path13', 'Path13')
$Paths | ForEach-Object {
Get-ChildItem $_ | Where-Object {$_.LastWriteTime} | Select-Object -Last 1
} | Format-Table -Property Name, LastWriteTime, Length
If you want to keep using foreach() instead, you have to wrap it in a scriptblock {…} to be able to chain everything to Format-Table:
. {
foreach ($Path in $Paths){
Get-ChildItem $path | Where-Object {$_.LastWriteTime} | Select-Object -Last 1
}
} | Format-Table -Property Name, LastWriteTime, Length
Here the . operator is used to run the scriptblock immediately, without creating a new scope. If you want to create a new scope (e. g. to define temporary variables that exist only within the scriptblock), you could use the call operator & instead.
I have a script configured on my GPO that tracks a certain user group their logon times exported to a csv.
Now i've gotten the question, to make it show only the LAST logon time.
At the moment it just writes down every single logon time, but they would like only the last one.
Is there a way to make it overwrite it instead?
Let's say user1 logs in 3 times, i would like to only show it the last one, and that for every user in this group.
The script i have at the moment is a very simple one:
"logon {0} {1} {2:DD-MM-YYYY HH:mm:ss}" -f (Get-Date),$env:username, $env:computername >> \\server\folder\file.csv
Much appreciated if somebody could tell me if it is possible!
First of all, you are appending a line to a text file which is not a CSV file because the values aren't separated by a delimiter character.
Then also, you use the wrong order of values for the -f Format operator: While the template string clearly has the date as the last placeholder in {2:DD-MM-YYYY HH:mm:ss}, you feed that as the first value..
Please also notice that a date format is Case-Sensitive, so you need to change DD-MM-YYYY HH:mm:ss into dd-MM-yyyy HH:mm:ss
I'd advise to change the script you now have to something like:
[PsCustomObject]#{
User = $env:USERNAME
Computer = $env:COMPUTERNAME
LogonDate = '{0:dd-MM-yyyy HH:mm:ss}' -f (Get-Date)
} | Export-Csv -Path '\\server\folder\UserLogon.csv' -NoTypeInformation -Append
Then, when it comes down to reading that file back and preserving only the latest logons, you can do
$data = Import-Csv -Path '\\server\folder\UserLogon.csv' | Group-Object User | ForEach-Object {
$_.Group | Sort-Object {[datetime]::ParseExact($_.LogonDate, 'dd-MM-yyyy HH:mm:ss', $null)} |
Select-Object -Last 1
}
# output on screen
$data
# overwrite the CSV file to keep only the last logons
$data | Export-Csv -Path '\\server\folder\UserLogon.csv' -NoTypeInformation
I think one arrow > should be to overwrite a file.
There is also a command called Out-File with parameters to overwrite a file.
More information on the PowerShell documentation,
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/out-file?view=powershell-7.1
I must reiterate what others have commented about this not being the best approach to this problem however, given the output file that you have, you could process it like this to get the last record for a given user:
$last = #{} ; Get-Content \\server\folder\file.csv |% { $token = $_ -split ' ' ; $last.$($token[3]) = $_ }
$last
It creates a hash keyed with the username and updates it with the last line from the file. You can also access $last.foo to get the last entry for user foo.
I'd also note that your CSV isn't a CSV which makes it more difficult to process. You'd be better to use Export-CSV or at least putting the commas in. Also, while still not the best approach, you could create a file per user which you could just overwrite each time they login, thus:
new-object PSObject -Property #{ 'date'= (Get-Date); 'username'= $env:username; 'Computer' = $env:computername } | Export-CSV -Path "\\server\folder\$($env:username).csv" -NoTypeInformation
You could import everything for processing by doing:
gci \\server\folder\*.csv |% { import-csv $_ }
I'm looking to create an array of files (pdf's specifically) based on filenames in Powershell. All files are in the same directory. I've spent a couple of days looking and can't find anything that has examples of this or something that is close but could be changed. Here is my example of file names:
AR - HELLO.pdf
AF - HELLO.pdf
RT - HELLO.pdf
MH - HELLO.pdf
AR - WORLD.pdf
AF - WORLD.pdf
RT - WORLD.pdf
HT - WORLD.pdf
....
I would like to combine all files ending in 'HELLO' into an array and 'WORLD' into another array and so on.
I'm stuck pretty early on in the process as I'm brand new to creating scripts, but here is my sad start:
Get-ChildItem *.pdf
Where BaseName -match '(.*) - (\w+)'
Updated Info...
I do not know the name after the " - " so using regex is working.
My ultimate goal is to combine PDF's based on the matching text after the " - " in the filename and the most basic code for this is:
$file1 = "1 - HELLO.pdf"
$file2 = "2 - HELLO.PDF"
$mergedfile = "HELLO.PDF"
Merge-PDF -InputFile $file1, $file2 -OututFile $mergedfile
I have also gotten the Merge-PDF to work using this code which merges all PDF's in the directory:
$Files = Get-ChildItem *.pdf
$mergedfiles = "merged.pdf"
Merge-PDF -InputFile $Files -OutputFile $mergedfiles
Using this code from #Mathias the $suffix portion of the -OutputFile works but the -InputFile portion is returning an error "Exception calling "Close" with "0" argument(s)"
$groups = Get-ChildItem *.pdf |Group-Object {$_.BaseName -replace
'^.*\b(\w+)$','$1'} -AsHashTable
foreach($suffix in $groups.Keys) {Merge-PDF -InputFile $(#($groups[$suffix]))
-OutputFile "$suffix.pdf"}
For the -InputFile I've tried a lot of different varieties and I keep getting the "0" arguments error. The values in the Hashtable seem to be correct so I'm not sure why this isn't working.
Thanks
This should do the trick:
$HELLO = Get-ChildItem *HELLO.pdf |Select -Expand Name
$WORLD = Get-ChildItem *WORLD.pdf |Select -Expand Name
If you want to group file names by the last word in the base name and you don't know them up front, regex is indeed an option:
$groups = Get-ChildItem *.pdf |Group-Object {$_.BaseName -replace '^.*\b(\w+)$','$1'} -AsHashTable
And then you can do:
$groups['HELLO'].Name
for all the file names ending with the word HELLO, or, to iterate over all of them:
foreach($suffixGroup in $groups.GetEnumerator()){
Write-Host "There are $($suffixGroup.Value.Count) files ending in $($suffixGroup.Key)"
}
Another option is to get all items with Get-ChildItem and use Where-Object to filter.
$fileNames = Get-ChildItem | Select-Object -ExpandProperty FullName
#then filter
$fileNames | Where-Object {$_.EndsWith("HELLO.PDF")}
#or use the aliases if you want to do less typing:
$fileNames = gci | select -exp FullName
$fileNames | ? {$_.EndsWith("HELLO.PDF")}
Just wanted to show more options -especially the Where-Object cmdlet which comes in useful when you're calling cmdlets that don't have parameters to filter.
Side note:
You may be asking what -ExpandProperty does.
If you just call gci | select -exp FullName, you will get back an array of PSCustomObjects (each of them with one property called FullName).
This can be confusing for people who don't really see that the objects are typed as it is not visible just by looking at the PowerShell script.
I have 2 folders: 'Old' and 'New'. Most of the files from 'Old' have been copied to 'New'. However, the structure of the sub-folders in 'Old' and 'New" are different. So the file-path for a file in 'Old' is very different from its copy in 'New'.
I need to loop through each file in 'Old', search for that file in 'New', and write the old and new file-paths for each file to a text file.
I have been assigned to do this manually, but it will take a long time due to the number of files. So I want to write a script. I am new to Powershell and am having difficulty figuring out which cmdlets can help me with my task.
I will appreciate any kind of guidance. Thank you.
try Something like this
#list old files
$patholdfile="c:\temp"
$listoldfile=Get-ChildItem $patholdfile -File -Recurse | select Name, FullName
#list new files
$pathnewfile="c:\temp2"
$listnewfile=Get-ChildItem $pathnewfile -File -Recurse | select Name, FullName
#extract liste file old and name and search all file in newlist
$resultsearch=#()
foreach ($currentfile in $listoldfile)
{
$resultsearch+=New-Object psobject -Property #{
Name=$currentfile.Name
OldPath=$currentfile.FullName
#if you want the firt founded uncomment this and comment after
#NewPaths=($listnewfile | Where {$_.Name -eq $currentfile.Name} | select FullName -First 1).FullName
NewPaths=($listnewfile | Where {$_.Name -eq $currentfile.Name} | select FullName).FullName -join "~"
}
}
#export result in csv file
$resultsearch | select name, OldPath , NewPaths | export-csv -Path "c:\temp\result.txt" -NoTypeInformation
This is as simplified version of what I'd like to achieve... I think it's called 'variable referencing'
I have created an array containing the content of the folder 'foo'
$myDirectory(folder1, folder2)
Using the following code:
$myDirectory= Get-ChildItem ".\foo" | ForEach-Object {$_.BaseName}
I'd like to create 2 arrays named as each folders, with the contained files.
folder1(file1, file2)
folder2(file1, file2, file3)
I tried the following code:
foreach ($myFolder in $myDirectory) {
${myFolder} = Get-ChildItem ".\$myFolders" | forEach-Object {$_.BaseName}
}
But obviously didn't work.
In bash it's possible create an array giving it a variable's name like this:
"${myForder[#]}"
I tried to search on Google but I couldn't find how to do this in Powershell
$myDirectory = "c:\temp"
Get-ChildItem $myDirectory | Where-Object{$_.PSIsContainer} | ForEach-Object{
Remove-Variable -Name $_.BaseName
New-Variable -Name $_.BaseName -Value (Get-ChildItem $_.FullName | Where-Object{!$_.PSIsContainer} | Select -ExpandProperty Name)
}
I think what you are looking for is New-Variable. Cycle through all the folders under C:\temp. For each folder make a new variable. It would throw errors if the variable already exists. What you could do for that is remove a pre-exising variable. Populate the variable with the current folders contents in the pipeline using Get-ChildItem. The following is a small explanation of how the -Value of the new variable is generated. Caveat Remove-Variable has the potiential to delete unintended variables depending on your folder names. Not sure of the implications of that.
Get-ChildItem $_.FullName | Where-Object{!$_.PSIsContainer} | Select -ExpandProperty Name
The value of each custom variable is every file ( not folder ). Use -ExpandProperty to just gets the names as strings as supposed to a object with Names.
Aside
What do you plan on using this data for? It might just be easier to pipe the output from the Get-ChildItem into another cmdlet. Or perhaps create a custom object with the data you desire.
Update from comments
$myDirectory = "c:\temp"
Get-ChildItem $myDirectory | Where-Object{$_.PSIsContainer} | ForEach-Object{
[PSCustomObject] #{
Hotel = $_.BaseName
Rooms = (Get-ChildItem $_.FullName | Where-Object{!$_.PSIsContainer} | Select -ExpandProperty Name)
}
}
You need to have at least PowerShell 3.0 for the above to work. Changing it for 2.0 is easy if need be. Create and object with hotel names and "rooms" which are the file names from inside the folder. If you dont want the extension just use BaseName instead of Name in the select.
This is how I did it at the end:
# Create an array containing all the folder names
$ToursArray = Get-ChildItem -Directory '.\.src\panos' | Foreach-Object {$_.Name}
# For each folder...
$ToursArray | ForEach-Object {
# Remove any variable named as the folder's name. Check if it exists first to avoid errors
if(Test-Path variable:$_.BaseName){ Remove-Variable -Name $_.BaseName }
$SceneName=Get-ChildItem ".\.src\panos\$_\*.jpg"
# Create an array using the main folder's name, containing the names of all the jpg inside
New-Variable -Name $_ -Value ($SceneName | Select -ExpandProperty BaseName)
}
And here it goes some code to check the content of all the arrays:
# Print Tours information
Write-Verbose "Virtual tours list: ($($ToursArray.count))"
$ToursArray | ForEach-Object {
Write-Verbose " Name: $_"
Write-Verbose " Scenes: $($(Get-Variable $_).Value)"
}
Output:
VERBOSE: Name: tour1
VERBOSE: Scenes: scene1 scene2
VERBOSE: Name: tour2
VERBOSE: Scenes: scene1