Needed to make a script that finds a list of files from a list of multiple computers. The script below works fine for that.
Now would need to add a functionality so that script goes thru ALL drives per computer, not just C$. Problem is I don't know which computers have which drives..
The current script:
$computers = Get-Content .\computers.txt
$filenames = Get-Content .\filenamelist.txt
foreach ($computer in $computers) {
foreach ($filename in $filenames) {
Get-ChildItem -Recurse -Force \\$computer\c$ -ErrorAction SilentlyContinue |
Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -eq "$filename") } |
Select-Object Name, Directory |
Export-Csv .\FoundFiles.csv -nti -append
}
}
So I should somehow implement command line:
$Drives = Get-PSDrive -PSProvider 'FileSystem'
So that it gets run on each computer and the Get-ChildItem line run based on the result, for each existing drive on each remote computer.
Any ideas please?
Related
I am trying to check for presence of a particular folder "appdata\Local\Packages\ActiveSync" in each of the profile folders that are returned for each of the computer by the below script.Searching through various forums I got the script below and need further assistance to eventually output it to a file with results of Test-Path against each computer name and corresponding profile path.
e.g. \\Computer1\C:\users\John\appdata\Local\packages\ActiveSync False
Invoke-Command -Computer (get-content c:\temp\servers.txt) -ScriptBlock {
Get-childItem 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList' |
% {Get-ItemProperty $_.pspath }} | Select pscomputername,profileimagepath |
Where-Object { $_.ProfileImagePath -like "C:\users*" } | Out-File c:\temp\profiles.csv
For this, I think I would use a loop to go through all user path strings like below:
Invoke-Command -ComputerName (Get-Content -Path 'c:\temp\servers.txt') -ScriptBlock {
$regPath = 'HKLM:\SOFTWARE\Microsoft\Windows NT\CurrentVersion\ProfileList\*'
Get-ItemPropertyValue -Path $regPath -Name 'ProfileImagePath' -ErrorAction SilentlyContinue |
Where-Object { $_ -like 'C:\Users*' } | ForEach-Object {
# combine the value with the rest of the path to form a LOCAL path
$path = Join-Path -Path $_ -ChildPath 'AppData\Local\Packages\ActiveSync'
[PsCustomObject]#{
ComputerName = $env:COMPUTERNAME
Path = '\\{0}\{1}' -f $env:COMPUTERNAME, ($path.Replace(":", "$")) # UNC format
Exists = Test-Path -Path $path -PathType Container
}
}
} | Export-Csv -Path 'c:\temp\profiles.csv' -NoTypeInformation
Please note that if the output should be a structured CSV file, you need to use Export-Csv on the resulting objects instead of Out-File.
Also you may need to append parameter -Credential to the Invoke-Command call where you can give it administrative credentials.
I want to compress a directory in a specific place.
The source path is : \\$Computers\Users\$Names
I want than for each computers a copy of each users directory in the sources path of each computers
I tried to use a foreach loop like :
$Computers = Get-ADComputer -Filter "Name -like 'PC*'" | Select-Object -ExpandProperty Name
$Names = Get-aduser -filter * | Select-Object -ExpandProperty givenname
Foreach($Computer in $Computers)
{
Compress-Archive -Path \\$Computer\Users\* -DestinationPath C:\Saves\\$Computer\Test.zip -Force
}
This actually work, but I don't know how can I add a second loop inside the loop.
If anyone can just explain me the function or just some advises please for trying to do that.
Thank you for your time.
You're approaching the problem with the wrong logic, you do need an inner loop, however, instead of attempting to compress a user profile that you don't know for sure is there you can instead query the remote computer's Users folder to see which ones are there and compress only those ones:
$Computers = (Get-ADComputer -Filter "Name -like 'PC*'").Name
# Add the profiles you want to exclude here:
$toExclude = 'Administrator', 'Public'
$params = #{
Force = $true
CompressionLevel = 'Optimal'
}
foreach($Computer in $Computers)
{
$source = "\\$Computer\Users"
Get-ChildItem $source -Exclude $toExclude -Directory | ForEach-Object {
$params.LiteralPath = $_.FullName
# Name of the zipped file would be "ComputerExample - UserExample.zip"
$params.DestinationPath = "C:\Saves\$computer - {0}.zip" -f $_.Name
Compress-Archive #params
}
}
I need to delete all the archived files and folder older than 15 days.
I have implemented the solution using PowerShell script but it taking more than a day to delete all files. Total size of the folder is less than 100 GB.
$StartFolder = "\\Guru\Archive\"
$deletefilesolderthan = "15"
#Get Foldernames for ForEach Loop
$SubFolders = Get-ChildItem -Path $StartFolder |
Where-Object {$_.PSIsContainer -eq "True"} |
Select-Object Name
#Loop through folders
foreach ($Subfolder in $SubFolders) {
Write-Host "Processing Folder:" $Subfolder
#For each folder recurse and delete files olders than specified number of days while the folder structure is left intact.
Get-ChildItem -Path $StartFolder$($Subfolder.name) -Include *.* -File -Recurse |
Where LastWriteTime -lt (Get-Date).AddDays(-$deletefilesolderthan) |
foreach {$_.Delete()}
#$dirs will be an array of empty directories returned after filtering and loop until until $dirs is empty while excluding "Inbound" and "Outbound" folders.
do {
$dirs = gci $StartFolder$($Subfolder.name) -Exclude Inbound,Outbound -Directory -Recurse |
Where {(gci $_.FullName).Count -eq 0} |
select -ExpandProperty FullName
$dirs | ForEach-Object {Remove-Item $_}
} while ($dirs.Count -gt 0)
}
Write-Host "Completed" -ForegroundColor Green
#Read-Host -Prompt "Press Enter to exit"
Please suggest some way to optimise the performance.
If you have many smaller files, the long delete time is not abnormal because it has to process each file descriptor. Some improvements can be made depending on your version; I'm going to assume you're on at least v4.
#requires -Version 4
param(
[string]
$start = '\\Guru\Archive',
[int]
$thresholdDays = 15
)
# getting the name wasn't useful. keep objects as objects
foreach ($folder in Get-ChildItem -Path $start -Directory) {
"Processing Folder: $folder"
# get all items once
$folders, $files = ($folder | Get-ChildItem -Recurse).
Where({ $_.PSIsContainer }, 'Split')
# process files
$files.Where{
$_.LastWriteTime -lt (Get-Date).AddDays(-$thresholdDays)
} | Remove-Item -Force
# process folders
$folders.Where{
$_.Name -notin 'Inbound', 'Outbound' -and
($_ | Get-ChildItem).Count -eq 0
} | Remove-Item -Force
}
"Complete!"
The reason why it takes so many time is that you are deleting files/folder over network which leads to need for additional network communication for every file and folder. You can easily check that fact using network analyzer. The best approach here is to use one of the method that allows to run code which executes file operations on remote machine, for example you can try to use:
WinRM
psexec (first copy code to remote machine and then execute it using psexec)
remote WMI (using CIM_Datafile)
or even adding needed task to the scheduler
I would prefer to use WinRM but psexec is also good decision (if you don't want to perform additional configuration of WinRM).
I am trying to copy machines names to a text file if the programs are found in the user installations
machines are copying over to text file however it is getting machines that are not installed as well
$computers = Get-Content C:\temp\MY_TEST.txt | ForEach-Object {$_.trim()}
foreach($computer in $computers) {
Invoke-command -computername $computer {
$UserHives = Get-ChildItem Registry::HKEY_USERS\ | Where-Object {$_.Name -match '^HKEY_USERS\\S-1-5-[\d\-]+$'}
foreach($Hive in $UserHives)
{
# Construct path from base key
$Path = Join-Path $Hive.PSPath "Software\Microsoft\Windows\CurrentVersion\Uninstall\*"
# Attempt to retrieve Item property
$one = Get-ItemProperty -Path $Path -ErrorAction SilentlyContinue | Where-Object {$_.displayname -eq 'ABC'}
$two = Get-ItemProperty -Path $Path -ErrorAction SilentlyContinue | Where-Object {$_.displayname -eq 'DEF'}
$three = Get-ItemProperty -Path $Path -ErrorAction SilentlyContinue | Where-Object {$_.displayname -eq '123'}
if($one -or $two -or $three){
$computer | Out-File C:\temp\MY.txt -Append
}
} #end foreach
}
#only copies here outside of loop
#$computer | Out-File C:\temp\MY.txt -Append
} #end main foreach loop
I expect only machines found in my registry hive to be copied to text file
I am having some difficulty creating a script to delete a particular file, localted in 'appdata\roaming\templates' for each user on a PC, from a PC List of about 75 PCs. Research indicted that I should create a powershell script for this, however I am fairly new to powershell. can anyone assist?
This is what I've thus far. If I run the body only, it works for the local PC.
foreach ($line in Get-Content .\file.txt) {
if ($line -match $regex) {
$users = Get-ChildItem -Path "C:\Users"
$users | ForEach-Object {
Remove-Item -Path "C:\Users\$($_.Name)\AppData\Local\File.exe" -Force
}
}
}
Remove-Item can work with wildcards, so you can remove the file in question from all given folders like this:
Remove-Item -Path 'C:\Users\*\AppData\Local\File.exe' -Force
To run this command on a list of remote hosts read the list of remote hosts from a file (the file should contain one IP address or FQDN per line):
$servers = Get-Content 'serverlist.txt'
then run Remove-Item on the remote hosts via Invoke-Command:
Invoke-Command -Computer $servers -ScriptBlock {
Remove-Item -Path 'C:\Users\*\AppData\Local\File.exe' -Force
}
try Something like this:
Get-ChildItem "C:\Users" -directory | Remove-Item -Path {("{0}\AppData\Local\File.exe" -f $_.fullname)} -force -ErrorAction SilentlyContinue