PowerShell find files by extension on multiple servers and export - powershell

I wrote a script to find the particular file on windows servers based on Disks and exporting that list to .txt file for each server and drive.
I want to run the script for multiple servers and export all the list details in one csv or excel file with server name and the file location path as output.
#Clearing the Host file
Clear-Host
#Get all the list of Servers
$Machines = get-content "C:\Scripts\Servers.txt"
#Get all the list of Disks to search
$Disks = get-content "C:\Scripts\Disks.txt"
#Lopping through specified servers
foreach ($Machine in $Machines)
{
#Lopping through each Disks
foreach ($Disk in $Disks)
{
if (Test-Path \\$Machine\$Disk$)
{
Write-Host Checking $Machine Disk $Disk -BackgroundColor DarkRed
Get-ChildItem -Path \\$Machine\$Disk$\ -Filter log4j.jar -Recurse -Name -Force | Out-File "C:\Scripts\Output\$Machine $Disk.txt"
}
}
}
Thanks in Advance

Don't send output from the script until you have the information you need.
Using a custom object will allow you to store the values you want
#Clear Host
Clear-Host
#Get all the list of Servers
$Machines = get-content "C:\Scripts\Servers.txt"
#Get all the list of Disks to search
$Disks = #('D','G')
# Looping through specified servers
foreach ($Machine in $Machines) {
#Looping through each Disks
$FileResults = $Disks | ForEach-Object {
If (Test-Path \\$Machine\$_$) {
Write-Host Checking $Machine Disk $_ -BackgroundColor DarkRed
$DiskResults = Get-ChildItem -Path \\$Machine\$_$\ -Filter log4j.jar -Name -Recurse -Force
#
# Send the results of each disk to custom object, to be stored in $FileResults
#
[pscustomobject]#{Disk=$_;Files=$DiskResults}
}
}
If (-not $FileResults) {
$FileResults = 'None found'
}
#
# Send the results of each completed server
#
[pscustomobject]#{ComputerName=$Machine;Found=$FileResults}
#
}

Related

Get location of specific SCCM device collection in Powershell

I am writing a script to export the names of all computer in a device collection to a txt file. My script works as expected but I would like to preserve the folder structure in the exported file structure. For this I need to get the location of the Device Collection.
My Question:
Is there a way to get the location of a SCCM Device Collection in PowerShell?
I've stumbled across a few posts like this and this that use WMI and WQL for this, but I wasn't able to get those working in my script and I would like to do everything in PowerShell whenever possible.
$collections = (Get-CMDeviceCollection | Select -ExpandProperty "Name")
$totalCollections = $collections.length
"Number of Collections: $totalCollections"
$i = 0
foreach($name in $collections){
ForEach-Object -Process {
$i++
"Writing File $i of $totalCollections"
$SanitizedName = $name -replace '/','(slash)' -replace '\\','(backslash)' -replace ':','(colon)' -replace '\*','(asterisk)' -replace '\?','(questionmark)' -replace '"','(quote)' -replace '<','(less)' -replace '>','(more)' -replace '\|','(pipe)'
$file = New-Item -Path "C:\Temp\exporte\$SanitizedName.txt"
Add-Content -Path $file.FullName -Value (Get-CMCollectionMember -CollectionName $name | Select -ExpandProperty "Name")
}
}
I would like to expand this code so that the txt files are placed in the corresponding subfolder analog to the SCCM file structure. E.g rootFolder/rooms/
I was using this module until now but wasn't able to find anything that gives me back the specific location of a collection.
Thanks in advance
I wasn't able to find a way to do this in plain PowerShell and the SCCM Module. In the end I did it like #FoxDeploy suggested. I made a SQL query select for each collection (performance isn't an issue in my case) on our SCCM database to get the folder path. I then used this to place the export file in the appropriate place.
This is my working example with some confidential lines removed
## Parameter ##
$exportLocation = [removed]
$sqlServer = [removed]
$db = [removed]
$query = "SELECT [ObjectPath] FROM [removed].[v_Collections] WHERE CollectionName ="
$SiteCode = [removed] # Site code
$ProviderMachineName = [removed] # SMS Provider machine name
# Customizations
$initParams = #{}
# Import the ConfigurationManager.psd1 module
if((Get-Module ConfigurationManager) -eq $null) {
Import-Module [removed]\..\ConfigurationManager.psd1" #initParams
}
# Connect to the site's drive if it is not already present
if((Get-PSDrive -Name $SiteCode -PSProvider CMSite -ErrorAction SilentlyContinue) -eq $null) {
New-PSDrive -Name $SiteCode -PSProvider CMSite -Root $ProviderMachineName #initParams
}
Set-Location "$($SiteCode):\" #initParams
# get all collections and save them to an array
$collections = (Get-CMDeviceCollection | Select -ExpandProperty "Name")
# total number of collections
$totalCollections = $collections.length
# output to console
"Number of Collections: $totalCollections"
# empty output directory
Set-Location [removed]
Remove-Item $exportLocation\* -Recurse -Force
Set-Location [removed]
# loop through all collections
$i = 0
foreach($name in $collections){
ForEach-Object -Process {
# print progress
$i++
"Writing File $i of $totalCollections"
# remove all characters, that aren't compatible with the windows file naming scheme (/\:*?"<>|)
$SanitizedName = $name -replace '/','(slash)' -replace '\\','(backslash)' -replace ':','(colon)' -replace '\*','(asterisk)' -replace '\?','(questionmark)' -replace '"','(quote)' -replace '<','(less)' -replace '>','(more)' -replace '\|','(pipe)'
# get members of collection
$collectionMembers = (Get-CMCollectionMember -CollectionName $name | Select -ExpandProperty "Name")
# write to file
Set-Location [removed]
$path = (Invoke-Sqlcmd -ServerInstance $sqlServer -Database $db -Query "$query '$collection'").Item("ObjectPath")
New-Item -ItemType Directory -Force -Path "$exportLocation$path"
$file = New-Item -Path "$exportLocation$path\$SanitizedName.txt"
Add-Content -Path $file.FullName -Value $collectionMembers
Set-Location [removed]
}
}
hope this helps someone. Thanks #FoxDeploy

Powershell find multiple files from multiple remote computers, from all drives

Needed to make a script that finds a list of files from a list of multiple computers. The script below works fine for that.
Now would need to add a functionality so that script goes thru ALL drives per computer, not just C$. Problem is I don't know which computers have which drives..
The current script:
$computers = Get-Content .\computers.txt
$filenames = Get-Content .\filenamelist.txt
foreach ($computer in $computers) {
foreach ($filename in $filenames) {
Get-ChildItem -Recurse -Force \\$computer\c$ -ErrorAction SilentlyContinue |
Where-Object { ($_.PSIsContainer -eq $false) -and ( $_.Name -eq "$filename") } |
Select-Object Name, Directory |
Export-Csv .\FoundFiles.csv -nti -append
}
}
So I should somehow implement command line:
$Drives = Get-PSDrive -PSProvider 'FileSystem'
So that it gets run on each computer and the Get-ChildItem line run based on the result, for each existing drive on each remote computer.
Any ideas please?

Issues with Copy-item script

I have a script that needs to copy a list of files to ceraitn directories and locations on target servers.
I was able to understand that I need to creat an csv file as follows:
I need to understand from you how to make the files from source location to their adjacent file in target location. Any ideas?
My code looks like this:
# customize log file
$date = (Get-Date -Format d).ToString() | foreach {$_ -replace "/", "_"}
$time = (Get-Date)
$scriptDir = "D:\Scripts\ServerBuildToolkitT1\SingleFileUpfate\"
$logDir = "D:\Scripts\ServerBuildToolkitT1\Logs\SingleFileUpfate\"
$logFileName = "SingleFileUpfate $date.log"
$sources = #()
$destinsyions = #()
function CSV {
Import-Csv D:\Scripts_PS\SD.csv | ForEach-Object {
$sources += $_."Source Location"
$destinations += $_."Destination Location"
}
}
# this file contains the list of destination server that you want copy
# file/folder to
$computers = Get-Content "D:\Scripts_PS\ServerList.txt"
function main
{
foreach ($computer in $computers) {
foreach ($destination in $destinations) {
Write-Output "$([DateTime]::Now) Copying files update to $computer now" |
Out-File -FilePath "$logDir\$logFileName" -Append
Copy-Item $sources -Destination "\\$computer\$destination" -Force -Recurse -Verbose:$false
}
}
}
csv
main
Write-Output "$([DateTime]::Now) the operation SingleFileUpdate completed successefully" |
Out-File -FilePath "$logDir\$logFileName" -Append
i have updtaed the Script (as seen above) and now i am getting the following ERROR
"WARNING: One or more headers were not specified. Default names starting with "H" have been used in place of any missing headers."

When using Get-ADUser -Filter {}, process is slow

I am trying to clear out some orphaned user shares on a DFS share. I want to compare the full name of the folder to the HomeDirectory property of a specified object using Get-ADUser -Filter.
If I use for instance (Get-ADUser $varibale -Properties * | Select Homedirectory) I get an error displayed when the account is not found. So I used -Filter to hide the error if there is no account found. However, this is much slower than the -Properties * | Select method.
Script:
$path = Read-Host -Prompt "Share path...."
$Dirs = Get-ChildItem $Path
foreach ($D in $Dirs) {
$Login = Get-ADUser -Filter {HomeDirectory -eq $d.FullName}
if ($d.FullName -ne $Login."HomeDirectory") {
$host.UI.RawUI.WindowTitle = "Checking $d..."
$choice = ""
Write-Host "Comparing $($d.FullName)......." -ForegroundColor Yellow
$prompt = Write-Host "An account with matching Home Directory to $($d.FullName) could not be found. Purge $($d.fullname)?" -ForegroundColor Red
$choice = Read-Host -Prompt $prompt
if ($choice -eq "y") {
function Remove-PathToLongDirectory {
Param([string]$directory)
# create a temporary (empty) directory
$parent = [System.IO.Path]::GetTempPath()
[string] $name = [System.Guid]::NewGuid()
$tempDirectory = New-Item -ItemType Directory -Path (Join-Path $parent $name)
robocopy /MIR $tempDirectory.FullName $directory
Remove-Item $directory -Force
Remove-Item $tempDirectory -Force
}
# Start of Script to delete folders from User Input and Confirms
# the specified folder deletion
Remove-PathToLongDirectory $d.FullName
}
} else {
Write-Host "Done!" -ForegroundColor Cyan
}
Write-Host "Done!" -ForegroundColor Cyan
}
You have a couple of suboptimal things in your code (like (re-)defining a function inside a loop, or creating and deleting an empty directory over and over), but your biggest bottleneck is probably that you do an AD query for each directory. You should be able to speed up your code considerably by making a single AD query and storing its result in an array:
$qry = '(&(objectclass=user)(objectcategory=user)(homeDirectory=*)'
$homes = Get-ADUser -LDAPFilter $qry -Properties HomeDirectory |
Select-Object -Expand HomeDirectory
so that you can check if there is an account with a given home directory like this:
foreach ($d in $dirs) {
if ($homes -notcontains $d.FullName) {
# ... prompt for deletion ...
} else {
# ...
}
}
Performance-wise I didn't notice a difference between the robocopy /mir approach and Remove-Item when deleting a 2 GB test folder with each command. However, if you have paths whose length exceeds 260 characters you should probably stick with robocopy, because Remove-Item can't handle those. I would recommend adding a comment explaining what you're using the command for, though, because the next person reading your script is probably as confused about it as I was.
$empty = Join-Path ([IO.Path]::GetTempPath()) ([Guid]::NewGuid())
New-Item -Type Directory -Path $empty | Out-Null
foreach ($d in $dirs) {
if ($homes -notcontains $d.FullName) {
# ... prompt for confirmation ...
if ($choice -eq 'y') {
# "mirror" an empty directory into the orphaned home directory to
# delete its content. This is used b/c regular PowerShell cmdlets
# can't handle paths longer than 260 characters.
robocopy $empty $d.FullName /mir
Remove-Item $d -Recurse -Force
}
} else {
# ...
}
}
Remove-Item $empty -Force
There's also a PowerShell module built on top of the AlphaFS library that supposedly can handle long paths. I haven't used it myself, but it might be worth a try.

get script to work on remote servers

I want the below code to work on multiple computers - any idea how to do this? I have the below but it fails as I do not currently call in the servers in question I think.
Thanks,
CODE:
Write-Host "Script to check Storage Write, Read and Delete Times"
Write-Host "`n"
$computer = Get-Content -path d:\temp\servers.txt
$path = "f:\temp\test.txt"
Foreach ($storage in $computer)
{
$date = Get-Date
Write-Host "Script being run on $date"
$write = Measure-Command { new-item -Path $path -ItemType File -Force } | select TotalMilliseconds
Write-Host "Writing file on $storage took $write"
$read = Measure-Command { Get-Content -Path $path } | select TotalMilliseconds
Write-Host "Reading file on $storage took $read"
$delete = Measure-Command {Remove-Item -Path $path -Force } | select TotalMilliseconds
Write-Host "Deleting file on $storage took $delete"
Write-Host "`n"
}
You need to step back for a second an rethink the approach. You are issuing the filesystem commands every time to f:\temp, which is on your local system.
There are two ways to make remote computers perform filesystem tasks. The easiest way is to use UNC paths. That is, \\server\share format. Assuming you have local admin access:
Foreach ($storage in $computer) {
$uncpath = $("\\{0}\f`$\temp\text.txt" -f $storage)
$write = Measure-Command { new-item -Path $uncpath -ItemType #...
# rest of code uses $uncpath for access
}
Mind you, using UNC path puts some stess on LAN, so this type of testing might or might not be accurate enough.
The second way would be using Powershell remoting to connect on remote systems and issuing the commands there. Take a look at New-PSSession, Enter-PSSession and Exit-PSSession cmdlets.