I'm wanting to improve on my script to be able to accomplish the following:
Scan servers based on get-adcomputer on specific OUs.
Scan each server based on whatever drive letter it has.
Scan each server for log4j.
Export all results to a CSV that identifies the folder path, name of file, and the server that the file was found on.
I have been using the following code to start with:
$Servers = Get-ADComputer -Filter * -SearchBase "OU=..." | Select -ExpandProperty Name
foreach ($server in $Servers){
Invoke-Command -ComputerName $Server -ScriptBlock {
$Drives = (Get-PSDrive -PSProvider FileSystem).Root
foreach ($drive in $Drives){
Get-ChildItem -Path $drive -Force -Filter *log4j* -ErrorAction SilentlyContinue | '
foreach{
$Item = $_
$Type = $_.Extension
$Path = $_.FullName
$Folder = $_.PSIsContainer
$Age = $_.CreationTime
$Path | Select-Object `
#{n="Name";e={$Item}}, `
#{n="Created";e={$Age}},`
#{n="FilePath";e={$Path}},`
#{n="Extension";e={if($Folder){"Folder"}else{$Type}}}`
} | Export-Csv C:\Results.csv -NoType
}
}
I am having the following issues and would like to address them to learn.
How would I be able to get the CSV to appear the way I want, but have it collect the information and store it on my machine instead of having it on each local server?
I have noticed extreme performance issues on the remote hosts when running this. WinRM takes 100% of the processor while it is running. I have tried -Include first, then -Filter, but to no avail. How can this be improved so that at worst, it's solely my workstation that's eating the performance hit?
What exactly do the ` marks do?
I agree with #SantiagoSquarzon - that's going to be a performance hit.
Consider using writing a function to run Get-ChildItem recursively with the -MaxDepth parameter, including a Start-Sleep command to pause occasionally. Also, you may want to note this link
You'd also want to Export-CSV to a shared network drive to collect all the machines' results.
The backticks indicate a continuation of the line, like \ in bash.
Finally, consider using a Scheduled Task or start a powershell sub-process with a lowered process priority, maybe that will help?
Related
Good morning,
Hopefully this will be a quick and easy one to answer.
I am trying to run a PS script and have it export to csv based on a list of IP addresses from a text file. At the moment, it will run but only produce one csv.
Code Revision 1
$computers = get-content "pathway.txt"
$source = "\\$computer\c$"
foreach ($computer in $computers) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "C:\path\$computer.csv" -NoTypeInformation
}
Edit
The script is now creating the individual server files as needed and I did change the source .txt file to list the servers by HostName rather than IP. The issue now is that no data is populating in the .csv files. It will create them but nothing populates. I have tried different source file paths to see if maybe its due to folder permissions or just empty but nothing seems to populate in the files.
The $computer file lists a number of server IP addresses so the script should run against each IP and then write out to a separate csv file with the results, naming the csv file the individual IP address accordingly.
Does anyone see any errors in the script that I provided, that would prevent it from writing out to a separate csv with each run? I feel like it has something to do with the foreach loop but I cannot seem to isolate where I am going wrong.
Also, I cannot use any third-party software as this is a closed network with very strict FW rules so I am left with powershell (which is okay). And yes this will be a very long run for each of the servers but I am okay with that.
Edit
I did forget to mention that when I run the script, I get an error indicating that the export-csv path is too long which doesn't make any sense unless it is trying to write all of the IP addresses to a single name.
"Export-CSV : The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.
At line:14 char:1
TIA
Running the script against C: Drive of each computer is strongly not advisable that too with Recurse option. But for your understanding, this is how you should pass the values to the variables. I haven't tested this code.
$computer = get-content "pathway.txt"
foreach ($Source in $computer) {
Get-ChildItem -Path "\\$Source\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length | Export-Csv -Path "C:\Path\$source.csv" -NoTypeInformation
}
$computer will hold the whole content and foreach will loop the content and $source will get one IP at a time. I also suggest instead of IP's you can have hostname so that your output file have servername.csv for each server.
In hopes that this helps someone else. I have finally got the script to run and create the individual .csv files for each server hostname.
$servers = Get-Content "path"
Foreach ($server in $servers)
{
Get-ChildItem -Path "\\$server\c$" -Recurse -Force -ErrorAction SilentlyContinue |
Select-Object Name,Extension,FullName,CreationTime,LastAccessTime,LastWriteTime,Length |
Export-CSV -Path "path\$server.csv" -NoTypeInformation
}
I am using powershell 2.0 in windows 7.
I would like to copy a file from my USB stick to a directory on my main hard drive using cmd or powershell. However, I need this to function on any PC without any input of the USB's current drive letter. In case that didn't make sense, let me rephrase it. I need a powershell or cmd command/ batch script to copy a file from my USB stick to my hard drive without any input.
Ideal command would assign the variable mydrive to the drive letter and allow me to run something like this in cmd
copy myvar:/path/fileToCopy.txt/ C:/path/of/target/directory/
I would really appreciate if I could use just my USB sticks name ('DD') to copy like this:
copy DD:/path/fileToCopy.txt/ C:/path/of/target/directory/
I've done well over an hours worth of research trying to find a way to pull this off and can't. Any help is greatly appreciated. Especially if it is clear how to use it. I am very new to powershell and cmd commands and don't understand the syntax. So stuff like [put drive name here] to show me how to use it would be amazing and is where a lot of forums are missing out.
You can do this like below:
$destination = 'C:\path\of\target\directory'
$sourceFile = 'path\fileToCopy.txt' # the path to the file without drive letter
# get (an array of) USB disk drives currently connected to the pc
$wmiQuery1 = 'ASSOCIATORS OF {{Win32_DiskDrive.DeviceID="{0}"}} WHERE AssocClass = Win32_DiskDriveToDiskPartition'
$wmiQuery2 = 'ASSOCIATORS OF {{Win32_DiskPartition.DeviceID="{0}"}} WHERE AssocClass = Win32_LogicalDiskToPartition'
$usb = Get-WmiObject Win32_Diskdrive | Where-Object { $_.InterfaceType -eq 'USB' } |
ForEach-Object {
Get-WmiObject -Query ($wmiQuery1 -f $_.DeviceID.Replace('\','\\')) #'# double-up the backslash(es)
} |
ForEach-Object {
Get-WmiObject -Query ($wmiQuery2 -f $_.DeviceID)
}
# loop through these disk(s) and test if the file to copy is on it
$usb | ForEach-Object {
# join the DeviceID (like 'H:') with the file path you need to copy
$file = Join-Path -Path $_.DeviceID -ChildPath $sourceFile
if (Test-Path -Path $file -PathType Leaf) {
Copy-Item -Path $file -Destination $destination
break # exit the loop because you're done
}
}
Hope that helps
If you upgrade your version of PowerShell, you can replace the Get-WmiObject with Get-CimInstance for better performance. See this and that
My organization requires the filtering, and removal of all .PFX and .P12 files from our computers and servers. The script we are currently running every week does not go deep enough or far enough per higher guidance. What I'm trying to do is take my current working script, and filter for both file extensions. The person who wrote the script is all but gone from this world so I'm working on a script I didn't write but I'm trying to become familiar with.
I've already tried changing some of the variables inside of the Get-ChildItem cmdlt to apply the filtering there instead of a variable. This includes attempts like:
$Files = Get-ChildItem -Path \\$client\c$\Users -Filter -filter {(Description -eq "school") -or (Description -eq "college")} -Recurse -ErrorAction SilentlyContinue
Here is a portion of the Code, not the entire thing. There is logging and notes and other administrative tasks that are done other than this, I've only included the portion of the code that is creating errors.
$computers = Get-ADComputer -Filter * -SearchBase "AD OU PATH OMITTED"
$destination = "****\Software\aPatching_Tools\Log Files\Soft Cert\Workstations\AUG 19\WEEK 2"
$ext = "*.pfx"
foreach ($computer in $computers)
{
$client = $computer.name
if (Test-Connection -ComputerName $client -Count 1 -ErrorAction SilentlyContinue)
{
$outputdir = "$destination\$client"
$filerepo = "$outputdir\Files"
$files = Get-ChildItem -Path \\$client\c$\Users -Filter $ext -Recurse -ErrorAction SilentlyContinue
if (!$files)
{
Write-Host -ForegroundColor Green "There are no .pfx files on $client."
}
else
{
Write-Host -ForegroundColor Cyan "PFX files found on $client"
The expected and normal operation of the script is that it goes through each machine, tests it, moves on if it's offline, or if it's online, there is a 4-5 minute pause while it searches and moves on.
The errors I get when I make changes such as doing a $ext = "*.p12", ".pfx" is that -Filter does not support this operation. Or if I try the above mentioned change to the filtering, the script takes 1-2 seconds per machine, and with at times, 15-20 users in the C:\Users folder, it's nearly impossible to search that fast over the network.
Instead of passing your extensions as the -filter, pass them using -include - that is, $files = Get-ChildItem -Path \\$client\c$\Users\* -include $ext -Recurse -ErrorAction SilentlyContinue. You can then define $ext as an array of strings, e.g., $ext = "*.pfx","*.p12", and Get-ChildItem will return only those files with the indicated extensions.
I am trying to collect a list of the viewers installed on a set of servers. I am trying to loop through that list and run a wmi query and store the results and export a table with with the wmi result and server name next to it.
I am running this on server 2012
$computers = Get-Content C:\computers.txt
$WMIQuery = foreach ($computer in $computers){Get-WmiObject -Class
Win32_Product | where-object {$_.name -match "Microsoft Viewer*"}}
$WMIQuery
$WMIQuery | Out-File c:\Viewers.txt
Desired Results
Server Name Object1 Object2
Server1 Microsoft Excel Viewer Microsoft Visio Viewer
I output the file and get a blank txt file.
foreach ($computer in (Get-Content -Path "C:\computers.txt")) {
Get-WMIObject -ComputerName $computer -Class Win32_Product |
Where-Object {$_.name -match "Microsoft Viewer" } |
Out-File -Append -Path "C:\viewers.txt"
}
Your original code wasn't identifying the computer to perform Get-WMIObject against, so it was looking at only the computer that you were running the script on.
If there are many products on the remote computer, you may want to consider filtering on the remote computer instead of locally, so as to avoid transferring large amounts of data over what may be a slower-than-ideal network:
foreach ($computer in (Get-Content -Path "C:\computers.txt")) {
Get-WMIObject -ComputerName $computer -Class Win32_Product -Filter "Name LIKE '*Microsoft Viewer*'"|
Out-File -Append -Path "C:\viewers.txt"
}
(I think I have the filter syntax correct; I seem to have to hack at it every time I write a new filter...)
I don't have enough rep to add a comment, but Jeff is correct. However, there are still issues with the original poster's query. The following piece of code will yield no results based on the examples provided by the poster:
{$_.name -match "Microsoft Viewer*"}
That needs to either be changed to
{$_.name -like "*Microsoft*Viewer*"}
or
{$_.name -match "Microsoft.*?Viewer"}
I wrote a script to delete temporary files/folders older than 90 days on remote servers. The server.txt file is loaded with Get-Content, and I use 'net use' to map to the IPC$ share. I'm worried that I'm not using Best Practices to delete the old temp files. Here is the meat of my script:
net use \\$server\IPC$ /user:$Authname $pw /persistent:yes
Get-ChildItem -Path "\\$($server)\C$\Temp" -Recurse | Where-Object {!$_.PSIsContainer -and $_.LastAccessTime -lt $cutoffdate} | Remove-Item -Recurse
(Get-ChildItem -Path "\\$($server)\C$\Temp" -recurse | Where-Object {$_.PSIsContainer -eq $True}) | Where-Object {$_.GetFiles().Count -eq 0} | Remove-Item -Recurse
net use \\$Server\IPC$ /delete
The first gci deletes old files, the second deletes empty folders.
The reason I'm concerned is that in my initial tests, it's taking about a half hour to delete approximately 4 gb off of one server. And I work in a big shop; my script needs to be run against about 10,000 servers. At that rate my script won't be done for more than six months, and I was hoping to run it on a quarterly basis.
Am I doing something the hard way?
get a list of your servers
cycle through the list and use invoke-command -computername
your command will be executed on the remote server rather than pulling all the data across the network which is very slow