Need to scan all domain computers for .pst files - powershell

I am new to powershell sctipting, like Brand new. I have some experience using Exchange powershell but thats always been for very specific items like adjust calendar permissions and such. Nothing to robust.
Currently I am working on a powershell script to push out via Group policy that will run a a search on each domain PC. I've been getting help from a co-worker but he isn't available right now and I have a hard time following him sometimes. I am this site and its user might be able to assist me. What I am trying to do(and I believe I am close to) is pulling a list of drives for each computer on the domain. Once I pull that list O pipe it into a variable and then do a search on that variable for any files that end with .pst. Once the search is complete if there were results from the search a file should be created with the FUllname"path" of each file and the computer name should be used for naming the file. If there are no results form the search the file would be empty but the filename should still be named after t he computer. I believe I have gotten everything correct except that I do not know how to name the file based on the computer name. Thank you for your time and help with this.
Here is my code so far:
$drives=Get-WmiObject -query "SELECT * from win32_logicaldisk where
DriveType = '3'" | select deviceid
foreach ($drive in $drives){
$pstfound=Get-ChildItem $drive.deviceid *.pst -recurse | select
fullname
$pst+=$pstfound
}
IF ($pst -eq $null) {
$pst | Out-File \\"Servername"\Searchresults\Null
} Else {
$pst | Out-File \\"Servername"\Searchresults\HasItems
}

Thank you. I wasn't initially planning on using the UNC path but changed it up anyways and I think that will make it easier to go through later. I also figured out my issue for naming the file generated after the computer it ran on. I just set a variable $hostname=hostname and then set the files as \$hostname.csv

Related

Choose which CSV to import when running a PowerShell script

I get a CSV every week that our finance team puts in a shared drive. I have a script for that CSV that I run once I get it.
The first command of the script is of course Import-Csv.
The problem is, the finance team insists on naming the file differently each time plus they don't always put it in the same location within the drive.
As a result, I have to first hunt for the file, put it into the directory that the script points to and then rename the file.
I've tried talking to the team about putting it in the same location and making sure the filename is the same but they only follow the instructions for a couple of weeks before just doing whatever.
Ideally, I'd like for it so that when I run the script, there would be a popup that would ask me to pick a CSV (Similar to how it looks when you do "Save As" on an Office Document).
Anyway for this to be done within PowerShell?
You can access .Net classes and interface with the forms library to instantiate and take input from the standard FileOpen dialog. Something like below:
Using Namespace System.Windows.Forms
$FileBrowser = [OpenFileDialog]::new()
$FileBrowser.InitialDirectory = 'c:\temp'
$FileBrowser.Filter = 'Comma Separated Values (*.csv) | *.csv'
[Void]$FileBrowser.ShowDialog()
$CsvFile = $FileBrowser.FileName
Then use $CsvFile int he Import-Csv command.
You can change the .InitialDirectory property to make navigating a little more convenient.
Use the .Filter property to limit the file open display to CSV files, to make things that much more convenient.
Also, use the [Void] class to prevent the status return (usually 'OK' or 'Cancel') from echoing to the screen.
Note: A simple Google search will turn up many examples. I refined some of the work from here. That will also document some of the other properties if you want to explore etc.
If you are willing to settle for a selection box that doesn't look as nice as the Save As dialog, you can use Out-Gridview. Something along these lines might help.
$filenames =
#(Get-ChildItem -Path C:\temp -Recurse -Filter *.csv |
Sort-Object LastWriteTime -Descending |
Out-GridView -Title 'Choose a file' -PassThru)
$csvfile = $filenames[0].FullName
Import-Csv $csvfile | More
The -Path specifies a directory that contains all the locations where your csv file might be delivered. The sort is just to put the recently written files at the top of the grid. This supposedly makes selection easier. The #() wrapper merely makes sure the result stored in $filenames is an array.
You would do something else with the results of Import-Csv.
Steven's response certainly satisfies your original question, but an alternative would be to let PowerShell do the work. If you know the drive, and you know the name of the file this week, you can pass the name to your script and let it search the drive filtering on the specific csv file you need. Make it recursive, and open the only file that matches. Sorry, didn't have time yesterday to include code. Here's a function that returns the full file path when provided with a top level search path and a filename with possible wildcards.
function gfp { $result=gci $args[0] -recurse -include $args[1]; return ($result.DirectoryName + "\" + $result.Name) }
Example: gfp "d:\rootfolder" "thisweeksfilename.csv"

Need assistance with PowerShell Script to clean up computers in AD that meet criteria

Hello & thanks in advance for the help!
Looking to delete computers (Workstations OU) in AD if they meet a certain criteria.
I need to make sure they have the "LOCATION," part of the Canonical name in common before proceeding to delete. If they are not at my location that could be reason to investigate and I do not want to delete them. This is an example of one PC (Caps are fields I changed):
ORGANIZATION.COM/Workstations/BUSINESS UNIT/Desktops/LOCATION/COMPUTER NAME
I have the following script currently that will print them to a .csv which is helpful, but to take it one step further, it would be nice to print this on the screen then review it quick and proceed with a delete. Any tips??
Get-Content C:\Temp\Powershell\hosts.txt | ForEach-Object {
Get-ADComputer $_ -Properties Name,CanonicalName |Select-Object Name,CanonicalName
} -ErrorAction Ignore | Export-Csv C:\Temp\Powershell\Output.csv
Or even a second line of code I can utilize the output.csv with, not sure where to go from here...
Again, Thanks!
Added -Recursive and it seems to be working as expected.

How to find a file location by file name, and file name only, using get_ciminstance in powershell?

I'm trying to write a script that will retrieve a specific file's properties across multiple computers. I was using get-childitem to do this until I realized that only retrieves locally. I've read that get-ciminstance can be used to do this for remote machines, however, all the examples I've seen use full paths to find the files. My script assumes the location could be anywhere on the C drive, so it only looks for the location based on the file's name. So far I've tried several variations of code using get-ciminstance, but all either produce nothing or have the wrong query structure.
Here's what I have right now, and it's what I believe is the closest to being correct, but I'm not sure:
Get-CimInstance -ComputerName $PC -ClassName CIM_DataFile | select Name | Where-Object { $_.Name -like "install.properties"}
If anyone can point me in the right direction, it would be greatly appreciated. Thank you.
Here's a nice link. https://community.idera.com/database-tools/powershell/powertips/b/tips/posts/accessing-individual-files-and-folders-remotely-via-wmi Name is the full path, with double backslashes. The filter language is like sql. You can't use invoke-command?
Get-CimInstance CIM_DataFile -Filter 'name = "c:\\users\\admin\\foo\\file.json"'

PowerShell: Compare CSV to AD

I'm fairly new to PowerShell and I'm posting this on many forums but I've had success with programming assistance from here before and although this isn't strictly programming, I was hoping someone might know the answer.
My organization had about 5,300 users we needed to disable for a client. Someone decided the best use of our time was have people go through AD and disable them one at a time. Soon as I got wind of this I put a stop to it and used PowerShell to take the CSV list we already had, and ran a cmdlet to disable all of the users in the CSV list.
This appeared to work, but I wanted to run a comparison. I want to compare the users from the CSV file, to the users in AD, and confirm that they are all disabled without having to check all 5300 individually. We checked about 60 random ones to verify my run worked, but I want to make sure none slipped through the cracks.
I've tried a couple scripts and I've tried some variations of cmdlets. None of the scripts I tried even worked, spammed with errors. When I try to run a search of AD either using get-content or import-CSV from the csv file, when I export its giving me about 7600 disabled users (if I search by disabled). There were only 5300 users in total, so it must be giving me all of the disabled users in AD. Other cmdlets i've run appear to do the same thing, its exporting an entire AD list instead of just comparing against my CSV file.
Any assistance anyone can provide would be helpful.
Without knowing the exact structure of your CSV I'm going to assuming it is as such:
"CN=","OU=","DC="
"JSmith","Accounting","Foo.com"
"BAnderson","HR","Foo.com"
"JAustin","IT","Foo.com"
That said, if your first field actually has CN= included (i.e. "CN=JSmith","OU=Accounting","Foo.com") you will want to trim that with .TrimStart("CN=").
$ToRemove = Import-CSV UserList.csv
$UserList=#()
ForEach($User in $ToRemove){
$Temp = ""|Select "User","Disabled"
$Temp.User = $User.'CN='
If((Get-aduser $Temp.User -Prop Enabled).Enabled){$Temp.Disabled='False'}else{$Temp.Disabled='True'}
$UserList+=$Temp}
$UserList|?{$_.Disabled -eq 'False'}
That loads the CSV into a variable, runs each listing through a loop that checks the 'CN=' property, creates a custom object for each user containing just their name and if they are disabled, and then adds that object to an array for ease of use later. In the end you are left with $UserList that lists everybody in the original CSV and if they are disabled. You can output it to a file, filter it for just those that are still enabled, or whatever you want. As noted before if your CSV actually has CN=JSmith for each line you will want to update line 5 to look as such:
$Temp.User = $User.'CN='.TrimStart("CN=")
If you don't have any headers in the CSV file you may want to inject them. Just put a line at the top that looks like:
CN=,OU=,DC=
Or, if you have varying OU depths you may be better off doing a GC and then running each line through a split, taking the first part, trimming the CN= off the beginning, and checking to see if they are disabled like:
GC SomeFile.CSV||%{$_.split(",")[0].trimstart("CN=")|%{If((get-aduser $_ -prop enabled).enabled){"$_ is Enabled"}else{"$_ is Disabled"}}}
Assuming your CSV has a column called DN you can run the following which will return all users from your spreadsheet which are enabled
import-csv YourUsersCSV.csv | Get-ADUser -Filter
{DistinguishedName -eq $_.DN } |
where{$_.enabled -eq $true} |
Select-Object -Property DistinguishedName,samaccountname,enabled

Powershell: Show what groups are added to a set of folders?

I spent some time searching through similar questions on here to see if I could find some answers, but I'm so clueless about AD that I'm not even sure how to tell if I'd found what I was looking for...
I have a number of folders in one place. All these folders are similarly named:
Reports_January_2011
Reports_March_2012
Reports_March_2012
All of these folders have a pair of identically named subfolders:
Export
Raw
I need to see all groups that have any permissions configured in these folders. Basically I have a Reports folder for every month for the past 5 years, each of those with the two subfolders. I need to make sure they all have the right groups added to them.
I started trying to figure out the regex to pick out only the right reports folders, but I'm totally lost on where to start for the "Get groups" part of the script. My experience with PS is limited to batch renaming, moving, etc. Simple one line stuff.
There is a nice PowerShell module (File System Security PowerShell Module 1.3) that could make your life easier. With that module in place, you can use the Get-Ace cmdlet to list permissions for files using a command like the one below:
Get-Item F:\backup | Get-Ace | Where-Object { $_.ID -like "*users*" }
Have a look at it.
You can try something like this this:
dir c:\ | ? { $_.psiscontainer } |`
Get-Acl | fl -property #{n="Path";E={ convert-path $_.pspath}}, #{N="AccessList";`
E={ $_.AccessToString -split '\n' | ? { $_.startswith("MyDomain") }}; }
to have a list with path and access list. Removing the las pipe also local user are listed