I am using powershell to extract all users from an OU who have not signed into their account in 365 number of days.
import-module activedirectory
get-aduser -SearchBase 'ou=staff,ou=brummitt,dc=DUNELAND,dc=LOCAL' -filter 'enabled -eq $true' -Properties samaccountname,lastlogondate |
Where-object {$_.lastlogondate -lt (get-date).AddDays(-365)} |
Select-Object -ExpandProperty samaccountname >>'C:\stale\brummitt.txt'
In attempt to organize the folder these are stored in I have created a folder in my servers C: drive called stale and have a folder called scripts in which the powershell scripts are stored.
When I run the script with powershell and the save extension is C:\stale\brummitt.txt it outputs all users in that OU. When the save location is C:\brummitt.txt it returns the correct users who have not signed in for over a year. Why would the results be changing based on the save location and how can this be combated?
Added:
I am running the powershell script from within the scripts folder.
Did you try using Tee-Object as a part of the pipeline?, that will give you the opotunity to check the stream to the file on console,
Related
I'm trying to find a PowerShell script that updates the title attrubute in AD for a large number of users. I was hoping to find a script that imports the changes from a csv file and updates the atribute only for the users in the list. I found the below script but apparently it is working only for Azure AD, and I need it for the local AD. Perhaps someone more switche on than me can help me amend the below script.
#Import Active Directory module
Import-Module ActiveDirectory
#Import CSV File to set variable for the user’s logon name + update data + delimiter
$Users = Import-CSV -Delimiter ";" -Path "c:\psscripts\users.csv"
#Using your code to filter AD Sam Accounts listed CSVData is listed with the information you wish to update
Foreach($user in $users){
#Using your code to filter AD Sam Accounts Based on column samaccountname in the csv file
Get-ADUser -Filter "SamAccountName -eq '$($user.samaccountname)'" | Set-ADUSer `
-title $($User.Title)`
}
That code is fine, beyond some variable consistency and lack of checks, and does target local AD, though use of that deliminator would likely be unusual if you're just using a standard csv file. If you have the data in an excel document with the column headers of "SamAccountName" (typically email addresses) and "Title", and then save the file as a csv, the below amended code should work for you. Added logic to test for blank Title, as you can't assign a blank value to an attribute.
#Import Active Directory module
Import-Module ActiveDirectory
#Import CSV File with AD SAM account and Title data from users.csv in the C:\psscripts directory of your computer
$Users = Import-CSV -Path "c:\psscripts\users.csv" | Where {$_}
#Filter AD Sam Accounts listed in CSV and update title for listed accounts
Foreach($user in $Users){
#Check for value of $user.Title in case of null value
If ($user.Title){
#Filter AD Sam Accounts Based on column SamAccountName in the csv file and update the account Title field
Get-ADUser -Filter "SamAccountName -eq '$($user.SamAccountName)'" | Set-ADUSer -Title $($user.Title)
}
else {
#Filter AD Sam Accounts Based on column SamAccountName in the csv file and clear the account Title field
Get-ADUser -Filter "SamAccountName -eq '$($user.SamAccountName)'" | Set-ADUSer -clear -Title
}
}
I'd recommend testing it on a test user account or two before going whole hog on your actual list. Goes without saying that you need to be logged into a PS session as a domain account with adequate privileges to make the changes to the accounts when running the script. VS Studio Code is a good environment to work in, and you can launch the program as the elevated account (shift + right-click program icon, choose run as a different user) within your normal account environment, to sandbox the privileges to just what you're working on in VS Studio Code.
If you are trying to work in Azure AD, you'd need to add these lines and approve your account access request within Azure, depending on your tenant setup, to actually run the script successfully. Depending on the tenant configuration, this may be required in a hybrid AD/Azure AD environment regardless of your intent to apply to local AD.
Connect-MgGraph -Scopes "User.ReadWrite.All", "Directory.ReadWrite.All"
Select-MgProfile -Name "beta"
Best regards, no warranties given or implied, please accept as answer if this works for you.
Running this script in powershell the passwords.txt file containing the default password that is being used for accounts.
But when I run it outputs the quality report but with no numbers.
Also once the script is put through it asks for a filter which I input: samAccountName -like '*,OU=Mgmt,DC=balrok,DC=edu'
$Passwords = Get-Content -Path C:\Users\Administrator\Desktop\passwords.txt
Get-ADUser | Test-PasswordQuality
Note: The Test-PasswordQuality cmdlet is part of the third-party DSInternals module.
I need a powershell script to move computer objects from Computers in Active directory according to the following conditions
It will check the username written in the description in the computer object (Username is written on each computer)
Moves the computer to the OU where the user is located.
I used the -SearchBase command while preparing the script but without success;
$ComputerObject = Read-Host "CN=$ComputerObject,OU=Computers,DC=domain,DC=com"
Get-ADComputer -Filter -SearchBase "OU=Computers,DC=domain,DC=com"
Move-ADObject –Identity (I don't know what to write here.)
Is there anyone who can help?
I have a script that creates a csv file that tells me users who haven't logged in in 90 days. It needs to include the Manager for that user but when pulling in the manager from Active Directory, I"m getting the full DN for that manager rather than just the Display Name. Here's my script...
import-module activedirectory
get-aduser -filter * -searchscope subtree -searchbase "ou=user departments ou,dc=acr,dc=org" -properties DisplayName,manager,lastlogontimestamp |
? {(((Get-date) - ([datetime]::FromFileTime($_.lastlogontimestamp))).TotalDays -gt 90)} |
select DisplayName,samaccountname,manager,Userprincipalname,#{Exp={([datetime]::FromFileTime($_.lastlogontimestamp))};label="Last logon time stamp"} |
export-csv "c:\scripts\reston_sharepoint_users_120_days.csv" -NoTypeInformation -Delimiter ","
Any suggestions? The rest of the script works perfectly. Thanks.
I have a script that creates a csv file which I then go open to get the information I am looking for. Is there a way to add to the script so that the file opens immediately as well as saves in the specified location?
Get-ADComputer -Filter { Enabled -eq $true } -Properties Name -SearchBase
"DC=REMOVED,DC=com" |
Select Name |
? {$_. Name -like "X*" } |
Export-Csv "C:\scripts\Computers.csv"
Add this quick snippet to the end of your code:
Invoke-Item "C:\scripts\Computers.csv"
If you have Excel automatically configured to support/open .csv files, it should open automatically.
EDIT: Note that this isn't available in Powershell 2.0 - only 3.0+