PowerShell way to reschedule tasks inside a Group Policy Object - powershell

I don't find any PowerShell cmdlet that allows me to edit date/time for a scheduled task inside a GPO. I'm asking confirmation that such cmdlets don't exist or (I hope) someone knows an automated way to achieve the task. many thanks

finally a way to amend a scheduled task inside a GPO by using PowerShell. Basically it is about selecting the xml file where the GPO has the scheduled tasks saved in the sysvol folder, edit it with the help of the xml linq library to select the right namespace, and save the xml back to its folder:
Add-Type -AssemblyName 'System'
Add-Type -AssemblyName 'System.Linq'
Add-Type -AssemblyName 'System.Xml.Linq'
#vars
$domain = 'domainName'
$ADcreds = PSCreds
$gpo = Get-GPO -Name 'GPO-Name' -Domain $domain
New-PSDrive -Name "S" -Root "\\$domain\SYSVOL\$domain\Policies\{$($gpo.Id)}\Machine\Preferences\ScheduledTasks" -PSProvider "FileSystem" -Credential $ADcreds
$inputFilename = "$((Get-PSDrive 'S').Root)\ScheduledTasks.xml"
$outputFilename = "$((Get-PSDrive 'S').Root)\ScheduledTasksNew.xml"
$reader = [System.IO.StreamReader]::new($inputFilename)
# skip first line with utf-8
$reader.ReadLine()
$xDoc = [System.Xml.Linq.XDocument]::Load($reader)
$today = [DateTime]::Now
$tasksV2 = $xDoc.Descendants().Where({$_.Name.LocalName -eq "TaskV2"})
foreach($taskV2 in $tasksV2.Where({$_.Attribute('name').Value -ne 'wsus-extraMaintenance'}))
{
$startBoundary = $taskV2 = $xDoc.Descendants().Where( {$_.Name.LocalName -eq "StartBoundary"})
$todayStr = $today.ToString("yyyy-MM-ddTHH:mm:ss")
$startBoundary.SetValue($todayStr)
}
$xDoc.Save($outputFilename)
Get-ChildItem -Path 'S:\ScheduledTasks.xml' | Remove-Item -Confirm:$false
Get-ChildItem -Path 'S:\ScheduledTasksNew.xml' | Rename-Item -NewName 'ScheduledTasks.xml' -Confirm:$false

Related

Powershell Script to Move Files to new server by last access date

I'm new to Powershell. I have 80 servers that I need to connect to and run a Pshell script on remotely to find files recursively in one share by last access date and move them to another \server\share for archiving purposes. I also need the file creation, last accessed etc. timestamps to be preserved.
I would welcome any help please
thank you
You need to test this thoroughly before actually using it on all 80 servers!
What you could do if you want to use PowerShell on this is to use Invoke-Command on the servers adding admin credentials so the script can both access the files to move as well as the destination Archive folder.
I would suggest using ROBOCOPY to do the heavy lifting:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$DaysAgo = 130
# from a cmd box, type 'robocopy /?' to see all possible switches you might want to use
# /MINAGE:days specifies the LastWriteTime
# /MINLAD:days specifies the LastAccessDate
robocopy $SourcePath $TargetPath /MOVE /MINLAD:$DaysAgo /COPYALL /E /FP /NP /XJ /XA:H /R:5 /W:5 /LOG+:$logFile
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred
If you want to do all using just PowerShell, try something like this:
$servers = 'Server1', 'Server2', 'Server3' # etcetera
$cred = Get-Credential -Message "Please supply admin credentials for archiving"
$scriptBlock = {
$SourcePath = 'D:\StuffToArchive' # this is the LOCAL path on the server
$TargetPath = '\\NewServer\ArchiveShare' # this is the REMOTE path to where the files should be moved
$LogFile = 'D:\ArchivedFiles.txt' # write a textfile with all fie fullnames that are archived
$refDate = (Get-Date).AddDays(-130).Date # the reference date set to midnight
# set the ErrorActionPreference to Stop, so exceptions are caught in the catch block
$OldErrorAction = $ErrorActionPreference
$ErrorActionPreference = 'Stop'
# loop through the servers LOCAL path to find old files and move them to the remote archive
Get-ChildItem -Path $SourcePath -File -Recurse |
Where-Object { $_.LastAccessTime -le $refDate } |
ForEach-Object {
try {
$target = Join-Path -Path $TargetPath -ChildPath $_.DirectoryName.Substring($SourcePath.Length)
# create the folder in the archive if not already exists
$null = New-Item -Path $target -ItemType Directory -Force
$_ | Move-Item -Destination $target -Force
Add-Content -Path $LogFile -Value "File '$($_.FullName)' moved to '$target'"
}
catch {
Add-Content -Path $LogFile -Value $_.Exception.Message
}
}
$ErrorActionPreference = $OldErrorAction
}
Invoke-Command -ComputerName $servers -ScriptBlock $scriptBlock -Credential $cred

Compact and Repair all Access databases in a directory using Powershell

I am looking to find a way to compact and repair all the Access databases in a certain directory using Powershell via a script.
The VBA codes below work, but need one for Powershell:
Find all Access databases, and Compact and Repair
I am new to Powershell so will be grateful for the assistance.
Thanks
You may try this.
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$rootfolder = 'c:\some\folder'
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $file.Directory ("{0}_compacted{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()
This will output each compacted database as the name of the original file with _compacted appended to the name (before the extension.) I have tested this in every way except actually compacting databases.
Edit
Regarding your comment, a few minor changes should achieve the desired result. Keep in mind that this will put all new files in the same folder. This may not be an issue for your case but if there are duplicate file names you will have problems.
$rootfolder = 'c:\some\folder'
$destination = 'c:\some\other\folder'
$todaysdate = get-date -format '_dd_MM_yyyy'
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $destination ("{0}$todaysdate{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()

PowerShell Compress and Read-Only using Send-To

I am pretty new in PowerShell scripting so if what I am asking is not possible by all means tell me that.
I would like to create a PowerShell script that would accept the send-to command.
The purpose of the script is to change the files to read-only and then compress those files. I’d like to be able to select multiple files in file explorer then right-click, send to (Script)
Is this something that is possible? Thanks!
Update 1
Alright, I have it were it will select files using file explorer then pass them into the script. The read-only is functioning correctly. Just need to sort out the compression.
Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property #{
Multiselect = $true # Multiple files can be chosen
Filter = 'Images (*.jpg, *.png)|*.jpg;*.png' # Specified file types
}
[void]$FileBrowser.ShowDialog()
$path = $FileBrowser.FileNames;
If($FileBrowser.FileNames -like "*\*") {
# Do something before work on individual files commences
$FileBrowser.FileNames #Lists selected files (optional)
foreach($file in Get-ChildItem $path){
Get-ChildItem ($file) |
ForEach-Object {
Set-ItemProperty -Path $path -Name IsReadOnly -Value $true
}
}
# Do something when work on individual files is complete
}
else {
Write-Host "Cancelled by user"
}
Update 2 Alrighty I got it all working, does what I need it too. Here it is if anyone has any interest in it.
Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property #{
Multiselect = $true # Multiple files can be chosen
}
[void]$FileBrowser.ShowDialog()
$path = $FileBrowser.FileNames;
If($FileBrowser.FileNames -like "*\*") {
# Do something before work on individual files commences
$FileBrowser.FileNames #Lists selected files (optional)
foreach($file in Get-ChildItem $path){
Get-ChildItem ($file) |
ForEach-Object {
Set-ItemProperty -Path $path -Name IsReadOnly -Value $true
compact /C $_.FullName
}
}
# Do something when work on individual files is complete
}
else {
Write-Host "Cancelled by user"
}

Powershell Script to Remotely Install Software (Microsoft Office)

I am trying to figure out how to write a powershell script that will automatically install office2010 on multiple pcs. I am struggling on the portion where you create the text file that we loop through listing the ComputerName and the Users Login. I have researched this all over the web but for some reason am unable to get this to work.
Function Get-FileName{
[CmdletBinding()]
Param(
[String]$Filter = "|*.*",
[String]$InitialDirectory = "C:\")
[void][System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms")
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
$OpenFileDialog.initialDirectory = $InitialDirectory
$OpenFileDialog.filter = $Filter
[void]$OpenFileDialog.ShowDialog()
$OpenFileDialog.filename
}
ForEach ($computer in (GC (Get-FileName -InitialDirectory $env:USERPROFILE\Desktop -Filter "Text files (*.txt)|*.txt|All files (*.*)|*.*"))) {
$filepath = Test-Path -Path "\\$computer\C:\Program Files (x86)\Microsoft Office"
If ($filepath -eq $false)
{
Get-Service remoteregistry -ComputerName $computer | Start-Service
Copy-Item -Path "\\server\Orig\Install\Office2010" -Destination "\\$computer\c$\windows\temp\" -Container -Recurse -Force
# $InstallString = '"C:\windows\temp\Office 2010\setup.exe"'
# ([WMICLASS]"\\$computer\ROOT\CIMV2:Win32_Process").Create($InstallString)
# "$computer" + "-" + "(Get-Date)" | Out-File -FilePath "\\server\Orig\Install\RemoteInstallfile.txt" -Append
# }
# Else
# {
# "$computer" + "_Already_Had_Software_" + "(Get-Date)" | Out-File -FilePath "\\server\Orig\Install\RemoteInstallfile.txt" -Append
}
}
ComputerList.txt
IT-Tech | David
IT-Tech would be the computer name and David would be the user. Then I would have a list like this line by line in the txt file.
So i was thinking I could do something like this Listing the computer name and then the user name of how to install. This part confuses me though just trying to learn and see what this powershell stuff is all about!
Any help with this would be greatly appreciated!
A line of your file, as you've said, will contain something like "IT-Tech | David", so when you iterate through that file that's the value of $computer. You then attempt to use this as the computer name call which will of course fail because first you need to split it out.
I will also point out it is extremely bad form to abbreviate and use aliases in scripts, you should only use them in the console. Also for readability it helps to split complex bits out.
$file = Get-FileName -InitialDirectory $env:USERPROFILE\Desktop -Filter "Text files (*.txt)|*.txt|All files (*.*)|*.*"
ForEach ($item in (Get-Content $file)) {
$sitem = $item.Split("|")
$computer = $sitem[0].Trim()
$user = $sitem[1].Trim()
$filepath = Test-Path -Path "\\$computer\C:\Program Files (x86)\Microsoft Office"
If ($filepath -eq $false)
{
Get-Service remoteregistry -ComputerName $computer | Start-Service
Copy-Item -Path "\\server\Orig\Install\Office2010" -Destination "\\$computer\c$\windows\temp\" -Container -Recurse -Force
<#
$InstallString = '"C:\windows\temp\Office 2010\setup.exe"'
([WMICLASS]"\\$computer\ROOT\CIMV2:Win32_Process").Create($InstallString)
"$computer" + "-" + "(Get-Date)" | Out-File -FilePath "\\server\Orig\Install\RemoteInstallfile.txt" -Append
}
Else
{
"$computer" + "_Already_Had_Software_" + "(Get-Date)" | Out-File -FilePath "\\server\Orig\Install\RemoteInstallfile.txt" -Append
#>
}
}
Note that this will NOT install the product if the installer is already in the destination, not sure if that is intended behaviour or not.

PowerShell SQL Job Step Move-Item not working on 1 server

This identical code has been used in 3 servers, and only one of them does it silently fail to move the items (it still REMOVES them, but they do not appear in the share).
Azure-MapShare.ps1
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path "${DriveLetter}:"))
{
cmd.exe /c "net use ${DriveLetter}: ${StorageLocation} /u:${StorageUser} ""${StorageKey}"""
}
Get-Exclusion-Days.ps1
param (
[datetime]$startDate,
[int]$daysBack
)
$date = $startDate
$endDate = (Get-Date).AddDays(-$daysBack)
$allDays =
do {
"*"+$date.ToString("yyyyMMdd")+"*"
$date = $date.AddDays(-1)
} until ($date -lt $endDate)
return $allDays
Migrate-Files.ps1
param(
[string]$Source,
[string]$Filter,
[string]$Destination,
[switch]$Remove=$False
)
#Test if source path exist
if((Test-Path -Path $Source.trim()) -ne $True) {
throw 'Source did not exist'
}
#Test if destination path exist
if ((Test-Path -Path $Destination.trim()) -ne $True) {
throw 'Destination did not exist'
}
#Test if no files in source
if((Get-ChildItem -Path $Source).Length -eq 0) {
throw 'No files at source'
}
if($Remove)
{
#Move-Item removes the source files
Move-Item -Path $Source -Filter $Filter -Destination $Destination -Force
} else {
#Copy-Item keeps a local copy
Copy-Item -Path $Source -Filter $Filter -Destination $Destination -Force
}
return $True
The job step is type "PowerShell" on all 3 servers and contains this identical code:
#Create mapping if missing
D:\Scripts\Azure-MapShare.ps1 -DriveLetter 'M' -StorageKey "[AzureStorageKey]" -StorageLocation "[AzureStorageAccountLocation]\backup" -StorageUser "[AzureStorageUser]"
#Copy files to Archive
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "D:\Databases\BackupArchive"
#Get date range to exclude
$exclusion = D:\Scripts\Get-Exclusion-Days.ps1 -startDate Get-Date -DaysBack 7
#Remove items that are not included in exclusion range
Remove-Item -Path "D:\Databases\BackupArchive\*.bak" -exclude $exclusion
#Move files to storage account. They will be destroyed
D:\Scripts\Migrate-Files.ps1 -Source "D:\Databases\Backup\*.bak" -Destination "M:\" -Remove
#Remove remote backups that are not from todays backup
Remove-Item -Path "M:\*.bak" -exclude $exclusion
If I run the job step using SQL then the files get removed but do not appear in the storage account. If I run this code block manually, they get moved.
When I start up PowerShell on the server, I get an error message: "Attempting to perform the InitializeDefaultDrives operation on the 'FileSystem' provider failed." However, this does not really impact the rest of the operations (copying the backup files to BackupArchive folder, for instance).
I should mention that copy-item also fails to copy across to the share, but succeeds in copying to the /BackupArchive folder
Note sure if this will help you but you could try to use the New-PSDrive cmdlet instead of net use to map your shares:
param (
[string]$DriveLetter,
[string]$StorageLocation,
[string]$StorageKey,
[string]$StorageUser
)
if (!(Test-Path $DriveLetter))
{
$securedKey = $StorageKey | ConvertTo-SecureString -AsPlainText -Force
$credentials = New-Object System.Management.Automation.PSCredential ($StorageUser, $securedKey)
New-PSDrive -Name $DriveLetter -PSProvider FileSystem -Root $StorageLocation -Credential $credentials -Persist
}
Apparently I tricked myself on this one. During testing I must have run the net use command in an elevated command prompt. This apparently hid the mapped drive from non-elevated OS features such as the Windows Explorer and attempts to view its existence via non-elevated command prompt sessions. I suppose it also was automatically reconnecting during reboots because that did not fix it.
The solution was as easy as running the net use m: /delete command from an elevated command prompt.