Using powershell script with different parameters - powershell

I have a script that deletes anything older than a set time. I want to replicate this for other delete jobs with different times and different folders
I am new to Powershell, this script was written with a lot of google assistance
$Minutes=[DateTime]::Now.AddMinutes(-5)
$Timestamp = Get-Date -Format "yyyy-MM-ddTHH-mm-ss"
$Log = "C:\test\logs\_" + $Timestamp + ".log"
Start-Transcript -path $Log -append -Force -NoClobber
try {
function Write-Log($string)
{
$outStr = "" + $Timestamp +" "+$string
Write-Output $outStr
}
Write-Log "------------ Start of Log ------------"
#Write-Log ""
# get all file objects to use in erasing
$files=Get-ChildItem -path 'c:\test\*' -Include *.* -Recurse |
Where-Object{ $_.LastWriteTime -lt $Minutes}
# Remove the file and its folder.
$files |
ForEach-Object{
Write-Log " Deleting File --> $_."; Remove-Item $_.Fullname
}
# output statistics
Write-Output "**********************"
Write-Output "Number of old files deleted: $($files.Count)"
Write-Log "------------- End of Log -------------"
}
catch {
Write-Error -Message "Something bad happened!" -ErrorAction Stop
}
Stop-Transcript

Welcome to PowerShell, and good for you on the web search approach. Yet remember, it is vital that being new to this, that you take some time to ramp up on all the basic before you diving into this space, in order to avoid as much undue confusion, frustration, etc., that you will encounter.
You really need to do this also to understand what you need, and to avoid having / causing catastrophic issues to your system and or your enterprise. Of course, never run any code you do not fully understand, and always list out your goals and address them one at a time to make sure you are getting the results you'd expect.
Live on YouTube, Microsoft Virtual Academy, Microsoft Learn, TechNet Virtual Labs, MS Channel9, leveraging all the videos you can consume; then hit the documentation/help files, and all the no cost eBooks all over the web.
As for ...
I want to replicate this for other delete jobs with different times
and different folders
… this is why functions and parameters exist.
Function Start-DeleteJob
{
[CmdletBinding()]
[Alias('sdj')]
Param
(
$JobTime,
$JobFolder
)
# Code begins here
}
So, spend time researching PowerShell functions, advance functions, and parameters.
Get-Help -Name About_*Functions*
Get-Help -Name About_*Parameters*

Related

PowerShell through Group Policy

I have two separate PowerShell (.ps1) files that I'd like to run one after the other when a user logs on to a PC. They're fairly straightforward tasks. The first copies a shortcut from a network location to all users AppData folder.
Copy-Item -Path "\\Server\Share\*.lnk" -Destination "$env:APPDATA\Microsoft\Windows\Start Menu\Programs"
The second .ps1 file removes a load of bloatware from Windows 10, I won't put all the code in here as it's quite repetitive but it basically lists a load of apps and finally removes them.
$AppList = #(
"*Microsoft.3dbuilder*"
"*AdobeSystemsIncorporated.AdobePhotoshopExpress*"
"*Microsoft.WindowsAlarms*"
"*Microsoft.Asphalt8Airborne*"
)
foreach ($App in $AppList) {
Get-AppxPackage -Name $App | Remove-AppxPackage -ErrorAction SilentlyContinue
}
If I place the two files into the same logon policy, the first script will run but the second one doesn't until the user logs off and back on again (I'd like them both to run at the same time).
I've tried placing them both in the same file and separating then with a ;, this didn't work so tried and, again no joy. I've also tried creating a master file (with the two .ps1 files in the same location) and running the following, again this didn't work.
&"$PSScriptroot\Copy Devices and Printers Shortcut.ps1" &"$PSScriptroot\BloatwareRemoval.ps1"
I've also tried separating the above with ; and and with no joy.
Edit I've resolved this with the following pd1 file:
Get-ChildItem \\File\Location | ForEach-Object {
& $_.FullName
}
As per the comments, this script you should save as a .ps1 file and call it however you want. It will do both the operations together. I have added error handling but ideally you should parse them in a file so that you can refer in case of any failure.
If( (Test-Path -Path "\\Server\Share\*.lnk") -and (Test-Path -Path "$env:APPDATA\Microsoft\Windows\Start Menu\Programs"))
{
Copy-Item -Path "\\Server\Share\*.lnk" -Destination "$env:APPDATA\Microsoft\Windows\Start Menu\Programs"
}
else
{
"Invalid path. Kindly validate. "
}
#("*Microsoft.3dbuilder*","*AdobeSystemsIncorporated.AdobePhotoshopExpress*","*Microsoft.WindowsAlarms*","*Microsoft.Asphalt8Airborne*")|% {
try{
Get-AppxPackage -Name $_ | Remove-AppxPackage -ErrorAction stop
}
catch
{
$_.Exception.Message
}
}
Hope it helps.
I've resolved this with the following pd1 file:
Get-ChildItem \\File\Location | ForEach-Object {
& $_.FullName
}

Print PDFs to specific printers based on filename

I would just like to preface this by saying I am brand new to Powershell and have been trying to learn by picking things up here and there. I'm currently trying to automate a process within my company using strictly powershell and Adobe reader.
Our company currently is manually printing individual sets of records and a separate cover page, binding them, and sending them off. An idea to automate this process was to fill a folder with a zipped set of .pdfs for the day. This zip file would then be extracted and it's contents moved to another folder. PDFs with the normal set of records listed as "WO-xxxxxx Set" and the cover page as "WO-xxxxxx Cover". All I would need to do is create a simple script that prints these out in order, so that "WO-000001 Cover" is on top of "WO-000001 Set" and then print the next set in the order.
The complication I've run into is that Start-Process -FilePath $File.Fullname -Verb Print only allows me to target a default printer. Our Covers will need to be printed on thicker paper, and as such I thought the best course of action would be to create two printers on the network with the required printer settings. If I could have the script swap between the two printers based on file name then it would solve my issue.
This script is sending the documents to the printer in order but not actually swapping the default printer. I'm sure this is something I've done wrong in my IfElse cmdlet and would appreciate an experts eye in this.
Function UnZipEverything($src, $dest)
{
[System.Reflection.Assembly]::LoadWithPartialName("System.IO.Compression.FileSystem") | Out-Null
$zps = Get-ChildItem $src -Filter *.zip
foreach ($zp IN $zps)
{
$all = $src + $zp
[System.IO.Compression.ZipFile]::ExtractToDirectory($all, $dest)
}
}
UnZipEverything -src "C:\Users\admin\Desktop\Zip Test\" -dest'C:\Users\admin\Desktop\UnZip Test\'
Remove-Item "C:\Users\admin\Desktop\Zip Test\*.zip"
$files = Get-ChildItem “C:\Users\admin\Desktop\UnZip Test\*.*” -Recurse
ForEach ($file in $files){
If ($files -eq '*Cover*') {
(New-Object -ComObject WScript.Network).SetDefaultPrinter('Test')
Start-Process -FilePath $File.FullName -Verb Print -PassThru | %{ sleep 10;$_ } | kill
(New-Object -ComObject WScript.Network).SetDefaultPrinter('\\RFC-Print01\Collections Tray 6')
}
Else {Start-Process -FilePath $File.FullName -Verb Print -PassThru | %{ sleep 10;$_ } | kill
}
}
Any help would be greatly appreciated.
If you use the verb PrintTo instead of Print, you can specify the printer:
Start-Process -FilePath $File.FullName -Verb PrintTo '\\RFC-Print01\Collections Tray 6' -PassThru
This would allow you to remove the SetDefaultPrinter calls from the script.

PowerShell Memory leak misunderstanding

New to PowerShell, so kind of learning by doing.
The process I have created works, but it ends up locking down my machine until it is completed, eating up all memory. I thought I had this fixed by looking into forcing the garbage collector, and also moving from a for-each statement to using %() to loop through everything.
Quick synopsis of process: Need to merge multiple SharePoint log files into single ones to track usage across all of the companies' different SharePoint sites. PowerShell loops through all log directories on the SP server, and checks each file in the directory if it already exists on my local machine. If it does exist it appends the file text, otherwise it does a straight copy. Rinse-repeat for each file and directory on the SharePoint Log server. Between each loop, I'm forcing the GC because... Well because my basic understanding is the looped variables are held in memory, and I want to flush them. I'm probably looking at this all wrong. So here is the script in question.
$FinFiles = 'F:\Monthly Logging\Logs'
dir -path '\\SP-Log-Server\Log-Directory' | ?{$_.PSISContainer} | %{
$CurrentDir = $_
dir $CurrentDir.FullName | ?(-not $_.PSISContainer} | %{
if($_.Extension -eq ".log"){
$DestinationFile = $FinFiles + '\' + $_.Name
if((Test-Path $DestinationFile) -eq $false){
New-Item -ItemType file -path $DestinationFile -Force
Copy-Item $_.FullName $DestinationFile
}
else{
$A = Get-Content $_.FullName ; Add-Content $DestinationFile $A
Write-Host "Log File"$_.FullName"merged."
}
[GC]::Collect()
}
[GC]::Collect()
}
Granted the completed/appended log files get very very large (min 300 MB, max 1GB). Am I not closing something I should be, or keeping something open in memory? (It is currently sitting at 7.5 of my 8 Gig memory total.)
Thanks in advance.
Don't nest Get-ChildItem commands like that. Use wildcards instead. Try: dir "\\SP-Log-Server\Log-Directory\*\*.log" instead. That should improve things to start with. Then move this to a ForEach($X in $Y){} loop instead of a ForEach-Object{} loop (what you're using now). I'm betting that takes care of your problem.
So, re-written just off the top of my head:
$FinFiles = 'F:\Monthly Logging\Logs'
ForEach($LogFile in (dir -path '\\SP-Log-Server\Log-Directory\*\*.log')){
$DestinationFile = $FinFiles + '\' + $LogFile.Name
if((Test-Path $DestinationFile) -eq $false){
New-Item -ItemType file -path $DestinationFile -Force
Copy-Item $LogFile.FullName $DestinationFile
}
else{
$A = Get-Content $LogFile.FullName ; Add-Content $DestinationFile $A
Write-Host "Log File"$LogFile.FullName"merged."
}
}
}
Edit: Oh, right, Alexander Obersht may be quite right as well. You may well benefit from a StreamReader approach as well. At the very least you should use the -readcount argument to Get-Content, and there's no reason to save it as a variable, just pipe it right to the add-content cmdlet.
Get-Content $LogFile.FullName -ReadCount 5000| Add-Content $DestinationFile
To explain my answer a little more, if you use ForEach-Object in the pipeline it keeps everything in memory (regardless of your GC call). Using a ForEach loop does not do this, and should take care of your issue.
You might find this and this helpful.
In short: Add-Content, Get-Content and Out-File are convenient but notoriously slow when you need to deal with large amounts of data or I/O operations. You want to fall back to StreamReader and StreamWriter .NET classes for performance and/or memory usage optimization in cases like yours.
Code sample:
$sInFile = "infile.txt"
$sOutFile = "outfile.txt"
$oStreamReader = New-Object -TypeName System.IO.StreamReader -ArgumentList #($sInFile)
# $true sets append mode.
$oStreamWriter = New-Object -TypeName System.IO.StreamWriter -ArgumentList #($sOutFile, $true)
foreach ($sLine in $oStreamReader.ReadLine()) {
$oStreamWriter.WriteLine($sLine)
}
$oStreamReader.Close()
$oStreamWriter.Close()

Why is this PowerShell script so slow? How can I speed it up?

I developed this script to apply Sitecore workflows to whole swaths of items without having to manually click through the GUI. I'm pretty pleased with how well it works, but it's just slow. Here is the script:
Import-Module 'C:\Subversion\CMS Build\branches\1.2\sitecorepowershell\Sitecore.psd1'
# Hardcoded IDs of workflows and states
$ContentApprovalWfId = "{7005647C-2DAC-4C32-8A09-318000556325}";
$ContentNoApprovalWfId = "{BCBE4080-496F-4DCB-8A3F-6682F303F3B4}";
$SettingsWfId = "{7D2BA7BE-6A0A-445D-AED7-686385145340}";
#new-psdrive *REDACTED*
set-location m-rocks:
function ApplyWorkflows([string]$path, [string]$WfId) {
Write-Host "ApplyWorkflows called: " $path " - " $wfId;
$items = Get-ChildItem -Path $path;
$items | foreach-object {
if($_ -and $_.Name) {
$newPath = $path + '\' + $_.Name;
$newPath;
} else {
Write-host "Name is empty.";
return;
}
if($_.TemplateName -eq "Folder" -or $_TemplateName -eq "Template Folder") {
# don't apply workflows to pure folders, just recurse
Write-Host $_.Name " is a folder, recursing.";
ApplyWorkflows $newPath $wfId;
}
elseif($_.TemplateName -eq "Siteroot" -or $_.TemplateName -eq "InboundSiteroot") {
# Apply content-approval workflow
Set-ItemProperty $newPath -name "__Workflow" $ContentApprovalWfId;
Set-ItemProperty $newPath -name "__Default workflow" $ContentApprovalWfId;
# Apply content-no-approval workflow to children
Write-Host $_.Name " is a siteroot, applying approval workflow and recursing.";
ApplyWorkflows $newPath $ContentNoApprovalWfId;
}
elseif($_.TemplateName -eq "QuotesHomePage") {
# Apply settings workflow to item and children
Write-Host $_.Name " is a quotes item, applying settings worfklow recursing.";
Set-ItemProperty $newPath -name "__Workflow" $SettingsWfId;
Set-ItemProperty $newPath -name "__Default workflow" $SettingsWfId;
ApplyWorkflows $newPath $SettingsWfId;
}
elseif($_.TemplateName -eq "Wildcard")
{
Write-Host $_.Name " is a wildcard, applying workflow (and halting).";
Set-ItemProperty $newPath -name "__Workflow" $ContentApprovalWfId;
Set-ItemProperty $newPath -name "__Default workflow" $ContentApprovalWfId;
}
elseif($_ -and $_.Name) {
# Apply passed in workflow and recurse with passed in workflow
Write-Host $_.Name " is a something else, applying workflow and recursing.";
Set-ItemProperty $newPath -name "__Workflow" $WfId;
Set-ItemProperty $newPath -name "__Default workflow" $WfId;
ApplyWorkflows $newPath $wfId;
}
}
}
ApplyWorkflows "sitecore\Content\" $ContentNoApprovalWfId;
It processes one item in a little less than a second. There are some pauses in its progress - evidence suggests that this is when Get-ChildItem returns a lot of items. There are a number of things I would like to try, but it's still running against one of our sites. It's been about 50 minutes and looks to be maybe 50% done, maybe less. It looks like it's working breadth-first, so it's hard to get a handle on exactly what's done and what's not.
So what's slowing me down?
Is it the path construction and retrieval? I tried to just get the children on the current item via $_ or $_.Name, but it always looks in the current working directory, which is the root, and can't find the item. Would changing the directory on every recursion be any faster?
Is it the output that's bogging it down? Without the output, I have no idea where it is or that it's still working. Is there some other way I could get indication of where it is, how many it has done, etc.?
Is there a better approach where I just use Get-ChildItem -r with filter sets and loop through those? If so, a first attempt at incorporating some of my conditionals in the first script into a filter set would be very much appreciated. I am new to PowerShell, so I'm sure there's more than one or two improvements to be made in my code.
Is it that I always call the recursive bit even if there aren't any children? The content tree here is very broad with a very many leaves with no children. What would be a good check whether or not child items exist?
Finally, the PowerShell provider (PSP) we have is not complete. It does not seem to have a working implementation of Get-Item, which is why everything is almost completely written with Get-ChildItem instead. Our Sitecore.Powershell.dll says it is version 0.1.0.0. Would upgrading that help? Is there a newer one?
Edit: it finally finished. I did a count on the output and came up with 1857 items and it took ~85 minutes to run, for an average of 21 items per minute. Slower than I thought, even...
Edit: My first run was on PowerShell 2.0, using the Windows PowerShell ISE. I've not tried the Sitecore PowerShell plugin module or the community. I didn't even know it existed until yesterday :-)
I tried another run after upgrading to PowerShell 3.0. Starting locally - running the script from my laptop, connecting to the remote server - there was no noticeable difference. I installed PowerShell 3.0 on the hosting box and ran the script from there and saw maybe a 20-30% increase in speed. So that's not the silver bullet I was hoping it would be - I need an order of magnitude or two's worth of improvement to make this something I won't have to babysit and run in batches. I am now playing around with some of the actual script improvements suggested by the fine answers below. I will post back with what works for me.
Personally I think the biggest boost you would get if you started using the community PowerShell implementation over the Rocks one.
Let me explain why.
You're traversing the whole tree which means you have to visit every node in your branch, which means it has to be read and travel over the Rocks web service at least once.
Then every property save is another webservice call.
I have run your script in the community console and it took me around 25 seconds for 3724 items.
(I've removed the modifications as the values didn't relate to my system).
A Simple
Get-ChildItem -recurse
On my 3724 item tree took 11 seconds in the community console vs 48 seconds in the Rocks implementation.
Additional tweaks you could use in the Community implementation for your script would be using the Sitecore query like:
get-item . -Query '/sitecore/content/wireframe//*[##TemplateName="Template Folder"]'
and only send those items into your function
None of it means the Rocks console is not written right, it just means the design choices in the Rocks console and their target is different.
You can find the community console here:
http://bit.ly/PsConScMplc
See this blog post where there are stated differences between foreach-object cmdlet and foreach statement.
You could speed it up when you pipe the get-childitem with foreach object:
get-childitem . | foreach-object { ApplyWorkflow($_) }
This will cause that each object returned by get-childitem will be immediately passed to the following step in pipeline so you will be processing them only once. This operation should also prevent long pauses for reading all children.
Additionally you can get all the items recursively and filter them by template and then apply appropriate workflows, eg:
get-childitem -recurse . | where-object { $_.TemplateName -eq "MyTemplate"} | foreach-object { ApplyWorkflowForMyTemplate($_) }
get-childitem -recurse . | where-object { $_.TemplateName -eq "MySecondTemplate"} | foreach-object { ApplyWorkflowForMySecondTemplate($_) }
Still I would rather not expect this script to run in seconds anyway. In the end you are going over the whole content tree.
And finally what library are you using? Is this Sitecore Powershell Console (the name of the dll sounds familiar)? There is newer version which has a lots of new features added.
The most obvious problems I see are that you are iterating over files twice per invocation of the function:
function ApplyWorkflows([string]$path, [string]$WfId) {
Write-Host "ApplyWorkflows called: " $path " - " $wfId;
# this iterates through all files and assigns to $items
$items = Get-ChildItem -Path $path;
# this iterates through $items for a second time
$items | foreach-object { # foreach-object is slower than for (...)
Also, piping to foreach-object is slower than using the traditional for keyword.

PowerShell script to echo files from folders and subfolders and then delete files over X days old

I am curious to understand the possible ways to echo the files in folders and subfolders and generate a output stating the filenames, which are picked up to delete X days old.
I wanted to write this script in two different levels
Level1:
PowerShell script only to echo filenames and give me the output of the files, which have been identified to be deleted. This should include the files including folders and subfolders.
Level2:
Combine the level1 script by adding a delete functionality, which would delete the files in folders and subfolders.
I have a move script and a direct script to delete but I want to ensure the correct files are picked and I want to know the file names which are being deleted.
Any help is highly appreciated.
EDIT Added from comment
I have been trying something like this in a very simple fashion
Get-ChildItem -Path c:\test | where {$_.lastWriteTime -lt (Get-Date).addDays(-60)}
I would like to add some parameter, which would generate an output of filenames in a different folder location.
I think this is something along the lines of what you need, I have introduced you to a few concepts which you might not be aware of, such as cmdletbinding which allows you to dry run your script using the -whatif parameter. You can also supply -verbose to see what is happening along the way, you could also append to a log at this point using the Add-Content cmdlet.
So you might run it like this:
.\DeleteOldFiles.ps1 -Path c:\test -Age 50 -WhatIf -Verbose
Then when you are ready to delete the files you can run it without the -WhatIf parameter:
.\DeleteOldFiles.ps1 -Path c:\test -Age 50 -Verbose
This doesn't answer all your questions, but should help you get started, I've put plenty of comments in the code so you should be able to follow it all.
# Add CmdletBinding to support -Verbose and -WhatIf
[CmdletBinding(SupportsShouldProcess=$True)]
param
(
# Mandatory parameter including a test that the folder exists
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path $_ -PathType 'Container'})]
[string]
$Path,
# Optional parameter with a default of 60
[int]
$Age = 60
)
# Identify the items, and loop around each one
Get-ChildItem -Path $Path | where {$_.lastWriteTime -lt (Get-Date).addDays(-$Age)} | ForEach-Object {
# display what is happening
Write-Verbose "Deleting $_ [$($_.lastWriteTime)]"
# delete the item (whatif will do a dry run)
$_ | Remove-Item
}
The question is a little vague, but I assume this is something like what you want.
I like David Martin's answer, but it may be a little too complex depending on your skill level and needs.
param(
[string]$Path,
[switch]$LogDeletions
)
foreach($Item in $(Get-ChildItem -Path $Path | where {$_.lastWriteTime -lt (Get-Date).addDays(-60)}))
{
if($LogDeletions)
{
$Item | Out-File "C:\Deleted.Log" -Append
}
rm $Item
}