Hello and thank you in advance. Currently I have a script running that downloads files from one location to another. I'm trying to clean up the script to only download files that have _Lthumb in the file name. Any help would be greatly appreciated.
The following line in which I added is causing the problems. Wheni remove it, the script runs fine, but it doesn't filter the data by _Lthumb:
| Where-Object {$_.Name -Match '*_Lthumb'}
Here is the script, including the portion above that breaks it:
if((Get-PSSnapin | Where {$_.Name -eq "Microsoft.SharePoint.PowerShell"}) -eq $null) {
Add-PSSnapin Microsoft.SharePoint.PowerShell;
}
######################## Start Variables ########################
$destination = "\\pulse-dev.hinshawad.com\C$\ProfilePhotos\ProfilePictures"
$webUrl = "http://mysites.hinshawad.com"
$listUrl = "http://mysites.hinshawad.com/user photos/profile pictures/"
##############################################################
$web = Get-SPWeb -Identity $webUrl
$list = $web.GetList($listUrl)
function ProcessFolder {
param($folderUrl)
$folder = $web.GetFolder($folderUrl)
foreach ($file in $folder.Files | Where-Object {$_.Name -Match '*_Lthumb'} ) {
#Ensure destination directory
$destinationfolder = $destination + "/" + $folder.Url
if (!(Test-Path -path $destinationfolder))
{
$dest = New-Item $destinationfolder -type directory
}
#Download file
$binary = $file.OpenBinary()
#$stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create
$stream = New-Object System.IO.FileStream($destinationfolder + "/" + ($file.Name -replace "_Lthumb")), Create
$writer = New-Object System.IO.BinaryWriter($stream)
$writer.write($binary)
$writer.Close()
}
}
#Download root files
ProcessFolder($list.RootFolder.Url)
#Download files in folders
#foreach ($folder in $list.Folders) {
#ProcessFolder($folder.URL)
#}
Related
I'm BRAND new to ps scripting and am looking for some advice please.
We replace a data share server every couple of years, and creating the complete folder structure and permissions by hand is very tedious, so I'm trying to automate it with a powershell script. Since I'm new I've been googling for some examples and snippets and have been compiling what I need from it.
My export script reads the folder structure and rites it to a text file, and my import script creates it once I move the folder over to new server, no problem.
The problem comes with the access rights.
It reads the rights and writes it to a CSV, but once I try to import it I get an error:
new-object : Cannot convert argument "2", with value: "TRUE", for
"FileSystemAccessRule" to type
"System.Security.AccessControl.AccessControlType": "Cannot convert
value "TRUE" to type
"System.Security.AccessControl.AccessControlType". Error: "Unable to
match the identifier name TRUE to a valid enumerator name. Specify one
of the following enumerator names and try again: Allow, Deny"" At
line:1 char:23
... ccessRule = new-object System.Security.AccessControl.FileSystemAccess ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : InvalidOperation: (:) [New-Object], MethodException
FullyQualifiedErrorId : ConstructorInvokedThrowException,Microsoft.PowerShell.Commands.NewObjectCommand
As I understand it it's looking for a Allow/Deny and not a True/False, but the export gives a True/False. So I'm guessing there's something wrong with my export...
Here is my code if anyone could point me in the correct direction I would greatly appreciate it!!
(Let me know if I should post ALL the code, I just don't want to clutter any more than I already do :D)
Export:
$FolderPath = dir -Directory -Path $DriveLetter -Force
$Report = #()
Foreach ($Folder in $FolderPath)
{
if ($Folder.Name -notlike '*$RECYCLE.BIN*')
{
if ($Folder.Name -notlike '*System Volume Information*')
{
$Acl = Get-Acl -Path $Folder.FullName
foreach ($Access in $acl.Access)
{
$Properties = [ordered]#{'FolderName'=$Folder.Name;'IDRef'=$Access.IdentityReference;'FSRights'=$Access.FileSystemRights;'Inherited'=$Access.IsInherited}
$Report += New-Object -TypeName PSObject -Property $Properties
}
}
}
}
$Report | Export-Csv -path $ExportACL -NoTypeInformation
Import:
foreach ( $LItem in $ACL_Imp )
{
$path_full = $Drivepath.ToString() + $LItem.FolderName
$ACL_Set = Get-Acl $path_full
$permission = $LItem.IDRef, $LItem.FSRights, $LItem.Inherited
$accessRule = new-object System.Security.AccessControl.FileSystemAccessRule $permission <<<--- Error occurs here
$ACL_Set.SetAccessRule($accessRule)
$ACL_Set | Set-Acl $path_full
}
Example of one user in the export csv ( I remove the drive letter cause it isn't the same drive letter always.)
#TYPE System.Management.Automation.PSCustomObject;;; FolderName;IDRef;FSRights;Inherited
Data\UserA;Domain\UserA;FullControl;FALSE
Data\UserA;NT AUTHORITY\SYSTEM;FullControl;TRUE
Data\UserA;DOMAIN\UserB;FullControl;TRUE
Data\UserA;BUILTIN\Administrators;FullControl;TRUE
Data\UserA;DOMAIN\GRP_A;ReadAndExecute, Synchronize;TRUE
Data\UserA;Domain\GRP_A;ReadAndExecute, Synchronize;TRUE
Once again thanks in advance for any assistance!
And if you can't provide any, thanx for taking the time to check it out anycase!! :)
I've changed the number of variables I export and import, and that seemed to do the trick. (Exporting all variables, and only using 5)
I'm posting my Full Code in case someone else also wants to use this, or want to modify for their needs :)
Hope this will help someone in the future, and that my comments make sense..
Our Directory structure:
ImportExport <-- Location of scripts and output files (whole folder to be copied to new server)
Shared
Software
Users/UserA/
Users/UserB/
....
Users/UserZ/
Export:
#Variables
$drivepath = Get-Location #Get working drives' letter
$DriveLetter = Split-Path -qualifier $drivepath
$ExportACL = $DriveLetter.ToString() + "\ImportExport\export_acl.csv" #ACL Location
$ExportFolders = $DriveLetter.ToString() + "\ImportExport\export_folders.txt" #File with List of Folders
$UsersPath = $DriveLetter.ToString() + "\Users" #"Users" folders location on working drive located in Data folder
#Read user access levels on each folder on working drive and write to file.
cd.. #<-- add this if script is not run from within the PS environment.
$FolderPath = dir -Directory -Path $DriveLetter -Force
$Report = #()
Foreach ($Folder in $FolderPath)
{
if ($Folder.Name -notlike '*$RECYCLE.BIN*')
{
if ($Folder.Name -notlike '*ImportExport*')
{
if ($Folder.Name -notlike '*System Volume Information*')
{
$Acl = Get-Acl -Path $Folder.FullName
foreach ($Access in $acl.Access)
{
$Properties = [ordered]#{'FolderName'=$Folder.Name;'FSRights'=$Access.FileSystemRights;'ACType'=$Access.AccessControlType;'IDRef'=$Access.IdentityReference;'Inherited'=$Access.IsInherited;'IFlags'=$Access.InheritanceFlags;'PFlags'=$Access.PropagationFlags}
$Report += New-Object -TypeName PSObject -Property $Properties
}
}
}
}
}
$Report | Export-Csv -path $ExportACL -NoTypeInformation
#Read user access levels on each child folder of Users folders on working drive and add to file.
$FolderPath = dir -Directory -Path $UsersPath -Force
$UserReport = #()
Foreach ($Folder in $FolderPath)
{
if ($Folder.Name -notlike '*$RECYCLE.BIN*')
{
if ($Folder.Name -notlike '*ImportExport*')
{
if ($Folder.Name -notlike '*System Volume Information*')
{
$Acl = Get-Acl -Path $Folder.FullName
foreach ($Access in $acl.Access)
{
$StrFolderPath = $Folder.Parent.ToString() + "\" + $Folder.BaseName
$Properties = [ordered]#{'FolderName'=$StrFolderPath;'FSRights'=$Access.FileSystemRights;'ACType'=$Access.AccessControlType;'IDRef'=$Access.IdentityReference;'Inherited'=$Access.IsInherited;'IFlags'=$Access.InheritanceFlags;'PFlags'=$Access.PropagationFlags}
$UserReport += New-Object -TypeName PSObject -Property $Properties
}
}
}
}
}
$UserReport | Export-Csv -append $ExportACL
#Read Directory Structure and Export to File
$Dirs = Dir -Directory -Path $DriveLetter -Force
foreach($Dir in $Dirs)
{
if ($Dir.Name -notlike '*$RECYCLE.BIN*')
{
if ($Dir.Name -notlike '*ImportExport*')
{
if ($Dir.Name -notlike '*System Volume Information*')
{
$Dir.Name | out-file -Append $ExportFolders
}
}
}
}
$Dirs = Get-ChildItem -Path $UsersPath -Force
foreach($Dir in $Dirs)
{
$DirName = "Users\" + $Dir.Name
$DirName | out-file -Append $ExportFolders
}
Before import I open csv file in excel and make it different columns, so I use ";" as the delimiter, I'm struggling if I do not edit it. Variables with multiple values is split with a "," and that messes up the import.
Also, due to Admin rights needs to apply ACL, script needs to be run within elevated PS Environment, can't figure out how to do it outside. Can probably run a batch file to do a RunAs, but I had enough of this scripts for the time being. :p
Import:
#Variables
$drivepath = Get-Location #Get working drives' letter
$DriveLetter = Split-Path -qualifier $drivepath
$ImportACL = $DriveLetter.ToString() + "\ImportExport\export_acl.csv" #ACL Location
$ACL_Imp = Import-Csv $ImportACL -Delimiter ';' #Import ACL
$ImportFolders = $DriveLetter.ToString() + "\ImportExport\export_folders.txt" #File with List of Folders
$UsersPath = $DriveLetter.ToString() + "\Users" #Users' folder location on working drive
#Create Folders from text file.
Get-Content $ImportFolders | %{mkdir "$DriveLetter\$_"}
#Apply ACL to created Folder structure
foreach ( $LItem in $ACL_Imp )
{
$path_full = $Drivepath.ToString() + $LItem.FolderName
$ACL_Set = Get-Acl $path_full
$permission = $LItem.IDRef, $LItem.FSRights, $LItem.IFlags, $LItem.PFlags, $LItem.ACType
$accessRule = new-object System.Security.AccessControl.FileSystemAccessRule $permission
$ACL_Set.SetAccessRule($accessRule)
$ACL_Set | Set-Acl $path_full
}
I'm wondering if someone can help me? I've butchered a few powershell scripts I've found online that make shortcuts from $source to $destination. However, it appears to overwrite each time, and I only want it to create a .lnk on new.
The original source of the script is here and this is my current "non working" script.. I added the following, but it doesn't seem to work. I think I need to somehow get it to check the $destination and then continue if $file.lnk doesn't exist
If ($status -eq $false) {($WshShell.fso.FileExists("$Destination") + "*.lnk")
Full script:
function Create-ShortcutForEachFile {
Param(
[ValidateNotNullOrEmpty()][string]$Source,
[ValidateNotNullOrEmpty()][string]$Destination,
[switch]$Recurse
)
# set recurse if present
if ($Recurse.IsPresent) { $splat = #{ Recurse = $true } }
# Getting all the source files and source folder
$gci = gci $Source #splat
$Files = $gci | ? { !$_.PSisContainer }
$Folders = $gci | ? { $_.PsisContainer }
# Creating all the folders
if (!(Test-Path $Destination)) { mkdir $Destination -ea SilentlyContinue > $null }
$Folders | % {
$Target = $_.FullName -replace [regex]::escape($Source), $Destination
mkdir $Target -ea SilentlyContinue > $null
}
# Creating Wscript object
$WshShell = New-Object -comObject WScript.Shell
# Creating all the Links
If ($status -eq $false) {($WshShell.fso.FileExists("$Destination") + "*.lnk")
$Files | % {
$InkName = "{0}.lnk" -f $_.sBaseName
$Target = ($_.DirectoryName -replace [regex]::escape($Source), $Destination) + "\" + $InkName
$Shortcut = $WshShell.CreateShortcut($Target)
$Shortcut.TargetPath = $_.FullName
$Shortcut.Save()
}
}
}
Create-ShortcutForEachFile -Source \\myserver.domain.local\Folder1\Folder2\Test -Destination \\myserver2.domain.local\Folder1\Folder2\Test -Recurse
Hoping anyone can help me out, apologies for being a powershell/scripting noob.
My brother kindly reworked the script to suit better to my needs.
Here it is:
#################################################
<#
CREATE-SHORTCUT - creates shortcut for all files from a source folder
version : 1.0
Author :
Creation Date :
Modified Date :
#>
#------------------------------------------------------------[ variables ]----------------------------------------------------------
$sourceDir="D:\scripts\create-shortcut\source"
$targetDir="D:\scripts\create-shortcut\dest"
#-------------------------------------------------------------[ Script ]-----------------------------------------------------------
# get files/files from folder
$src_gci=Get-Childitem -path $sourceDir -Recurse
$src_files=$src_gci | ? { !$_.PSisContainer }
$src_folders=$src_gci | ? { $_.PSisContainer }
# create subfolders
$src_folders | Copy-Item -Destination { join-path $targetDir $_.Parent.FullName.Substring($sourceDir.Length) } -Force
# create shortcuts
$WshShell = New-Object -comObject WScript.Shell
$src_files | % {
$lnkName="{0}.lnk" -f $_.BaseName
$Target = ($_.DirectoryName -replace [regex]::escape($sourceDir), $targetDir) + "\" + $lnkName
$Shortcut = $WshShell.CreateShortcut($Target)
$Shortcut.TargetPath = $_.FullName
$Shortcut.Save()
# change to SourceFiles ModifiedDate #
$src_date=$_.LastWriteTime
Get-ChildItem $Target | % { $_.LastWriteTime = "$src_date" }
}
and thank you in advance!!! I was able to successfully move documents from one server(sharepoint) to another (pulse). However, each document begins with HC_.jpg.
How can I move the document and rename at the same time. Currently I am using two separate scripts and two separate scheduled tasks which is less than ideal.
First Script that moves the document from SharePoint Server:
######################## Start Variables ########################
######################## Adam's Script######################
$destination = "\\pulse-dev.domain.com\C$\ProfilePhotos\"
$webUrl = "http://mysites-dev.domain.com"
$listUrl = "http://mysites-dev.domain.com/user photos/"
##############################################################
$web = Get-SPWeb -Identity $webUrl
$list = $web.GetList($listUrl)
function ProcessFolder {
param($folderUrl)
$folder = $web.GetFolder($folderUrl)
foreach ($file in $folder.Files) {
#Ensure destination directory
$destinationfolder = $destination + "/" + $folder.Url
if (!(Test-Path -path $destinationfolder))
{
$dest = New-Item $destinationfolder -type directory
}
#Download file
$binary = $file.OpenBinary()
$stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create
$writer = New-Object System.IO.BinaryWriter($stream)
$writer.write($binary)
$writer.Close()
}
}
#Download root files
ProcessFolder($list.RootFolder.Url)
#Download files in folders
#foreach ($folder in $list.Folders) {
#ProcessFolder($folder.URL)
#}
Second Script that is run on the Pulse server after first script completes
$dir= "C:\ProfilePhotos\"
CD $dir
Get-ChildItem -Recurse |
where-Object {$_.Name -match 'HC_'} |
Rename-Item -NewName {$_.Name -replace 'HC_', ''}
Replace
$stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create
with
$stream = New-Object System.IO.FileStream($destinationfolder + "/" + ($file.Name -replace "^HC_")), Create
I want to use Powershell to automate the:
1. compression of log files (.xml and .dat extensions) older than 7 days,
2. copy these compressed archives elsewhere and
3. then delete the raw log files from source.
I am using the following Powershell script which I pieced together from various resources.
function New-Zip
{
param([string]$zipfilename)
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
function Add-Zip
{
param([string]$zipfilename)
if(-not (test-path($zipfilename)))
{
set-content $zipfilename ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $zipfilename).IsReadOnly = $false
}
$shellApplication = new-object -com shell.application
$zipPackage = $shellApplication.NameSpace($zipfilename)
foreach($file in $input)
{
$zipPackage.CopyHere($file.FullName)
Start-sleep -milliseconds 500
}
}
$targetFolder = 'C:\source'
$destinationFolder = 'D:\destination\'
$now = Get-Date
$days = 7
$lastWrite = $now.AddDays(-$days)
Get-ChildItem $targetFolder -Recurse | Where-Object { $_ -is [System.IO.FileInfo] } | ForEach-Object {
If ($_.LastWriteTime -lt $lastWrite)
{
$_ | New-Zip $($destinationFolder + $_.BaseName + ".zip")
$_ | Add-Zip $($destinationFolder + $_.BaseName + ".zip")
}
}
Get-ChildItem $targetFolder -Recurse -Include "*.dat", "*.xml" | WHERE {($_.CreationTime -le $(Get-Date).AddDays(-$days))} | Remove-Item -Force
This script does work reasonably well, as it archives only the files, and copies them on destination folder.
If I have a structure of C:\source\bigfolder\logfile.dat, the resulting zip file will not get the folder structure as I would like:
logfile.zip>bigfolder>logfile.dat
Instead, it just gets: logfile.zip>logfile.dat
Can someone help in figuring this out ?
To fine tune it even better, I would like if possible to build some logic, so the files are compressed only when a specific criteria is met.
The raw log files that I compress have a naming routine as following:
Folders:
emstg#12_list\randomstring.xml
Individual log files:
emstg#12_query_data.xml
emstg#12_events_cache.dat etc...
As you may see the start of these files is same with emstg#number.
How to implement a "name-detection" mechanism in script above ?
Thanks
you could zip a folder by using [System.IO.Compression]
I wrote this based on your script.
My idea is to copy the whole folder structure of the file you need to compress into a temp folder and then zip that temp folder.
For the name-detection, you just need another where-object (modify the code as you want)
function Zip
{
param(
[string]$source,
[string]$des
)
add-type -AssemblyName System.IO.Compression.FileSystem
[System.IO.Compression.ZipFile]::CreateFromDirectory($source,$des,'Optimal',$true)
Start-sleep -s 1
}
$targetFolder = "C:\source"
$destinationFolder = "C:\destination"
$temp = "C:\temp"
$now = Get-Date
$days = 7
$lastWrite = $now.AddDays(-$days)
$i = 1
Get-ChildItem $targetFolder -Recurse | Where-Object { $_ -is [System.IO.FileInfo] } | Where-Object {$_ -like "*.*"} | ForEach-Object {
If ($_.LastWriteTime -lt $lastWrite) {
$halfDir = $_.DirectoryName.Trim($targetFolder)
$s = $temp + "\" + $i + "\" + $halfDir
$d = $destinationFolder + "\" + $_.BaseName + ".zip"
Copy-Item $_.DirectoryName -Destination $s
Copy-Item $_.FullName -Destination $s
Zip -source $s -des $d
$i++
}
}
Got this script running. Nearly completed my mission to print attachments from it that land in a specific subfolder of outlook
$OutputFolder = 'C:\tests';
$outlook = New-Object -ComObject Outlook.Application;
$olFolderInbox = 6;
$ns = $outlook.GetNameSpace("MAPI");
$inbox = $ns.GetDefaultFolder($olFolderInbox);
$inbox.Folders `
| ? Name -eq 'colour' `
| % Items `
| % Attachments `
| % {
$OutputFileName = Join-Path -Path $OutputFolder -ChildPath $_.FileName;
if (Test-Path $OutputFileName) {
$FileDirectoryName = [System.IO.Path]::GetDirectoryName($OutputFileName);
$FileNameWithoutExtension = [System.IO.Path]::GetFileNameWithoutExtension($OutputFileName);
$FileExtension = [System.IO.Path]::GetExtension($OutputFileName);
for ($i = 2; Test-Path $OutputFileName; $i++) {
$OutputFileName = "{0} ({1}){2}" -f (Join-Path -Path $FileDirectoryName -ChildPath $FileNameWithoutExtension), $i, $FileExtension;
}
}
Write-Host $OutputFileName;
$_.SaveAsFile($OutputFileName)
}
Remove-Item -Path C:\tests\*.jpg
Dir C:\tests\ | Out-Printer -name xerox-b8
Remove-Item -Path C:\tests\*.*
when i try to pipe the objects to print i am getting XML printing out, or just the directory contents
I have tried:
select-object (wrong)
get-childitem (wrong)
DIR C:\tests\*.* (only returns directory listing printout)
These either return a load of XML rubbish or just a directory listing,
How can i pipe the contents of a folder to a printer using powershell, surely this can be done