I'm planning to delete my OneDrive subfolders using PowerShell but I'm unable to do it because the sub folders is prettry large, have too many depth and items so just wondering how should I modify my script so that I can delete it.
Under the Parent Folder, I used to have 5 Subfolders. I was able to delete the other 3 subfolders already but I wasn't able to do it for the remaining 2 because of the reason above.
Add-Type -Path "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.Runtime.dll"
Add-Type -Path "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.Online.SharePoint.Client.Tenant.dll"
#Key File information for secure connection
$Global:adminUPN = "upn"
$Global:PasswordFile = "C:\scripts\LitHold-OneDrive\key-file\ODpw.txt"
$Global:KeyFile = "C:\scripts\LitHold-OneDrive\key-file\OD.key"
$Global:adminPwd = ""
$CurDate = Get-Date -Format "yyyy-MM-dd"
#Pwd Key Encryption File
$key = Get-Content $Global:KeyFile
$Global:adminPwd = Get-Content $Global:PasswordFile | ConvertTo-SecureString -Key $key
$credential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $Global:adminUPN, $Global:adminPwd
#Variables
$SiteURL = "OD URL"
$ServerRelativeUrl= "Documents/parentFolder"
Try {
#Get Credentials to connect
$Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Global:adminUPN, $Global:adminPwd)
$ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$ctx.Credentials = $Credentials
#Get the web from URL
$Web = $Ctx.web
$Ctx.Load($Web)
$Ctx.executeQuery()
#Get the Folder object by Server Relative URL
$Folder = $Web.GetFolderByServerRelativeUrl($ServerRelativeUrl)
$Ctx.Load($Folder)
$Ctx.ExecuteQuery()
#Call the function to empty Folder
Empty-SPOFolder $Folder
#Delete the given Folder itself
Write-host -f Green "Deleting Folder:"$Folder.ServerRelativeUrl
$Folder.Recycle() | Out-Null
$Ctx.ExecuteQuery()
}
Catch {
write-host -f Red "Error:" $_.Exception.Message
}
#Function to Delete all files and Sub-folders of a given Folder
Function Empty-SPOFolder([Microsoft.SharePoint.Client.Folder]$Folder)
{
Try {
#Get All Files from the Folder
$Ctx = $Folder.Context
$Files = $Folder.Files
$Ctx.Load($Files)
$Ctx.ExecuteQuery()
#Iterate through each File in the Root folder
Foreach($File in $Files)
{
#Delete the file
Write-Host -f Green "$File.Name"
$Folder.Files.GetByUrl($File.ServerRelativeUrl).Recycle() | Out-Null
$Folder.Files.GetByUrl($File.ServerRelativeUrl).DeleteObject() | Out-Null
Write-host -f Green "Deleted File '$($File.Name)' from '$($File.ServerRelativeURL)'"
}
$Ctx.ExecuteQuery()
#Process all Sub Folders of the given folder
$SubFolders = $Folder.Folders
$Ctx.Load($SubFolders)
$Ctx.ExecuteQuery()
#delete all subfolders
Foreach($Folder in $SubFolders)
{
#Exclude "Forms" and Hidden folders
#Call the function recursively to empty the folder
Empty-SPOFolder -Folder $Folder
#Delete the folder
Write-Host -f Green "$Folder.UniqueId"
#$Ctx.Web.GetFolderById($Folder.UniqueId).Recycle() | Out-Null
$Ctx.Web.GetFolderById($Folder.UniqueId).DeleteObject() | Out-Null
$Ctx.ExecuteQuery()
Write-host -f Green "Deleted Folder:"$Folder.ServerRelativeUrl
}
}
Catch {
write-host -f Red "Error: $Folder.UniqueId - $File.Name " $_.Exception.Message
}
}
Fairly new to both SharePoint online and Powershell and thought this would be a pretty simple task, but I'm reaching out for help.
I have a client who has photos stored in multiple folders in a file share and they want to move this to SharePoint. They want to use the folder name of where the file exits as metadata to make searching easier.
This is the script I am using and not having much luck.
$connection = Connect-PnPOnline https://somecompany.sharepoint.com -Credentials $me -ReturnConnection
$files = Get-ChildItem "F:\some data" -Recurse
foreach ($file in $files)
{Add-PnPFile -Path $file.FullName -Folder Photos -Values #{"Title" = $file.Name;} -Connection $connection}
Issue I am having, is that this does not recurse the folders and comes back with " Local file not found"
If I can get that working, I can move onto getting the current folder name as a variable into metadata.
I'm pretty sure that this will be a simple task for experts, but alas that I am not.Any help will be greatly appreciated.
Thanks
Jassen
This seems to work for me, so will answer this. Happy for comments if there is an easier way or cleaner and also if anyone knows how to go 1 more layer deeper.
$connection = Connect-PnPOnline https://somecompany.sharepoint.com -Credentials $me -ReturnConnection
$LocalFolders = get-childitem -path "c:\test" | where-object {$_.Psiscontainer} | select-object FullName
foreach ($folder in $localfolders) {
$files = get-childitem -Path $folder.FullName -Recurse
foreach ($file in $files) {
$value1 = $file.Directory.Name
Add-PnPFile -Path $file.FullName -Folder Photos -Values #{"Title" = $file.Name;"SubCat" = $value1;} -Connection $connection
}
}
You could try below script, you need install pnp powershell.
function UploadDocuments(){
Param(
[ValidateScript({If(Test-Path $_){$true}else{Throw "Invalid path given: $_"}})]
$LocalFolderLocation,
[String]
$siteUrl,
[String]
$documentLibraryName
)
Process{
$path = $LocalFolderLocation.TrimEnd('\')
Write-Host "Provided Site :"$siteUrl -ForegroundColor Green
Write-Host "Provided Path :"$path -ForegroundColor Green
Write-Host "Provided Document Library name :"$documentLibraryName -ForegroundColor Green
try{
$credentials = Get-Credential
Connect-PnPOnline -Url $siteUrl -CreateDrive -Credentials $credentials
$file = Get-ChildItem -Path $LocalFolderLocation -Recurse
$i = 0;
Write-Host "Uploading documents to Site.." -ForegroundColor Cyan
(dir $path -Recurse) | %{
try{
$i++
if($_.GetType().Name -eq "FileInfo"){
$SPFolderName = $documentLibraryName + $_.DirectoryName.Substring($path.Length);
$status = "Uploading Files :'" + $_.Name + "' to Location :" + $SPFolderName
Write-Progress -activity "Uploading Documents.." -status $status -PercentComplete (($i / $file.length) * 100)
$te = Add-PnPFile -Path $_.FullName -Folder $SPFolderName
}
}
catch{
}
}
}
catch{
Write-Host $_.Exception.Message -ForegroundColor Red
}
}
}
UploadDocuments -LocalFolderLocation C:\Lee\Share -siteUrl https://tenant.sharepoint.com/sites/Developer -documentLibraryName MyDOc4
Hello and thank you in advance. Currently I have a script running that downloads files from one location to another. I'm trying to clean up the script to only download files that have _Lthumb in the file name. Any help would be greatly appreciated.
The following line in which I added is causing the problems. Wheni remove it, the script runs fine, but it doesn't filter the data by _Lthumb:
| Where-Object {$_.Name -Match '*_Lthumb'}
Here is the script, including the portion above that breaks it:
if((Get-PSSnapin | Where {$_.Name -eq "Microsoft.SharePoint.PowerShell"}) -eq $null) {
Add-PSSnapin Microsoft.SharePoint.PowerShell;
}
######################## Start Variables ########################
$destination = "\\pulse-dev.hinshawad.com\C$\ProfilePhotos\ProfilePictures"
$webUrl = "http://mysites.hinshawad.com"
$listUrl = "http://mysites.hinshawad.com/user photos/profile pictures/"
##############################################################
$web = Get-SPWeb -Identity $webUrl
$list = $web.GetList($listUrl)
function ProcessFolder {
param($folderUrl)
$folder = $web.GetFolder($folderUrl)
foreach ($file in $folder.Files | Where-Object {$_.Name -Match '*_Lthumb'} ) {
#Ensure destination directory
$destinationfolder = $destination + "/" + $folder.Url
if (!(Test-Path -path $destinationfolder))
{
$dest = New-Item $destinationfolder -type directory
}
#Download file
$binary = $file.OpenBinary()
#$stream = New-Object System.IO.FileStream($destinationfolder + "/" + $file.Name), Create
$stream = New-Object System.IO.FileStream($destinationfolder + "/" + ($file.Name -replace "_Lthumb")), Create
$writer = New-Object System.IO.BinaryWriter($stream)
$writer.write($binary)
$writer.Close()
}
}
#Download root files
ProcessFolder($list.RootFolder.Url)
#Download files in folders
#foreach ($folder in $list.Folders) {
#ProcessFolder($folder.URL)
#}
My objective is to write a powershell script that will recursively check a file server for any directories that are "x" (insert days) old or older.
I ran into a few issues initially, and I think I got most of it worked out. One of the issues I ran into was with the path limitation of 248 characters. I found a custom function that I am implementing in my code to bypass this limitation.
The end result is I would like to output the path and LastAccessTime of the folder and export the information into an easy to read csv file.
Currently everything is working properly, but for some reason I get some paths output several times (duplicates, triples, even 4 times). I just want it output once for each directory and subdirectory.
I'd appreciate any guidance I can get. Thanks in advance.
Here's my code
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function Get-FolderItem
{
[cmdletbinding(DefaultParameterSetName='Filter')]
Param (
[parameter(Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
[Alias('FullName')]
[string[]]$Path = $PWD,
[parameter(ParameterSetName='Filter')]
[string[]]$Filter = '*.*',
[parameter(ParameterSetName='Exclude')]
[string[]]$ExcludeFile,
[parameter()]
[int]$MaxAge,
[parameter()]
[int]$MinAge
)
Begin
{
$params = New-Object System.Collections.Arraylist
$params.AddRange(#("/L","/S","/NJH","/BYTES","/FP","/NC","/NFL","/TS","/XJ","/R:0","/W:0"))
If ($PSBoundParameters['MaxAge'])
{
$params.Add("/MaxAge:$MaxAge") | Out-Null
}
If ($PSBoundParameters['MinAge'])
{
$params.Add("/MinAge:$MinAge") | Out-Null
}
}
Process
{
ForEach ($item in $Path)
{
Try
{
$item = (Resolve-Path -LiteralPath $item -ErrorAction Stop).ProviderPath
If (-Not (Test-Path -LiteralPath $item -Type Container -ErrorAction Stop))
{
Write-Warning ("{0} is not a directory and will be skipped" -f $item)
Return
}
If ($PSBoundParameters['ExcludeFile'])
{
$Script = "robocopy `"$item`" NULL $Filter $params /XF $($ExcludeFile -join ',')"
}
Else
{
$Script = "robocopy `"$item`" NULL $Filter $params"
}
Write-Verbose ("Scanning {0}" -f $item)
Invoke-Expression $Script | ForEach {
Try
{
If ($_.Trim() -match "^(?<Children>\d+)\s+(?<FullName>.*)")
{
$object = New-Object PSObject -Property #{
ParentFolder = $matches.fullname -replace '(.*\\).*','$1'
FullName = $matches.FullName
Name = $matches.fullname -replace '.*\\(.*)','$1'
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
}
Else
{
Write-Verbose ("Not matched: {0}" -f $_)
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
}
Function ExportFolders
{
#================ Global Variables ================
#Path to folders
$Dir = "\\myFileServer\somedir\blah"
#Get all folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 800
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderPath = $SubDir.FullName
$folders = Get-ChildItem -Recurse $FolderPath -force -directory| Where-Object { ($_.LastAccessTimeUtc -le $cutOffDate)} | Select-Object FullName, LastAccessTime
ForEach ($folder in $folders)
{
$folderPath = $folder.fullname
$fixedFolderPaths = ($folderPath | Get-FolderItem).fullname
ForEach ($fixedFolderPath in $fixedFolderPaths)
{
#$fixedFolderPath
$getLastAccessTime = $(Get-Item $fixedFolderPath -force).lastaccesstime
#$getLastAccessTime
$details = #{ "Folder Path" = $fixedFolderPath; "LastAccessTime" = $getLastAccessTime}
$results += New-Object PSObject -Property $details
$results
}
}
}
}
ExportFolders
I updated my code a bit and simplified it. Here is the new code.
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function ExportFolders
{
#================ Global Variables ================
#Path to user profiles in Barrington
$Dir = "\\myFileServer\somedir\blah"
#Get all user folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 1
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
$details = $null
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderName = $SubDir.FullName
$FolderInfo = $(Get-Item $FolderName -force) | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$FolderLeafs = gci -Recurse $FolderName -force -directory | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0} | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$details = #{ "LastAccessTime" = $FolderInfo.LastAccessTime; "Folder Path" = $FolderInfo.FullName}
$results += New-Object PSObject -Property $details
ForEach ($FolderLeaf in $FolderLeafs.fullname)
{
$details = #{ "LastAccessTime" = $(Get-Item $FolderLeaf -force).LastAccessTime; "Folder Path" = $FolderLeaf}
$results += New-Object PSObject -Property $details
}
$results
}
}
ExportFolders
The FolderInfo variable is sometimes printing out multiple times, but the FolderLeaf variable is printing out once from what I can see. The problem is if I move or remove the results variable from usnder the details that print out the folderInfo, then the Parent directories don't get printed out. Only all the subdirs are shown. Also some directories are empty and don't get printed out, and I want all directories printed out including empty ones.
The updated code seems to print all directories fine, but as I mentioned I am still getting some duplicate $FolderInfo variables.
I think I have to put in a condition or something to check if it has already been processed, but I'm not sure which condition I would use to do that, so that it wouldn't print out multiple times.
In your ExportFolders you Get-ChildItem -Recurse and then loop over all of the subfolders calling Get-FolderItem. Then in Get-FolderItem you provide Robocopy with the /S flag in $params.AddRange(#("/L", "/S", "/NJH", "/BYTES", "/FP", "/NC", "/NFL", "/TS", "/XJ", "/R:0", "/W:0")) The /S flag meaning copy Subdirectories, but not empty ones. So you are recursing again. Likely you just need to remove the /S flag, so that you are doing all of your recursion in ExportFolders.
In response to the edit:
Your $results is inside of the loop. So you will have a n duplicates for the first $subdir then n-1 duplicates for the second and so forth.
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
$results
}
should be
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
}
$results
I have this code :
$Count=0
Function DryRun-UploadFile($DestinationFolder, $File, $FileSource, $Count)
{
if($FileSource -eq $null){
$FileSource = $Folder
}
$path= [String]$FileSource+'\'+$File
$Size = get-item $Path
$Size = $Size.length
if($Size -lt 160000){
Write-Host "Passed"
}else{
$Count=$Count+1
}
}
function DryRun-PopulateFolder($ListRootFolder, $FolderRelativePath, $Count)
{
Write-Host "Uploading file " $file.Name "to" $WorkingFolder.name -ForegroundColor Cyan
if(!($File -like '*.txt')){
#Upload the file
DryRun-UploadFile $WorkingFolder $File $FileSource $Count
}else{
$Count=$Count+1
}
}
}
Function DryRun-Copy-Files{
$AllFolders = Get-ChildItem -Recurse -Path $Folder |? {$_.psIsContainer -eq $True}
#Get a list of all files that exist directly at the root of the folder supplied by the operator
$FilesInRoot = Get-ChildItem -Path $Folder | ? {$_.psIsContainer -eq $False}
#Upload all files in the root of the folder supplied by the operator
Foreach ($File in ($FilesInRoot))
{
#Notify the operator that the file is being uploaded to a specific location
Write-Host "Uploading file " $File.Name "to" $DocLibName -ForegroundColor Cyan
if(!($File -like '*.txt')){
#Upload the file
DryRun-UploadFile($list.RootFolder) $File $null $Count
}else{
$Count=$Count+1
}
}
#Loop through all folders (recursive) that exist within the folder supplied by the operator
foreach($CurrentFolder in $AllFolders)
{
DryRun-PopulateFolder ($list.RootFolder) $FolderRelativePath $Count
}
Write-output "Number of files excluded is: "$Count | Out-file DryRun.txt -append
}
I have removed some of my code for simplicity sake as it has nothing to do with my problem. My code goes through a file structure and counts up if the file is above 160000 bytes or is a txt file. run calling DryRun-Copy-Files.
And I have a variable called $count which I want to use in all the functions and then output what the count is to a file.
The problem is it only counts in the first function DryRun-Copy-Files not in the others
define the variable with global:
$global:count=0
and use it in the functions (don't explicit pass it)