Remotely access \\server\c$\users\user\My Documents - powershell

I'm trying to remotely get the size of a users 'My Documents' folder using the C$ built in share.
I can browse the share, I can 'Set-Location' to the share but as soon as I try to 'Get-ChildItem' I get a permission denied.
I can't figure out if this is some built in limitation of Powershell?
Currently tried on PS2, PS3 same result.
(User has full access on both share and NTFS)
I've tried with providing the credentials using 'Get-Credentials' and I have also tried with 'New-PSDrive' mappings as well, same issue, the location is fine but as soon as I GCI it spits out 'PermissionDenied'.
$compList = [LIST OF COMPUTERS]
$exclude = [LIST OF EXCLUDED USERS]
$userSizes = #()
foreach ($computer in $compList){
gci ("\\$computer\c$\users\") | where {$exclude -notcontains $_.name}| foreach-object {
$curUser = $_.name
New-PSDrive -name "Map" -PSProvider FileSystem -Root "\\$computer\c$\users\$_\My Documents"
$size = "{0:N2}" -f ((gci "Map:\" -recurse | Measure-Object -property length -sum).sum /1MB)
$properties = #{'Computer'=$computer;'User'=$curUser;'Size (MB)'=$size}
$curObject = New-Object –TypeName PSObject –Prop $properties
$userSizes += $curObject
Remove-PSDrive -name "Map"
}
}
$userSizes | Out-GridView
$usersizes = $null
Keep in mind that GCI in PS2 doesn't allow providing credentials and the 'FileSystem' provider doesn't either!

You might need credentials to use Get-ChildItem on a remote share, i've had it happen that i've had full access to my NAS but powershell gave me the same error "Permission Denied", it seems weird and i can't why it failed when i had full permissions but it worked when i gave powershell my credentials.
Try declaring credentials first:
$creds = get-credential
then using the credentials like so
Get-ChildItem "\\server\c$\users\user\My Documents" -credentials $creds

Uggh what a disgrace.
The reason was that the path is actually
\\[server]\c$\users\[user]\documents
For some unknown, god forsaken reason, Windows Explorer displays the path as 'My Documents' but the actual path is 'Documents'.
I have no idea why they would do this but there it is. Working fine now, another few hours wasted...

Related

Powershell script searching files on domain

Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!

How to remove OneDrive folder using PowerShell

I'm trying to remove a user OneDrive folder using PowerShell but I'm not seeing any sucess even though I've been searching around internet so I would be really appreciated if I can get any help or suggestion.
so Just for testing purpose, I'm trying to delete my a folder in my own OneDrive called "Testing" and I wanted to delete everything in there including subfolders and files.
Connect-SPOService -Url https://company-admin.sharepoint.com
$OneDriveURLs = Get-SPOSite -IncludePersonalSite $true -Limit All -Filter "Url -like '-my.sharepoint.com/personal/'"
foreach($OneDriveURL in $OneDriveURLs)
{
Connect-SPOService -Url https://company-admin.sharepoint.com
Connect-PnPOnline -Url $OneDriveURLs
Remove-PnPFolder -Name "Google Drive" -Folder "Testing"
}
Your cmdlet format is not correct, you should follow the structure, more about it in Microsoft Docs
You would need to change <username> to the name the personal drive.
Change your format to :
$drive= https://company-admin.sharepoint.com/personal/<username>
$folder = 'testing'
Remove-PnPFolder -Name $drive -Folder $folder
If that does not work, as an alternative you can try the following powershell module OneDrive and use:
Remove-ODItem -AccessToken $Auth.access_token -ResourceId "https://sepagogmbh-my.sharepoint.com/" -path "/Upload"
You can read more about the module on their GitHub page.

Get-ChildItem on Multiple Computers, Performance Issues

I'm wanting to improve on my script to be able to accomplish the following:
Scan servers based on get-adcomputer on specific OUs.
Scan each server based on whatever drive letter it has.
Scan each server for log4j.
Export all results to a CSV that identifies the folder path, name of file, and the server that the file was found on.
I have been using the following code to start with:
$Servers = Get-ADComputer -Filter * -SearchBase "OU=..." | Select -ExpandProperty Name
foreach ($server in $Servers){
Invoke-Command -ComputerName $Server -ScriptBlock {
$Drives = (Get-PSDrive -PSProvider FileSystem).Root
foreach ($drive in $Drives){
Get-ChildItem -Path $drive -Force -Filter *log4j* -ErrorAction SilentlyContinue | '
foreach{
$Item = $_
$Type = $_.Extension
$Path = $_.FullName
$Folder = $_.PSIsContainer
$Age = $_.CreationTime
$Path | Select-Object `
#{n="Name";e={$Item}}, `
#{n="Created";e={$Age}},`
#{n="FilePath";e={$Path}},`
#{n="Extension";e={if($Folder){"Folder"}else{$Type}}}`
} | Export-Csv C:\Results.csv -NoType
}
}
I am having the following issues and would like to address them to learn.
How would I be able to get the CSV to appear the way I want, but have it collect the information and store it on my machine instead of having it on each local server?
I have noticed extreme performance issues on the remote hosts when running this. WinRM takes 100% of the processor while it is running. I have tried -Include first, then -Filter, but to no avail. How can this be improved so that at worst, it's solely my workstation that's eating the performance hit?
What exactly do the ` marks do?
I agree with #SantiagoSquarzon - that's going to be a performance hit.
Consider using writing a function to run Get-ChildItem recursively with the -MaxDepth parameter, including a Start-Sleep command to pause occasionally. Also, you may want to note this link
You'd also want to Export-CSV to a shared network drive to collect all the machines' results.
The backticks indicate a continuation of the line, like \ in bash.
Finally, consider using a Scheduled Task or start a powershell sub-process with a lowered process priority, maybe that will help?

Why am i receiving RPC server is unavailable error when looping?

I have a Powershell script to find specific servers and their corresponding service accounts. If I modify the script to use a single server and a single service account, the results are what I expect. If I loop thru the servers and accounts, I receive the following error:
#################################################################
# Find Service Account(s) used to start Services on a Server(s) #
#################################################################
$accounts = (Get-Content C:\Users\location\Scripts\Service_Accounts.txt)
Remove-Item -path C:\Users\location\Scripts\ServiceAccountFnd.txt -force -erroraction silentlycontinue
Import-Module ActiveDirectory # Imports the Active Directory PowerShell module #
## Retrieves servers in the domain based on the search criteria ##
$servers=Get-ADComputer -Filter {Name -Like "namehere*"} -property *
## For Each Server, find the services running under the user specified in $account ##
ForEach ($server in $servers) {
Write-Host $server
ForEach ($account in $accounts) {
Write-Host $account
Get-WmiObject Win32_Service -ComputerName $server | Where-Object {$_.StartName -like "*$account*"} | Format-Table -HideTableHeaders -property #{n='ServerName';e={$_.__SERVER}}, StartName, Name -AutoSize | Out-File -FilePath C:\Users\location\Scripts\ServiceAccountFnd.txt -append -Width 150
}
}
Your $server variable does not only contain the hostname, but also all attributes of the AD computer object.
Try to change the ComputerName value to $server.name.
If that doesn't help: Can you confirm, that you used the very same computer in the loop as without the loop, as you described? I'd assume that you try to access another computer, which is not configured as expected.
Besided that, I'd recommend you to use Get-CimInstance rather than Get-WmiObject, as it doesn't use RPC, but WinRM by default. WinRM is more firewall friendly, secure and faster.

How to export shared folder with permissions and groups associated

I'm working on a windows server 2008 r2 and I'm trying to export the configuration of shared folder with all the groups associated to them,permissions and file system permissions.
is there a way to do that?
maybe with powershell?
#edit: another problem is that I need to do that after a reboot, so I have to save the configuration in a file for example and then reimport it.
If you want to backup/restore all existing shares you could export/import the registry key HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\LanmanServer\Shares.
Backup:
reg export HKLM\SYSTEM\CurrentControlSet\Services\LanmanServer\Shares shares.reg
Restore:
reg import shares.reg
net stop server && net start server
File/folder ACLs can be saved and restored like this:
Backup:
Get-WmiObject -Class Win32_Share -Filter 'Type = 0' | select -Expand Path | % {
$path = $_
Get-Acl $path | select #{n='Path';e={$path}}, Sddl
} | Export-Csv 'C:\path\to\acls.csv'
Restore:
Import-Csv 'C:\path\to\acls.csv' | % {
$acl = Get-Acl $_.Path
$acl.SetSecurityDescriptorSddlForm($_.Sddl)
Set-Acl -Path $_.Path -AclObject $acl
}
Interesting question, I think the only way to do so is manually getting the acl on original folder and then re-apply them to the copied folder. The cmdlet to be used are Get-Acl -path $youfolder, Copy-Item and Set-Acl
I'm working on a module (see here) that should be able to do this for you. It's a script module, so you can actually open it up and look at/modify the code. If you use it, you could do something like this (the Export-Csv call is commented out, but you can put it in after confirming this is the output you're looking for):
Get-WmiObject Win32_Share -ComputerName ServerName |
Get-AccessControlEntry #| Export-Csv -Path CsvLocation.csv
You'll get errors for built-in system shares, e.g., C$, so you may want to add an -ErrorAction SilentlyContinue and/or an -ErrorVariable to the Get-AccessControlEntry call.
To bring the permissions back in, you'd just feed the Get-AccessControl output into Add-AccessControlEntry:
Import-Csv -Path CsvLocation.csv | Add-AccessControlEntry -WhatIf
Add-AccessControlEntry prompts for confirmation by default. Use the -Force switch to suppress the prompts.
Changing this to work for the NTFS permissions is very easy, too. Just change the Get-WmiObject call into a Get-ChildItem call, and everything else should be the same.