Looking to delete a folder from explorer via registry - powershell

I am looking to delete highlighted value from registry shown in Picture, where 'standard user' is the user id from which system is logged in. I need power shell script so that I can deploy it in every machine of my organization from backend and this highlighted value gets deleted from every user's system profile.

Assuming you are planning on doing this via GPO I would advise two steps:
1- Create the script file and add it to the Files preference on your GPMC
2- Create a one-time Scheduled Task and run the remote script.
This code should do what you want as long as you adapt the Path to your needs. It will get a list of the Values inside the Key you point it to and match it using the where-object.
(Get-Item -Path HKCU:\SOFTWARE\Microsoft\OneDrive\Accounts\Business1\Tenants\Intune Test').Property | Where-Object{$_ -match 'Test Sync - Documents'} | Remove-Item
Deployment is up to you, please assume this code can be improved and or adapted. This is just the core block you need to achieve what you asked.

Related

Running powershell without useriteraction

start "odopen://sync/?siteId=$siteid17&webId=$webid17&listId=$listid17&userEmail=$upn&webUrl=$URL17&webtitle=$webtitle17&listtitle=$listtitle17"
How is it possible to run the following command inside Powershell without an appearing popup window or any userinteraction? I've tried adding /ArgumentList "/S", "/Background". Also tried with -WindowStyle Hidden at the end. Appreciate some help :)
Your command as-is basically says "Start the program that opens odopen:// (OneDrive) links" and can't really be given any silent style instructions. The proper way to configure this kind of thing is through OneDrive Group Policies, but we can cheat and set registry keys.
The link above goes into detail about how to configure group policy, but also tells us that the specific group policy setting to "Configure team site libraries to sync automatically" sets this registry key:
[HKCU\Software\Policies\Microsoft\OneDrive\TenantAutoMount]"LibraryName"="LibraryID"
And that your LibraryID is in this format, which looks familiar:
tenantId=xxx&siteId=xxx&webId=xxx&listId=xxx&webUrl=httpsxxx&version=1
So to put it in a script, I would use something like this, adapted from Nicola Suter's blog post here:
$tenantAutoMountRegKey = "HKLM:\SOFTWARE\Policies\Microsoft\OneDrive\TenantAutoMount"
$autoMountTeamSitesList= #{
#Enter your SharePoint libraries to configure here as key/value pairs
MySharePoint="odopen://sync/?siteId=$siteid17&webId=$webid17&listId=$listid17&userEmail=$upn&webUrl=$URL17&webtitle=$webtitle17&listtitle=$listtitle17"
}
# Check if the key exists and create if missing:
if (-not (Test-Path $tenantAutoMountRegKey)){ New-Item -Path $tenantAutoMountRegKey -Force }
# Add the sites for automatic mounting
$autoMountTeamSitesList | Set-ItemProperty -Path $tenantAutoMountRegKey -Name $_.Key -Value $_.Value
This generally takes effect the next time a user signs into OneDrive, though Microsoft warns it may take up to 8 hours to start syncing (Keeps hundreds of users from syncing the same library at the same time)
TL;DR: You cannot.
Using odopen will always show sign-in window (as stated here: https://learn.microsoft.com/en-us/onedrive/deploy-on-windows#help-users-sign-in), what you can do is only populate it with data, which is what you are already doing.
If you want to do it silently, there is documentation about it: https://learn.microsoft.com/en-us/onedrive/use-silent-account-configuration

Powershell script tool to rename profile folders in networks share

I am new to Powershell scripting and needs some guidance on creating a script.
We've have terminal server environment where we have to regularly reset user profile in order to resolve application related issues for them. Currently we rename the user profile folder manually to .old, which forces the profile service create a new profile for user next time they login. We are thinking to create a powershell based tool to do this operation for some priority users not from technical domain.
Currently we have one user, who has different profile in different file share depending on which region (Asia, America) user is connecting to. I am looking to create a script which first loads the folders from all region file shares(for performance reason), then a list box to select the region Asia or America, and a username search box. The script should search for the username profile folder in the selected region only.
I havefound only a small bit, need help constructing rest
# Date to append folder name
$Date = Get-Date -Format yyyyMMddhhmm
# List of file shares for FsLogix containers
$FileShares = "\\abc\fileshare"
# Retrieving list of subfolders in each file share
$Containers = foreach ($FileShare in $FileShares) {((Get-ChildItem -Path $FileShare -Directory -Force).FullName)}
# Display containers to select for rename/removal
$UserContainer = $Containers | Out-GridView -PassThru
# Rename container to _OLD with date
Rename-Item $UserContainer -NewName $UserContainer"_OLD_"$Date

Can you use a powershell script to create a powershell script?

So this may be an odd request and maybe I'm going about this all wrong but I also have a unique situation. I have servers that are sometimes cloned and I need to run a script that I created on the clones servers. Due to the nature of the clones they cannot be connected to a network.
Currently I am manually putting the generic script on each server before cloning and then running the script on the clone server.
What I would like to do is have a script that runs and gathers all the information, say installed programs as an example, and generate a custom version of my current script on the servers before they are cloned.
I have both the powershell script that gets the server information and the generic one that makes the changes to the clone but I have not found a way to merge the two or any documentation so I don't know if i am hitting a limitation with this one.
Edit for more explanation and examples. I'm doing this from my phone atm so I dont have an example I can post.
Current I have a script that has a set number of applications to uninstall, registry keys to remove, services to stop ect. In another application I have a list of all the software that we have for each server and I can pull that data for each server. What I need to do is pull the data for each server, and have a script placed on each server that will uninstall just the programs for that server.
Currently the script has to run through every potential software and try to uninstall it and then check the other application to see if there are any additional programs that need to be uninstalled.
Hope this extra info helps.
Thanks.
Stop thinking of it as code.
Use script 1 to export blocks of text into a new file. for example, you might have a configuration that says all Dell servers must have this line of code run:
Set-DELL -attribute1 unmanaged
where on HP, the script would have been
Set-HP -attribute1 unmanaged
on web servers, you want:
set-web -active yes
where if not a web server, you want nothing.. so, your parent script code would look like:
$Dell = "Set-DELL -attribute1 unmanaged"
$HP = "Set-HP -attribute1 unmanaged"
$web = "set-web -active yes"
if (Get-servermake -eq "Dell")
{
$dell | out-file Child.ps1 -append
}
if (Get-servermake -eq "HP")
{
$HP | out-file Child.ps1 -append
}
if (Get-webserver -eq $true)
{
$web | out-file Child.ps1 -append
}
The result is a customized script for the specific server, child.ps1.
Now, you can take this and run with it. You could say add functionality to the child script like "Is it an AD controller", etc.
However, you might be better off having all of this in a single script, and just block off sections that don't apply in an if statement for example.
I'm still not totally sure I understand what your asking. If I've missed the mark, tell me how, and I'll tell you how to tweak this better. (And hopefully obvious is that the Get-whatever is sample code. I don't expect that to be what your using to determine a computer make/model/etc)

How to get an environment variable in a Powershell script when it is deployed by SCCM?

I've made a script to automatically change and/or create the default Outlook signature of all the employees in my company.
Technically, it gets the environment variable username where the script is deployed, access to the staff database to get some information regarding this user, then create the 3 different files for the signature by replacing values inside linked docx templates. Quite easy and logical.
After different tests, it is working correctly when you launch the script directly on a computer, either by using Powershell ISE, directly by the CMD or in Visual Studio. But when we tried to deploy it, like it will be, by using SCCM, it can't get any environment variable.
Do any of you have an idea about how to get environment variables in a script when it is deployed by SCCM ?
Here is what I've already tried :
$Name = [Environment]::UserName
$EnvVarUserName = Get-Item Env:\USERNAME
Even stuff like this :
$proc = gwmi win32_process -Filter "Name = 'explorer.exe'"
$report = #()
ForEach ($p in $proc)
{
$temp = "" | Select User
$temp.user = ($p.GetOwner()).User
$report += $temp
}
Thanks in advance and have a nice day y'all !
[EDIT]:
I've found a way of doing this, not the best one, but it works. I get the name of the machine, check the DB where when a laptop is connected to our network it stores the user id and the machine, then get the info in the staff DB.
I will still check for Matt's idea which is pretty interesting and, in a way, more accurate.
Thank you all !
How are you calling the environmental variable? $Env:computernamehas worked for me in scripts pushed out via SCCM before.
Why don't you enumerate the "%SystemDrive%\Users" folder, exclude certain built-in accounts, and handle them all in one batch?
To use the UserName environment variable the script would have to run as the logged-in user, which also implies that all of your users have at least read access to your staff database, which, at least in our environment, would be a big no-no.

Cannot find path Powershell script on SQL Server Agent

I've been searching all the internet and stackOF to and resolve this issue.
I am trying to automate a db restore using SQL Server Agent. The sql server agent job comprises of four steps 3 of which are tsql and one which is a powershell script.
I have created a proxy with admin credentials so that the script can be run as admin.
cd c:;
$backuppath="Microsoft.PowerShell.Core\FileSystem::\\sharedcomputer\backup";
$destpath="c:\tmp\";
get-childitem -path $backuppath | where-object { -not $_.PSIsContainer } |
sort-object -Property $_.CreationTime |
select-object -last 1 | copy-item -Destination (join-path $destpath "byte.BAK");
It copies the .bak file from the source shared folder and places it in to tmp folder on the target.
Whenever I run this through regular Powershell it works fine.
Whenever I try to run this from SQL server agent I get an error stating that it cannot find path.
I tried to even use net use to pass credentials for the shared folder. I am thinking it has to do with the fact that the folder has requirement for credentials.
I have turned of password file sharing as well on the source server but for some reason when i use windows explorer to locate the shared file it still asks for credentials initially. Once its saved and cached I can then use powershell to cd in to that folder. But none of this works when its executed from sql server agent
I was able to finally figure this out with a little help from a Windows Server guy...
Going back to answering the question. When I created a proxy agent I used the credentials that were associated with the current Domain Account i.e Domain\Administrator.
In order for the proxy to connect to the remote server it needs to have credentials on that domain.
So what I did was create another domain account on my target and source servers using the same name and password and gave it permissions to the folders I needed
That account was used in the proxy and the credential was set up as .\AccountName, so because the wildcard was in place the proxy was able to jump back and fort between the two servers and successfully transfer the files....
Hope this helps