Edit/Add GPO on DC via Powershell - powershell

I'm given the task to migrate all the printers installed on workstations via GPO to another server.
As for now all printers are installed in a local decentralized Distribution Point, we want to move on a centralized Distribution Point/Print Server.
On mine DC, via Group Policy Management Editor, I've a lot of printers in
Computer Configuration\Preferences\Control Panel Settings\Printers
All printers are mapped from \DP00x\Printer and given a local name.
What i want to change is the \DP00x to \CentralDP01\Printer in the GPO.
I've managed via powershell to create all printer ports, install all printers and publish/list in the directory all of them.
Given that they are more than 100, I wish to automate the process to edit the GPO editing, so that i don't need to open each policy and each printer to modify the destination.
I've tried the cmdlet Get-GPRegistryValue because I know (at least) that printers are installed on HKLM\SYSTEM\CurrentControlSet\Control\Print\Printers
but i get this error every time:
Get-GPRegistryValue : The following Group Policy registry setting was not found: "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Print\Printers".
Parameter name: keyPath
At line:1 char:1
+ Get-GPRegistryValue -Guid 6b464ed9-66c8-47fa-8327-1fe9b074a0d7 -Key H...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (Microsoft.Group...tryValueCommand:GetGPRegistryValueCommand) [Get-GPRegistryValue], ArgumentException
+ FullyQualifiedErrorId : UnableToRetrievePolicyRegistryItem,Microsoft.GroupPolicy.Commands.GetGPRegistryValueCommand
I tried as well Get-GPPrefRegistryValue
Get-GPPrefRegistryValue -Context Computer -Guid 6b464ed9-66c8-47fa-8327-1fe9b074a0d7 -Key HKLM\SYSTEM\CurrentControlSet\Control\Print\Printers
But error looks the same:
Get-GPPrefRegistryValue : The Preference registry setting "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Print\Printers" was not found in the
"x-x-x-x-x-x" GPO in the x-x-x-x-x-x-x.com domain.
Parameter name: keyPath
At line:1 char:1
+ Get-GPPrefRegistryValue -Context Computer -Guid 6b464ed9-66c8-47fa-83 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (Microsoft.Group...tryValueCommand:GetGPPrefRegistryValueCommand) [Get-GPPrefRegistryValue], ArgumentException
+ FullyQualifiedErrorId : UnableToRetrievePreferenceRegistryItem,Microsoft.GroupPolicy.Commands.GetGPPrefRegistryValueCommand
I found a workaround. Backup the GPO, manually edit the XML with the new value and import back the GPO.
I don't fancy the idea of manually editing because it can lead to errors and with over 100+ GPOs I can have alot of errors.
Can anyone help me?
Maybe i'm using the wrong commands, but so far documentations state to use GPO Module.

Unfortunately the GroupPolicy commands are limited to registry key settings only, and printer-preferences fall outside that. You can safely edit the live GPO xml files themselves though (or use Backup-GPO/Restore-GPO).
If you're only replacing the server name, this should work fine. Try it on a test GPO, updating the path as needed:
$guid = (Get-GPO -Name 'Test GPO')
# Check the GPO version before changes:
Get-GPO -guid $guid
$domain = 'domain.com'
$path = "\\$domain\SYSVOL\$domain\Policies\{$guid}\User\Preferences\Printers\Printers.xml"
# Update the path in the GPO xml:
(Get-Content $path -Raw) -replace 'DP00x','CentralDP01' | Set-Content $path
# Validate the GPO version/change date have updated - might take a while if xml is on a different DC:
Get-GPO -guid $guid

Related

PowerShell Add PS-Drive to robocopy to a device outside of the domain

I'm currently creating a PowerShell script that basically maps the network drive from a NAS which is outside of the domain, but I get the following error message:
New-PSDrive : The network path was not found
...
+ New-PSDrive -Name X -PSProvider "FileSystem" -Credential $credential ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (X:PSDriveInfo) [New-PSDrive], Win32Exception
+ FullyQualifiedErrorId : CouldNotMapNetworkDrive,Microsoft.PowerShell.Commands.NewPSDriveCommand
This is the following code:
$nasipaddress = "192.168.0.110"
$nasusername = ".\nas-local"
$naspassword = cat "F:\Skript\password.txt" | ConvertTo-SecureString
$credential = New-Object System.Management.Automation.PSCredential ($nasusername, $naspassword)
robocopy F:\Admin$ \\$nasipaddress\robocopy /MIR /PURGE /e /log+:$filepath
Robocopy works just fine, but it needs authentication credentials and that's the reason why it doesn't run when the PC is inactive for a while, that's why I'm implementing it to map it first. It has to be mapped temporarily, not persistent.
I think it might be because it's not a domain device, but I might be wrong here. Did google a lot, like really a lot, but couldn't find anything that worked so here's my last resort. Not that experienced with PowerShell, but on my way to mastering it.
Best Regards and thank you very much
stillrigeway
You should map to a share, are you exposing a share from the NAS?
Can you provide the full "New-PSDrive" with your arguments?
Also, if this in a script, set the Scope parameter value to "Global" to ensure the drive persists outside the current scope.
source: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.management/new-psdrive?view=powershell-7.2

TFS 2017 build executing powershell failing due to term not recognized error on line 1

I'm pretty new to TFS and I'm still learning how to use it (so hopefully this isn't just a stupid oversight on my end). I'm working an internship this summer at a seasonal developer position, and essentially my end goal is to automate load testing for the company's website. I'm using TFS to achieve this goal; the build I currently have has two tasks only right now: one to start the controller and the testing environment, and one to stop them. My problem is that the build keeps failing before it really even starts, due to a "term not recognized" error on line 1, specifically caused by what appears to be the default working folder not being recognized.
Here are the relevant log files:
2019-05-30T20:00:02.0942883Z Executing the following powershell script. (workingFolder = D:\RM_agent\_work\11\s)
2019-05-30T20:00:02.0942883Z D:\RM_agent\_work\11\s
2019-05-30T20:00:02.4999117Z ##[error]. : The term 'D:\RM_agent\_work\11\s' is not recognized as the name of a
2019-05-30T20:00:02.4999117Z ##[error]cmdlet, function, script file, or operable program. Check the spelling of the
2019-05-30T20:00:02.4999117Z ##[error]name, or if a path was included, verify that the path is correct and try again.
2019-05-30T20:00:02.4999117Z ##[error]At line:1 char:3
2019-05-30T20:00:02.4999117Z ##[error]+ . 'D:\RM_agent\_work\11\s'
2019-05-30T20:00:02.4999117Z ##[error]+ ~~~~~~~~~~~~~~~~~~~~~~~~
2019-05-30T20:00:02.4999117Z ##[error] + CategoryInfo : ObjectNotFound: (D:\RM_agent\_work\11\s:String)
2019-05-30T20:00:02.4999117Z ##[error] [], CommandNotFoundException
2019-05-30T20:00:02.4999117Z ##[error] + FullyQualifiedErrorId : CommandNotFoundException
I know that the working folder defaults to $(Build.SourcesDirectory), so I'm assuming that D:\RM_agent\_work\11\s is what $(Build.SourcesDirectory) evaluates to. RM_agent is obviously an agent, so /_work/11/s should be the local path where it stores the source code. Why is it unrecognized then?
I tried manually setting the working folder for the scripts through tfs to the folder where the build is stored, but the build still failed and the logs still showed that workingFolder = D:\RM_agent\_work\11\s.
Additionally, the line of code that the build is failing on, Executing the following powershell script. (workingFolder = D:\RM_agent\_work\11\s), is nowhere in the script I am trying to execute, which confuses me. Where is this script coming from?
(I can remove this if it doesn't fit the guidelines/is off topic, but if anyone could point me towards any resources about tfs and/or load testing it would be massively helpful as well)
EDIT: Here is the powershell script for the first task
########################################
# start environment
########################################
# import modules
Import-Module '\\neenah-san1\TSbuild\Deployment\Tools\PowerShell\Azure\JJK.TS.Azure.psm1' -Force -Prefix 'TS'
# provide azure credentials
$credential = Get-Credential
# login to azure subscription
Login-AzureRmAccount -Credential $credential
# start the controller
Get-AzureRmVM -ResourceGroupName 'TS-LoadTest-TST' | Where-Object {$_.Name -match 'vstc'} | Start-TSAzureVM -Credential $credential
# wait for controller to fully start
Start-Sleep -Seconds 120
# start the agents
Get-AzureRmVM -ResourceGroupName 'TS-LoadTest-TST' | Where-Object {$_.Name -match 'vsta'} | Start-TSAzureVM -Credential $credential
# check status of all servers
Get-AzureRmVM -ResourceGroupName 'TS-LoadTest-TST' -Status | Sort-Object -Property Name | Select-Object -Property Name, PowerState | Format-Table -AutoSize
Solution structure:
EDIT 2: [RESOLVED] It's all fixed now, thank you! I went into the repository and mapped the folder my scripts were in directly to $(build.sourcesDirectory). Consequently I was able to change the file path to $(build.sourcesDirectory)\StartControllerAndAgents.ps1 and the build is now able to find the files to run.
You need to specify the path to the script as $(Build.SourcesDirectory)\Path\To\Script. Not the TFVC path which you've configured now $/Project/Path/To/Script.
The exact path depends on the workspace mapping of the build definition.
The same applies to the working directory.
There are a number of variables in Azure Pipelines (the current name for the Build hub in TFS/Azure DevOps) that resolve to different standardized paths on the agent. Almost all tasks take a relative path off of those variables.

How to fix "Object reference not set to an instance of an object" error when running Get-AzDataLakeStoreChildItem cmdlet?

I'm getting an error while running the Azure cmdlet in Powershell. How do I resolve this?
I'm trying to get details of folders and files present in Azure datalake through powershell. I'm able to access the data lake through portal and access all files.
Using Azure cmdlet I've tested the connection using "Test-AzDataLakeStoreAccount -Name $Server" and it works fine too. However, when I execute the below command, it throws null pointer exception. How to resolve that?
**Get-AzDataLakeStoreChildItem -Account "****.azuredatalakestore.net" -Path "/" **
Get-AzDataLakeStoreChildItem : Object reference not set to an instance of an object.
At line:1 char:1
+ Get-AzDataLakeStoreChildItem -Account "entadls8cc9b872.azuredatalakes ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Get-AzDataLakeStoreChildItem], NullReferenceException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.DataLakeStore.GetAzureDataLakeStoreChildItem
I can reproduce your issue in both windows powershell and azure cloud shell. It looks like a bug about the powershell module Az.DataLakeStore.
I tried Get-AzDataLakeStoreChildItem -AccountName "AccountName" -Path "/test/" which is the same as the sample in the doc, also got the same error. I also tried the Get-AzDataLakeStoreItem -AccountName "AccountName" -Path "/test/123.txt" and Test-AzDataLakeStoreItem -AccountName "AccountName" -Path "/test/123.txt", both got an error like below.
I find a github issue related to this error: https://github.com/Azure/azure-powershell/issues/8352. I think the format of the commands I have tried should be correct. The comment in this issue said 'To use datalake az module you have to use it in netcore powershell (not windows powershell)', but as I know, the Az module is cross-platform, it is not a reason, according to the doc. Another comment said the 'we have fixed this issue. Apologize for the inconvenience. It will be released as part of next release.'

Powershell Script for adding users to AD

Hi I've just resently started to use powershell on my server. Though when I run the script I get the error:
New-ADUser : Unable to find a default server with Active Directory Web Services running.
At C:\Users\Administrator\Desktop\Powerwhell Script, H1 case.ps1:6 char:1
+ New-ADUser -name $_."fornavn"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ResourceUnavailable: (:) [New-ADUser], ADServerDownException
+ FullyQualifiedErrorId : ActiveDirectoryServer:1355,Microsoft.ActiveDirectory.Management.Commands.NewADUser
I have attached the script and my .csv file. Hope any of you can help me figuring it out.
(Don't worry about the information, it's for a school assignment)
Script
.csv file
It looks like your script can not find the domain control on your domain. Simply use the -Server parameter and give it the Full Qualified Domain Name or IP of the domain control.
New-ADUser -Server "ServerName.Domain.com"
If this doesn't work you might not have Active Directory Management Gateway Service installed on your domain control (Download Here). With Windows server 2012 R2 make sure you have the following feature installed.
The headers warning you are seeing is because Import-Csv is unable to get the headings from your CSV file for some reason, and replaces the header name with H1,H2 ... Hx. For example:
fornavn efternavn H1 beskrivelse, ...
------- --------- ----- -----------
Keld Bruun KB Adm.Ledergruppe, ...
You can get round this you can giving Import-Csv the names of your columns via the -Header parameter. Note that these do not have to be the same as the ones in the CSV, as the new column headers will overwrite the CSV.
Import-Csv "C:\H1, Powershell.csv" -Header 'fornavn','efternavn','forkortelse','beskrivelse','email','brugernavn','kode','kontor','fuldnavn'

Out-file export to a network location powershell

I am having an issue trying to output a file with PowerShell, I can export the file to the computer's local drives however when I want to export it to a network location it will not let me.
I receive the following error:
Access to the path '\\fmadt-prod-web5\e$\ftproot\customer\temp\SiteLists\Classic\Hosted1.txt' is denied.
+ CategoryInfo : OpenError: (:) [Out-File], UnauthorizedAccessException
+ FullyQualifiedErrorId : FileOpenFailure,Microsoft.PowerShell.Commands.OutFileCommand
This is the code that I am using:
$list2 | Format-Table -a -Property "WebAppName", "Version", "State"| Out-File '\\fmadt-prod-web5\e$\ftproot\customer\temp\SiteLists\Classic\Hosted1.txt' -force
Is it possible to export to a network location? The user I am using has admin access to that location as well.
You can't use a UNC path, but you can map a PSDrive to that location and use that:
New-PSDrive -Name dest -Root \\fmadt-prod-web5\e$\ftproot\customer\temp\SiteLists\Classic -PSProvider FileSystem
Then:
| out-file dest:\Hosted1.txt
Use the -Credential parameter of New-PSDrive if you need to access the drive using alternate credentials.
you most certainly can output to unc paths, I regularly use those at work. This almost looks like you don't, or the account you are running the script as doesn't, have access to the directory.
E$ refers to the admin share on a server, try actually sharing that directory via windows shares or run the script using an account that is in the administrators group on the relevant server. Also, I always use double quotes for paths because then you can include variables -- force of habit :)