GPO not being applied - powershell

I am using the powershell GroupPolicy module to create and link new GPOs. I have a large number of GPOs to create, and thus I wish to automate the process without having to interact with the Group Policy Editor.
I noticed while creating GPOs through the editor that each policy object would be contained in either one or several XML files or .INI files.
Having noted above, I started creating GPOs with the New-GPO command, passing the -Name and -Domain flags. After the GPO has been successfully created, I would (via my script) generate an XML file containing all of the information that the policy would consume. Shown below is an extract of the XML file that I would create to set up a mapped drives policy.
When inspecting the policy in the editor, everything looks fine. The correct drives are showing up and all of the settings appear to be correct. However, the policy is never applied. If I create an identical policy manually via the Group Policy Editor, all of the policies start working, including the ones that I created with powershell.
The error therefore seems to be that the domain controller is never made aware of the changes, but they get applied when a manual change is made.
I have tried running gpupdate /force, which does not seem to update or propagate the changes
New-Item \\$($MappedDrivesGPO.DomainName)\SYSVOL\$($MappedDrivesGPO.DomainName)\Policies\$("{"+$MappedDrivesGPO.Id+"}")\User\Preferences\Drives\Drives.xml -ItemType File -Force
Set-Content \\$($MappedDrivesGPO.DomainName)\SYSVOL\$($MappedDrivesGPO.DomainName)\Policies\$("{"+$MappedDrivesGPO.Id+"}")\User\Preferences\Drives\Drives.xml $xml
<?xml version="1.0" encoding="utf-8"?>
<Drives clsid="{8FDDCC1A-0C3C-43cd-A6B4-71A6DF20DA8C}">
<Drive clsid="{935D1B74-9CB8-4e3c-9914-7DD559B7A417}" name="P:" status="P:" image="2" changed="2019-04-26 10:41:54" uid="{$guid1}" bypassErrors="1">
<Properties action="U" thisDrive="NOCHANGE" allDrives="NOCHANGE" userName="" path="\\fs1\Projects" label="Projects" persistent="0" useLetter="1" letter="P"/>
<Filters>
<FilterGroup bool="AND" not="0" name="$($domainName)\Drive P Access" sid="$($filterGroupSidDriveP)" userContext="1" primaryGroup="0" localGroup="0"/>
</Filters>
</Drive>
<Drive clsid="{935D1B74-9CB8-4e3c-9914-7DD559B7A417}" name="S:" status="S:" image="2" changed="2019-04-26 10:39:21" uid="{$guid2}" bypassErrors="1">
<Properties action="U" thisDrive="NOCHANGE" allDrives="NOCHANGE" userName="" path="\\as1\Software" label="Software" persistent="0" useLetter="1" letter="S"/>
<Filters>
<FilterGroup bool="AND" not="0" name="$($domainName)\Drive S Access" sid="$($filterGroupSidDriveS)" userContext="1" primaryGroup="0" localGroup="0"/>
</Filters>
</Drive>
</Drives>
I expected the policy to start working as intended after the XML file had been created
The actual result is that the policy appears to be well formed, but never applied

I managed to resolve this on my own, posting here in case someone else runs into the same issue. If you're creating GPOs programmatically and not via the editor, you will have to extend your script/program to add CSE (in this case for drive mapping) and SnapIn GUID to gPCUserExtensionNames.
[{00000000-0000-0000-0000-000000000000}{2EA1A81B-48E5-45E9-8BB7-A6E3AC170006}][{5794DAFD-BE60-433F-88A2-1A31939AC01F}{2EA1A81B-48E5-45E9-8BB7-A6E3AC170006}]
The 0000.. is the Core GPO Engine, 23EA.. is the Preference Tool CSE GUID Drives, 5794.. is Preference CSE GUID Drives.
If you create the policy manually with the desired changes, you can then open dsa.msc, enable advanced features and then view the policy object properties. Under gPCUserExtensionNames you will be able to see the arrays containing the GUIDs that you need to incorporate in your software.

Related

Changing network configuration for all VMs in VirtualBox

I've a problem with forceing NAT mode on network cards in all VirtualBox machines. The only solution I have found is to replace the code in the configuration file located in the C:\Users\USER\VirtualBoxVMs\VirtualMachineName\VirtualMachineName.vbox.
Unfortunately, I cannot simply replace these files because each user has different settings (such as the name of the VM)
I have to replace the text in the
<Adapter>...<Adapter> tag. Doing it on one computer for one user is not a problem. I can't deal with the fact that I have to do it with a script on multiple computers for all users. So far, I've managed to do something like this, and I don't know what to do next.
$InputFiles = Get-Item "C:\Users\*\VirtualBox VMs\*\*.vbox"
$OldString = '<Adapter ...various computer-dependent variables...>
...
...
</Adapter>'
$NewString = '<Adapter ...various computer-dependent variables...>
<NAT/>
</Adapter>'
$InputFiles | ForEach {
(Get-Content -Path $_.FullName).Replace($OldString,$NewString) | Set-Content -Path $_.FullName
}
Another complication is that when saving changes, everything is saved in one file for all users, i.e. all settings from all machines on a given computer as one file.
Unfortunately, VB does not store such values in the register, which makes it very difficult. Maybe there is another, easy way?
EDIT: Sample VBox config file
<?xml version="1.0"?>
<!--
** DO NOT EDIT THIS FILE.
** If you make changes to this file while any VirtualBox related application
** is running, your changes will be overwritten later, without taking effect.
** Use VBoxManage or the VirtualBox Manager GUI to make changes.
-->
<VirtualBox xmlns="http://www.virtualbox.org/" version="1.15-windows">
<Machine uuid="{1d96510c-97ce-4671-a1df-3f08c7d2b2c7}" name="TEST" OSType="Windows7_64" snapshotFolder="Snapshots" lastStateChange="2020-05-04T09:56:34Z">
<MediaRegistry>
<HardDisks>
<HardDisk uuid="{eab23143-54ec-4f7d-822d-a56bf90a58bb}" location="TEST.vdi" format="VDI" type="Normal"/>
</HardDisks>
</MediaRegistry>
<ExtraData>
<ExtraDataItem name="GUI/FirstRun" value="yes"/>
</ExtraData>
<Hardware>
<CPU>
<PAE enabled="false"/>
<LongMode enabled="true"/>
<HardwareVirtExLargePages enabled="true"/>
</CPU>
<Memory RAMSize="2048"/>
<HID Pointing="USBTablet"/>
<Paravirt provider="Default"/>
<Display VRAMSize="18"/>
<VideoCapture fps="25" options="ac_enabled=false"/>
<RemoteDisplay enabled="false"/>
<BIOS>
<IOAPIC enabled="true"/>
</BIOS>
<USB>
<Controllers>
<Controller name="OHCI" type="OHCI"/>
<Controller name="EHCI" type="EHCI"/>
</Controllers>
</USB>
<Network>
<Adapter slot="0" enabled="true" MACAddress="MACAddress" cable="true" type="82540EM">
<NAT/>
</Adapter>
</Network>
<AudioAdapter controller="HDA" driver="DirectSound" enabled="true" enabledIn="false"/>
</Hardware>
<StorageControllers>
<StorageController name="SATA" type="AHCI" PortCount="2" useHostIOCache="false" Bootable="true" IDE0MasterEmulationPort="0" IDE0SlaveEmulationPort="1" IDE1MasterEmulationPort="2" IDE1SlaveEmulationPort="3">
<AttachedDevice type="HardDisk" hotpluggable="false" port="0" device="0">
<Image uuid="{eab23143-54ec-4f7d-822d-a56bf90a58bb}"/>
</AttachedDevice>
<AttachedDevice passthrough="false" type="DVD" hotpluggable="false" port="1" device="0"/>
</StorageController>
</StorageControllers>
</Machine>
</VirtualBox>

Publish SSRS by Octopus

I'm building the set up to deploy my SSRS reports through Octopus Deploy, I found out one Octopus Library and I'm working on it, but I've had some issues:
1º ---- Message error: (The path is alright, but it keeps with the same warning)
WARNING: Unable to find datasource SalesDrivers in /Sales Drivers/Data Sources
2º ---- The method doesn't exist
Method invocation failed because [Microsoft.PowerShell.Commands.NewWebserviceProxy.AutogeneratedTypes.WebServiceProxy3er_ReportService2005_asmx_wsdl.ReportingService2005] doesn't contain a method named 'LoadReportDefinition'.
The powershell function from the template\library that is throwing the error can been seen below:
#region Update-ReportParamters()
Function Update-ReportParameters($ReportFile)
{
# declare local variables
$ReportParameters = #();
# necessary so that when attempting to use the report execution service, it doesn't puke on you when it can't find the data source
$ReportData = (Remove-SharedReferences -ReportFile $ReportFile)
# get just the report name
$ReportName = $ReportFile.SubString($ReportFile.LastIndexOf("\") + 1)
$ReportName = $ReportName.SubString(0, $ReportName.IndexOf("."))
# create warnings object
$ReportExecutionWarnings = $null
# load the report definition
Write-Host "*********************************************"
#Write-Host $ReportData
#(Remove-SharedReferences -ReportFile $ReportFile)
#Write-Host $ReportExecutionWarnings
$ExecutionInfo = $ReportExecutionProxy.LoadReportDefinition($ReportData, [ref] $ReportExecutionWarnings);
# loop through the report execution parameters
foreach($Parameter in $ExecutionInfo.Parameters)
{
# create new item parameter object
$ItemParameter = New-Object "$ReportServerProxyNamespace.ItemParameter";
# fill in the properties except valid values, that one needs special processing
Copy-ObjectProperties -SourceObject $Parameter -TargetObject $ItemParameter;
# fill in the valid values
$ItemParameter.ValidValues = Convert-ValidValues -SourceValidValues $Parameter.ValidValues;
# add to list
$ReportParameters += $ItemParameter;
}
# force the parameters to update
Write-Host "Updating report parameters for $ReportFolder/$ReportName"
if ($IsReportService2005) {
$ReportServerProxy.SetReportParameters("$ReportFolder/$ReportName", $ReportParameters);
}
elseif ($IsReportService2010) {
$ReportServerProxy.SetItemParameters("$ReportFolder/$ReportName", $ReportParameters);
}
else { Write-Warning 'Report Service Unknown in Update-ReportParameters method. Use ReportService2005 or ReportService2010.' }
}
Anyone knows how I could sort it out?
I have solved a similar problem but took a slightly different approach. Rather than using powershell and octopus directly I used the useful open source tool RSBuild to deploy the reports. It is pretty easy to bundle up the rsbuild.exe executable (it is tiny) and a deploy.config along with your reports inside the octopus package. Then you can use octopus's substitution feature to rewrite the config file and Powershell function to execute the executable. This also has the advantage that you can deploy easily without octopus, the config for data sources and reports is declarative in XML rather than procedural in Powershell and the smarts of your scripted deployment can live alongside your reports rather than buried in Octopus.
So my config looks a bit like:
<?xml version="1.0" encoding="utf-8" ?>
<Settings>
<Globals>
<Global Name="CollapsedHeight">0.5in</Global>
</Globals>
<ReportServers>
<ReportServer Name="RS1" Protocol="http" Host="${ReportServer}" Path="${ReportServerPath}" Timeout="30" />
</ReportServers>
<DataSources>
<DataSource Name="Source1" Publish="true" Overwrite="true" TargetFolder="Data Sources" ReportServer="RS1">
<ConnectionString>data source=${ReportServer};initial catalog=${DatabaseName}</ConnectionString>
<CredentialRetrieval>Store</CredentialRetrieval>
<WindowsCredentials>False</WindowsCredentials>
<UserName>${RepotrUser}</UserName>
<Password>${ReportsPassword}</Password>
</DataSource>
</DataSources>
<Reports>
<ReportGroup Name="Details" DataSourceName="Source1" TargetFolder="Reports"
ReportServer="RS1" CacheTime="10080">
<Report Name="BusinessReportABC">
<FilePath>reports\BusinessReportABC.rdl</FilePath>
</Report>
<!--More reports here-->
</ReportGroup>
</Reports>
</Settings>
My deployed octopacked artefacts contain RSBuild.Core.dll, RSBuild.exe, deploy.config and the reports files
Then I simply call the executable using powershell:
PS> rsbuild deploy.config

OctopusDeploy - Every website in the deploy has a different AppPool and Website name; how to deal; no other differences

I'm trying to setup a deploy process that targets 16 web sites each hosting an instance of the same application.
Websites and AppPools are named as such:
appServer1:
app10.site.com
app11.site.com
app12.site.com
app13.site.com
appServer2:
app20.site.com
app21.site.com
app22.site.com
app23.site.com
etc.
etc.
...with each website having a correspondingly named AppPool.
I am desperately trying to determine how to use a single Deploy NuGet Package step to target all of these websites/app pools using variables and a combination of powershell scripts if possible.
I'd like to have a single step where I can variable substitute the website and app pool names. As this is the only difference. I basically need the equivalent of being able to loop the nuget package step passing it a list of website and app pool names. I cannot simply use variables because I can only resolve to the machine level with variable scoping.
Create list of all Website and AppPool names, iterate them passing each value to a Step for execution. ForEach processing step for lack of better words.
I do have the ability to rename the AppPools if need be for a more consistent pattern, but I cannot change the website names
Any ideas would be greatly appreciated.
http://help.octopusdeploy.com/discussions/questions/3481-every-website-in-the-deploy-has-a-different-apppool-and-website-name-how-to-deal-no-other-differences
There's a lot to your question, but I'm going to take a stab at explaining our approach, in hopes of jogging your creative juices.
tl;dr
simply put, use your own powershell scripts to install the web-application. In there you can set the app pool name on a per website basis
For starters, we do do a separate deployment step for each project. The scripts we use will allow you to do all deployments from a single deploy.ps1 (including unique appPool names), but we find that it really helps keep each deployment nice and lean, and easy to manage. Each project get's it's own nupkg and therein contains the predeploy.ps1, deploy.ps1, and postdeploy.ps1 as well as a folder of build/deploy scripts that we've open sourcesd, and a folder of environment config xml files.
A sample of an environment config would be this. The name is simply [envName].xml
<!-- environments\Production.xml -->
<environmentSettings>
<webSites>
<app>
<physicalPathRoot>c:\inetpub</physicalPathRoot>
<physicalFolderPrefix>appname</physicalFolderPrefix>
<siteProtcol>https</siteProtcol>
<siteName>appname.tld</siteName>
<siteHost>appname.tld</siteHost>
<portNumber>443</portNumber>
<appPath>/</appPath>
<appPool>
<name>appname.tld</name>
<!-- valid identityTypes are: [LocalSystem, LocalService, NetworkService, SpecificUser, ApplicationPoolIdentity] -->
<identityType>NetworkService</identityType>
<!-- Set this value to the User the Service will run under in the format DOMAIN\username -->
<!-- If Running as 'NetworkService' then 'NT AUTHORITY\Network Service' is used -->
<userName>NT AUTHORITY\Network Service</userName>
<!-- Leave blank unless using SpecificUser -->
<password></password>
<maxWorkerProcesses>5</maxWorkerProcesses>
</appPool>
</app>
</webSites>
<serverDatabase>
<name>database_name</name>
<connectionString>REPLACED BY OCTOPUS</connectionString>
<providerName>System.Data.SqlClient</providerName>
</serverDatabase>
</environmentSettings>
You can see in the corresponding Get-EnvironmentSettings.ps1 where we load up the config, and then update it with any Octopus variables. This is the trickiest part, because we use dot-Notation to update the paths (case sensitive).
Our octopus variables really only contain information that is secret, as everything else lives in [environment].xml
| Name | Value | Scope
--------------------------------------------------------------------------
| webSites.app.appPool.password | supersecret | Production
So now a typical deployment script simply imports the modules, grab environmentSettings, update config, and install the web app.
# Top of the script, get Octopus environment and version
param(
[string] $version = $OctopusPackageVersion,
[string] $environment = $OctopusEnvironmentName
)
# Make sure a failed deployment actually fails
$ErrorActionPreference = "Stop"
# Import the modules
$currentDir = Split-Path $script:MyInvocation.MyCommand.Path
$moduleDir = "$currentDir\modules"
Import-Module BuildDeployModules
# Grab the environment settings
$environmentSettings = Get-EnvironmentSettings $environment "//environmentSettings"
$databaseSettings = $environmentSettings.serverDatabase
$websiteSettings = $environmentSettings.webSites.app
# update the config
Update-XmlConfigValues $currentDir\website\Web.config "//appSettings/add[#key='databaseName']" $($databaseSettings.name) "value"
Update-XmlConfigValues $currentDir\website\Web.config "//connectionStrings/add[#name='databaseConnection']" $($databaseSettings.connectionString) "connectionString"
Update-XmlConfigValues $currentDir\website\Web.config "//connectionStrings/add[#name='databaseConnection']" $($databaseSettings.providerName) "providerName"
# Install the web application
Install-WebApplication $environment $websiteSettings $version "anonymousAuthentication"
In doing all of this, the web application is installed into IIS with a specific application pool, and appropriate config transforms without relying on any unknowns.
Our nupkg structure looks something like this
appname.1.2.3.4.nupkg
environments
dev.xml
staging.xml
qual.xml
production.xml
modules
[all of our build modules]
website
[all of our website files]
This is super repeatable, easy to maintain, and easy to edit config. Hope it helps

Use AppCmd to LIST CONFIG in APPHOST only

I have a requirement to use powershell to configure IIS7.5 on WebApplications that have not yet had code deployed (possibly at all, possibly old/broken web.configs exist) to the file system. I would like to be able to do this all at the APPHOST level. (Note at the bottom about use of Powershell > AppCmd).
I can SET all the values properly, however, being somewhat diligent, I like to also validate the values were set properly by retrieving them after setting.
Here's the scenario:
I can set this value using AppCmd so the setting is applied at the APPHOST level using the /Commit:APPHOST flag. However, I havent found a way to READ the values exclusively at the APPHOST level.
Setting the Code is successful:
C:\Windows\System32\inetsrv\appcmd.exe set config "webSiteName/webAppName" -section:system.webServer/security/authentication/anonymousAuthentication /enabled:"True" /commit:apphost
However, I cant find a way to read the values using AppCmd (or Powershell):
Running the following AppCmd returns an error due to the broken pre-existing web.config in the folder (the specific error is unimportant, as it is reading the WebApp's web.config instead of the ApplicationHost.config/APPHOST):
C:\Windows\System32\inetsrv\appcmd.exe list config "MACHINE/WEBROOT/APPHOST/webSiteName/webAppName" -section:system.webServer/security/authentication/anonymousAuthentication
ERROR ( message:Configuration error
Filename: \\?\c:\inetpub\wwwroot\webSiteName\webAppName\web.config
Line Number: 254
Description: The configuration section 'system.runtime.caching' cannot be read because it is missing a section declaration
. )
Note: I would prefer to do this all in Powershell instead of using AppCmd, so if anyone has the syntax for modifying the APPHOST settings for anonymousAuthentication section of a WebApplication, that lives under a Website, from inside Powershell (Get-WebConfiguration seems to only use the WebApp web.config), that would be totally awesome and much appreciated!
Here's how to do this in PowerShell:
[Reflection.Assembly]::Load(
"Microsoft.Web.Administration, Version=7.0.0.0,
Culture=Neutral, PublicKeyToken=31bf3856ad364e35") > $null
$serverManager = New-Object Microsoft.Web.Administration.ServerManager
$config = $serverManager.GetApplicationHostConfiguration()
$anonymousAuthenticationSection = $config.GetSection("system.webServer/security/authentication/anonymousAuthentication", "simpleasp.net")
Write-Host "Current value: " $anonymousAuthenticationSection["enabled"]
# Now set new value
$anonymousAuthenticationSection["enabled"] = $true
$serverManager.CommitChanges()

Breaking MsBuild package & deploy into separate MsBuild and MsDeploy commands

I'm having a few problems breaking out an MsBuild package+deploy command into two separate commands. (I need to do this to pass additional parameters to MsDeploy).
The command that works fine looks like this:
msbuild "src\Solution.sln"
/P:Configuration=Deploy-Staging
/P:DeployOnBuild=True
/P:DeployTarget=MSDeployPublish
/P:MsDeployServiceUrl=https://192.168.0.1:8172/MsDeploy.axd
/P:DeployIISAppPath=staging.website.com
/P:AllowUntrustedCertificate=True
/P:MSDeployPublishMethod=WmSvc
/P:CreatePackageOnPublish=True
/P:UserName=staging-deploy
/P:Password=xyz
The separated packaging command looks like this:
msbuild "src\Solution.sln"
/P:Configuration=Deploy-Staging
/P:DeployOnBuild=True
/P:DeployTarget=Package
/P:_PackageTempDir=C:\temp\web
which works fine. But then the MsDeploy portion:
msdeploy
-verb:sync
-allowUntrusted
-usechecksum
-source:manifest=
'src\WebProject\obj\Deploy-Staging\Package\WebProject.SourceManifest.xml'
-dest:auto,ComputerName=
'https://192.168.0.1:8172/MsDeploy.axd?site=staging.website.com',
username='staging-deploy',password='xyz',authType='basic',includeAcls='false'
-enableRule:DoNotDeleteRule
fails, with the following error in WmSvc.log
wmsvc.exe Error: 0 : Attempted to perform an unauthorized operation.
setAcl/C:\temp\web (Read)
ProcessId=15784
ThreadId=31
DateTime=2011-03-30T14:57:02.4867689Z
Timestamp=3802908721815
wmsvc.exe Error: 0 : Not authorized.
Details: No rule was found that could authorize user 'staging-deploy',
provider 'setAcl', operation 'Read', path 'C:\temp\web'.
(and several more Read/Write operations)
Something is clearly going wrong with the paths it's trying to access (as it works fine with the other method) - I'm not sure it's even trying to use the iisApp targeting correctly, and at the moment I don't think the correct web.config's will be deployed either.
I've got this fixed now - I needed a different command to the one the automatically generated .cmd file was using, but comparing the two allowed me to fix it up (thanks #Vishal R. Joshi)
The differences I needed was:
basic authentication
allow untrusted certificates
?site=staging.webserver on the end of the MsBuild.axd path, as with my original command
override the IIS Web App name that is set in the params file
enable the do not delete rule
The winning command is as follows:
msdeploy
-verb:sync
-allowUntrusted
-source:package='src\WebProject\obj\Deploy-Staging\Package\WebProject.zip'
-dest:auto,ComputerName=
'https://192.168.0.1:8172/MsDeploy.axd?site=staging.website.com',
username='staging-deploy',password='xyz',authType='basic',includeAcls='false'
setParamFile:
"src\WebProject\obj\Deploy-Staging\Package\WebProject.SetParameters.xml"
-setParam:name='IIS Web Application Name',value='staging.website.com'
-enableRule:DoNotDeleteRule
-disableLink:AppPoolExtension -disableLink:ContentExtension
-disableLink:CertificateExtension
Hope this helps someone!
Add a delegation rule on the server using inetmgr to allow staging-deploy to carry out set-Acl operations.
Inetmgr -> Click on server node -> Management Service Delegation (in Management) -> Click Add rule to the right -> Choose the template labelled "Set Permissions for Applications" -> Accept defaults and click OK.
This should let you deploy any package or manifest with setAcl as long as the user you are deploying as, has permissions to the site you are deploying to.
You are able to specify the -setParam:name='',value='' flag when calling the MyProject.deploy.cmd file that is created when you generate a Package from a web project. The cmd is a friendly wrapper around msdeploy.exe, so you have no need to specify all the rest of the defaults.
Here's the details: http://evolutionarydeveloper.blogspot.co.uk/2013/05/specifying-environment-variables-at.html