PowerShell DSC res does not exist - powershell

I have created a ps1 file with the DSC configuration and one for the LCM.
The MOF files were created successfully.
When I start the configuration, I get the following error:
The PowerShell DSC resource MSFT_TeamsMeetingConfiguration from module
<Microsoft365DSC,1.21.728.1> does not exist at the PowerShell module
path nor is it registered as a WMI DSC resource.
The Config file looks as follows:
Configuration TeamsConfig2
{
param
(
[Parameter(Mandatory = $true,Position=0)][ValidateNotNull()][System.Management.Automation.PSCredential]$AdminCred
)
Import-DSCResource -ModuleName "Microsoft365DSC"
Node localhost
{
TeamsMeetingConfiguration DemoMeetingConfiguration2
{
ClientAppSharingPort = 50040;
ClientAppSharingPortRange = 20;
ClientAudioPort = 50000;
ClientAudioPortRange = 20;
ClientMediaPortRangeEnabled = $True;
ClientVideoPort = 50020;
ClientVideoPortRange = 20;
CustomFooterText = "sdfs"
DisableAnonymousJoin = $False;
EnableQoS = $False;
GlobalAdminAccount = $AdminCred;
HelpURL = "https://github.com/Microsoft/Office365DSC/Help";
Identity = "Global";
LegalURL = "https://github.com/Microsoft/Office365DSC/Legal";
LogoURL = "https://github.com/Microsoft/Office365DSC/Logo.png";
}
}
}
The Microsoft365DSC module is installed with version 1.21.728.1.
Get-DSCResource also finds the resource so I currently don't understand what the issue could be.

Related

Run custom Powershell script on provisioned Azure VM

I provisioned VM with following C# snippet
var ssrsVm = new WindowsVirtualMachine("vmssrs001", new WindowsVirtualMachineArgs
{
Name = "vmssrs001",
ResourceGroupName = resourceGroup.Name,
NetworkInterfaceIds = { nic.Id },
Size = "Standard_B1ms",
AdminUsername = ssrsLogin,
AdminPassword = ssrsPassword,
SourceImageReference = new WindowsVirtualMachineSourceImageReferenceArgs
{
Publisher = "microsoftpowerbi",
Offer = "ssrs-2016",
Sku = "dev-rs-only",
Version = "latest"
},
OsDisk = new WindowsVirtualMachineOsDiskArgs
{
Name = "vmssrs001disk",
Caching = "ReadWrite",
DiskSizeGb = 200,
StorageAccountType = "Standard_LRS",
}
});
After VM has been provisioned I would like to run a custom Powershell script on it to add a firewall rule. Now wondering how to do this as a part of the Pulumi app.
With Azure looks like I could do this with RunPowerShellScript but couldn't find anything about it in Pulumi docs, maybe there is a better way to handle my case?
UPDATE
Thanks to Ash's comment I was able to find VirtualMachineRunCommandByVirtualMachine which seems should do what I'm looking for, but unfortunately, following code snippet returns error
var virtualMachineRunCommandByVirtualMachine = new VirtualMachineRunCommandByVirtualMachine("vmssrs001-script",
new VirtualMachineRunCommandByVirtualMachineArgs
{
ResourceGroupName = resourceGroup.Name,
VmName = ssrsVm.Name,
RunAsUser = ssrsLogin,
RunAsPassword = ssrsPassword,
RunCommandName = "enable firewall rule for ssrs",
Source = new VirtualMachineRunCommandScriptSourceArgs
{
Script =
#"Firewall AllowHttpForSSRS
{
Name = 'AllowHTTPForSSRS'
DisplayName = 'AllowHTTPForSSRS'
Group = 'PT Rule Group'
Ensure = 'Present'
Enabled = 'True'
Profile = 'Public'
Direction = 'Inbound'
LocalPort = ('80')
Protocol = 'TCP'
Description = 'Firewall Rule for SSRS HTTP'
}"
}
});
error
The property 'runCommands' is not valid because the 'Microsoft .Compute/RunCommandPreview' feature is not enabled for this subscription."
Looks like other people are struggling with the same here.
You can use a Compute Extension to execute a script against a VM with Pulumi.
This article details some of the options if you just completed the procedure via PowerShell.
As an addition to Ash answer here is how I integrated it with Pulumi
first, I create a blob container for my project scripts
var deploymentContainer = new BlobContainer("deploymentscripts", new BlobContainerArgs
{
ContainerName = "deploymentscripts",
ResourceGroupName = resourceGroup.Name,
AccountName = storageAccount.Name,
});
next, I upload all of my Powershell scripts to create blob
with this snippet
foreach (var file in Directory.EnumerateFiles(Path.Combine(Environment.CurrentDirectory, "Scripts")))
{
var fileName = Path.GetFileName(file);
var blob = new Blob(fileName, new BlobArgs
{
ResourceGroupName = resourceGroup.Name,
AccountName = storageAccount.Name,
ContainerName = deploymentContainer.Name,
Source = new FileAsset(file),
});
deploymentFiles[fileName] = SignedBlobReadUrl(blob, deploymentContainer, storageAccount, resourceGroup);
}
SignedBlobReadUrl grabbed from Pulumi repo.
private static Output<string> SignedBlobReadUrl(Blob blob, BlobContainer container, StorageAccount account, ResourceGroup resourceGroup)
{
return Output.Tuple<string, string, string, string>(
blob.Name, container.Name, account.Name, resourceGroup.Name).Apply(t =>
{
(string blobName, string containerName, string accountName, string resourceGroupName) = t;
var blobSAS = ListStorageAccountServiceSAS.InvokeAsync(new ListStorageAccountServiceSASArgs
{
AccountName = accountName,
Protocols = HttpProtocol.Https,
SharedAccessStartTime = "2021-01-01",
SharedAccessExpiryTime = "2030-01-01",
Resource = SignedResource.C,
ResourceGroupName = resourceGroupName,
Permissions = Permissions.R,
CanonicalizedResource = "/blob/" + accountName + "/" + containerName,
CacheControl = "max-age=5",
});
return Output.Format($"https://{accountName}.blob.core.windows.net/{containerName}/{blobName}?{blobSAS.Result.ServiceSasToken}");
});
}
and lastly, I create Extension to run my script
code
var extension = new Extension("ssrsvmscript", new Pulumi.Azure.Compute.ExtensionArgs
{
Name = "ssrsvmscript",
VirtualMachineId = ssrsVm.Id,
Publisher = "Microsoft.Compute",
Type = "CustomScriptExtension",
TypeHandlerVersion = "1.10",
Settings = deploymentFiles["ssrsvm.ps1"].Apply(script => #" {
""commandToExecute"": ""powershell -ExecutionPolicy Unrestricted -File ssrsvm.ps1"",
""fileUris"": [" + "\"" + script + "\"" + "]}")
});
Hope that will save some time someone else struggling with the problem.

How to pass a PowerShell script using user_data (AWS and Terraform)?

I am trying to pass the PowerShell script as the file IIS.txt which is present in the CWD.
I don't see the script running on the server. I am not sure if I am missing something. Any help would be appreciated.
resource "aws_instance" "db1" {
ami = "ami-1234567890"
instance_type = "t3.small"
subnet_id = "${aws_subnet.db.0.id}"
key_name = "ireland"
user_data = "${file("IIS.txt")}"
tags = {
Name = "sql node 1"
}
}
I've used a template_file data and local_file resource for this.
data "template_file" "user_data" {
template = "${file("iis.txt")}"
}
resource "local_file" "user_data" {
content = "${data.template_file.user_data.rendered}"
filename = "user_data-${sha1(data.template_file.user_data.rendered)}.ps"
}
Then update your user_data property content of the local_file resource.
resource "aws_instance" "db1"
{
ami = "ami-1234567890"
instance_type = "t3.small"
subnet_id = "${aws_subnet.db.0.id}"
key_name = "ireland"
user_data = "${local_file.user_data.content}"
tags =
{
Name = "sql node 1"
}
}
This also allows you to get a little fancier and do a template script, and pull TF variables, etc into the template and render it just in time before you deploy.

Partial DSC configuration for SMB

Unfortunately I can't find any examples on the internet for my scenario.
I got a DSC server with a SMB share. I want to deploy partial configs like in https://learn.microsoft.com/de-de/powershell/dsc/pull-server/partialconfigs
But there are only examples for a HTTP DSC servers not SMB. Is this also possible with an SMB DSC server? If so, could I have an example?
I have found an example:
[DSCLocalConfigurationManager()]
configuration PartialConfig
{
Node localhost
{
Settings
{
RefreshMode = 'Pull'
ConfigurationID = 'a5f86baf-f17f-4778-8944-9cc99ec9f992'
RebootNodeIfNeeded = $true
}
ConfigurationRepositoryShare SMBPull
{
SourcePath = '\\Server\Configurations'
Name = 'SMBPull'
}
PartialConfiguration OSConfig
{
Description = 'Configuration for the Base OS'
ConfigurationSource = '[ConfigurationRepositoryShare]SMBPull'
RefreshMode = 'Pull'
}
PartialConfiguration SQLConfig
{
Description = 'Configuration for the SQL Server'
DependsOn = '[PartialConfiguration]OSConfig'
RefreshMode = 'Push'
}
}
}

Powershell DSC client can't register with pull server

For the past few days, I have been trying to create a development/test environment where I can automate deployments with DSC.
I have been using WMF5.1.
The pullserver has been set up using the example: Sample_xDscWebServiceRegistrationWithSecurityBestPractices
From xPSDesiredStateConfiguration 5.1.0.0.
configuration Sample_xDscWebServiceRegistrationWithSecurityBestPractices
{
param
(
[string[]]$NodeName = 'CORE-O-DSCPull.CORE.local',
[ValidateNotNullOrEmpty()]
[string] $certificateThumbPrint,
[Parameter(HelpMessage='This should be a string with enough entropy (randomness) to protect the registration of clients to the pull server. We will use new GUID by default.')]
[ValidateNotNullOrEmpty()]
[string] $RegistrationKey # A guid that clients use to initiate conversation with pull server
)
Import-DSCResource -ModuleName xPSDesiredStateConfiguration -ModuleVersion '5.1.0.0'
Node $NodeName
{
WindowsFeature DSCServiceFeature
{
Ensure = "Present"
Name = "DSC-Service"
}
xDscWebService PSDSCPullServer
{
Ensure = "Present"
EndpointName = "PSDSCPullServer"
Port = 8080
PhysicalPath = "$env:SystemDrive\inetpub\wwwroot\PSDSCPullServer"
CertificateThumbPrint = $certificateThumbPrint
ModulePath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Modules"
ConfigurationPath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Configuration"
State = "Started"
DependsOn = "[WindowsFeature]DSCServiceFeature"
RegistrationKeyPath = "$env:PROGRAMFILES\WindowsPowerShell\DscService"
AcceptSelfSignedCertificates = $true
UseSecurityBestPractices = $true
}
File RegistrationKeyFile
{
Ensure = 'Present'
Type = 'File'
DestinationPath = "$env:ProgramFiles\WindowsPowerShell\DscService\RegistrationKeys.txt"
Contents = $RegistrationKey
}
}
}
I apply the MOF file to my pull server without issues. I create a meta MOF using the same example:
[DSCLocalConfigurationManager()]
configuration Sample_MetaConfigurationToRegisterWithSecurePullServer
{
param
(
[ValidateNotNullOrEmpty()]
[string] $NodeName = 'CORE-O-DSCPull.CORE.local',
[ValidateNotNullOrEmpty()]
[string] $RegistrationKey, #same as the one used to setup pull server in previous configuration
[ValidateNotNullOrEmpty()]
[string] $ServerName = 'CORE-O-DSCPull.CORE.local' #node name of the pull server, same as $NodeName used in previous configuration
)
Node $NodeName
{
Settings
{
RefreshMode = 'Pull'
}
ConfigurationRepositoryWeb CORE-O_PullSrv
{
ServerURL = "https://$ServerName`:8080/PSDSCPullServer.svc" # notice it is https
RegistrationKey = $RegistrationKey
ConfigurationNames = #('Basic')
}
}
}
I apply the LCM settings to my pull-server without a problem.
I can create a simple basic.mof and use DSC to apply it. All this works fine.
Next, I create another meta.mof file for another node to let it register to my pull-server. I use the same configuration as above except for the nodename, which I change to the name of the other node. I use the command:
Set-DscLocalConfigurationManager -ComputerName <nodename> -path <pathtonewmetamof>
This command works correctly. That machine can then use DSC to apply the same basic.mof without problems.
Here comes the problem:
I restart my pull server and node, create a new basic.mof and try to apply this to both my machines. This procedure works fine on the pull server itself, but my node can no longer apply the basic.mof, because it will no longer register with my pull-server. I have replicated this many times, where I would install both machines from scratch and configure them. Every time I restart my machines, registration stops working. See the error below:
Registration of the Dsc Agent with the server https://CORE-O-DSCPull.CORE.local:8080/PSDSCPullServer.svc failed. The underlying error is: Failed to register Dsc
Agent with AgentId 1FE837AA-C774-11E6-80B5-9830B2A0FAC0 with the server
https://core-o-dscpull.core.local:8080/PSDSCPullServer.svc/Nodes(AgentId='1FE837AA-C774-11E6- 80B5-9830B2A0FAC0'). .
+ CategoryInfo : InvalidResult: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : RegisterDscAgentCommandFailed,Microsoft.PowerShell.DesiredStateConfiguration.Commands.RegisterDscAgentCommand
+ PSComputerName : CORE-O-DC.CORE.local
So, my problem is that registration seems to work fine until I reboot the pull server. Does anyone have any idea what can cause this issue?
For those wondering if I managed to fix this, yes I did.
It appears to be a bug in WMF5.0 and I was only using WMF5.1 on the pullserver. Not on the node. So I had to update that and now it is working.
As explained in this blog entry the low-level problem is that WMF 5.0 uses TLS 1.0 to communicate with the server, while WFM 5.1 does no longer support TLS 1.0.
In the aforementioned entry you will find two solutions: one that implies upgrading WMF in every and each of the nodes, and another that allows less secure connections by modifying the register in the server.

Ability to set CertificateID for LCM with February powershell 5

I'm trying to update my DSC deployment to now use partial configurations to break up the configuration. For that I need to now use a pull process instead of push.
When I try to apply the configuration for the LCM which looks something like:
[DscLocalConfigurationManager()]
Configuration CreateGESService
{
param(
[Parameter(Mandatory=$true)]
[ValidateNotNullorEmpty()]
[PsCredential] $InstallCredential,
[Parameter(Mandatory=$true)]
[ValidateNotNullorEmpty()]
[PsCredential] $RunCredential
)
Node $AllNodes.NodeName
{
$hostVersion = (get-host).Version
# changed how the possible values for debugMode in the February build
if (($hostVersion.Major -ge 5) -and ($hostVersion.Minor -ge 0) -and ($hostVersion.Build -ge 9842)){
$debugMode = 'All'
}
else{
$debugMode = $true
}
#setup the localConfigManager
Settings
{
#CertificateID = $node.Thumbprint
# slower performance - and only available WMF5
# now we need to kill the dsc
DebugMode = $debugMode
ConfigurationMode = 'ApplyAndAutoCorrect'
ConfigurationModeFrequencyMins = '15'
AllowModuleOverwrite = $true
RefreshMode = 'Push'
ConfigurationID = $node.ConfigurationID
}
PartialConfiguration GetEventStoreConfiguration {
Description = "Contains the stuff for GetEventStore Being Installed"
ConfigurationSource = "[ConfigurationRepositoryShare]ConfigSource"
RefreshMode = "Pull"
}
PartialConfiguration ExternalIntegrationConfiguration{
Description = "Contains the stuff for External Integration"
ConfigurationSource = "[ConfigurationRepositoryShare]ConfigSource"
DependsOn = '[PartialConfiguration]GetEventStoreConfiguration'
RefreshMode = "Pull"
}
PartialConfiguration ServeGroupSpike{
Description = "Contains the stuff for External Integration"
ConfigurationSource = "[ConfigurationRepositoryShare]ConfigSource"
DependsOn = '[PartialConfiguration]ExternalIntegrationConfiguration'
RefreshMode = "Pull"
}
ConfigurationRepositoryShare ConfigSource{
SourcePath = "\\someServer\Shared\dscService\Configuration"
Credential = $InstallCredential
}
ResourceRepositoryShare ResourceSource{
SourcePath = "\\someServer\Shared\dscService\Resources"
Credential = $InstallCredential
}
}
If I try to include the CertificateID I get an error like:
The property CertificateID of metaconfiguration is not compatible with the current version 2.0.0 of the configuration
document. This property only works with version greater than or equal to 1.0.0 . In case the version is greater, then
the property MinimumCompatibleVersion should be set to atleast 1.0.0 . Set these properties in the
OMI_ConfigurationDocument instance in the document and try again.
+ CategoryInfo : InvalidArgument: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : MI RESULT 4
+ PSComputerName : SGSpike-Main
Naturally when the Configuration is attempted to be applied it can't decrypt the credentials passed, and I get an error in the event view like:
Job {B37D5239-EDBA-11E4-80C2-00155D9ACA1F} :
WarningMessage An error occured while applying the partial configuration [PartialConfiguration]ExternalIntegrationConfiguration. The error message is :
The Local Configuration Manager is not configured with a certificate. Resource '[File]GpgProgram' in configuration 'ExternalIntegrationConfiguration' cannot be processed..
Any ideas how to do this? I had this working with the certificateID when I was using a single configuration in a push model.
Even in the April 2015 drop the problem still seems to exist. Further diagnosis shows that you can:
Not use partial configurations
Not use a certificate to encrypt credentials
Opened an issue on connect (with some more details) at https://connect.microsoft.com/PowerShell/Feedback/Details/1292678