PowerShell Import-GPO: Operation not valid - powershell

Afternoon everyone. I'm running into an issue I'm not sure how to handle. I'm working on a script for work to deploy a Domain Controller using PSremoting. It all works well in fine until I get to where I'm importing some GPOs from backups.
*All the commands are run under invoke-command
I run the command Import-GPO -BackUpName $GPO -TargetName $GPO -Path $GPOPath -MigrationTable $MigTable -CreateIfNeeded
When I run this, I get an error on the host:
Operation is not valid due to the current state of the object.
+ CategoryInfo : NotSpecified: (:) [Import-GPO], InvalidOperationException
+ FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.GroupPolicy.Commands.ImportGpoCommand
+ PSComputerName : v204-DC1
I can't seem to find anything that says what this means. When I check for the GPOs on the DC, they all show up and seem to be linked properly. I am curious what this error is, or if I should just append -ErrorAction SilentlyContinue to the end of my code.

Ended up being an issue with my migration table. I rolled back to an earlier one and ended up being good to go

Related

Posh-SSH is giving an error when in WorkFlow

I wrote a script to export rows from a SQL DB, encrypt them using PGP, then transfer them using POSH-SSH v2.3.0. It all works fine, until I put it in a PowerShell WorkFlow to run multiple at a time.
This particular line is the issue:
Set-SFTPFile -SessionId $sftp.SessionId -LocalFile "$encrypted\$($file.Name).pgp" -RemotePath ".\path"
When running that line, I am getting the below error. If I comment out that line, the error goes away and the script runs fine (minus actually transferring the file).
Microsoft.PowerShell.Utility\Write-Error : Object reference not set to an instance of an object.
At StartExport:20 char:20
+
+ CategoryInfo : NotSpecified: (:) [Write-Error], RemoteException
+ FullyQualifiedErrorId : System.Management.Automation.RemoteException,Microsoft.PowerShell.Commands.WriteErrorCommand
+ PSComputerName : [localhost]
To me, it looks like there is something with POSH-SSH that is not compatible with PowerShell WorkFlows. Has anyone had any experience with this? I would try to upgrade to v3.0 PSH-SSH, but it is on a pretty locked down server that runs a lot of automation, and it would require me to install a new .NET, which could break many things.
Edit: As a test, I installed POSH-SSH v 2.3.0 on my laptop, got the same error. Then I updated POSH-SSH to version 3.0.0 (the latest) and I still get the same error.
Appreciate any help! I've been stuck on this one for quite a while.
"Module was not written to work in workflows. They are not supported at this time."
-https://github.com/darkoperator/Posh-SSH/issues/173
Not the answer I was hoping for, but now I understand. Still open to any work arounds though.

PowerShell DSC: The data source could not process the filter

Afternoon!
I have run into an issue with PowerShell DSC (the Start-DscConfiguration cmdlet specifically) 2 days trying to figure it out, now I am here :)
I am getting the following error for a specific MOF file, prior MOFs run fine.
The data source could not process the filter. The filter might be missing or it might be invalid. Change the filter
and try the request again.
+ CategoryInfo : InvalidArgument: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : HRESULT 0x8033801a
EDIT: Has anyone experienced this error before? I can share the MOF file in question, with some restricted info removed.
Thank you
The issue here wasn't with PowerShell DSC, rather just PowerShell in itself.
I had a cmdlet within a switch block. As an example:
Switch($item){
{$_ -eq $true}{do something}
{$_ -eq $fales}{do something}
get-service -name $item
}
It was the get-service cmdlet that could not be filtered.

Powershell Exception: Not enough quota is available to process this command

I am running a simple cmdlet with Powershell 5.1, winver 1803.
I am running:
Rename-Computer -ComputerName $pc -NewName $newName -DomainCredential $cred -Restart
This command is part of a script. Everything in the script works - and this line used to work, but for some reason it stopped.
The exception is:
Fail to rename computer '$pc' to '$newName' due to the following
exception: Not enough quota is available to process this command.
There are no memory or page-file size issues. There are no other applications even running. Page file size is in excess of 8GB.
WinRM is running, stack 3.0.
Why on earth is this command producing this error? How might I troubleshoot it?
UPDATE
This is what I am seeing...
Rename-Computer : Fail to rename computer 'W4000100' to 'W1401-TR100'
due to the following exception: Not enough quota is available to
process this command. At line:2 char:9
+ Rename-Computer -ComputerName $pc -NewName $newName -DomainCr ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (W4000100:String) [Rename-Computer], InvalidOperationException
+ FullyQualifiedErrorId : FailToRenameComputer,Microsoft.PowerShell.Commands.RenameComputerCommand
Rename-Computer : Fail to rename computer 'W4000100' to 'W1401-TR100'
due to the following exception: Not enough quota is available to
process this command. At line:2 char:9
+ Rename-Computer -ComputerName $pc -NewName $newName -DomainCr ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (W4000100:String) [Rename-Computer], InvalidOperationException
+ FullyQualifiedErrorId : FailToRenameComputer,Microsoft.PowerShell.Commands.RenameComputerCommand
Again, this same message pops up if you try to rename a computer locally via the GUI, and also happens if I remove the hyphen from the name, etc.
After researching, I believe this solution may solve your issue from the following article by Phil Coutard:
http://blog.coultard.com/2012/01/fix-windows-error-0x80070718-not-enough.html
This might be set on the user level. Try going to Control Panel, Sync Center, Offline Files, Manage Offline Files (left hand side), Disk Usage tab, Change Limits. It could be that your Disk Usage has a limit. Try that first and see if that fixes it.
UPDATE:
If that doesn't work, since the error is so generic, I would recommend using Microsoft's Automatic diagnostic/repair tool: https://support.microsoft.com/en-us/help/17590/automatically-diagnose-and-repair-windows-file-and-folder-problems
The answer after troubleshooting with MS is that the image used for these systems was domain-joined at the time of capture. This results in each machine having the same AccountDomainSID, which apparently can cause many issues - however, this is the only issue we have identified as a result.
MS have pretty much stopped researching with me and say this is the cause and that the only fix is to remove from domain and rejoin.

ADAL in Azure Automation: Type not loading intermittantly

I'm working on an Azure Automation script where I need to retrieve an access token to call the AAD Graph API. I wanted to use ADAL to do this so I zipped up Microsoft.IdentityModel.Clients.ActiveDirectory.dll and uploaded it as a module. When I run from the test blade, it sometimes works, and sometimes fails with this error:
New-Object : Cannot find type [Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential]: verify that the
assembly containing this type is loaded.
At line:22 char:9
+ $cred = New-Object Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredent ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidType: (:) [New-Object], PSArgumentException
+ FullyQualifiedErrorId : TypeNotFound,Microsoft.PowerShell.Commands.NewObjectCommand
When it fails I just run it again and it works. Now I've published and scheduled this runbook as a daily job 3 days ago. So far it has failed every day with this same error.
Has any one else seen this? Any suggestions on next steps?
I fixed this by adding the following line to my script:
Add-Type -Path "C:\Modules\User\Microsoft.IdentityModel.Clients.ActiveDirectory\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
Still not sure why it was working intermittently before. I guess depending on what else was going on ADAL may or may not have been loaded?

Remove-Item Vs [System.IO.File]::Delete()

I have the following code in an Azure Runbook:
$pathToDownloadedBlob = 'C:\depId-20150904032522\SevenZipSharp.dll'
if ((Test-Path $pathToDownloadedBlob) -eq $true)
{
try
{
Remove-Item -Path $pathToDownloadedBlob
}
catch
{
write-error "Could not delete $pathToDownloadedBlob. - $($error[0])"
exit
}
}
When I use Remove-Item I get this error:
4/7/2015 2:14:14 PM, Error: Remove-Item : The converted JSON string is in bad format.
At DavidTest:45 char:45
+
+ CategoryInfo : InvalidOperation: (System.Unauthor... Boolean force):ErrorRecord) [Remove-Item],
InvalidOperationException
+ FullyQualifiedErrorId : JsonStringInBadFormat,Microsoft.PowerShell.Commands.RemoveItemCommand
When I use [System.IO.File]::Delete($using:path) instead, I get this error:
4/7/2015 2:22:48 PM, Error: Exception calling "Delete" with "1" argument(s): "Access to the path 'C:\Deployment\SevenZipSharp.dll' is denied."
At DavidTest:46 char:46
+
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : UnauthorizedAccessException
I know I don't have permission to delete the file.
However, why is it complaining about a JSON string when I use Remove-Item?
EDIT:
Note this only happens in Azure Automation. However I wasn't truly able to replicate this in Powershell ISE locally because I have permission to files I wish to delete.
UPDATE:I just realised this is only happening for .dll files. If I try to delete a .7z file it works fine.
I would imagine that this is due to the serialization / deserialization of the object being passed between the PowerShell Workflow context, and the InlineScript Workflow Activity, which runs in a separate process by default.
Are you always passing in a [System.String], or are you sometimes passing in a [System.IO.FileInfo] object? If the latter, then you'll probably want to reference the FullName property, rather than passing in the object itself to Remove-Item.
I'm not 100% sure that this is what you're running into, but it's worth discussing.
By the way, as a best practice, always explicitly name your parameters, so other people understand what you're doing. Your call to Remove-Item doesn't include the -Path parameter, by name, because it's positionally at 0. Of course, this isn't a good thing to take for granted when you're asking for help. Better to be verbose.
Hope this helps at least a bit. By the way, is this problem unique to Azure Automation Runbooks, or does it also exist in locally executed PowerShell Workflows?
Edit: This code seems to work just fine for me locally.
workflow test {
$Path = 'C:\dsc\srv01.xml';
InlineScript { Remove-Item -Path $using:Path; };
}
test