I want a powershell script running as SYSTEM user to display a Windowsform on another users session and have interaction with the controls of it.
I am trying to automate the installation/repair of Symantec Endpoint Protection with Solarwinds N-Able. This platform uses agent software which is installed on clients to monitor and execute tasks on them.
The agent uses the NT AUTHORITY\SYSTEM user to execute tasks on the machine. The installation of SEP works fine so far, but the reboots in between the deinstall/install phases are still uncontrollable as a regular user on the machine. I want the currently active user be able to control this reboot cycles. Something like the Windows update reboot prompt.
My idea is to display a windowsform on logged on user's desktop with controls on it to execute or delay the reboot. My question now is how do I display a windowsform defined in powershell on another user's session, and how am I going to get the actions of the controls back in the script that is running on the SYSTEM user.
I've already tried the msg command to send a message to all the users on the system. But this is only one-way communication and isn't really meant to be used in situations like this is guess.
I found the solution for my problem. I used the WTSSendMessage function which boxdog suggested in the comments. I combined this with a script that gets the sessionID's of the logged on users. I minimized this script to only get the "Active" user's sessionID. This is then used to send the message to user. I tested it in Solarwinds and so far this works flawless.
My coding skills are pretty basic, but this is the end result.
function Send-MessageBox
{
[CmdletBinding()]
[OutputType([string])]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[string]$title,
[Parameter(Mandatory=$true, Position=1)]
[string]$message,
[Parameter(Mandatory=$true, Position=2)]
[int]$duration,
[Parameter(Mandatory=$true, Position=3)]
[int]$style
)
Begin
{
$typeDefinition = #"
using System;
using System.Runtime.InteropServices;
public class WTSMessage {
[DllImport("wtsapi32.dll", SetLastError = true)]
public static extern bool WTSSendMessage(
IntPtr hServer,
[MarshalAs(UnmanagedType.I4)] int SessionId,
String pTitle,
[MarshalAs(UnmanagedType.U4)] int TitleLength,
String pMessage,
[MarshalAs(UnmanagedType.U4)] int MessageLength,
[MarshalAs(UnmanagedType.U4)] int Style,
[MarshalAs(UnmanagedType.U4)] int Timeout,
[MarshalAs(UnmanagedType.U4)] out int pResponse,
bool bWait
);
static int response = 0;
public static int SendMessage(int SessionID, String Title, String Message, int Timeout, int MessageBoxType) {
WTSSendMessage(IntPtr.Zero, SessionID, Title, Title.Length, Message, Message.Length, MessageBoxType, Timeout, out response, true);
return response;
}
}
"#
}
Process
{
if (-not ([System.Management.Automation.PSTypeName]'WTSMessage').Type)
{
Add-Type -TypeDefinition $typeDefinition
}
$RawOuput = (quser) -replace '\s{2,}', ',' | ConvertFrom-Csv
$sessionID = $null
Foreach ($session in $RawOuput) {
if(($session.sessionname -notlike "console") -AND ($session.sessionname -notlike "rdp-tcp*")) {
if($session.ID -eq "Active"){
$sessionID = $session.SESSIONNAME
}
}else{
if($session.STATE -eq "Active"){
$sessionID = $session.ID
}
}
}
$response = [WTSMessage]::SendMessage($sessionID, $title, $message, $duration, $style )
}
End
{
Return $response
}
}
Send-MessageBox -title "Title" -message "Message" -duration 60 -style 0x00001034L
Related
I am trying to use Powershell (auditpol) to query the security setting values of the Audit Policy items. So far with all the auditpol commands, I only able to get the subcategories value instead.
auditpol /get /category:*
So far I could only get the list of the 9 items without the success/failure/no auditing values using:
auditpol /list/category
Could there be a command/flag that I might have left out for auditpol or is there any other command for me to retrieve the policies and its relevant security setting values?
Policy and values that I would like to query.
As you've found, auditpol only manages the settings that are in effect when the "Advanced Audit Policy Configuration" feature is enabled.
To query the "classic" audit policy, you will need to use the LSA Policy Win32 API to:
Open the local security policy using LsaOpenPolicy()
Query the audit settings using LsaQueryPolicyInformation()
Translate the results to something readable.
The following example uses Add-Type to compile a C# type that in turn does all of the above:
$AuditPolicyReader = Add-Type -TypeDefinition #'
using System;
using System.Runtime.InteropServices;
using System.Text;
using System.Linq;
using System.Collections.Generic;
public class AuditPolicyReader
{
[Flags()]
public enum AuditPolicySetting
{
Unknown = -1,
None = 0x0,
Success = 0x1,
Failure = 0x2
}
[StructLayout(LayoutKind.Sequential)]
private struct LSA_UNICODE_STRING
{
public UInt16 Length;
public UInt16 MaximumLength;
public IntPtr Buffer;
}
[StructLayout(LayoutKind.Sequential)]
private struct LSA_OBJECT_ATTRIBUTES
{
public int Length;
public IntPtr RootDirectory;
public LSA_UNICODE_STRING ObjectName;
public UInt32 Attributes;
public IntPtr SecurityDescriptor;
public IntPtr SecurityQualityOfService;
}
public struct POLICY_AUDIT_EVENTS_INFO
{
public bool AuditingMode;
public IntPtr EventAuditingOptions;
public Int32 MaximumAuditEventCount;
}
[DllImport("advapi32.dll")]
static extern uint LsaQueryInformationPolicy(IntPtr PolicyHandle, uint InformationClass, out IntPtr Buffer);
[DllImport("advapi32.dll", SetLastError = true, PreserveSig = true)]
static extern uint LsaOpenPolicy(ref LSA_UNICODE_STRING SystemName, ref LSA_OBJECT_ATTRIBUTES ObjectAttributes, uint DesiredAccess, out IntPtr PolicyHandle);
[DllImport("advapi32.dll", SetLastError = true)]
static extern uint LsaClose(IntPtr ObjectHandle);
public static Dictionary<string, AuditPolicySetting> GetClassicAuditPolicy()
{
// Create dictionary to hold the audit policy settings (the key order here is important!!!)
var settings = new Dictionary<string, AuditPolicySetting>
{
{"System", AuditPolicySetting.Unknown},
{"Logon", AuditPolicySetting.Unknown},
{"Object Access", AuditPolicySetting.Unknown},
{"Privilige Use", AuditPolicySetting.Unknown},
{"Detailed Tracking", AuditPolicySetting.Unknown},
{"Policy Change", AuditPolicySetting.Unknown},
{"Account Management", AuditPolicySetting.Unknown},
{"Directory Service Access", AuditPolicySetting.Unknown},
{"Account Logon", AuditPolicySetting.Unknown},
};
// Open local machine security policy
IntPtr polHandle;
LSA_OBJECT_ATTRIBUTES aObjectAttributes = new LSA_OBJECT_ATTRIBUTES();
aObjectAttributes.Length = 0;
aObjectAttributes.RootDirectory = IntPtr.Zero;
aObjectAttributes.Attributes = 0;
aObjectAttributes.SecurityDescriptor = IntPtr.Zero;
aObjectAttributes.SecurityQualityOfService = IntPtr.Zero;
var systemName = new LSA_UNICODE_STRING();
uint desiredAccess = 2; // we only need the audit policy, no need to request anything else
var res = LsaOpenPolicy(ref systemName, ref aObjectAttributes, desiredAccess, out polHandle);
if (res != 0)
{
if(res == 0xC0000022)
{
// Access denied, needs to run as admin
throw new UnauthorizedAccessException("Failed to open LSA policy because of insufficient access rights");
}
throw new Exception(string.Format("Failed to open LSA policy with return code '0x{0:X8}'", res));
}
try
{
// now that we have a valid policy handle, we can query the settings of the audit policy
IntPtr outBuffer;
uint policyType = 2; // this will return information about the audit settings
res = LsaQueryInformationPolicy(polHandle, policyType, out outBuffer);
if (res != 0)
{
throw new Exception(string.Format("Failed to query LSA policy information with '0x{0:X8}'", res));
}
// copy the raw values returned by LsaQueryPolicyInformation() to a local array;
var auditEventsInfo = Marshal.PtrToStructure<POLICY_AUDIT_EVENTS_INFO>(outBuffer);
var values = new int[auditEventsInfo.MaximumAuditEventCount];
Marshal.Copy(auditEventsInfo.EventAuditingOptions, values, 0, auditEventsInfo.MaximumAuditEventCount);
// now we just need to translate the provided values into our settings dictionary
var categoryIndex = settings.Keys.ToArray();
for (int i = 0; i < values.Length; i++)
{
settings[categoryIndex[i]] = (AuditPolicySetting)values[i];
}
return settings;
}
finally
{
// remember to release policy handle
LsaClose(polHandle);
}
}
}
'# -PassThru |Where-Object Name -eq AuditPolicyReader
Now we can call GetClassicAuditPolicy() (remember to run this from an elevated prompt):
PS ~> $AuditPolicyReader::GetClassicAuditPolicy()
Key Value
--- -----
System None
Logon Success, Failure
Object Access None
Privilige Use None
Detailed Tracking None
Policy Change Success
Account Management Success, Failure
Directory Service Access None
Account Logon None
auditpol only returns the Advanced audit policy configuration. These settings can be found in the UI under Security Settings > Advanced Audit Policy Configuration > System Audit Policies
The legacy audit policy your screenshot shows were mostly done away with after Windows Server 2003/Windows Vista. Note the warnings in the policy properties or on the MS compatibility page:
For advanced policies, you can use /r to get a csv-formatted table:
auditpol /get /category:'Account Logon' /r | ConvertFrom-Csv |
Format-Table 'Policy Target',Subcategory,'Inclusion Setting'
Policy Target Subcategory Inclusion Setting
------------- ----------- -----------------
System Kerberos Service Ticket Operations No Auditing
System Other Account Logon Events No Auditing
System Kerberos Authentication Service No Auditing
System Credential Validation No Auditing
For legacy audit policies:
secedit.exe /export /areas SECURITYPOLICY /cfg filename.txt
[Event Audit]
AuditSystemEvents = 0
AuditLogonEvents = 0
AuditObjectAccess = 0
AuditPrivilegeUse = 0
AuditPolicyChange = 0
AuditAccountManage = 0
AuditProcessTracking = 0
AuditDSAccess = 0
AuditAccountLogon = 0
Requires that it hasn't been disabled. Check in the registry:
Get-ItemProperty HKLM:\System\CurrentControlSet\Control\Lsa -Name SCENoApplyLegacyAuditPolicy
Here is a code that gives you a list of all categories and subcategories with their current audit-status. I made it a bit longer than really needed to add the local names of each object. Also see some usage-samples at the end of the code.
# getting the audit policy settings for each subcategory
# works for any OS language
cls
Remove-Variable * -ea 0
$ErrorActionPreference = 'stop'
#requires -runasadmin
$dll = [string]::Join("`r`n", '[DllImport("advapi32.dll")]', 'public static extern bool')
$auditpol = Add-Type -Name 'AuditPol' -Namespace 'Win32' -PassThru -MemberDefinition "
$dll AuditEnumerateCategories(out IntPtr catList, out uint count);
$dll AuditLookupCategoryName(Guid catGuid, out string catName);
$dll AuditEnumerateSubCategories(Guid catGuid, bool all, out IntPtr subList, out uint count);
$dll AuditLookupSubCategoryName(Guid subGuid, out String subName);
$dll AuditQuerySystemPolicy(Guid subGuid, uint count, out IntPtr policy);
$dll AuditFree(IntPtr buffer);"
Add-Type -TypeDefinition "
using System;
public struct AUDIT_POLICY_INFORMATION {
public Guid AuditSubCategoryGuid;
public UInt32 AuditingInformation;
public Guid AuditCategoryGuid;
}"
function getPolicyInfo($sub) {
# get policy info for one subcategory:
$pol = new-object AUDIT_POLICY_INFORMATION
$size = $ms::SizeOf($pol)
$ptr = $ms::AllocHGlobal($size)
$null = $ms::StructureToPtr($pol, $ptr, $false)
$null = $auditpol::AuditQuerySystemPolicy($sub, 1, [ref]$ptr)
$pol = $ms::PtrToStructure($ptr, [type][AUDIT_POLICY_INFORMATION])
$null = $ms::FreeHGlobal($ptr)
[PsCustomObject]#{
category = $pol.AuditCategoryGuid
success = [bool]($pol.AuditingInformation -band 1)
failure = [bool]($pol.AuditingInformation -band 2)
}
}
# (optional) get GUID and local name of all categories:
$ms = [System.Runtime.InteropServices.Marshal]
$count = [uint32]0
$buffer = [IntPtr]::Zero
$size = $ms::SizeOf([type][guid])
$null = $auditpol::AuditEnumerateCategories([ref]$buffer,[ref]$count)
$ptr = [int64]$buffer
$name = [System.Text.StringBuilder]::new()
$catList = #{}
foreach($id in 1..$count) {
$guid = $ms::PtrToStructure([IntPtr]$ptr,[type][guid])
$null = $auditpol::AuditLookupCategoryName($guid,[ref]$name)
$catList[$guid] = $name
$ptr += $size
}
$null = $auditpol::AuditFree($buffer)
# get all subcategories (with optional name):
$guid = [guid]::Empty
$null = $auditpol::AuditEnumerateSubCategories($guid, $true, [ref]$buffer, [ref]$count)
$ptr = [int64]$buffer
$subList = #{}
foreach($id in 1..$count) {
$guid = $ms::PtrToStructure([IntPtr]$ptr,[type][guid])
$null = $auditpol::AuditLookupSubCategoryName($guid,[ref]$name)
$pol = getPolicyInfo $guid
$data = [psCustomObject]#{
category = $catList[$pol.category]
subcategory = $name
success = $pol.success
failure = $pol.failure
}
$subList[$guid.guid] = $data
$ptr += $size
}
$null = $auditpol::AuditFree($buffer)
# listing all subCategories and their audit settings:
$subList.Values | sort category, subcategory | ft -AutoSize
# gettings the audit-settings for a given subcategory-GUID (without '{}'):
$process_creation_guid = '0CCE922B-69AE-11D9-BED3-505054503030'
$subList[$process_creation_guid]
I have a PowerShell script that creates a schedule task to launch the script. The idea is there are some task in the script that requires reboot. At the end of the PowerShell a message box should prompt the user to let the user knows that all the tasks are completed. What am i doing wrong?
Add-Type -AssemblyName PresentationFramework
TaskName = "Run Agents Install Script"
$TaskDescription = "Run Agents Install Script at logon"
$Action = New-ScheduledTaskAction -Execute 'Powershell.exe' `
-Argument "-executionpolicy remotesigned -File $PSScriptRoot\AgentInstall.ps1"
$Trigger = New-ScheduledTaskTrigger -AtLogOn
Register-ScheduledTask -Action $Action -Trigger $Trigger -TaskName $TaskName -Description $TaskDescription -User "System"
$MsgBoxInput = [System.Windows.MessageBox]::Show('Installation completed successfully.','Agent Install','OK')
Switch ($MsgBoxInput) {
'OK'
{
$MsgBoxInput = [System.Windows.MessageBox]::Show('WARNING! Please install Imprivata agent manually if applicable.','Agent Install','OK')
}
}
One option is to use the Terminal Services API to send a message to the console. Unfortunately, it is native API, so you need to use .NET interop to call it, but in this case it isn't too tricky:
$typeDefinition = #"
using System;
using System.Runtime.InteropServices;
public class WTSMessage {
[DllImport("wtsapi32.dll", SetLastError = true)]
public static extern bool WTSSendMessage(
IntPtr hServer,
[MarshalAs(UnmanagedType.I4)] int SessionId,
String pTitle,
[MarshalAs(UnmanagedType.U4)] int TitleLength,
String pMessage,
[MarshalAs(UnmanagedType.U4)] int MessageLength,
[MarshalAs(UnmanagedType.U4)] int Style,
[MarshalAs(UnmanagedType.U4)] int Timeout,
[MarshalAs(UnmanagedType.U4)] out int pResponse,
bool bWait
);
static int response = 0;
public static int SendMessage(int SessionID, String Title, String Message, int Timeout, int MessageBoxType) {
WTSSendMessage(IntPtr.Zero, SessionID, Title, Title.Length, Message, Message.Length, MessageBoxType, Timeout, out response, true);
return response;
}
}
"#
Add-Type -TypeDefinition $typeDefinition
[WTSMessage]::SendMessage(1, "Message Title", "Message body", 30, 36)
This is essentially a thin wrapper to the WTSSendMessage function.
You will need to get the SessionID via some tool like query. This script might help with that: Get-UserSession.
The TimeOut value here is 30, which means the pop-up will wait 30 seconds before returning with a value of '32000'. Set to '0' to wait forever.
The MessageBoxType is a combination of the values for uType here: MessageBox Function. So the '36' in the example is a combination of the values for 'MB_YESNO' and 'MB_ICONQUESTION', so will show a message with a question mark icon and 'yes'/'no' buttons. Note that the documentation gives the values in hexadecimal, so you'll need to convert them.
I tested this as a scheduled task running as an admin and it was able to show a message on the desktop of a different logged on user. hope it helps.
Recently I encounter some problems making the connection to a FTP server but there will be some popup asking for the acceptance on the certificate.
I don't know how to overcome this via PowerShell during invoke method $ftpRequest.GetResponse(). I found some solution regarding overriding the callback method on certificate like this one [System.Net.ServicePointManager]::ServerCertificateValidationCallback
The solution is given on C# & I don't know how to port it to PowerShell yet.
My code is as below
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
Write-Information -MessageData "Create folder to store backup (Get-FolderName -Path $global:backupFolder)"
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.EnableSsl = $true
$response = $ftprequest.GetResponse();
Write-Host "Folder created successfully, status $response.StatusDescription"
$response.Close();
}
[UPDATED] While searching for Invoke-RestRequest, I found this solution from Microsoft example
Caution: this is actually accept ANY Certificate
# Next, allow the use of self-signed SSL certificates.
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $True }
More information (thanks to #Nimral) : https://learn.microsoft.com/en-us/dotnet/api/system.net.servicepointmanager.servercertificatevalidationcallback?view=netcore-3.1
It's a bit hacky, but you can use raw C# in PowerShell via Add-Type. Here's an example class I've used to be able to toggle certificate validation in the current PowerShell session.
if (-not ([System.Management.Automation.PSTypeName]'CertValidation').Type)
{
Add-Type #"
using System.Net;
using System.Net.Security;
using System.Security.Cryptography.X509Certificates;
public class CertValidation
{
static bool IgnoreValidation(object o, X509Certificate c, X509Chain ch, SslPolicyErrors e) {
return true;
}
public static void Ignore() {
ServicePointManager.ServerCertificateValidationCallback = IgnoreValidation;
}
public static void Restore() {
ServicePointManager.ServerCertificateValidationCallback = null;
}
}
"#
}
Then you can use it prior to calling your function like this.
[CertValidation]::Ignore()
And later, restore default cert validation like this.
[CertValidation]::Restore()
Keep in mind though that it's much safer to just fix your service's certificate so that validation actually succeeds. Ignoring certificate validation should be your last resort if you have no control over the environment.
I am trying to work with our Load Balancer via Powershell 3.0 and a REST API. However I am currently getting a failure no matter what I try if it is an https request, whether to our load balancer or to any other https site. I feel like I'm missing something obvious.
Here is the code that fails with https
try
{
#fails
#$location='https://www.bing.com'
#fails
#$location='https://www.google.com'
#fails
#$location='https://www.facebook.com'
#fails
#$location='https://www.ebay.com'
#works
#$location='http://www.bing.com'
#works
#$location='http://www.google.com'
#fails (looks like Facebook does a redirect to https://)
$location='http://www.facebook.com'
#works
#$location='http://www.ebay.com'
$response=''
$response = Invoke-WebRequest -URI $location
$response.StatusCode
$response.Headers
}
catch
{
Write-Host StatusCode $response.StatusCode
Write-Host $_.Exception
}
The error I get is:
System.Net.WebException: The underlying connection was closed: An unexpected error occurred on a send. ---> System.Management.Automation.PSInvalidOperationException:
There is no Runspace available to run scripts in this thread. You can provide one in the DefaultRunspace property of the System.Management.Automation.Runspaces.Runspa
ce type. The script block you attempted to invoke was: $true
at System.Net.TlsStream.EndWrite(IAsyncResult asyncResult)
at System.Net.ConnectStream.WriteHeadersCallback(IAsyncResult ar)
--- End of inner exception stack trace ---
at Microsoft.PowerShell.Commands.WebRequestPSCmdlet.GetResponse(WebRequest request)
at Microsoft.PowerShell.Commands.WebRequestPSCmdlet.ProcessRecord()
I was hoping this page and the suggestions towards the bottom including the one from Aaron D.) would make a difference but none of them made a difference.
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
and
function Ignore-SSLCertificates
{
$Provider = New-Object Microsoft.CSharp.CSharpCodeProvider
$Compiler = $Provider.CreateCompiler()
$Params = New-Object System.CodeDom.Compiler.CompilerParameters
$Params.GenerateExecutable = $false
$Params.GenerateInMemory = $true
$Params.IncludeDebugInformation = $false
$Params.ReferencedAssemblies.Add("System.DLL") > $null
$TASource=#'
namespace Local.ToolkitExtensions.Net.CertificatePolicy
{
public class TrustAll : System.Net.ICertificatePolicy
{
public bool CheckValidationResult(System.Net.ServicePoint sp,System.Security.Cryptography.X509Certificates.X509Certificate cert, System.Net.WebRequest req, int problem)
{
return true;
}
}
}
'#
$TAResults=$Provider.CompileAssemblyFromSource($Params,$TASource)
$TAAssembly=$TAResults.CompiledAssembly
## We create an instance of TrustAll and attach it to the ServicePointManager
$TrustAll = $TAAssembly.CreateInstance("Local.ToolkitExtensions.Net.CertificatePolicy.TrustAll")
[System.Net.ServicePointManager]::CertificatePolicy = $TrustAll
}
and
add-type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"#
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
I have tried switching to Invoke-RestCommand but to no avail as I get the same response.
It feels like this has to be something environmental because I can't believe the above doesn't work for anyone else, but I've tried it on a workstation and on a server with the same results (doesn't rule out environment completely but I know they were set up differently).
Any thoughts?
This worked perfectly for me. The site defaults to TLS 1.0 and apparently PS doesn't work with that. I used this line:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
My PS scripts (so far all I've tested) have worked perfectly.
The answer is do not do this to solve the SSL issue:
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
If you do this, your first https request will work (it seems), however subsequent ones will not. Additionaly at that point you need to close out of the Powershell ISE, and reopen it and then try again (without that line).
This is alluded to in a sentence here http://social.technet.microsoft.com/Forums/windowsserver/en-US/79958c6e-4763-4bd7-8b23-2c8dc5457131/sample-code-required-for-invokerestmethod-using-https-and-basic-authorisation?forum=winserverpowershell - "And all subsequent runs produce this error :", but it wasn't clear what the solution to reset was.
I too was plagued by this for a really long time. It even affected Visual Studio as VS loaded my $PROFILE into it's domain when running NuGet restore.
Seeing your comment above made me realize that I had a custom callback script because of one of our vendors shipped a product with an invalid CN in it's ssl cert.
Long story short, I replaced my script delegate with a compiled c# object (removing the script runspace from the equation).
(separate code block for C# highlighting)
using System.Net;
using System.Net.Security;
using System.Security.Cryptography.X509Certificates;
public static class CustomCertificateValidationCallback {
public static void Install()
{
ServicePointManager.ServerCertificateValidationCallback += CustomCertificateValidationCallback.CheckValidationResult;
}
public static bool CheckValidationResult(
object sender,
X509Certificate certificate,
X509Chain chain,
SslPolicyErrors sslPolicyErrors)
{
// please don't do this. do some real validation with explicit exceptions.
return true;
}
}
In Powershell:
Add-Type "" # C# Code
[CustomCertificateValidationCallback]::Install()
Consolidating and condensing some of the above learnings, I have adopted the following approach:
Syntax colored and commented like the C# of yore:
// Piggyback in System.Net namespace to avoid using statement(s)
namespace System.Net
{
// Static class to make the ps call easy
// Uses a short name that is unlikely to clash with real stuff...YMMV
public static class Util
{
// Static method for a static class
public static void Init()
{
// [optionally] clear any cruft loaded into this static scope
ServicePointManager.ServerCertificateValidationCallback = null;
// Append a dangerously permissive validation callback
// using lambda syntax for brevity.
ServicePointManager.ServerCertificateValidationCallback +=
(sender, cert, chain, errs) => true;
// Tell SPM to try protocols that have a chance
// of working against modern servers.
// Word on the street is that these will be tried from "most secure"
// to least secure. Some people add em all!
ServicePointManager.SecurityProtocol =
SecurityProtocolType.Tls |
SecurityProtocolType.Tls11 |
SecurityProtocolType.Tls12;
}
}
}
And now the real powershell highlighted version (no comments, but the same code)
Add-Type -Language CSharp #"
namespace System.Net {
public static class Util {
public static void Init() {
ServicePointManager.ServerCertificateValidationCallback = null;
ServicePointManager.ServerCertificateValidationCallback += (sender, cert, chain, errs) => true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
}}}"#
[System.Net.Util]::Init()
Obviously you can remove irrelevant whitespace, but you should be able to drop that into your session, and then Invoke-WebRequest at will.
Note that the
# Do not use IMHO!
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
approach seems quite incorrect for ps 5.1 (where i have tested this). Not sure where it came from, but I wish I had avoided it and saved the heartache.
The below powershell script works for me to check post web request
add-type #"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"#
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12'
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::CertificatePolicy = New-Object TrustAllCertsPolicy
$uri = "XXXX"
$person = #{grant_type= 'user_password'
username = 'XXXX'
password = 'XXX'
}
$body = (ConvertTo-Json $person)
$hdrs = #{}
$hdrs.Add("XXXX","XXXX")
Invoke-RestMethod -Uri $uri -Method Post -Body $body -ContentType 'application/json' -Headers $hdrs
I have a powershell module which attempts to upload a blob to azure storage. Everything checks out until the last line which actually uploads the blob.
I receive the following error:
Exception calling "UploadText" with "1" argument(s):
"The specified resource does not exist."
At line:1 char:1
+ $blob.UploadText("asdasdfsdf")
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageClientException
I have also tried using the overload with 3 args, but the same issue exists there as well.
Here is the module:
Function Add-BlobText
{
[CmdletBinding()]
param(
[Parameter(Mandatory = $true,Position = 0)]
[string]
$StorageAccount,
[Parameter(Mandatory = $true,Position = 1)]
[string]
$Container,
[Parameter(Mandatory = $true,Position = 2)]
[string]
$BlobName,
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[string]
$BlobText
) #end param
Add-Type -Path "C:\Assemblies\Microsoft.WindowsAzure.StorageClient.dll"
Set-AzureSubscription -SubscriptionName "MySubName"
$secondaryKey = (Get-AzureStorageKey -StorageAccountName $storageAccount).Secondary
$creds = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey($StorageAccount,$secondaryKey)
$cloudStorageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount($creds, $true)
[Microsoft.WindowsAzure.StorageClient.CloudBlobClient]$cloudBlobClient = New-Object Microsoft.WindowsAzure.StorageClient.CloudBlobClient($cloudStorageAccount.BlobEndpoint)
[Microsoft.WindowsAzure.StorageClient.CloudBlobContainer]$blobContainer = $cloudBlobClient.GetContainerReference($Container)
[Microsoft.WindowsAzure.StorageClient.CloudBlob]$blob = $blobContainer.GetBlobReference($BlobName)
$blob.UploadText($BlobText)
} #end Function Add-BlobText
Update:
I have been able to get this working as a binary module (below). If anyone can figure out why UploadText() works within a binary module but throws an exception in a script module, please let me know.
[Cmdlet(VerbsCommon.Add, "BlobText")]
public class AddBlobText : PSCmdlet
{
[Parameter(Mandatory = true, Position = 0)]
public string StorageAccount { get; set; }
[Parameter(Mandatory = true, Position = 1)]
public string Container { get; set; }
[Parameter(Mandatory = true, Position = 2)]
public string BlobName { get; set; }
[Parameter(Mandatory = true, ValueFromPipeline = true)]
public string BlobText { get; set; }
protected override void ProcessRecord()
{
PowerShell ps = PowerShell.Create();
ps.AddScript("Set-AzureSubscription -SubscriptionName 'MySubName'");
string keyScript = "( Get-AzureStorageKey -StorageAccountName " + StorageAccount + " ).Secondary";
ps.AddScript(keyScript);
Collection<PSObject> result = ps.Invoke();
string secondaryKey = result[0].ToString();
StorageCredentialsAccountAndKey credentials = new StorageCredentialsAccountAndKey(StorageAccount, secondaryKey);
CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(Container);
var blob = container.GetBlobReference(BlobName);
blob.UploadText(BlobText);
}
}
This is probably because your container does not exist. You should call CreateIfNotExist after initializing the container to make sure it exists:
[Microsoft.WindowsAzure.StorageClient.CloudBlobContainer]$blobContainer = $cloudBlobClient.GetContainerReference($Container)
$blobContainer.CreateIfNotExist() <-- Here
[Microsoft.WindowsAzure.StorageClient.CloudBlob]$blob = $blobContainer.GetBlobReference($BlobName)
$blob.UploadText($BlobText)
This error is very ambiguous and misleading but there are instances' where Azure Storage can get "confused". Looking at Sandrino's example and specifically this line,
[Microsoft.WindowsAzure.StorageClient.CloudBlob]$blob = $blobContainer.GetBlobReference($BlobName)
Not that Sandrino's answer is your issue but the exception you encountered will happen when passing a Url or possibly other confusing key strings to Azure Storage Containers.
Unfortunately I am not a Powershell guy but here is a reproducing example then fix in C#.
public void Save(string objId, T obj)
{
CloudBlob blob = this.container.GetBlobReference(objId); // Problematic if a URL
blob.Properties.ContentType = "application/json";
var serialized = string.Empty;
serialized = serializer.Serialize(obj);
if (this.jsonpSupport)
{
serialized = this.container.Name + "Callback(" + serialized + ")";
}
blob.UploadText(serialized);
}
Assume that this.container is a valid blob storage instance pointing to http://127.0.0.1:10000/devstoreaccount1/sggames or whatever you have for a valid container.
And objId is a key that contains a Url like https://www.google.com/accounts/o8/id?id=AItOawk4Dw9sLxSc-zmdWQHdZNcyzkTcvKUkhiE ...and yes this can happen, in my case this is an actual identity claim from Google using Azure ACS.
After the GetBlobReference call the blob instance has become corrupt which now looks at a messed up Uri -> https://www.google.com/accounts/o8/id?id=AItOawk4Dw9sLxSc-zmdWQHdZNcyzkTcvKUkhiE
Unfortunately the solution to simply call $blobContainer.CreateIfNotExist() is not a solution and wouldn't work. Key's that contain a Uri structure will simply be used to re-interpret the blob storage location.
The work around (other than daredev's Update) would be something like this:
if (Uri.IsWellFormedUriString(claim, UriKind.Absolute) && HttpUtility.ParseQueryString(claim).Count > 0)
{
claim = HttpUtility.ParseQueryString(claim)[0];
}
Add this code within my method above to clean up any Uri's, but you could use any appropriate method like Base64 encoding URLs if you need to maintain the full key.
Here are the before and after images showing the results as I described.
The bad:
notice the bad URI
this bad URI munged up the actual storage blob location
here is the same exception daredev had
The good:
the new scrubbed key, notice it's just the value on the URL's query string
Azure Storage URI looks good now
Eureka!
Hope this helps.
This is the PowerShell script I use to upload a file to Azure Blob: Uploading to Azure Storage
$SubscriptionName = ""
$SubscriptionId = ""
$DestContainer = ""
$StorageAccountName = ""
Import-AzurePublishSettingsFile -PublishSettingsFile "<Location of the publishsettings-file>"
Set-AzureSubscription -SubscriptionId $SubscriptionId -CurrentStorageAccountName $StorageAccountName
Select-AzureSubscription -SubscriptionName $SubscriptionName
Set-AzureStorageBlobContent -File "<File you want to upload>" -Container $DestContainer