how to establish and enter a remote session using runspace in powershell - powershell

I am trying to establish a session with a remote host B from my machine A, within a C# code. I am using runspace API for that. the code snippet is provided below
Runspace runspace = RunspaceFactory.CreateRunspace();
runspace.Open();
//constructing the vmname parameter here
vmname = useralias + DateTime.Now.ToString();
Pipeline pipeline = runspace.CreatePipeline();
string scripttext = "$secpasswd = ConvertTo-SecureString '222_bbbb' -AsPlainText –Force";
string scripttext1 = "$mycreds = New-object -typename System.Management.Automation.PSCredential('TS-TEST-09\\Administrator',$secpasswd)";
string scripttext2 = "$s = New-PSSession -ComputerName TS-TEST-09 -Credential $mycreds";
//not accepting session string here, only computername acceptable
**string scripttext3 = "Enter-PSSession -Session $s";**
//Command cmd = new Command(#"C:\mypath\helper.ps1", true);
//cmd.Parameters.Add("local_useralias", useralias);
//cmd.Parameters.Add("local_vmname", vmname);
//cmd.Parameters.Add("local_datastore", datastoredropdown.Text.ToString());
//cmd.Parameters.Add("local_architecture", architecturedropdown.Text.ToString());
pipeline.Commands.AddScript(scripttext);
pipeline.Commands.AddScript(scripttext1);
pipeline.Commands.AddScript(scripttext2);
pipeline.Commands.AddScript(scripttext3);
//pipeline.Commands.Add(cmd);
Collection<PSObject> results = pipeline.Invoke();
runspace.Close();
this code is expected to enter into a session with machine TS-TEST-09 and invoke the script helper.ps1 existing on that machine(that part is commented out in the code currently as i am not able to enter into the session with remote host).
now the problem is that i can't enter into the session $s using -Session parameter(highlighted at scripttext3) however i can enter into it using -Computername parameter.
the error that i get when using -Session parameter in scripttext3 is :
at
invokedSystem.Management.Automation.Internal.Host.InteralHost.GetIHostSupportsInteractiveSession()
at invokedSystem.Management.Automation.
Internal.Host.InteralHost.PushRunspace(Runspace runspace)
at Microsoft.Powershel.Commands.EnterPSSessionCommand.ProcessRecord()
at System.Management.Automation.Cmdlet.DoProcessRecord()
at System.Management.Automation.CommandProcessor.ProcessRecord()
end of inner exception stack trace
does it mean i have to write a custom PSHost and add support for the Enter-PSSession cmdlet with this parameter?
Is there any alternative to make this command work?
any help will be much appreciated.
Thanks,
Manish

The easiest way to open a remote session goes something like this:
string shell = "http://schemas.microsoft.com/powershell/Microsoft.PowerShell";
var target = new Uri("http://myserver/wsman");
var secured = new SecureString();
foreach (char letter in "mypassword")
{
secured.AppendChar(letter);
}
secured.MakeReadOnly();
var credential = new PSCredential("username", secured);
var connectionInfo = new WSManConnectionInfo(target, shell, credential);
Runspace remote = RunspaceFactory.CreateRunspace(connectionInfo);
remote.Open();
using (var ps = PowerShell.Create())
{
ps.Runspace = remote;
ps.Commands.AddCommand("'This is running on {0}.' -f (hostname)");
Collection<PSObject> output = ps.Invoke();
}
You could also create remote pipelines from the remote runspace instance, but the new PowerShell object is a much more managable way to do this (since powershell v2.)
In powershell v3, you can just new up a WSManConnectionInfo and set the ComputerName property as the other properties adopt the same defaults as the above. Unfortunately these properties are read-only in v2 and you have to pass in the minimum as above. Other variants of the constructor will let you use kerberos/negotiate/credssp etc for authentication.
-Oisin

I wanted to leave a comment on the solution, as it helped me alot as well,
but one thing i was missing was the port for WSMan which is 5985
so this
var target = new Uri("http://myserver/wsman");
should be
var target = new Uri("http://myserver:5895/wsman");
in my case

Have you tried Invoke-Command? Enter-PsSession is there for interactive use. (see help Enter-PsSession -online).

Related

EWS and AutoDiscoverURL error using Azure AD Certificate with Powershell

I've tried with and without Secret ID, and now with a self-signed Certificate and I keep getting the same error:
Exception calling "AutodiscoverUrl" with "2" argument(s): "The
expected XML node type was XmlDeclaration, but the actual type is
Element."
My PowerShell script:
$TenantId = "blahblah"
$AppClientId="blahblah"
$EDIcertThumbPrint = "blahblah"
$EDIcert = get-childitem Cert:\CurrentUser\My\$EDIcertThumbPrint
$MsalParams = #{
ClientId = $AppClientId
TenantId = $TenantId
ClientCertificate = $EDIcert
Scopes = "https://outlook.office.com/.default"
}
$MsalResponse = Get-MsalToken #MsalParams
$EWSAccessToken = $MsalResponse.AccessToken
Import-Module 'C:\Program Files\Microsoft\Exchange\Web Services\2.2\Microsoft.Exchange.WebServices.dll'
#Provide the mailbox id (email address) to connect via AutoDiscover
$MailboxName ="email#myemaildomain.com.au"
$ews = [Microsoft.Exchange.WebServices.Data.ExchangeService]::new()
$ews.Credentials = [Microsoft.Exchange.WebServices.Data.OAuthCredentials]$EWSAccessToken
$ews.Url = "https://outlook.office365.com/EWS/Exchange.asmx"
$ews.AutodiscoverUrl($MailboxName,{$true})
I've searched that error message everywhere, and I am not getting anywhere. The error doesn't make sense, because I am not referring to XML in any way - unless it's embedded inside the EWS?
The only time this works is when I don't use either a Secret ID nor a Certificate, but the Token only lasts 1 hour! I need to make this automatic, so I can get into my mailbox and extract files from emails.
Thanks
UPDATE
So I've removed the AutoDiscoverUrl() and I now getting another error:
Exception calling "FindItems" with "2" argument(s): "The request
failed. The remote server returned an error: (403) Forbidden."
Trace log:
The token contains not enough scope to make this call.";error_category="invalid_grant"
But why when I have an Oauth token!?
My code in trying to open the "Inbox":
$results = $ews.FindItems(
"Inbox",
( New-Object Microsoft.Exchange.WebServices.Data.ItemView -ArgumentList 100 )
)
$MailItems = $results.Items | where hasattachments
AutoDiscoverv1 doesn't support the client credentials flow so you need to remove the line
$ews.AutodiscoverUrl($MailboxName,{$true})
It's redundant anyway because your already setting the EWS endpoint eg
$ews.Url = "https://outlook.office365.com/EWS/Exchange.asmx"
The only time that endpoint would change is if you had mailbox OnPrem in a hybrid environment and there are other ways you can go about detecting that such as autodiscoverv2.

can not connect Azure SQL with Azure AD user in Azure

I have a powershell script used to query Azure SQL database with Azure AD user. I can run the poweshell script on my local machine. But when I host the powershell script on Azure function, I always get the error : keyword not supported: 'authentication'
My script
$Username = “”
$Password = “”
$Port = 1433
$cxnString = "Server=tcp:$serverName,$Port;Database=$databaseName;UID=$UserName;PWD=$Password;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;Authentication=Active Directory Password"
$query = ''
Invoke-Sqlcmd -ConnectionString $cxnString - Query $query
According to my test, now we can not implement it with PowerShell in Azure Function. Now we can implement it with .Net Core Function. We need to package Microsoft.Data.SqlClient
For example
SqlConnectionStringBuilder builder = new SqlConnectionStringBuilder();
string userName = "<>";
string password = "<>";
string server = "<>.database.windows.net";
string database = "test";
builder.ConnectionString = "Server=tcp:" + server + ",1433;Initial Catalog=" + database + ";Authentication=Active Directory Password;User ID=" + userName + ";Password=" + password + ";MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;";
using (SqlConnection connection = new SqlConnection(builder.ConnectionString))
{
connection.Open();
string sql = "SELECT * FROM StarWars";
using (SqlCommand command = new SqlCommand(sql, connection))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
log.LogInformation(reader.GetString(2));
}
}
}
}

Accept certificate permanently during FtpWebRequest via PowerShell

Recently I encounter some problems making the connection to a FTP server but there will be some popup asking for the acceptance on the certificate.
I don't know how to overcome this via PowerShell during invoke method $ftpRequest.GetResponse(). I found some solution regarding overriding the callback method on certificate like this one [System.Net.ServicePointManager]::ServerCertificateValidationCallback
The solution is given on C# & I don't know how to port it to PowerShell yet.
My code is as below
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
Write-Information -MessageData "Create folder to store backup (Get-FolderName -Path $global:backupFolder)"
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.EnableSsl = $true
$response = $ftprequest.GetResponse();
Write-Host "Folder created successfully, status $response.StatusDescription"
$response.Close();
}
[UPDATED] While searching for Invoke-RestRequest, I found this solution from Microsoft example
Caution: this is actually accept ANY Certificate
# Next, allow the use of self-signed SSL certificates.
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = { $True }
More information (thanks to #Nimral) : https://learn.microsoft.com/en-us/dotnet/api/system.net.servicepointmanager.servercertificatevalidationcallback?view=netcore-3.1
It's a bit hacky, but you can use raw C# in PowerShell via Add-Type. Here's an example class I've used to be able to toggle certificate validation in the current PowerShell session.
if (-not ([System.Management.Automation.PSTypeName]'CertValidation').Type)
{
Add-Type #"
using System.Net;
using System.Net.Security;
using System.Security.Cryptography.X509Certificates;
public class CertValidation
{
static bool IgnoreValidation(object o, X509Certificate c, X509Chain ch, SslPolicyErrors e) {
return true;
}
public static void Ignore() {
ServicePointManager.ServerCertificateValidationCallback = IgnoreValidation;
}
public static void Restore() {
ServicePointManager.ServerCertificateValidationCallback = null;
}
}
"#
}
Then you can use it prior to calling your function like this.
[CertValidation]::Ignore()
And later, restore default cert validation like this.
[CertValidation]::Restore()
Keep in mind though that it's much safer to just fix your service's certificate so that validation actually succeeds. Ignoring certificate validation should be your last resort if you have no control over the environment.

Powershell DSC client can't register with pull server

For the past few days, I have been trying to create a development/test environment where I can automate deployments with DSC.
I have been using WMF5.1.
The pullserver has been set up using the example: Sample_xDscWebServiceRegistrationWithSecurityBestPractices
From xPSDesiredStateConfiguration 5.1.0.0.
configuration Sample_xDscWebServiceRegistrationWithSecurityBestPractices
{
param
(
[string[]]$NodeName = 'CORE-O-DSCPull.CORE.local',
[ValidateNotNullOrEmpty()]
[string] $certificateThumbPrint,
[Parameter(HelpMessage='This should be a string with enough entropy (randomness) to protect the registration of clients to the pull server. We will use new GUID by default.')]
[ValidateNotNullOrEmpty()]
[string] $RegistrationKey # A guid that clients use to initiate conversation with pull server
)
Import-DSCResource -ModuleName xPSDesiredStateConfiguration -ModuleVersion '5.1.0.0'
Node $NodeName
{
WindowsFeature DSCServiceFeature
{
Ensure = "Present"
Name = "DSC-Service"
}
xDscWebService PSDSCPullServer
{
Ensure = "Present"
EndpointName = "PSDSCPullServer"
Port = 8080
PhysicalPath = "$env:SystemDrive\inetpub\wwwroot\PSDSCPullServer"
CertificateThumbPrint = $certificateThumbPrint
ModulePath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Modules"
ConfigurationPath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Configuration"
State = "Started"
DependsOn = "[WindowsFeature]DSCServiceFeature"
RegistrationKeyPath = "$env:PROGRAMFILES\WindowsPowerShell\DscService"
AcceptSelfSignedCertificates = $true
UseSecurityBestPractices = $true
}
File RegistrationKeyFile
{
Ensure = 'Present'
Type = 'File'
DestinationPath = "$env:ProgramFiles\WindowsPowerShell\DscService\RegistrationKeys.txt"
Contents = $RegistrationKey
}
}
}
I apply the MOF file to my pull server without issues. I create a meta MOF using the same example:
[DSCLocalConfigurationManager()]
configuration Sample_MetaConfigurationToRegisterWithSecurePullServer
{
param
(
[ValidateNotNullOrEmpty()]
[string] $NodeName = 'CORE-O-DSCPull.CORE.local',
[ValidateNotNullOrEmpty()]
[string] $RegistrationKey, #same as the one used to setup pull server in previous configuration
[ValidateNotNullOrEmpty()]
[string] $ServerName = 'CORE-O-DSCPull.CORE.local' #node name of the pull server, same as $NodeName used in previous configuration
)
Node $NodeName
{
Settings
{
RefreshMode = 'Pull'
}
ConfigurationRepositoryWeb CORE-O_PullSrv
{
ServerURL = "https://$ServerName`:8080/PSDSCPullServer.svc" # notice it is https
RegistrationKey = $RegistrationKey
ConfigurationNames = #('Basic')
}
}
}
I apply the LCM settings to my pull-server without a problem.
I can create a simple basic.mof and use DSC to apply it. All this works fine.
Next, I create another meta.mof file for another node to let it register to my pull-server. I use the same configuration as above except for the nodename, which I change to the name of the other node. I use the command:
Set-DscLocalConfigurationManager -ComputerName <nodename> -path <pathtonewmetamof>
This command works correctly. That machine can then use DSC to apply the same basic.mof without problems.
Here comes the problem:
I restart my pull server and node, create a new basic.mof and try to apply this to both my machines. This procedure works fine on the pull server itself, but my node can no longer apply the basic.mof, because it will no longer register with my pull-server. I have replicated this many times, where I would install both machines from scratch and configure them. Every time I restart my machines, registration stops working. See the error below:
Registration of the Dsc Agent with the server https://CORE-O-DSCPull.CORE.local:8080/PSDSCPullServer.svc failed. The underlying error is: Failed to register Dsc
Agent with AgentId 1FE837AA-C774-11E6-80B5-9830B2A0FAC0 with the server
https://core-o-dscpull.core.local:8080/PSDSCPullServer.svc/Nodes(AgentId='1FE837AA-C774-11E6- 80B5-9830B2A0FAC0'). .
+ CategoryInfo : InvalidResult: (root/Microsoft/...gurationManager:String) [], CimException
+ FullyQualifiedErrorId : RegisterDscAgentCommandFailed,Microsoft.PowerShell.DesiredStateConfiguration.Commands.RegisterDscAgentCommand
+ PSComputerName : CORE-O-DC.CORE.local
So, my problem is that registration seems to work fine until I reboot the pull server. Does anyone have any idea what can cause this issue?
For those wondering if I managed to fix this, yes I did.
It appears to be a bug in WMF5.0 and I was only using WMF5.1 on the pullserver. Not on the node. So I had to update that and now it is working.
As explained in this blog entry the low-level problem is that WMF 5.0 uses TLS 1.0 to communicate with the server, while WFM 5.1 does no longer support TLS 1.0.
In the aforementioned entry you will find two solutions: one that implies upgrading WMF in every and each of the nodes, and another that allows less secure connections by modifying the register in the server.

Get Tfs Shelveset file contents at the command prompt?

I'm interested in getting the contents of a shelveset at the command prompt. Now, you would think that a cmdlet such as Get-TfsShelveset, available in the TFS Power Tools, would do this. You might also think that "tf.exe shelvesets" would do this.
However, unless I've missed something, I'm appalled to report that neither of these is the case. Instead, each command requires you to give it a shelveset name, and then simply regurgitates a single line item for that shelveset, along with some metadata about the shelveset such as creationdate, displayname, etc. But as far as I can tell, no way to tell what's actually in the shelf.
This is especially heinous for Get-TfsShelveset, which has the ability to include an array of file descriptors along with the Shelveset object it returns. I even tried to get clever, thinking that I could harvest the file names from using -WhatIf with Restore-TfsShelveset, but sadly Restore-TfsShelveset doesn't implement -WhatIf.
Please, someone tell me I'm wrong about this!
tf status /shelveset:name
will list out the content of the named shelveset (you can also supplier an owner: see tf help status).
With the TFS PowerToy's PowerShell snapin:
Get-TfsPendingChange -Shelveset name
for the same information.
It is possible to construct a small command-line application that uses the TFS SDK, which returns the list of files contained in a given shelveset.
The sample below assumes knowledge of the Shelveset name & it's owner:
using System;
using System.IO;
using System.Collections.ObjectModel;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.Framework.Common;
using Microsoft.TeamFoundation.Framework.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
namespace ShelvesetDetails
{
class Program
{
static void Main(string[] args)
{
Uri tfsUri = (args.Length < 1) ? new Uri("TFS_URI") : new Uri(args[0]);
TfsConfigurationServer configurationServer = TfsConfigurationServerFactory.GetConfigurationServer(tfsUri);
ReadOnlyCollection<CatalogNode> collectionNodes = configurationServer.CatalogNode.QueryChildren(
new[] { CatalogResourceTypes.ProjectCollection },
false, CatalogQueryOptions.None);
CatalogNode collectionNode = collectionNodes[0];
Guid collectionId = new Guid(collectionNode.Resource.Properties["InstanceId"]);
TfsTeamProjectCollection teamProjectCollection = configurationServer.GetTeamProjectCollection(collectionId);
var vcServer = teamProjectCollection.GetService<VersionControlServer>();
Shelveset[] shelves = vcServer.QueryShelvesets(
"SHELVESET_NAME", "SHELVESET_OWNER");
Shelveset shelveset = shelves[0];
PendingSet[] sets = vcServer.QueryShelvedChanges(shelveset);
foreach (PendingSet set in sets)
{
PendingChange[] changes = set.PendingChanges;
foreach (PendingChange change in changes)
{
Console.WriteLine(change.FileName);
}
}
}
}
}
Invoking this console app & catching the outcome during execution of the powershell should be possible.
Try:
tfpt review
/shelveset:shelvesetName;userName
You may also need to add on the server option so something like:
tfpt review /shelveset:Code Review;jim
/sever:company-source
I think this is what you are looking for.
This is what I ended up with, based on pentelif's code and the technique in the article at http://akutz.wordpress.com/2010/11/03/get-msi/ linked in my comment.
function Get-TfsShelvesetItems
{
[CmdletBinding()]
param
(
[string] $ShelvesetName = $(throw "-ShelvesetName must be specified."),
[string] $ShelvesetOwner = "$env:USERDOMAIN\$env:USERNAME",
[string] $ServerUri = $(throw "-ServerUri must be specified."),
[string] $Collection = $(throw "-Collection must be specified.")
)
$getShelvesetItemsClassDefinition = #'
public IEnumerable<PendingChange> GetShelvesetItems(string shelvesetName, string shelvesetOwner, string tfsUriString, string tfsCollectionName)
{
Uri tfsUri = new Uri(tfsUriString);
TfsConfigurationServer configurationServer = TfsConfigurationServerFactory.GetConfigurationServer(tfsUri);
ReadOnlyCollection<CatalogNode> collectionNodes = configurationServer.CatalogNode.QueryChildren( new[] { CatalogResourceTypes.ProjectCollection }, false, CatalogQueryOptions.None);
CatalogNode collectionNode = collectionNodes.Where(node => node.Resource.DisplayName == tfsCollectionName).SingleOrDefault();
Guid collectionId = new Guid(collectionNode.Resource.Properties["InstanceId"]);
TfsTeamProjectCollection teamProjectCollection = configurationServer.GetTeamProjectCollection(collectionId);
var vcServer = teamProjectCollection.GetService<VersionControlServer>();
var changes = new List<PendingChange>();
foreach (Shelveset shelveset in vcServer.QueryShelvesets(shelvesetName, shelvesetOwner))
{
foreach (PendingSet set in vcServer.QueryShelvedChanges(shelveset))
{
foreach ( PendingChange change in set.PendingChanges )
{
changes.Add(change);
}
}
}
return changes.Count == 0 ? null : changes;
}
'#;
$getShelvesetItemsType = Add-Type `
-MemberDefinition $getShelvesetItemsClassDefinition `
-Name "ShelvesetItemsAPI" `
-Namespace "PowerShellTfs" `
-Language CSharpVersion3 `
-UsingNamespace System.IO, `
System.Linq, `
System.Collections.ObjectModel, `
System.Collections.Generic, `
Microsoft.TeamFoundation.Client, `
Microsoft.TeamFoundation.Framework.Client, `
Microsoft.TeamFoundation.Framework.Common, `
Microsoft.TeamFoundation.VersionControl.Client `
-ReferencedAssemblies "C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.Client.dll", `
"C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.Common.dll", `
"C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\ReferenceAssemblies\v2.0\Microsoft.TeamFoundation.VersionControl.Client.dll" `
-PassThru;
# Initialize an instance of the class.
$getShelvesetItems = New-Object -TypeName "PowerShellTfs.ShelvesetItemsAPI";
# Emit the pending changes to the pipeline.
$getShelvesetItems.GetShelvesetItems($ShelvesetName, $ShelvesetOwner, $ServerUri, $Collection);
}
Spent a few days trying to do this as well, this always popped up on google so here is what I found to help future generations:
To get the contents of the shelveset (at least with Team Explorer Everywhere),
use the command: tf difference /shelveset:<Shelveset name>
That will print out the contents of the shelveset and give filenames in the form :
<Changetype>: <server file path>; C<base change number>
Shelved Change: <server file path again>;<shelveset name>
So if your file is contents/test.txt
in the shelveset shelve1 (with base revision 1), you will see :
edit: $/contents/file.txt;C1
Shelved Change: $/contents/file.txt;shelve1
After that, using the tf print command
(or view if not using TEE) on $/contents/file.txt;shelve1 should get you the contents :
tf print $/contents/file.txt;shelve1
Shows you what is in the file.txt in shelveset shelve1
If you want get shelveset changes from server by using tfs command
Using power shell:
Get-TfsPendingChange -Server http://example.com/org -Shelveset shelvsetName
Using vs commands:
c:\projects>tf shelvesets BuddyTest_23
more info about this please see here
https://learn.microsoft.com/en-us/azure/devops/repos/tfvc/shelvesets-command?view=azure-devops