Update IIS 6 WebSite Credentials via powershell - powershell

I am trying to write a script that will loop through all of my local IIS website and update their physical path credentials whenever I'm forced to update my domain password.
The following works... the first time you run it...
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
After running this, I noticed that the following properties also get set
$SiteElement.userName #Same as was set earlier on .virtualDirectoryDefaults
$SiteElement.password #Same as was set earlier on .virtualDirectoryDefaults
Subsequently, anytime I try to update the credentials using the code above, these two properties remain unchanged, and the changes don't take affect in IIS.
So the result is:
$SiteElement.userName #Unchanged
$SiteElement.password #Unchanged
$SiteElement.virtualDirectoryDefaults.userName #New value
$SiteElement.virtualDirectoryDefaults.password #New value
And the IIS site still shows the old username in the UI and the credentials fail.
So naturally I tried setting those extra 2 properties in my update function:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
$SiteElement.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.password = $Credentials.Password
$SiteElement.virtualDirectoryDefaults.userName = "$($Credentials.Domain)\$($Credentials.UserName)"
$SiteElement.virtualDirectoryDefaults.password = $Credentials.Password
$SiteElement | Set-Item -Force
}
The code throws no errors or warnings, but the end result is the same, those 2 extra properties remain unchanged.
I am using the following code to get "$SiteElement"
$sites = Get-ChildItem IIS:\Sites
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
Also, at the end of the script I restart IIS using this command:
Restart-Service W3SVC

Ugh, finally found a command that works. All in all I've tried 4 different variation of the same thing from different example around the interwebz, all of which only work the first time. But this command updates properly on subsequent changes:
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
The full script
param (
[switch]$All,
[switch]$AllPools,
[switch]$AllSites,
[string]$AppPool,
[string]$Site
)
Import-Module WebAdministration
function Set-AppPool-Credentials(
$AppPoolElement,
$Credentials
){
Set-ItemProperty $AppPoolElement.PSPath -name processModel -value #{userName="$($Credentials.Domain)\$($Credentials.UserName)";password="$($Credentials.Password)";identitytype=3}
}
function Set-Site-Credentials(
$SiteElement,
$Credentials
){
Set-WebConfiguration -Filter "$($SiteElement.ItemXPath)/application[#path='/']/virtualDirectory[#path='/']" -Value #{userName="$($Credentials.Domain)\$($Credentials.UserName)"; password="$($Credentials.Password)"}
}
$newCredentials = (Get-Credential).GetNetworkCredential()
$appPools = Get-ChildItem IIS:\AppPools
$sites = Get-ChildItem IIS:\Sites
if($All -or $AllPools){
$appPools | Foreach-Object { Set-AppPool-Credentials -AppPoolElement $_ -Credentials $newCredentials }
}
elseif($AppPool){
$poolElement = ($appPools | Where-Object { $_.name -eq $AppPool })
Set-AppPool-Credentials -AppPoolElement $poolElement -Credentials $newCredentials
}
if($All -or $AllSites){
$sites | Foreach-Object { Set-Site-Credentials -SiteElement $_ -Credentials $newCredentials }
}
elseif($Site){
$siteElement = ($sites | Where-Object { $_.name -eq $Site })
Set-Site-Credentials -SiteElement $siteElement -Credentials $newCredentials
}
Restart-Service W3SVC

Related

Download file from SFTP server using powershell(Have a working script for FTP server)

I wrote one script that does file downloads from FTP server but now due to change in servers, it should be translated for SFTP.
I checked many post but most of them using .net script.
I need to do it using powershell.
Below is the working script that downloads from FTP, please help me translate it for SFTP server.
SFTP server has hostname,password and username.
Script:-----------
function Dld-FtpFiles {
param
(
$FtpHost,
$FtpUser,
$FtpPassword,
$FtpFolder,
$DownloadDirectory
)
$Client = Connect-FTP -Server $FtpHost -Verbose -Username $FtpUser -Password $FtpPassword
$List = Get-FTPList -Client $Client -Path $FtpFolder
Get-Childitem -Path $DownloadDirectory | Remove-Item
$downloadedFiles = #()
foreach($originalFtpFile in $list) {
$downloadedFile = Receive-FTPFile -Client $Client -RemoteFile $originalFtpFile -LocalPath $DownloadDirectory -LocalExists Overwrite -VerifyOptions Retry, None
Write-Host "Downloaded file $($originalFtpFile.Name) in folder $($downloadedFile.LocalPath)"
$downloadedFiles += ($downloadedFile)
}
Disconnect-FTP -Client $Client
$downloadedFiles = $downloadedFiles | Where-Object {$_}
return $downloadedFiles
}
Set-ExecutionPolicy RemoteSigned -Scope Process -Force
Import-Module Transferetto
$ftpDetails = #(
(New-Object PSOBject -Property #{
FtpHost = 'eu.data.com'
FtpUser = 'FT_34436'
FtpPassword = 'V64445h3'
FtpFolder = '/File'
DownloadDirectory = "\\Srcfile\data"
})
)
foreach($ftpDetail in $ftpDetails) {
$downloadedFiles = Dld-FtpFiles -FtpHost $ftpDetail.FtpHost -FtpUser $ftpDetail.FtpUser -FtpPassword $ftpDetail.FtpPassword -FtpFolder $ftpDetail.FtpFolder -DownloadDirectory $ftpDetail.DownloadDirectory
}

Powershell return variable from within a Invoke-Command

I'm developing a powershell script (and kind of new to it) to go to a couple of servers and extract the RDP logons, so we can check if a certain policy is being followed.
So I've search a bit and now I got the script output exatcly as I want. But now I want to send the result over email.
But I have a problem, because the variable which is output to console (with the info I need) is inside a Invoke-Command, so I cannot use it after outside the Invoke-Command to send the email.
This is my code:
$ServersToCheck = Get-Content "C:\Temp\Servers-RDP2.txt"
foreach ($server in $ServersToCheck) {
Invoke-Command -ComputerName $server -ErrorAction SilentlyContinue {
$username = "user"
$FilterPath = "<QueryList><Query Id='0'><Select Path='Microsoft-Windows-TerminalServices-RemoteConnectionManager/Operational'>*[System[(EventID=1149) and TimeCreated[timediff(#SystemTime) <= 604800000]]] and *[UserData[EventXML[#xmlns='Event_NS'][Param1='{0}']]]</Select></Query></QueryList>" -f $username
$RDPAuths = Get-WinEvent -ErrorAction SilentlyContinue -LogName 'Microsoft-Windows-TerminalServices-RemoteConnectionManager/Operational' -FilterXPath $FilterPath
[xml[]]$xml = $RDPAuths | Foreach { $_.ToXml() }
$EventData = Foreach ($event in $xml.Event) {
New-Object PSObject -Property #{
TimeCreated = (Get-Date ($event.System.TimeCreated.SystemTime) -Format 'dd-MM-yyyy hh:mm:ss')
User = $event.UserData.EventXML.Param1
Domain = $event.UserData.EventXML.Param2
Client = $event.UserData.EventXML.Param3
Server = hostname
}
}
$EventData | FT
}
}
So, I need to use $EventData outside the Invoke-Command so I can add the results of all servers and then send it over by email.
How can I use that variable outside the Invoke-Command?
Thanks

PowerShell Invoke-Command Returns Blank Data?

Been trying to solve this for a bit and can't seem to figure it out.
I have the following script:
$Servers = Get-Content -Path "C:\Utilities_PowerShell\ServerList.txt"
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus # Checks for status of IIS services
{
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3)
{
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
}
else
{
Write-Host " No IIS service(s) were found..." -foreground "red"
}
}
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock ${function:IISServiceStatus} -AsJob -ArgumentList $IISServiceName1, $IISServiceName2, $IISServiceName3, $IISarrService, $IISarrServiceCheck | Wait-Job | Receive-Job
Write-Host " "
}
Whenever I run it, all I get is the output of:
Status of IIS service(s) on *PC* :
If I run the function outside of a loop/invoke-command, the results are absolutely perfect. What is wrong with my remote loop?
I've tried putting the variables inside the function, I've tried running invoke-command without the argument list, etc.
Update: 3/17/16
Turns out...if I run my actual script as is, the result of $EndJobs is weird in that it outputs ALL services in one table and then the three IIS services in another table. This would explain why when I run my invoke-command (stopIIS) scriptblock...I had to reboot the whole server because it took all of the services down.
These functions run PERFECTLY when not run via remote/invoke-command.
What the heck...invoke-command is seriously screwing with my stuff!
Anyone have any ideas/tips on how I can run my local script (which works 100%) on a set of servers from a text file without weird issues like this? Is invoke-command the only way?
do you have the same problem if you wrap it all into the script block like this?
$Servers = Get-Content 'C:\Utilities_PowerShell\ServerList.txt'
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock {
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus { # Checks for status of IIS services
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3) {
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
} else {
Write-Host ' No IIS service(s) were found...' -ForegroundColor Red
}
}
IISServiceStatus $IISServiceName1 $IISServiceName2 $IISServiceName3 $IISarrService $IISarrServiceCheck
} -AsJob | Wait-Job | Receive-Job
Write-Host ' '
}
$EndJobs
I'm having a similar issue. I'm using credssp to test 2nd hop auth for an automation for shutting down a production environment cleanly. My script has 3 sections; session setup, the invoke, session teardown. If I run each piece separately, I get output. If I run the whole script, I get blank lines matching the amount of output I get when I run them separately... there's nothing fancy in my invoke (backtick line continuation - I prefer Python's formatting paradigm better than Powershell/C#):
Invoke-Command `
-Session $workingSession `
-ScriptBlock {
get-service *spool* -ComputerName server01
}

How to test writing to a file share path using credential?

I have an array of Credential objects and I would like to test that these credentials have permissions to write a file to a file share.
I was going to do something like
$myPath = "\\path\to\my\share\test.txt"
foreach ($cred in $credentialList)
{
"Testing" | Out-File -FilePath $myPath -Credential $cred
}
but then I discovered that Out-File doesn't take Credential as a parameter. What's the best way to solve this?
You can use New-PSDrive:
$myPath = "\\path\to\my\share"
foreach ($cred in $credentialList)
{
New-PSDrive Test -PSProvider FileSystem -Root $myPath -Credential $Cred
"Testing" | Out-File -FilePath Test:\test.txt
Remove-PSDrive Test
}
Here is asituation where an old exe (net.exe) seems to do better than powershell...
I guess you could try to map a network drive with the credential provided then test to write a file to that drive :
$cred=get-credential
$pass=$cred.GetNetworkCredential().Password
net use q: \\servername\share $pass /user:$cred.username
Use this script taken from Microsofts TechNet Script Center : http://gallery.technet.microsoft.com/scriptcenter/Lists-all-the-shared-5ebb395a
It is a lot easier to alter to fit your needs then to start completely from scratch.
Open up ListSharedFolderPermissions.ps1, and find the three $Properties vars. add a line at the top of each one so you can tell which user your looking at, so it should now look like this:
$Properties = #{'Username' = $Credential.UserName
'ComputerName' = $ComputerName
. . . . . }
Next, add your new Username property to the select-object line (3 times) :
$Objs|Select-Object Username,ComputerName,ConnectionStatus,SharedFolderName,SecurityPrincipal, `
FileSystemRights,AccessControlType
Once youve added those small pieces in the six appropriate places your script is ready to use:
cd c:\Path\where\you\put\ps1\file
$permissions = #()
$myPath = "computername"
foreach ($cred in $credentialList)
{
$permissions += .\ListAllSharedFolderPermission.ps1 -ComputerName $myPath -Credential $cred
$permissions += " "
}
$permissions | Export-Csv -Path "C:\Permission.csv" -NoTypeInformation
Try using the Invoke-Command function. It will take a credential object and allow you to run an arbitrary script block under that command. You can use that to test out writing the file
Invoke-Command -ScriptBlock { "Testing" | Out-File $myPath } -Credential $cred
I think the Invoke-command approach should work. But if nothing works you can try the powershell impersonation module. It successfully impersonates a user for most Powershell commands without the -Credential switch.
A few ideas:
Create your own PowerShell Provider
Impersonate a user and then write to the share (not sure if possible in powershell)
Use net use d:... as #Kayasax has suggested
Use WScript.Network
I'm very interested in the PowerShell provider myself, but I decided to make something real quick so I went with using the WScript.Network library. I used a hash table to track whether a user would be "authenticated" or not.
$credentials = #() # List of System.Net.NetworkCredential objects
$authLog = #{}
$mappedDrive = 'z:'
$tmpFile = $mappedDrive, '\', [guid]::NewGuid(), '.tmp' -join ''
$path = [io.path]::GetPathRoot('\\server\share\path')
$net = new-object -comObject WScript.Network
foreach ($c in $credentials) {
if ($authLog.ContainsKey($c.UserName)) {
# Skipping because we've already tested this user.
continue
}
try {
if (Test-Path $mappedDrive) {
$net.RemoveNetworkDrive($mappedDrive, 1) # 1 to force
}
# Attempt to map drive and write to it
$net.MapNetworkDrive($mappedDrive, $path, $false, $c.UserName, $c.Password)
out-file $tmpFile -inputObject 'test' -force
# Cleanup
Remove-Item $tmpFile -force
$net.RemoveNetworkDrive($mappedDrive, 1)
# Authenticated.
# We shouldn't have reached this if we failed to mount or write
$authLog.Add($c.UserName, 'Authorized')
}
catch [Exception] {
# Unathenticated
$authLog.Add($c.UserName, 'Unauthorized')
}
}
$authLog
# Output
Name Value
---- -----
desktop01\user01 Authorized
desktop01\user02 Unauthorized

Send files over PSSession

I just burned a couple of hours searching for a solution to send files over an active PSSession. And the result is nada, niente. I'm trying to invoke a command on a remote computer over an active session, which should copy something from a network storage. So, basically this is it:
icm -Session $s {
Copy-Item $networkLocation $PCLocation }
Because of the "second hop" problem, I can't do that directly, and because I'm running win server 2003 I cant enable CredSSP. I could first copy the files to my computer and then send/push them to the remote machine, but how? I tried PModem, but as I saw it can only pull data and not push.
Any help is appreaciated.
This is now possible in PowerShell / WMF 5.0
Copy-Item has -FromSession and -toSession parameters. You can use one of these and pass in a session variable.
eg.
$cs = New-PSSession -ComputerName 169.254.44.14 -Credential (Get-Credential) -Name SQL
Copy-Item Northwind.* -Destination "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL2008R2\MSSQL\DATA\" -ToSession $cs
See more examples at here, or you can checkout the official documentation.
If it was a small file, you could send the contents of the file and the filename as parameters.
$f="the filename"
$c=Get-Content $f
invoke-command -session $s -script {param($filename,$contents) `
set-content -path $filename -value $contents} -argumentlist $f,$c
If the file is too long to fit in whatever the limits for the session are, you could read the file in as chunks, and use a similar technique to append them together in the target location
PowerShell 5+ has built-in support for doing this, described in David's answer.
I faced the same problem a while ago and put together a proof-of-concept for sending files over a PS Remoting session. You'll find the script here:
https://gist.github.com/791112
#requires -version 2.0
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[string]
$ComputerName,
[Parameter(Mandatory=$true)]
[string]
$Path,
[Parameter(Mandatory=$true)]
[string]
$Destination,
[int]
$TransferChunkSize = 0x10000
)
function Initialize-TempScript ($Path) {
"<# DATA" | Set-Content -Path $Path
}
function Complete-Chunk () {
#"
DATA #>
`$TransferPath = `$Env:TEMP | Join-Path -ChildPath '$TransferId'
`$InData = `$false
`$WriteStream = [IO.File]::OpenWrite(`$TransferPath)
try {
`$WriteStream.Seek(0, 'End') | Out-Null
`$MyInvocation.MyCommand.Definition -split "``n" | ForEach-Object {
if (`$InData) {
`$InData = -not `$_.StartsWith('DATA #>')
if (`$InData) {
`$WriteBuffer = [Convert]::FromBase64String(`$_)
`$WriteStream.Write(`$WriteBuffer, 0, `$WriteBuffer.Length)
}
} else {
`$InData = `$_.StartsWith('<# DATA')
}
}
} finally {
`$WriteStream.Close()
}
"#
}
function Complete-FinalChunk ($Destination) {
#"
`$TransferPath | Move-Item -Destination '$Destination' -Force
"#
}
$ErrorActionPreference = 'Stop'
Set-StrictMode -Version Latest
$EncodingChunkSize = 57 * 100
if ($EncodingChunkSize % 57 -ne 0) {
throw "EncodingChunkSize must be a multiple of 57"
}
$TransferId = [Guid]::NewGuid().ToString()
$Path = ($Path | Resolve-Path).ProviderPath
$ReadBuffer = New-Object -TypeName byte[] -ArgumentList $EncodingChunkSize
$TempPath = ([IO.Path]::GetTempFileName() | % { $_ | Move-Item -Destination "$_.ps1" -PassThru}).FullName
$Session = New-PSSession -ComputerName $ComputerName
$ReadStream = [IO.File]::OpenRead($Path)
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
try {
do {
$ReadCount = $ReadStream.Read($ReadBuffer, 0, $EncodingChunkSize)
if ($ReadCount -gt 0) {
[Convert]::ToBase64String($ReadBuffer, 0, $ReadCount, 'InsertLineBreaks') |
Add-Content -Path $TempPath
}
$ChunkCount += $ReadCount
if ($ChunkCount -ge $TransferChunkSize -or $ReadCount -eq 0) {
# send
Write-Verbose "Sending chunk $TransferIndex"
Complete-Chunk | Add-Content -Path $TempPath
if ($ReadCount -eq 0) {
Complete-FinalChunk -Destination $Destination | Add-Content -Path $TempPath
Write-Verbose "Sending final chunk"
}
Invoke-Command -Session $Session -FilePath $TempPath
# reset
$ChunkCount = 0
Initialize-TempScript -Path $TempPath
}
} while ($ReadCount -gt 0)
} finally {
if ($ReadStream) { $ReadStream.Close() }
$Session | Remove-PSSession
$TempPath | Remove-Item
}
Some minor changes would allow it to accept a session as a parameter instead of it starting a new one. I found the memory consumption on the Remoting service on the destination computer could grow quite large when transferring large files. I suspect PS Remoting wasn't really designed to be used this way.
NET USE allows you to add a local drive letter for the remote system, which then then allows you to use the drive letter in your PSSession, or even without a PSSession. This is helpful if you don't have Powershell v5.0, and even if you do,
You may use the remote machine name or its IP address as part of the remote UNC path and you can specify the username and password credentials on the same line:
NET USE Z: \\192.168.1.50\ShareName /USER:192.168.1.50\UserName UserPassword
Another example:
NET USE Z: \\RemoteSystem\ShareName /USER:RemoteSystem\UserName UserPassword
OR
NET USE Z: \\RemoteSystem\ShareName /USER:Domain\UserName UserPassword
If you don't supply the user credentials on the same line, you will be prompted for them:
>NET USE Z: \\192.168.1.50\ShareName
Enter the user name for '192.168.1.50': 192.168.1.50\UserName
Enter the password for 192.168.1.50: *****
The command completed successfully.
You may remove the drive letter when you're finished with the following:
NET USE Z: /delete
You can get the full syntax with NET USE /?
>net use /?
The syntax of this command is:
NET USE
[devicename | *] [\\computername\sharename[\volume] [password | *]]
[/USER:[domainname\]username]
[/USER:[dotted domain name\]username]
[/USER:[username#dotted domain name]
[/SMARTCARD]
[/SAVECRED]
[[/DELETE] | [/PERSISTENT:{YES | NO}]]
NET USE {devicename | *} [password | *] /HOME
NET USE [/PERSISTENT:{YES | NO}]
NET is a standard external .exe command in the system folder and works in Powershell just fine.
$data = Get-Content 'C:\file.exe' -Raw
Invoke-Command -ComputerName 'server' -ScriptBlock { $using:data | Set-Content -Path 'D:\filecopy.exe' }
Don't actually know what the maximum file size limitation is.