I'll get right to it:
My join-path idea isn't going to work here, but this is what I am trying to do within powershell.
$HOSTNAME = $env:COMPUTERNAME
powershell -Command cd B:\backup\; ./cmd.exe /c "aws s3 sync . s3://backups123/$HOSTNAME/ --dryrun"
I am attempting to take a backup of a folder that im in within powershell and send it to an s3 bucket. The issue is I have to add computername into the path, but its not passing the variable through.
Anyone have a workaround?
If you use the AWS Tools for Powershell you can utilise the Write-S3Object cmdlet
$results = Get-ChildItem .\path\to\files -Recurse -Include *
foreach ($path in $results)
{Write-Host $path
$filename = [System.IO.Path]::GetFileName($path)
Write-S3Object -BucketName my-bucket -File $path -Key subfolder/$env:COMPUTERNAME/$filename -CannedACLName Private -AccessKey accessKey -SecretKey secretKey}
Credit to https://volkanpaksoy.com/archive/2014/12/04/uploading-files-to-s3-using-windows-powershell/
Related
Very new to powershell and AD, so apologies if this post has an obvious answer. I have done some research and I am still not finding the answers I am looking for. My script is below for reference.
I have created a simple powershell script that will run on an admin vm i have setup on my domain. I have a separate SQL vm running a backup script that consume a lot of storage over time. I am trying to run this very simple script. My question is, do I need to modify this script in order to store it on my admin vm but have it run on my sql vm? Or can i leave the path as is and just set up in AD task scheduler. I have tried targeting the FQDN and the IP, but it doesn't seem to be working either way.
$backups_file = 'E:\blahBlahBla\SQL\Backups' or
$backups_file = '<IP_ADDRESS>\E:\blahBlahBla\SQL\Backups' or
$backups_file = '<FQDN>E:\blahBlahBla\SQL\Backups'
$backup_file_exist = (Test-Path -Path $backups_file)
if ($backup_file_exist){
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $backups_file
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $backups_file -Recurse | Where-Object {($_.LastWriteTime -lt (Get-
Date).AddDays(-7))} | Remove-Item
}
else
{
Write-Output -InputObject "Unable to access this directory."
}
Thanks.
well all your $backups_file solutions seems wrong to me.
If you want excess a directory on a Remote system, it has to be at least a fileshare or a administrative share like \\computer\e$\folder\folder\
But why using file shares or something like that when you just simple can connect to a Powershell Session on the Remote Host? here is a example.:
$mySQLServer = "Server1.domain.name", "server2.domain.name"
$backupFolder = "E:\blahBlahBla\SQL\Backups"
foreach ($server in $mySQLServer)
{
$session = New-PSSession -ComputerName $server #maybe -cred if needed
Invoke-Command -Session $session -ArgumentList $backupFolder -ScriptBlock {
param(
$directoy
)
if ($backup_file_exist)
{
# Verifies the folder exists
Write-Output -InputObject "This folder exists"
# returns all the files in the folder.
Get-ChildItem -Path $directoy
# Deletes all files in the folder that are older that 7 days.
Get-ChildItem -Path $directoy -Recurse | Where-Object { ($_.LastWriteTime -lt (Get-Date).AddDays(-7))
} | Remove-Item
}
}
Remove-PSSession
}
Good Luck!
I am trying to run script to manage some VHD Disks, but the disk mount is failing due to elevated permissions required. The user the script is run under is a local admin, but UAC is blocking it I think. The error which comes back is: “DiskState=Failed to mount disk - "Access to a CIM resource was not available to the client”
Ideally I need to the script to run under elevated command prompt automatically. Any idea's how I can achieve that programmatically?
The script I am running is this:
$location = "C:\temp"
$name = "downloadfile"
$Author = "FSLogix"
$FilePath = "Filepath here"
$LogFilePath = "Logfilepath here"
# Force to create a zip file
$ZipFile = "$location\$Name.zip"
New-Item $ZipFile -ItemType File -Force
$RepositoryZipUrl = "https://github.com/FSLogix/Invoke-FslShrinkDisk/archive/master.zip"
# download the zip
Write-Host 'Starting downloading the GitHub Repository'
Invoke-RestMethod -Uri $RepositoryZipUrl -OutFile $ZipFile
Write-Host 'Download finished'
#Extract Zip File
Write-Host 'Starting unzipping the GitHub Repository locally'
Expand-Archive -Path $ZipFile -DestinationPath $location -Force
Write-Host 'Unzip finished'
# remove the zip file
Remove-Item -Path $ZipFile -Force
# Run the FSLogix Optimisation
C:\temp\Invoke-FslShrinkDisk-master\Invoke-FslShrinkDisk.ps1 -Path $FilePath -Recurse -PassThru -LogFilePath $LogFilePath\logfile.csv
You can elevate the PS script using the Powershell as a separate process and make it "run as admin" like below:
start-process PowerShell -verb runas
OR
Powershell -Command "Start-Process PowerShell -Verb RunAs"
Apart from that , you can condition it as well. There is a beautiful conditional code shared by PGK which can help as well:
if (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
$arguments = "& '" +$myinvocation.mycommand.definition + "'"
Start-Process powershell -Verb runAs -ArgumentList $arguments
Break
}
I'm using the following script to run test.reg on multiple remote systems:
$computers = Get-Content computers.txt
Invoke-Command -ComputerName $computers -ScriptBlock {
regedit /i /s "\\SERVER\C$\RegistryFiles\test.reg"
}
The script doesn't error, but the registry entry doesn't import on any of the systems.
I know test.reg file is a valid registry file because I copied it over, ran it manually, and the registry key imports. I also made sure PowerShell Remoting is enabled on the remote computers.
Any ideas why the registry key isn't importing?
I found the best way not to mess with issues related to server authentication and cut down on complexity just to pass Reg file as parameter to function.
$regFile = #"
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters]
"MaxUserPort"=dword:00005000
"TcpTimedWaitDelay"=dword:0000001e
"#
Invoke-Command -ComputerName computerName -ScriptBlock {param($regFile) $regFile | out-file $env:temp\a.reg;
reg.exe import $env:temp\a.reg } -ArgumentList $regFile
I posted on some PowerShell forums and finally got this working.
I had to 1) move the $newfile variable inside the loop and 2) comment out the $ in the path stored in the $newfile variable.
For reference, the final script looks like this if anyone wants to use it:
$servers = Get-Content servers.txt
$HostedRegFile = "C:\Scripts\RegistryFiles\test.reg"
foreach ($server in $servers)
{
$newfile = "\\$server\c`$\Downloads\RegistryFiles\test.reg"
New-Item -ErrorAction SilentlyContinue -ItemType directory -Path \\$server\C$\Downloads\RegistryFiles
Copy-Item $HostedRegFile -Destination $newfile
Invoke-Command -ComputerName $server -ScriptBlock {
Start-Process -filepath "C:\windows\regedit.exe" -argumentlist "/s C:\Downloads\RegistryFiles\test.reg"
}
}
I am attempting to figure out why a script that works in AWS tools 1.x (I think 1.1.16?) Is now not working after upgrade to the latest AWS tools (2.0.3)
The Script
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$creds = New-AWSCredentials -AccessKey [REDACTED] -SecretKey [REDACTED]
Set-AWSCredentials -Credentials $creds
$a = Get-Content C:\users\killeens\desktop\temp\AmazonKeysToDownload.txt
$startingpath = "G:\TheFiles\"
$a | ForEach-Object {
$keyname = $_
$fullpath = $startingpath + $keyname
write-host "fullpath: "$fullpath
Get-S3Bucket -BucketName OURBUCKETNAME | Get-S3Object -Key $_ | Copy-S3Object -Key $keyname -LocalFile $fullpath
}
The Problem
In 1.1.16, this works fine.
Now, under deadline in 2.0.3, I get the following error:
Copy-S3Object : The specified bucket does not exist
These details might be important
For what it's worth, our bucket name is all capital letters. ("COMPANYCLIENT")
This literally worked on my machine an hour or so ago. I then wanted to do something in parallel, so I downloaded powershell v4 and the latest AWS Tools. This problem kept happening. I have since reverted to powershell 3 but the issue remains.
I have not been able to find an old version of amazon 1.x tools to test
Troubleshooting so far
if I only execute Get-S3Bucket OURBUCKETNAME, it works
if I execute the script, leaving off the piped Copy-S3Object command, it works, outputting all of the objects that I imported in my file.
I checked and it doesn't appear that there is a BucketName parameter on the Copy-S3Object command according to the intellisense. If I try to specify one, I get an error.
It appears there is also a cmdlet called Read-S3Object that ends up with the same result. Had to use that.
Didn't see anything about Copy-S3object being deprecated or having its functionality changed, so that's unfortunate.
Assuming you have:
Powershell V3
Amazon Tools for Powershell v2.x
Appropriate Amazon Credentials
Then the following script should work:
Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
### SET ONLY THE VARIABLES BELOW ###
$accessKey = "" # Amazon access key.
$secretKey = "" # Amazon secret key.
$fileContainingAmazonKeysSeparatedByNewLine = "" # Full path to a file, e.g. "C:\users\killeens\desktop\myfile.txt"
$existingFolderToPlaceDownloadedFilesIn = "" # Path to a folder, including a trailing slash, such as "C:\MyDownloadedFiles\" NOTE: This folder must already exist.
$amazonBucketName = "" # the name of the Amazon bucket you'll be retrieving the keys for.
### SET ONLY THE VARIABLES ABOVE ###
$creds = New-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey
Set-AWSCredentials -Credentials $creds
$amazonKeysToDownload = Get-Content $fileContainingAmazonKeysSeparatedByNewLine
$uniqueAmazonKeys = $amazonKeysToDownload | Sort-Object | Get-Unique
$startingpath = $existingFolderToPlaceDownloadedFilesIn
$uniqueAmazonKeys | ForEach-Object {
$keyname = $_
$fullpath = $startingpath + $keyname
Read-S3Object -BucketName $amazonBucketName -Key $keyname -File $fullpath
}
Obviously there would be better ways to produce this (as a function that accepts parameters, in a Powershell v4 workflow with parallel loops and a throttle count, better dealing with credentials, etc.) but this gets it done in its most basic form.
I need to call a remote VB script from Powershell, and the VB script needs to run on the remote machine.
I have been using \$computer\root\cimv2:Win32_Process").Create(C:\test.vbs)
This works, however I can't get a return value from the script, just a return value from the win32 process.
I would convert the whole thing to powershell, but can't as I'm connecting to a legacy domain I can't install additional tools on so have to call the remote vbscript
This is an old question but I would like to share my solution. It's the same as the one posted by Ansgar but it's been tested and working fine:
$VNC = '\\share\software\AppName\_Install_Silent.vbs'
$Computer = 'RemoteHost'
$TMP = "\\$Computer\c$\TEMP"
if (!(Test-Path $TMP)) {
New-Item -Path $TMP -ItemType Directory
}
Copy-Item -LiteralPath (Split-Path $VNC -Parent) -Destination $TMP -Container -Recurse -Force -Verbose
$LocalPath = Join-Path 'C:\TEMP' (Join-Path (Split-Path $VNC -Parent | Split-Path -Leaf) (Split-Path $VNC -Leaf))
Invoke-Command -ScriptBlock {cscript.exe $Using:LocalPath} -Computer $Computer
# Restart might be needed of remote host
The difference is that you have to copy the files first to the remote machine to avoid the double hop issue and then you can install it with the $Using variable.
Hope this helps someone.
I'd probably try either remote invocation:
Invoke-Command -ScriptBlock { cscript.exe "C:\test.vbs" } -Computer $computer
or PsExec:
PsExec \\$computer cscript.exe "C:\test.vbs"
Can't test either of them right now, though.