Issue with downloading different file from same website - powershell

So I manage to create below script, which will download the .exe file from McAfee website and execute it. However, the name of .exe file in website keeps on changing daily basis. For e.g. file is V3_4756dat.exe, next day version will be changed to V3_4757dat.exe and keeps on incrementing by +1. Any idea what changes I can make in below code to achieve this.
# URL and Destination
$url = "https://download.nai.com/products/datfiles/V3DAT/V3_4756dat.exe"
$dest = "C:\Users\user\Desktop\V3_4756dat.exe"
# Download file
Start-BitsTransfer -Source $url -Destination $dest
#file execution
Start-Process -FilePath “V3_4756dat.exe” -WorkingDirectory “C:\Users\user\Desktop\”

Fortunately, there is access to directory services with belowlocation:
https://download.nai.com/products/datfiles/v3dat/
We can use invoke-webrequest to download the directory content and parse the output file to know the latest name of exe file. After this, you can download the latest file and run the command.
Hope this helps.
With few minor corrections, I can propose below:
# Download the current directory output
$outfile="c:\temp\currentfile.txt"
Invoke-WebRequest "https://download.nai.com/products/datfiles/v3dat/" -OutFile $outfile
$ExeFilename=(gc "c:\temp\abc.log" -tail 3 |where {$_ -match ".exe" }).split("=")[3].split(">")[0].replace("""", "")
Remove-Item $outfile
# URL and Destination
$url = "https://download.nai.com/products/datfiles/v3dat/$ExeFilename"
$dest = "C:\temp\$ExeFilename"
echo $url
# Download file
Start-BitsTransfer -Source $url -Destination "c:\temp\"
#file execution
Start-Process -FilePath "$ExeFilename" -WorkingDirectory "c:\temp\"

Thanks GMaster9 for help in this. I also wanted to run this script on different users, hence using generic folder. Final script turned out to be like this.
#Create a new temp folder
New-Item -Path 'C:\Temp' -ItemType Directory
# Download the current directory output
$outfile="C:\Temp\CurrentDatfiles.txt"
Invoke-WebRequest "https://download.nai.com/products/datfiles/v3dat/" -OutFile $outfile
$ExeFilename=(gc "C:\Temp\CurrentDatfiles.txt" -tail 3 |where {$_ -match ".exe" }).split("=")[3].split(">")[0].replace("""", "")
Remove-Item $outfile
# URL and Destination
$url = "https://download.nai.com/products/datfiles/v3dat/$ExeFilename"
$dest = "C:\Temp\$ExeFilename"
echo $url
# Download file
Start-BitsTransfer -Source $url -Destination "C:\Temp"
#file execution
Start-Process -FilePath "$ExeFilename" -WorkingDirectory "C:\Temp"

Related

PowerShell Copy File and Create Folder

I am trying to write a PowerShell script to copy file from a local server to a network location. Here is what I have but I am unable to get it to work. Not sure if I have to do an if statement of some sort?
But I need it to create a sub folder with the current date each time it runs of a scheduled task and copy the file.
$Destination = "\\NetShare\Account Information\" + $((Get-Date).ToString("MM-dd-yyyy")) + "\"
$From = "C:\data\verified2.csv"
Copy-Item -Path $From -Destination "$Destination"`
The path must already exist for Copy-Item to work.
But in your $Destination example, it looks that you are trying to create a share?
I'm taking that Account Informationis the share? In that case...
$Destination = '\\Server\Account Information\' + (Get-Date).ToString('MM-dd-yyyy')
$From = 'C:\data\verified2.csv'
New-Item -Path $Destination -Type Directory -ErrorAction SilentlyContinue # don't care if it already exists
Copy-Item -Path $From -Destination $Destination

GPG - Can't Find File Specified Error Even Though File is There

I have a PowerShell script where I'm assigning two files to a variable, using that variable in the Get-ChildItem cmdlet with the -Include parameters. I assign the result of Get-ChildItem to a variable and then write the contents of the variable to the logs. The files are clearly in the list, but when I use GPG to encrypt, it's saying the file cannot be found.
File filter variable:
$FileFilter = 'status_*.csv','financial_transactions_*.csv'
Writing getting the files and writing the result out:
$allfiles = Get-ChildItem -Path $Source -Include $FileFilter -Recurse
Write-Host $allfiles
Result in the logs:
E:\temp\reports\2022\07\01\financial_transactions_20220701_000.CSV
E:\temp\reports\2022\07\01\status_20220701_000.CSV
E:\temp\reports\2022\07\02\status_20220702_000.CSV
Full code to encrypt the file:
Get-ChildItem -Path $Source -Include $FileFilter -Recurse | ForEach-Object -Verbose {
$FileFullPath = $_.FullName
Write-Host "Encrypting" $FileFullPath
$NewEncryptFile = $_.FullName + '.PGP'
Write-Host "New file name is "$NewEncryptFile"."
Write-Host "In test. Using GnuPG."
Write-Host $NewEncryptFile
Write-Host $FileFullPath
Start-Process -filepath $GPGExePath -RedirectStandardError "E:\Scripts\error.txt" -Wait -ArgumentList "--batch --yes --output $NewEncryptFile --encrypt --recipient $GPGUser $FileFullPath"
}
I write out the path to the CSV file and the path where the encrypted file needs to go, which is the same location. I have physically gone to the location of the CSV files and it is there., I get the error "Error: This command cannot be run due to the error: The system cannot find the file specified."
When I assign one file to the $FileFilter variable, the GPG code works.

Elevating PowerShell script permissions

I am trying to run script to manage some VHD Disks, but the disk mount is failing due to elevated permissions required. The user the script is run under is a local admin, but UAC is blocking it I think. The error which comes back is: “DiskState=Failed to mount disk - "Access to a CIM resource was not available to the client”
Ideally I need to the script to run under elevated command prompt automatically. Any idea's how I can achieve that programmatically?
The script I am running is this:
$location = "C:\temp"
$name = "downloadfile"
$Author = "FSLogix"
$FilePath = "Filepath here"
$LogFilePath = "Logfilepath here"
# Force to create a zip file
$ZipFile = "$location\$Name.zip"
New-Item $ZipFile -ItemType File -Force
$RepositoryZipUrl = "https://github.com/FSLogix/Invoke-FslShrinkDisk/archive/master.zip"
# download the zip
Write-Host 'Starting downloading the GitHub Repository'
Invoke-RestMethod -Uri $RepositoryZipUrl -OutFile $ZipFile
Write-Host 'Download finished'
#Extract Zip File
Write-Host 'Starting unzipping the GitHub Repository locally'
Expand-Archive -Path $ZipFile -DestinationPath $location -Force
Write-Host 'Unzip finished'
# remove the zip file
Remove-Item -Path $ZipFile -Force
# Run the FSLogix Optimisation
C:\temp\Invoke-FslShrinkDisk-master\Invoke-FslShrinkDisk.ps1 -Path $FilePath -Recurse -PassThru -LogFilePath $LogFilePath\logfile.csv
You can elevate the PS script using the Powershell as a separate process and make it "run as admin" like below:
start-process PowerShell -verb runas
OR
Powershell -Command "Start-Process PowerShell -Verb RunAs"
Apart from that , you can condition it as well. There is a beautiful conditional code shared by PGK which can help as well:
if (-NOT ([Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
{
$arguments = "& '" +$myinvocation.mycommand.definition + "'"
Start-Process powershell -Verb runAs -ArgumentList $arguments
Break
}

Download file from SharePoint through Powershell

I am trying to download a file from a SharePoint site through PowerShell. The objective of this script is to take the SharePoint file and apply it to Outlook for employee signatures. We already have created the employee signatures that we want, we just need to deploy those .htm files to the employee devices. I am running this script through Office 365 Endpoint Manager (no longer Intune) to deploy to my end users. The Script will create the .htm files locally on a machine, however when opening the .htm file it takes me to the site url, not providing me with the actual email signature - almost like the command is only copying and not actually downloading. Any thoughts on how to add onto this script to download the respective file from the SharePoint site, where the email signatures are stored? Or is there another, easier option that I am not thinking about that could work better? Still a little new to some SharePoint functions and getting Powershell to communicate with Sharepoint, so please bear with me. Thank you for your help.
Used this code to help build the script: https://github.com/marcusburst/scripts/blob/master/Powershell/Exchange/AutomaticOutlookSignature.ps1
Made adjustments to help fit my needs.
# Checks if outlook profile exists - we only want it to run on people who have a profile so they don't get an annoying profile popup #
$OutlookProfileExists = Test-Path
"C:\Users\$env:Username\AppData\Local\Microsoft\Outlook"
if ($OutlookProfileExists -eq $true) {
Write-Host "User Outlook profile exists.. continuing.." -ForegroundColor Yellow
# Signature Variables #
$ExternalSignatureName = 'External-Signature.htm'
$SigSource = 'https://SharePointSiteURLL'
$RepliesForwardsSignatureName = 'Replies-Forwards-Signature.htm'
$SigSource = 'https://SharePointSiteURL'
# Environment variables #
$AppData = $env:appdata
$SigPath = '\Microsoft\Signatures'
$LocalSignaturePath = $AppData + $SigPath
# Copy file #
If (!(Test-Path -Path $LocalSignaturePath)) {
New-Item -Path $LocalSignaturePath -Type Directory
}
# Check signature path #
if (!(Test-Path -path $LocalSignaturePath)) {
New-Item $LocalSignaturePath -Type Directory
}
# Copy signature templates from domain to local Signature-folder #
Write-Host "Copying Signatures" -ForegroundColor Green
Invoke-WebRequest -URI $Sigsource -Outfile "$LocalSignaturePath\$ExternalSignatureName"
Invoke-WebRequest -URI $SigSource -OutFile "$LocalSignaturePath\$RepliesForwardsSignatureName"
# Set as Default Signature #
If (Test-Path HKCU:'\Software\Microsoft\Office\16.0') {
Write-host "Setting signature for Office 2019"-ForegroundColor Green
Write-host "Setting signature for Office 2019 as available" -ForegroundColor Green
If ((Get-ItemProperty -Name 'First-Run' -Path HKCU:'\Software\Microsoft\Office\16.0\Outlook\Setup' -ErrorAction SilentlyContinue))
{
Remove-ItemProperty -Path HKCU:'\Software\Microsoft\Office\16.0\Outlook\Setup' -Name 'First-Run' -Force
}
If (!(Get-ItemProperty -Name 'External-Signature' -Path HKCU:'\Software\Microsoft\Office\16.0\Common\MailSettings' -ErrorAction SilentlyContinue))
{
New-ItemProperty -Path HKCU:'\Software\Microsoft\Office\16.0\Common\MailSettings' -Name 'Ext-Signature' -Value $ExternalSignatureName -PropertyType 'String' -Force
}
If (!(Get-ItemProperty -Name 'Replies-Forwards-Signature' -Path HKCU:'\Software\Microsoft\Office\16.0\Common\MailSettings' -ErrorAction SilentlyContinue))
{
New-ItemProperty -Path HKCU:'\Software\Microsoft\Office\16.0\Common\MailSettings' -Name 'Replies-Forwards-Signature' -Value $RepliesForwardsSignatureName -PropertyType 'String' -Force
}
}
# Removes files from recent items in file explorer #
Write-host "Cleaning up recent files list in File Explorer.." -ForegroundColor Yellow
Get-ChildItem -Path "$env:APPDATA\Microsoft\Windows\Recent" -File | Sort-Object LastWriteTime -Descending | Remove-Item
Get-ChildItem -Path "$env:APPDATA\Microsoft\Windows\Recent\AutomaticDestinations" -File | Sort-Object LastWriteTime -Descending | Remove-Item
}
I think i ran into this issue a while back. I think your script it fine, you just have to make sure there's a valid token in your blob storage.

How to run a powershell script that is based online

I have a Powershell script that is stored here:
https://gitlab.example.example.co.uk/example/example/raw/master/shrink-diskpart.ps1
I would like to run this on many servers through a scheduled task from this gitlab so I can make single changes to the script and it will run the most up to date one on all the servers.
Can anyone advise if this is possible an if it is how it can be done?
Thanks
You can use Invoke-WebRequest with -OutFile to download a file, and then just execute it. You might store the file on the web server as a .txt so that you don't have to add a MIME type.
$scriptUrl = "http://localhost/test.txt"
$destination = "c:\temp\test.txt"
$scriptName = "c:\temp\test.ps1"
Invoke-WebRequest $scriptUrl -OutFile $destination
# if the file was downloaded, delete the old script file and rename the new
# file
if(test-path $destination){
remove-item $scriptName
Rename-Item $destination $scriptName
}
&$scriptName
Props to http://www.powershellatoms.com/basic/download-file-website-powershell/