wget command to download last modified file from a particular url which has many release versions - wget

I need to download latest charts from a url having last modified date
example on a particular url www.python.com/download/releases:
name last-modified
python1 2-Jun-2020
python2 6-Jul-2021
python3 5-Dec-2022
python4 8-Jan-2022
python5 10-Sep-2022
from this I want to download last modified python file what I need to add in wget command?

Related

Wget creates some of directories only at the end of the mirroring

I'm currently mirroring www.typingstudy.com
wget --mirror --page-requisites --convert-link --no-clobber --no-parent --domains typingstudy.com https://www.typingstudy.com/
And wget creates directories, which contain html files on the site, only at the end of the scrapping and, accordingly, when it tries to download those html files, before the creation of the directories in which this files are located, wget says:powershell output
Sometimes it downloads only 1 file, like at this example, to the directory "part" and refuses to see this directory while trying downloading all other ~10 files from this exact directory, saying that this directory does not exist: enter image description here
Can someone help me understand what's wrong with my commands? Or is it a bug of wget?(Probably not)
Thanks in advance.
When I start the downloading process again - everything is neat, wget downloads all those ~10 other html files to created in the previous download session ("part") directories. So the problem is that I need to start the downloading 2 times, at least in case of this site.
And I totally do not understand why this is happening.

Trouble installing the EC2-steps-plugin

I have installed the EC2-steps plugin, restarted rundeck, but I can't see the steps. Is there anything I'm missing here, installation or interface wise?
With this steps works:
1) Get the plugin with: git clone https://github.com/rundeck-plugins/aws-ec2-steps
2) Compress on .zip file with: zip -r aws-ec2-steps.zip aws-ec2-step
3) Now, move the zip file to /var/lib/rundeck/libext (DEB/RPM based installation) or $RUNDECK_BASE/libext (WAR based instllation).
4) Check the new steps available.
Tip: Check the zip file permissions, make sure that the Rundeck user (or the user that launches Rundeck) can read the zip file.

Wget to download software with a changing URL

Setting up a PowerShell script to auto-download applications with varying URLs.
I had a batch file script to download certain application to ensure my USB Toolkit was always updated. However I want to switch to powershell because I found it had the WGET command to download direct from a URL. What I was hoping, was to have the URL always get the latest version.
wget https://download.ccleaner.com/ccsetup556.exe -O ccleaner.exe
Note that the 556 in the URL is the variable I would like to always select the highest version.
They have two download pages, a direct link, and one that takes 2-5 seconds to download, however when I point wget to that page, it just downloads the HTML webpage.

Download a WebFile only if the files newer that the one that exists

I'm struggling to find how I would be able to download a file that's accessible via a URL, but don't keep downloading it if the file on the URL is newer than the file in existence.
Using Power shell this command downloads the file, but it will just download it again regardless if it's there already.
Invoke-WebRequest http://www.domain.co.uk/downloads/File.zip -OutFile C:\Temp\File.zip
I know the command line for copying local files with "XCOPY /D" to check for the date/time stamp but I am wondering if something similar can be done downloading a file from the internet?
Many thanks in Advance

wget link containing question mark

I'm trying to download a .exe using the command line.
download link: https://go.microsoft.com/fwlink/?LinkId=691980&clcid=0x409
Doing wget <link> results in a file index.html#LinkId=691980&clcid=0x409
How do you deal with links that contain parameters at the end of the link? LinkId is necessary to download the correct .exe, so I can't just get rid of/ignore it.