Invoke-WebRequest can't download properly - powershell

I was using Invoke-WebRequest for some time in my script and today first time I got problem with it - once I run script, it starts to download ZIP file and finishes without warnings, but when I am trying to open file, WinRar gives me an error and I can see that file size is less than it should be. I tried several time and got same result. Then I tried to download file with Google Chrome just to ensure that file is not corrupted, but it downloaded (with correct file size) and opened without any issues.
After couple of hours I ran script again and it downloaded file as well. So now I am not sure that I can trust Invoke-WebRequest since it seems that sometimes it can make incomplete download.
Myabe someone can point me on how I can avoid this situation? Or maybe better to use another method?

Related

crawl-300d-2M-subword.zip corrupted or cannot be downloaded

I am trying to use this fasttext model crawl-300d-2M-subword.zip from the official page onI my Windows machine, but the download fails by the last few Kb.
I managed to successfully download the zip file into my ubuntu server using wget, but the zipped file is corrupted whenever I try to unzip it. Example of what I am getting:
unzip crawl-300d-2M-subword.zip
Archive: crawl-300d-2M-subword.zip
inflating: crawl-300d-2M-subword.vec
inflating: crawl-300d-2M-subword.bin bad CRC ff925bde (should be e9be08f7)
It is always the file crawl-300d-2M-subword.bin, which I am interested in, that has problems in te unzipping.
I tried the two ways many times but with no success. it seems to me no one had this issue before
I've just downloaded & unzipped that file with no errors, so the problem is likely unique to your system's configuration, tools, or its network-path to the download servers.
One common problem that's sometimes not prominently reported by a tool like wget is a download that keeps ending early, resulting in a truncated local file.
Is the zip file you received exactly 681,808,098 bytes long? (That's what I get.)
What if you try another download tool instead, like curl? (Such a relay between different endpoints might not trigger the same problems.)
Sometimes if repeated downloads keep failing in the same way, it's due to subtle misconfiguration bugs/corruption unique to the network path from your machine to the peer (download origin) machine.
Can you do a successful download of the zip file (of full size per above) to anywhere else?
Then, transfer from that secondary location to where you really want it?
If you're having problems on both a Windows machine, and a Ubuntu server, are they both on the same local network, perhaps subject to the same network issues – either bugs, or policies that cut a particular long download short?

VSCode Cannot Execute Julia FIles

I've scoured other forums that talk about this problem and have tried all of the recommendations I've found, but I cannot seem to get VSCode to recognize my Julia.exe path and execute commands in a .jl file. Every time I run even a simple .jl file, I get /bin/sh: julia: command not found (pictured below).
I have ensured that the executable path is set properly in the .json file, and have tried moving the executable to other locations (using an M1 Mac), but it seems it still cannot find/recognize the Julia.exe:
I have even tried just pointing to the binary folder /bin, and /bin/julia.exe (with the extension), and though VSCode does not generate the error when it cannot confirm the .exe path, the actual code still generates the error above.
I'm at my wits end here. I'm sure it's a simple answer that someone could spot in a second, or know the troubleshooting for, but I've never had this issue to this degree before with installing other languages like Kotlin. Any help is greatly appreciated.
Problem was fixed by the MacOS recommendation found here:
julialang.org/downloads/platform/#optional_add_julia_to_path
Copying the binary .exe elsewhere (outside of /Applications) and changing the executable path in VSCode fixed the issue.

Download a WebFile only if the files newer that the one that exists

I'm struggling to find how I would be able to download a file that's accessible via a URL, but don't keep downloading it if the file on the URL is newer than the file in existence.
Using Power shell this command downloads the file, but it will just download it again regardless if it's there already.
Invoke-WebRequest http://www.domain.co.uk/downloads/File.zip -OutFile C:\Temp\File.zip
I know the command line for copying local files with "XCOPY /D" to check for the date/time stamp but I am wondering if something similar can be done downloading a file from the internet?
Many thanks in Advance

Powershell ISE loading old version of script

I have a script in Powershell ISE that I've added to my ISE profile (The only profile I have) through dot-sourcing. Whenever I open ISE, the version of the script that's loaded is 3.7.2. However, the current version of the script (which the path for the dot-sourcing points to) is 5.3. If I copy the dot source line in my profile and paste it and run it in ISE, the script will then correctly show as 5.3. I've even removed the line from my profile, and the command still shows up when ISE is loaded.
Now, it seems like the script is being cached somewhere. I've checked in my WindowsPowerShell\Modules folder, but I only have modules for ImportExcel and WASP. I never made it into a module in the first place, and I don't see it listed anywhere in Get-Module. Currently the line referencing adding my script is removed from my profile, and checking $profile.Contains("Create-Cert") returns False, which to me means that it's loading the correct file. Another thing I tried was to Dot-Source my $profile in ISE, which did seem to run successfully, but still didn't have the current version, whether or not the dot-sourcing inside $profile was there.
Is there somewhere else that Powershell could be storing this old version of this script? I've searched my computer for references to it, but I can't even find an old version that matches 3.7.2.
Edit: Another troubleshooting step that I've just attempted was to rename my profile and then open ISE. When I did this, the command no longer showed in my command list, and Get-Help Create-Cert came back with an error since it couldn't find it. I then changed the name of my profile back to Microsoft.PowerShellISE_profile.ps1, closed and opened ISE again, and the command loaded with version 3.7.2 again. It's almost like the command is embedded into the profile itself, which I don't even think should be possible.
One additional thing I want to note is that this script exists on a server, and not locally on the computer. I don't think that should matter, since the server is accessable the entire time, but perhaps there's something caching due to that fact.
Edit 2:
On recommendation of Tom Collins, I created a new profile and added just the line concerning my script to it, and this time it worked. When ISE loaded, it correctly loaded version 5.3. I then swapped the naming of my old and new profile, and suddenly it loaded the correct version again. I've tested closing and opening it a few times, and now it's loading 5.3 each time. I'm still at a complete loss for what actually fixed it, and if anyone is willing to offer a deeper explanation I'd be willing to know more.
Adding my triage in as an answer.
Next step I'd take is to rename the original profile, load ISE to confirm it isn't loading, and then manually rebuild a new ISE profile file with just the script (and minimal pre-reqs). Save that as the new profile file and re-run. If that works, then there's something in your original profile that is loading the old script.

Is there a way to make Netbeans file uploads not fail on a single error?

Is there a way to make Netbeans's FTP uploads less error sensitive(reconnect and retry on error)?
It is simply unable to handle large amounts of files. If I upload 1000 files and there is some sort of failure at the 400th file, it will fail for the rest of the files, and it will not record the files that failed. So I will have to upload ALL the files again, even the ones that uploaded successfully on the previous attempt, but it will fail agan again and again.
This post describes changing a NetBeans FTP setting to Passive. Doing so reduced many of my transfer errors.