wget -N (or more verbose wget --timestamping) has the nice effect that files that are already downloaded are not attempted to be downloaded again.
That way you can save time and resources. I'm looking for the equivalent in PowerShell's Invoke-WebRequest.
Is there a way to respect the file's and the server's time stamp in Invoke-WebRequest?
based on what i can find in the documentation, no, it doesn't appear that Invoke-WebRequest has an option similar to that.
the best i could tell you is to check yourself in a script through conditionals and saving the new file with a different file name, since if you're using Invoke-WebRequest to download a file, i can only assume you're also using -OutFile as an option;
$File1Creation=(Get-ChildItem <PathToFile1> -Force).CreationTime
Invoke-WebRequest -Uri https://website.com -Outfile <PathToFile2>
$File2Creation=(Get-ChildItem <PathToFile2> -Force).CreationTime
if ($File1Creation -eq $File2Creation)
{
#do something here
} else {
#do something else here
}
the biggest problem is that, because I-WR doesn't have an option similar to it, unless your file has a timestamp embedded somewhere on its originating webpage, there's no way to check it prior to actually downloading it.
Related
I want to download a file to a pc from onedrive/google drive.
After some digging into this subject i found invoke-Webrequest was the best command to use in this subject.
# Download the file $zipFile = "https://xxxxxxmy.sharepoint.com/:u:/g/personal/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxfRW5c" Invoke-WebRequest -Uri $zipFile -OutFile "c:\temp\xxxxxx.exe"
only to found out the code was working but only downloaded a .exe file of 156kB
This file i wanted to download is 22mb? i get no error's in powershell but maybe you have any idea what is going on?
zipfiles work but then i need to extract a zip file in the code and i dont know the working code for that..? ( expand-archive didnt work).
So there is no login context for the session spawned by your script. If you open one drive in your browser, once authentication is established and a session exists, the browser is given access to the file.
If you open your file that is 156kb in notepad, you should find it's just a webpage saying the URL is not available.
I believe this will help the situation, but it's more complex:
https://techcommunity.microsoft.com/t5/onedrive-for-business/how-to-download-root-level-files-from-onedrive-using-powershell/m-p/758689
Thnx you for you reply and sorry for my late reply,
It turns out that the link i was using didn't have access to the file directly.
When you add in the onedrive/google docs :download=1 it will skip the "virus scan".
&download=1 needs to be added.
I am using the command
dssc pop -get -unify path_to_file
to locally modify a file and when i try to revert the changes with
dssc cancel -force path_to_file
I get an error "Error: path_to_file - Object does not exist"
Same issue exists without -force flag
Here's something that might help:
If you take a look at any file under control by the dssc (ls -l) that you haven't yet checked out, you can easily discover that the file is actually a link to the vault.
So, when you actually populate a file by using dssc pop -get -uni, what happens is that the tool goes to the vault and fetches a local copy for you.
Now, the above sentence has the actual answer to your question: all you need to do is to actually use dssc pop -get -uni one more time... Well, the tool will probably disagree, recognizing that you've tampered with the file and prompting you to use the -force switch if you really want to revert (repopulate) your file.
Hope this does the trick.
I have this simple script called webcam.php to acquire some screenshot from webcams
<?php
$d=date('YmdHis');
$url = 'http://xxx:40801/snap.jpeg?'.$d;
$img = 'camera_east.jpg';
echo file_put_contents($img, file_get_contents($url));
$url = 'http://xxx:40802/snap.jpeg?'.$d;
$img = 'camera_west.jpg';
echo file_put_contents($img, file_get_contents($url));
echo $d;
?>
and if I call http://xxx/webcam.php from browser, everything's OK:
I find the two pictures in the folder, and the script returns the length of the files and the timestamp as echoes.
I tried to make this script to be executed by the windows scheduler, but although it returns 0x0 the pictures are not updated.
(I tried also unlinking the images, and also using curl but nothing changes)
Then I tried to run the PHP script from command line (also from PowerShell):
something like:
C:\Program Files\PHP\v7.2\php.exe -f C:\\webcam.php
but again, although it seems working, since it returns the length of the two files and the timestamp, the pictures are not updated and if I add unlink command, files are not cancelled:
Clearly folder has all permissions...
I've not big experience in PHP... :-(
what can be wrong?
Thanks!
obviously from cmd/ps/scheduler requires the full path,
while from browser can accept relative path
I am trying to automate a build process by first getting the code from bitbucket as follows.
$output = "E:\FileName.xyz"
$url = 'https://bitbucket.org/WhatEver/WhatEverBranchName/get/master.zip'
$wc = New-Object -TypeName System.Net.WebClient
$wc.Headers.Add('Authorization','token oinksdilikncl--MyAccessToken--ioiwcnoisif89kfg9')
$wc.DownloadFile($url, $output)
When I execute this, The file i receive at FileName.xyz is a html file that redirects me to the loginpage of the bitbucket, essentially its asking for creds, even though I suppiled access token.
Where am I wrong? Are there other ways to do this say, Invoke-Webrequest? Or someone kindly direct me to a code sample please?
I have absolutely zero experience in powershell, but I tried to do the similar task in node and here are my findings.
First you create an "Oauth" in access management section of your bitbucket account setting. This gives you a "Key" and a "Secret".
Now using these Key and Secret you ask Bitbucket for a token. In my case I made a http request to https://bitbucket.org/site/oauth2/access_token. In your case you should use an equivalent of CURL (Invoke-RestMethod maybe?). the CURL command is like this:
$ curl -X POST -u "yourKeyHere:yourSecretHere" https://bitbucket.org/site/oauth2/access_token -d grant_type=client_credentials
my http request was like this (using superagent in node) with my Content-Type set to application/x-www-form-urlencoded:
request.post("https://yourKeyHere:yourSecretHere#bitbucket.org/site/oauth2/access_token").send('grant_type=client_credentials');
Now that you have the token, your program or your command can clone a private repo with it. But the url to your repo should be like this (keep the bracket around token):
https://x-token-auth:{tokenHere}#bitbucket.org/youRepoOwnerHere/RepoNameHere.git
Now you have the whole codebase on your machine. But you want a single file rather than the whole repo which I refer you to this Retrieve a single file from a repository but remember to use above repo url with the token instead of a normal repo url.
Actually, at least now (2 years after the original post), the things are more easy than that as it's enough to do a Basic auth. So as long as the script is private thous you have no problem having the creds written in it, the following should to the trick
Invoke-WebRequest -uri '<url>' -Headers #{ Authorization = 'Basic <auth_str_b64>' } -OutFile <dest_path>
where:
- url is something like https://bitbucket.org/<account>/<repo_name>/get/<branch_or_tag_or_whatever>.zip, get it from the Downloads page of the desired repository
- auth_str_b64 is the usual <username>:<password> pair base64 encoded
You can use the following to create/compute the encode string:
$encodedCreds = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes('<username>:<password>'))
In order to avoid keeping the creds lying around in the script you could pass them as arguments or prompt them at runtime.
I've solved this problem like this:
# Instanciate the WebClient
$wc = New-Object -TypeName System.Net.WebClient
# Add the base64 encoded credentials
$wc.Headers.Add('Authorization','Basic {0}' -f [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f '{USERNAME}','{TOKEN}'))))
# Download the file
$wc.DownloadFile( 'https://{BITBUCKET_URL}/projects/{PROJECT}/repos/{REPOSITORY}/raw/{FILE.EXT}?at=refs%2Fheads%2F{BRANCH}', 'C:\file.txt' )
I am assuming you are using a Personal Access Token. Oh, and it's much, much faster then Invoke-WebRequest or Invoke-RestMethod.
I often do the same mistake over again: in powershell, I run
wget http://example.com
instead of
wget http://example.com -OutFile somename
and when the command (wget aka Invoke-WebRequest) is done executing, the downloaded file is stored... apparently, nowhere.
Q: Is there a way to store the downloaded content post-factum?
No, if you dont specify -outfile, it is only returned to the pipeline to be used in next Statement.