Download file from Google Drive with PowerShell - powershell

In trying to download a file with PowerShell I have the following
$client = new-object System.Net.WebClient
$client.DownloadFile($AGRIDATAMISCURL,$TESTAGRIDATAMISCZIP)
Where $AGRIDATAMISCURL is a URL that looks like "https://drive.google.com/file/d/<...>" and $TESTAGRIDATAMISCZIP looks like "C:\test\A.zip"
This script doesn't return an error but the file it downloads is basically an HTML file with a prompt to sign in to Google. Is there another way to download a file that is "shared with me"?

Share the file first
Files in Google Drive must be made available for sharing before they can be downloaded. There's no security context when running from PowerShell, so the file download fails. (To check this, rename the file with a `.html` extension, and view in a text editor).
Note: the following solution assumes that the links are to non-security-critical files, or that the links will only be given to those with whom access can be trusted (links are https, so are encrypted with transmission). The alternative is to programatically authenticate with Google - something not addressed in this answer.
To Share the file, in Google Drive:
Right-click the file, and choose Get shareable link
2. Turn link sharing on
Click Sharing Settings
Ensure that Anyone with the link can view (Note that in corporate environments, the link must be shared with those outside the organization in order to bypass having to login)
Then Download Programatically
Then, code can be used to download the file as such (in this case with Windows PowerShell):
# Download the file
$zipFile = "https://drive.google.com/uc?export=download&id=1cwwPzYjIzzzzzzzzzzzzzzzzzzzzzzzz"
Invoke-WebRequest -Uri $zipFile -OutFile "$($env:TEMP)\myFile.doc"
Replace the 1cwwPzYjIzzzzzzzzzzzzzzzzzzzzzzzz with the ID code from the shareable link setting back in step #2, above.

Related

Powershell download file onedrive/googledrive

I want to download a file to a pc from onedrive/google drive.
After some digging into this subject i found invoke-Webrequest was the best command to use in this subject.
# Download the file $zipFile = "https://xxxxxxmy.sharepoint.com/:u:/g/personal/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxfRW5c" Invoke-WebRequest -Uri $zipFile -OutFile "c:\temp\xxxxxx.exe"
only to found out the code was working but only downloaded a .exe file of 156kB
This file i wanted to download is 22mb? i get no error's in powershell but maybe you have any idea what is going on?
zipfiles work but then i need to extract a zip file in the code and i dont know the working code for that..? ( expand-archive didnt work).
So there is no login context for the session spawned by your script. If you open one drive in your browser, once authentication is established and a session exists, the browser is given access to the file.
If you open your file that is 156kb in notepad, you should find it's just a webpage saying the URL is not available.
I believe this will help the situation, but it's more complex:
https://techcommunity.microsoft.com/t5/onedrive-for-business/how-to-download-root-level-files-from-onedrive-using-powershell/m-p/758689
Thnx you for you reply and sorry for my late reply,
It turns out that the link i was using didn't have access to the file directly.
When you add in the onedrive/google docs :download=1 it will skip the "virus scan".
&download=1 needs to be added.

File not found error when using cyberduck CLI for OneDrive

I want to upload encrypted backups on OneDrive using cyberduck to avoid local copys. Having a local file called file.txt I want to upload in folder Backups from OneDrive root, I used this command:
duck --username <myUser> --password <myPassword> --upload onedrive://Backups .\file.txt
Transfer incomplete…
File not found. /. Please contact your web hosting service provider for assistance.
It's not even possible to get the directory content using duck --username <myUser> --password <myPassword> --listonedrive://Backups command. This also cause a File not found error.
What I'm doing wrong?
I exactly followed the documentation and have no clue why this is not working. Cyberduck was installed by using chocolately, current version is Cyberduck 6.6.2 (28219)
Just testing this out, it looks like OneDrive sets a unique identifier for the root folder. You can find that by either inspecting the value of the cid parameter in the URL of your OneDrive site or I found it by using the following command
duck --list OneDrive:///
Note, having three slashes is important. It would appear the first two are part of the protocol name and the first specifies you want the root. The result should look like a unique id of some sort like this: 36d25d24238f8242, which you can then use to upload your files like:
duck --upload onedrive://36d25d24238f8242/Backups .\file.txt
Didn't see any of that in the docs... just tinkering with it. So I might recommend opening a bug with duck to update their docs if this works for you.
What happens if you use the full path to the file, it looks like it is just complaining about not finding the file to uploads so could be you are in a different directory or something so it needs the full path to the source file.

What are my PowerShell options for determining a path that changes with every build (TeamCity)?

I'm on a project that uses TeamCity for builds.
I have a VM, and have written a PowerShell script that backs up a few files, opens a ZIP artifact that I manually download from TeamCity, and then copies it to my VM.
I'd like to enhance my script by having it retrieve the ZIP artifact (which always has the same name).
The problem is that the download path contains the build number which is always changing. Aside from requesting the download path for the ZIP artifact, I don't really care what it is.
An example artifact path might be:
http://{server}/repository/download/{project}/{build_number}:id/{project}.zip
There is a "Last Successful Build" page in TeamCity that I might be able to obtain the build number from.
What do you think the best way to approach this issue is?
I'm new to TeamCity, but it could also be that the answer is "TeamCity does this - you don't need a PowerShell script." So direction in that regard would be helpful.
At the moment, my PowerShell script does the trick and only takes about 30 seconds to run (which is much faster than my peers that do all of the file copying manually). I'd be happy with just automating the ZIP download so I can "fire and forget" my script and end up with an updated VM.
Seems like the smallest knowledge gap to fill and retrieving changing path info at run-time with PowerShell seems like a pretty decent skill to have.
I might just use C# within PS to collect this info, but I was hoping for a more PS way to do it.
Thanks in advance for your thoughts and advice!
Update: It turns out some other teams had been using Octopus Deploy (https://octopus.com/) for this sort of thing so I'm using that for now - though it actually seems more cumbersome than the PS solution overall since it involves logging into the Octopus server and going through a few steps to kick off a new build manually at this point.
I'm also waiting for the TC administrator to provide a Webhook or something to notify Octopus when a new build is available. Once I have that, the Octopus admin says we should be able to get the deployments to happen automagically.
On the bright side, I do have the build process integrated with Microsoft Teams via a webhook plugin that was available for Octopus. Also, the Developer of Octopus is looking at making a Microsoft Teams connector to simplify this. It's nice to get a notification that the new build is available right in my team chat.
You can try to get your artefact from this url:
http://<ServerUrl>/repository/downloadAll/<BuildId>/.lastSuccessful
Where BuildId is the unique identifier of the build configuration.
My implementation of this question is, in powershell:
#
# GetArtefact.ps1
#
Param(
[Parameter(Mandatory=$false)][string]$TeamcityServer="",
[Parameter(Mandatory=$false)][string]$BuildConfigurationId="",
[Parameter(Mandatory=$false)][string]$LocalPathToSave=""
)
Begin
{
$username = "guest";
$password = "guest";
function Execute-HTTPGetCommand() {
param(
[string] $target = $null
)
$request = [System.Net.WebRequest]::Create($target)
$request.PreAuthenticate = $true
$request.Method = "GET"
$request.Headers.Add("AUTHORIZATION", "Basic");
$request.Accept = "*"
$request.Credentials = New-Object System.Net.NetworkCredential($username, $password)
$response = $request.GetResponse()
$sr = [Io.StreamReader]($response.GetResponseStream())
$file = $sr.ReadToEnd()
return $file;
}
Execute-HTTPGetCommand http://$TeamcityServer/repository/downloadAll/$BuildConfigurationId/.lastSuccessful | Out-File $LocalPathToSave
}
And call this with the appropriate parameters.
EDIT: Note that the current credential I used here was the guest account. You should check if the guest account has the permissions to do this, or specify the appropriate account.
Try constructing the URL to download build artifact using TeamCity REST API.
You can get a permanent link using a wide range of criteria like last successful build or last tagged with a specific tag, etc.
e.g. to get last successful you can use something like:
http://{server}/app/rest/builds/buildType:(id:{build.conf.id}),status:SUCCESS/artifacts/content/{file.name}
TeamCity has the capability to publish its artifacts to a built in NuGet feed. You can then use NuGet to install the created package, not caring about where the artifacts are. Once you do that, you can install with nuget.exe by pointing your source to the NuGet feed URL. Read about how to configure the feed at https://confluence.jetbrains.com/display/TCD10/NuGet.
Read the file content of the path in TEAMCITY_BUILD_PROPERTIES_FILE environment variable.
Locate the teamcity.configuration.properties.file row in the file, iirc the value is backslash encoded.
Read THAT file, and locate the teamcity.serverUrl value, decode it.
Construct the url like this:
{serverurl}/httpAuth/repository/download/{buildtypeid}/.lastSuccessful/file.txt
Here's an example (C#):
https://github.com/WideOrbit/buildtools/blob/master/RunTests.csx#L272

Accessing Teamcity Artifact with Dynamic name

I want to download a TeamCity artifact via powershell. It needs to be the last successful build of a specific branch.
I've noticed two common url paths to access the artifacts. One seems to be
/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/ARTIFACT_PATH
The problem is that the file at the end relies on the release version. Within TeamCity there is syntax to specify all files \*.msi. Is there any way to specify an artifact starting with FileName-{version.number}.msi when trying to access this url?
EDIT:
The other url I noticed is for the REST API.
http://teamcity/guestAuth/app/rest/builds/branch:[BRANCH],buildType:[BUILD TYPE],status:SUCCESS,state:finished/artifacts/[BUILD PATH]
The problem is that I can't download artifacts from here. If I want to download the artifacts I have to use the current build id. The above url gives the following url: /guestAuth/app/rest/builds/id:[Build ID]/artifacts/content/[Artifact Path] to download the artifact.
I can use the first REST url to eventually get the second through the returned xml, but would prefer a more straightforward approach.
Unfortunately as TeamCity artifacts are not browsable the usual workarounds like wget recursive download or wildcards are not applicable.
Using wildcards in wget or curl query
How do I use Wget to download all Images into a single Folder
One workaround you could try is formatting the link in the job, saving the link to a text file and storing that as an artifact as well, with a static name. Then you just need to download that text file to get the link.
I found you can format the artifact URL in TeamCity job by doing:
%teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/<path_to_artifact>
In a command line step. You can write this to a file by doing:
echo %teamcity.serverUrl%/repository/download/%system.teamcity.buildType.id%/%teamcity.build.id%:id/myMsi-1.2.3.4.msi > msiLink.txt"
Now you have an artifact with a constant name, that points to the installer (or other artifact) with the changing name.
If you use the artifact msiLink.txt you don't need to use the REST interface (it's still two calls, both through the same interface).
You can easy download the latest version from batch/cmd by using:
wget <url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt ---user #### --passsword ####
set /P msi_url=<msiLink.txt
wget %msi_url% --user #### --passsword ####
Hope it helps.
Update:
Sorry I just realized the question asked for PowerShell:
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential("yourUser", "yourPassword")
$WebClient.DownloadFile( "<url_server>/repository/download/BUILD_TYPE_EXT_ID/.lastSuccessful/msiLink.txt", "msiLink.txt" )
$msi_link = [IO.File]::ReadAllText(".\msiLink.txt")
$WebClient.DownloadFile( $msi_link, "yourPath.msi" )

How can i download a report using wget

I am trying to use WGET to automate download of a file which is generated by a report server (PDF format). However, the problem i am having is that the file name is never known (generated randomly by the server) and the URL accepts parameters that will change eg. Date=. Name= ID= etc.
For Example if i were to pass http://url.com/date=&name=&id= in internet explorer, i will get a download dialog prompting me to download a file with file xyz123.pdf
Is it possible that i can use WGET to pass these parameters to the report server and automatically download the generated PDF file
Just put the full url in quotes - It should go and fetch the file:
wget "http://url.com/date=foo&name=baa&id=baz"
Thanks,
//P