Gerrit Code Review - How to download a file with wget using http password? - wget

I'd like to donwload a single file from a gerrit server which a no read access for anonymous users.
Therefore I set the http password for the user and tried something like:
wget --user=user --password=passwd "http://example.com:8443/gitweb/?p=...;a=blob_plain;f=...;hb=refs/heads/master"
HTTP request sent, awaiting response... 401 Unauthorized
Unknown authentication scheme.
Is this possible at all using the http password generated in the user settings of gerrit?
Thank you!

Yes, it possible to use the HTTP password generated in the user setting. Have a look at the Authentication documentation here.
However, Gerrit Code Review doesn't allow you to download a single file (unless you have a particular plugin installed to do so). Are you sure downloading a single file is what you want?

the following works for me:
http://your gerrit:8080/gitweb?p=your repository.git;a=blob_plain;f=path/to/file -O file

Related

download gitub artifact from url using wget

I am trying to follow these docs to download an artifact from github using githubs API:
https://docs.github.com/en/rest/actions/artifacts#download-an-artifact
I ran the curl command given in the docs, and it gave me the following url from which to download the artifact (I have replaced the specifics with ...)
https://pipelines.actions.githubusercontent.com/serviceHosts/..../_apis/pipelines/1/runs/16/signedartifactscontent?artifactName=my-artifact&urlExpires=....&urlSigningMethod=HMACV2&urlSignature=....
I am able to download the artifact by putting the URL into my browser (it automatically downloads when the URL is visited) however I tried to use wget to download it via console and got this error:
wget https://pipelines.actions.githubusercontent.com/... # the command I ran
HTTP request sent, awaiting response... 400 Bad Request # the error I got
How can I download a zip file to console? Should I use something other than wget?
I'd like to clarify that viewing this link in the browser is possible even when not logged in to github (or when in private browsing). Also, I can download the zip file at the link as many times as I would like before the link expires after 1 minute. Also my repo is private, which is necessary for my work. I need to use an access token when doing the curl command as described in the docs, however the link that is returned to me does not require any authentication when accessed via a browser.
The api docs seem a bit ambiguous here. It is possible that the redirect can only be accessed a single time in which case you should try generating the redirect and first using wget to parse it. You can then unzip the file using the unzip command.
If that is not the case I believe this statement in the api docs is key:
Anyone with read access to the repository can use this endpoint. If the repository is private you must use an access token with the repo scope. GitHub Apps must have the actions:read permission to use this endpoint.
My guess is that your repository is private and you are logged in on the browser to Github which allows you to be authenticated hence why you are able to download from the redirect link. I would suggest trying from incognito mode to test this.
Migrating the repository to public would allow you to bypass this issue. Alternatively you can pass the authentication token as a header to wget like so in order to authenticate with the server to pull the file.
header='--header=Authorization: token <TOKEN>'
wget "$header" https://pipelines.actions.githubusercontent.com/... -O output_file
The problem was that I didn't put quotes around my url. I needed to do this:
wget "https://pipelines.actions.githubusercontent.com/serviceHosts/..../_apis/pipelines/1/runs/16/signedartifactscontent?artifactName=my-artifact&urlExpires=....&urlSigningMethod=HMACV2&urlSignature=...."

How to prevent raw github tokens from expiring

I want to fetch a file from a github repository and read it from my application using http request url.
I tried using the below http url.,
Example:
https://raw.github.xxx.com/sample-repo/config-details/master/conf/application-dev.conf?token=ABAD5JUZ7O84U7CIKEU4MGY5UWJU6
It worked well, but after a week the token got expired. So I am getting "404: Not found" when I try to fetch the file.
Is there any way to prevent the token from expiring or any better solution to solve this problem?
Update:
Actually I am trying to implement a remote config server in play framework.
I am using play-rconf-http (a library which helps to fetch a config file hosted in a http server)
I will set the remote file url in my app's config file as below,
remote-configuration.http.url = "https://raw.github.xxx.com/sample-repo/config-details/master/conf/application-dev.conf?token=ABAD5JUZ7O84U7CIKEU4MGY5UWJU6"
So while my server starts it will fetch the file from the remote server and load the configuration.
It was working as I expected, but after few days the tokens are getting expired. So I need to solve it.

How can i validate links within a private github repository

Background
We are writing some documentation for our support team.
We want to include links to files that are stored in private GitHub repositories.
We do not want the documentation to become stale if somebody refactors the code in the private GitHub repositories, so instead I am setting up a CI job that parses the documentation (with jsoup if you are interested) and finds all the links.
Once we have all the links we start checking them.
NOTE: we have written a custom link checker, because one of the critical set of links we have is for our monitoring solution, and sadly (also understandably) the SaaS we are using returns 404's for any unauthenticated requests on the URLs of the alerts.
The SaaS itself uses a 2FA to access the Web UI, so what we have ended up doing is parsing the URLs and then constructing an equivalent call to the SaaS API to validate the link.
For the monitoring system we use, this is easy: all the URLs are the same format.
Question
Can we validate a random GitHub URL as valid (ideally using only curl - I can translate to my chosen HTTP client from there, and curl gives a more generic answer) using a Personal Access Token? And if so, how?
The URLs could be:
simple direct to repo URLs: https://github.com/<org>/<repo>
direct to branch URLs: https://github.com/<org>/<repo>/tree/<branch>
file URLs: https://github.com/<org>/<repo>/blob/<path/to/file>
diff URLs: https://github.com/<org>/<repo>/compare/[<branch>...]<branch>
other URLs that are based on the presence of the repo and do not vary in child path, e.g. https://github.com/<org>/<repo>/pulls, https://github.com/<org>/<repo>/settings/collaboration, etc
plus who knows what other URLs people will add within the docs...
Things I have tried that didn't work
HTTP Basic authentication with the Personal Access Token as the password, e.g.
curl -I -u stephenc:2....token.redacted....b https://github.com/stephenc/<repo-name>
HTTP/1.1 404 Not Found
HTTP Bearer authentication, e.g.
curl -I -H "Authorization: bearer 2....token.redacted....b" https://github.com/stephenc/<repo-name>
HTTP/1.1 404 Not Found
It looks like it works for some URLs (no idea which ones).
I can access curl -u agentgonzo:$TOKEN https://raw.githubusercontent.com/agentgonzo/repo/path/to/file using the API Token as my username, but the same doesn't work on https://github.com URLs. Not sure if this will help you or not.
I got an answer from GitHub Support: No
Since a personal access token won't work for GitHub web UI URLs, no, there isn't a way to verify all possible GitHub private repo URLs without making API calls in some cases.

Access raw file on GitHub Enterprise without user having to create token

I have a repo with shell script and want to put single command to run it in readme file, like:
bash <(curl -L <path_to_raw_script_file>)
Raw file urls for GitHub Enterprise look like this: https://raw.github.ibm.com/<user>/<repo>/<branch>/<path_to_file>?token=<token>, where <token> is unique to the file and generated when accesing it via Raw button in repository or with ?raw=true suffix in url.
The problem is, tokens get invalidated after few days/when file is updated and I wouldn't like to update mentioned command each time token becomes invalid. Is there a way to deal with it?
I know there is a way for user to create personal token and use it to login to github from machine he's runnning script from, but I wanted to keep it as simple as possible.
I was thinking of something like auto-generating that raw file url (since user reading the readme file on github surely does have access to the script located in the same repo), but I am not sure if that's possible.
No input, one-liner.
You can get this link by clicking the raw button in the GHE UI, just remove the token query param at the end.
curl -sfSO https://${USER}:${TOKEN}#${GHE_DOMAIN}/raw/${REPO_OWNER}/${REPO_NAME}/${REF}/${FILE}
I believe you'll always need the tokens - however if you'd like to automate the process you can dynamically request tokens associated with a github Oauth app and not associated with any user profile.
https://developer.github.com/enterprise/2.13/apps/building-oauth-apps/authorizing-oauth-apps/
I know there is a way for user to create personal token and use it to login to GitHub from machine he's runnning script from, but I wanted to keep it as simple as possible.
Actually, using GCM (Git Credential Manager); the PAT will be provided when accessing the raw.xxx URL.
But only with GCM v2.0.692 which supports those URLs. See PR 599.
Fix GitHub Enterprise API URL for raw source code links
This is a simple fix of #598 for GitHub Enterprise instances that use a raw. hostname prefix for raw source code links.
I've verified this fix locally by swapping out the GitHub.dll that is used by Visual Studio.
So it now checks for 'raw.' in the hostname and remove it to get the correct GHE API URL.

Facebook Product Feed Schedule using ShareFile FTP Link - curl error 56

I'm attempting to set up a scheduled fetch for a product feed in Facebook Business Manager. From what I can see, I can provide an FTP link to Facebook along with credentials and it should pick up the file.
The FTP provider I'm using is ShareFile, with the appropriate account given access to the folder the file is in. I confirmed I can use these credentials and connect to the FTP using FileZilla, so that portion should be solid. The setup in FileZilla is as follows:
Host: host.sharefileftp.com
Protocol: FTP
Encryption: Require implicit FTP over TLS (port 990)
User: domain\facebookfeed#host.com
Password: ######
However, when I put the same credentials into Facebook Business Manager for the feed upload, I get the following error:
Fetching product feed from FTP server failed due to unknown reason
(Curl error code: 56). Please help us by reporting the problem. You
may be able to try again.
All I can find on this error is from the cURL error page:
CURLE_RECV_ERROR (56)
Failure with receiving network data.
I saw a mention somewhere about needing to whitelist Facebook's IPs within ShareFile, but I can't for the life of me find that page.
FTP Details are set up in Facebook as follows:
URL: ftp://host.sharefileftp.com:990/Path/To/File.csv
Username: domain\facebookfeed#host.com
Password: ######
I've been unsuccessful getting anywhere else with the Facebook documentation and I can't find a way to contact their support directly. If anyone has experience with this any help is appreciated!
EDIT: Trying this just with a direct cURL and getting the same results:
$ curl -u 'domain\facebookfeed#host.com' "ftp://host.sharefileftp.com:990/Path/To/File.csv"
Enter host password for user 'domain\facebookfeed#host.com':
curl: (56) response reading failed
So at this point it's probably just an issue with ShareFile itself, nothing to do with Facebook.
As #jared pointed out in the comments, using ftps did allow the cURL to work:
$ curl -u 'domain\facebookfeed#host.com' "ftps://host.sharefileftp.com:990/Path/To/File.csv"
However, Facebook doesn't support ftps. It only supports ftp or sftp, which ShareFile doesn't support.
The normal ftp I originally tried didn't work because the ShareFile account used didn't have normal ftp allowed. Once that was turned on, Facebook was able to connect to ShareFile and grab the file successfully.