Google storage upload files in javascript - google-cloud-storage

I'm looking for the absolute minimal requirements to upload a file to a storage bucket. And the minimal required code.
The minimal oAuth/API-key information I have is from console.cloud.google.com/apis/credentials/oauthclient/
clientId: Client ID from Credentials page.
apiKey: Client secret from Credentials page?
scopes: 'https://www.googleapis.com/auth/devstorage.read_write' (Why is it called devstorage? As if it's intended only for devs? Or is it short for device?
I'm not sure if the apiKey is right, but using this from an example doesn't throw an error:
// This is triggered after the `client.js` loads
function handleClientLoad() {
gapi.client.setApiKey(apiKey);
window.setTimeout(checkAuth,1);
}
function checkAuth() {
gapi.auth.authorize({client_id: clientId, scope: scopes, immediate: true}, handleAuthResult);
}
So, that's step one, but correct me if I'm wrong.
Now to handle Auth Result:
function handleAuthResult(authResult) {
if (authResult && !authResult.error) {
makeApiCall();
}
}
Here I'm kind of stuck, there's an example I'm working off: https://developers.google.com/api-client-library/javascript/start/start-js
There it stops working when it gets to gapi.client.plus.people, where people is not defined. This is because I have an anonymous user, and sort of a public API-key. Later I will implement per user ACL, but for now I just need it to work/upload.
Now, I've got something similar working on Amazon Cloud, with a simple jQuery-based widget that can upload files, where I only needed to input their API-key and the bucket name basically (which was in PHP unfortunately).
I would be happy with just a simple <form>, but the examples I encounter have a lot more information/fields in them than the above 3 minimal data (clientId, apiKey, scopes) and the additional url to the bucket (like strange encrypted acl strings)
I understand that bucket-name.storage.googleapis.com is where the files end up, and that works when I manually upload images.
I'm now looking for the absolute minimal piece of code, preferably javascript, using the google client so using something like (which I found in another example):
gapi.client.request({
'path': '/upload/storage/' + API_VERSION + '/b/' + BUCKET + '/o',
'method': 'POST',
'params': {'uploadType': 'media'}
'headers': {
'Content-Type': 'multipart/mixed; boundary="' + boundary + '"'
}
Do I need API_VERSION? How do I find out which path to use? I know my bucket name, but where did the upload/storage/ come from? and /b/? (although https://cloud.google.com/storage/docs/json_api/v1/how-tos/upload shows that I need it)
The example used a header with a boundary but I cannot find anything about that, and I probably also don't need that. Problem is I can't find any minimally required headers.
I came across 'x-goog-acl': 'public-read', does this need to be in the headers?
So to sum up, the questions I'd like answered:
What headers would I minimally need to upload to a bucket with allUsers access set to Owner ( I know, not the best idea, this will change in the future)
Any example code of how to get a file from an html form, and either use ajax to send the request, or use the gapi, which I think is meant to do that.
I hope my problem is clear, I basically don't know where to start/find the right code examples. Although this might have to do with the fact the google javascript client api is in beta..

What headers would I minimally need to upload to a bucket with allUsers access set to Owner ( I know, not the best idea, this will change in the future).
You are correct! This is not a good idea. If anonymous users own your bucket, that means they can delete any object in it, overwrite an existing objects, and otherwise cause expensive mischief. Nonetheless, the answer to your question:
<form action="https://storage.googleapis.com/YOUR_BUCKET_NAME"
method="post" enctype="multipart/form-data">
<input name="key" type="text" value="objectName.txt" /><br/>
<input name="file" type="file" /><br/>
<input type="submit" value="Upload!" />
</form>
This is the absolute minimum required to upload an object. Note that it doesn't redirect you anywhere or otherwise indicate success. For that, you'd want to use the success_action_redirect parameter, and you asked for the absolute minimum.

Related

Generate unpredictable/unforgeable URL from predictable ID

I have a simple API that return Something for a given ID and it must be used without any kind of authentication, the URL should be permanent and yet I want to avoid as much as possible it to be botted.
The Url is something like this:
https://url/{SomeId}/doSomething
The problem is that this is very predicable and a bot could easily try all the ID and get everything associated to it.
I'm looking for a way to make the URL non predictable like for example:
https://url/{SomeId}/doSomething?Key=SomeVeryLongCryptographicKey
In this way except if you run a supercalculator you shouldn't be able to deduce all the URLs
I know that there is probably a lot of ways to do that, like using a DB which I want to avoid.
I guess I'm looking for a kind a JWT associated to the ID without expiration but maybe there is better solution.
Edit: to give a better example i'm looking to do a bit like did Zoom for permanent invitation Links. They had predictable room Ids and they added a password making it unpredictable lie so:
https://us05web.zoom.us/j/132465789?pwd=SUxIU0pLankyhTRtWmlGbFlNZ21Ndz08
What would be the best/light/"secure" way to achieve that ?
Also I'm using .Net if there is a library doing that, it would be great.
I think your idea of using a JWT makes the most sense. Better to use something standard from a cryptographic point of view, and the JSON format allows for encoding whatever you need to provide to the receiving endpoint (user names, entity names, entity IDs, other things).
There are standard Microsoft libraries for building and validating JWTs, but I prefer the library Jwt.Net (https://www.nuget.org/packages/JWT). It lets you do something like this quite easily:
var token = JwtBuilder()
.WithAlgorithm(new RS256Algorithm(publicKey,privateKey))
.AddClaim("uri", String.Format("https://example.com/api/{0}/{1}", entityName, entityId))
.Encode();
Just add whatever claims you like, and the JWT will then contain what you want to transfer (I've used an example of the URI that you want to give to the entity) and a signature with your private key. You could even just give a URL like https://example.com/from_token/eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJ1cmkiOiJodHRwczovL2V4YW1wbGUuY29tL2FwaS90ZXN0RW50aXR5LzEyMzQifQ.K2P4wSF6g1Kt-IHMzlklWTV09_MIkoiNHQztSIYOohmOWe7aBfFSQLIKSbdTECj9XPjNNG_AjH9fdjFglkPuYfr2G9rtl2eh5vTjwdM-Uc0X6RkBu0Z2j7KyMKjkaI3zfkIwhtL1mH873xEBtNOGOW18fuBpgnm8zhMAj1oD3PlDW8-fYBrfLb6VK97DGh_DyxapbksgUeHst7cAGg3Nz97InDPtYcWDi6lDuVQsj3t4iaJBRL8IM785Q8xjlHHhzdfcX3xU4IhflyNHHXxP56_8ahNNMOZKWdwgbTSIxEEB98b3naY3XknC-ea7Nc1y4_5fszrYdy3LaQWV43jpaA
and have the handler decode the entity name and ID you want to retrieve directly from the URI while verifying the signature. Decoding with the library is just as easy:
var json = JwtBuilder.Create()
.WithAlgorithm(new RS256Algorithm(_key))
.MustVerifySignature()
.Decode(token);

PHP CSRF sitewide solution - Is it good enough

I have a PHP based document management system that I wrote a while back to store my families paperwork. Although the full code is not publicly available it would still be a good idea to protect it against CSRF attacks.
Would this be a suitable solution for implementing sitewide CSRF protection:
All forms are protected against being directly accessed, so they all have to be accessed via index.php, this is done with the following code at the top of each form PHP file:
if(basename(__FILE__) == basename($_SERVER['PHP_SELF'])){
header("Location: ../../");
}
In the index.php file, which every form must go through to be accessed I have the following code to create a token session if it doesn't exist and then check for any post data containing a token (this is done before any possible calls to a form):
// Generate session token for CSRF
if (empty($_SESSION['token'])) {
$_SESSION['token'] = bin2hex(random_bytes(32));
}
if (isset($_POST['token'])) {
if (!hash_equals($_SESSION['token'], $_POST['token'])) {
echo 'There is a problem with your session token.';
exit;
}
}
Form validation contains the following code to check that a token session has been set to ensure the session checking on index.php has been ran:
if (isset($_POST['token']) && $_POST['token'] == $_SESSION['token']) {
// Process form data
}
Finally, each form would include a hidden element like so:
<input type="hidden" name="token" value="<?php echo $_SESSION['token']; ?>" />
As far as I can see this should be suitable, but I am not an expert on CSRF PHP attacks.
My questions are:
1) Is the first block of code a suitable method of blocking access to the file, other than being included in another PHP file on the same server?
2) Is this method of implementing a token suitable to stop CSRF attacks?

How to get the current user using jsonwebtoken in Sails.js?

I've been working with Sails since couple of weeks ago, I came from Rails and I don't have any experience working with Node.js.
Now I'm trying to make a robust token authentication using jsonwebtoken.
https://github.com/auth0/node-jsonwebtoken
I followed this guide http://thesabbir.com/how-to-use-json-web-token-authentication-with-sails-js/ and everything worked fine.
I'm able to make a sign up, sign in and then use the token correctly for different actions.
Now, there are some actions where I'd like to use the login user,
something like devise current_user helper.
For example, when creating a comment, this comment should belongs to the current user.
Using Sabbir Ahmed guide, in the line 33 from the isAuthorized.js policy the token gets decrypted so I can get the current user id from there.
So, my question is, what should be the best way to get the current user and be able to use it later in some controller?
For example I tried something like:
# isAuthorized.js line 34, after getting decrypted token
User.findOne({id: token.id}).exec(function findOneCB(err, found){
currentUser = found;
});
But, on this way, because this is an async action I can't use this currentUser in a controller.
I want to store the current user in order to be able to use it later in some controller without repeated the same code in each controller, something like a helper or maybe a service.
The trick is where you place the next(). Since you are making an async call, the control should only be transferred to next policy/ controller once the database action is competed.
You should modify the policy to:
User.findOne({id: token.id}).exec(function findOneCB(err, found){
if(err) next(err);
req.currentUser = found;
next();
});
And you should be able to access the user details in controllers that use isAuthorized policy via req.currentUser
If by
For example, when creating a comment, this comment should belongs to the current user.
what you mean is certain attributes like username, and country etc, rather than querying the database after verification, what you can choose to do is to send these additional attributes to jwToken.issue in api/controllers/UsersController.js
eg.
jwToken.issue({
id: user.id,
username: user.name,
country: user.country
})
How that helps is, you can keep api/policies/isAuthorized.js as is, and in all the controllers that you use in the future, you can access the payload values from as
token.username or token.country
Instead of having to query the database again, thereby saving you valuable response time.
Beware however, of the data you choose to send in the token (you could also send {user:user} if you want to) however, as the secret key or hashing is not required to decrypt the payload as you can figure # jwt.io , you might want to exercise restraint.

How to extend res.json in sailsjs

I need to extend res.json so that the response goes out as text with a csrf token eg
&&&CSRF&&&{foo:bar}
Sails seems to use a different csrf methodology, but I need to do it this way to match the preexisting client side codebase.
Ideally I need to be able to create a new function:
return res.jsonWithCsrf({
foo: bar
});
Internally this would call res.json but would wrap the csfr token around the response.
I gather that I need to write a hook but am unsure how to do it.
You can create custom responses by placing your file in the api/responses directory.
You can see the files that are already there, modify them if you want, or create your own.
If you were to create jsonWithCsrf.js in that folder, then you can access it in the manner you describe above.
res.jsonWithCsrf()
http://sailsjs.org/#!/documentation/concepts/Custom-Responses

Authentication That Doesn't Require Javascript?

I have a Web API app, initialized thusly:
app.UseCookieAuthentication();
app.UseExternalSignInCookie(DefaultAuthenticationTypes.ExternalCookie);
app.UseOAuthBearerTokens(OAuthOptions);
app.UseGoogleAuthentication();
For calls to most controllers, it works great. However, it also requires a bit of javascript before client-side service calls are made:
function getSecurityHeaders() {
var accessToken = sessionStorage["accessToken"] || localStorage["accessToken"];
if (accessToken) {
return { "Authorization": "Bearer " + accessToken };
}
return {};
}
The problem is that we have a certain type of controller (one that accesses files) where no javascript can be run during the call. For example, the call might be to:
http://mysite/mycontroller/file/filename.jpg
...where the value is assigned as the src attribute of an img tag. The call works, but Thread.CurrentPrincipal.Identity is unauthenticated with a null name, so there's currently not a way to enforce security.
I'm new to Web API, so it may be a dumb question, but what's the way around this? What switches do I need to flip to not require javascript to add security headers? I was considering trying to find a way to force an authorization header in an IAuthorizationFilter or something, but I'm not even sure that would work.
So I figured out the solution to my problem.
First, I needed to configure the app to use an authentication type of external cookies thusly:
//the line below is the one I needed to change
app.UseCookieAuthentication(AuthenticationType = DefaultAuthenticationTypes.ExternalCookie);
app.UseExternalSignInCookie(DefaultAuthenticationTypes.ExternalCookie);
app.UseOAuthBearerTokens(OAuthOptions);
app.UseGoogleAuthentication();
Second, it turned out there was a line of code in my WebApiConfig file that was disabling reading the external cookie:
//this line needed to be removed
//config.SuppressDefaultHostAuthentication();
After that, I could see the external cookie from Google, which passed along an email address I could identify the user with.