My company's website is managed and hosted by a third party.
We'd like to provide a portal on the website that allows our clients to upload files directly to a Google Cloud Storage bucket without the file going through the website (these uploads can span thousands of files and several GB).
I've found a good guide for how to do it on AWS (https://softwareontheroad.com/aws-s3-secure-direct-upload/) but can't even determine if the equivalent functionality exists for Google, let alone how to do it.
Has anyone done this before?
Please consider providing us some more technical details on what you want to achieve. Things like programming languages, server platform, cloud provider where the website is hosted…
In a generalistic way, I can tell you that Google Cloud Storage has a similar approach to upload files which is Signed URLs
For example, if you are coding in Python with this you can upload a file to a bucket using a signed URL:
import datetime
from google.cloud import storage
def generate_upload_signed_url_v4(bucket_name, blob_name):
"""Generates a v4 signed URL for uploading a blob using HTTP PUT.
# bucket_name = 'your-bucket-name'
# blob_name = 'your-object-name'
storage_client = storage.Client()
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow PUT requests using this URL.
method="PUT",
content_type="application/octet-stream",
)
print("Generated PUT signed URL:")
print(url)
print("You can use this URL with any user agent, for example:")
print(
"curl -X PUT -H 'Content-Type: application/octet-stream' "
"--upload-file my-file '{}'".format(url)
)
return url
Then you can implement this as needed by your organization
Related
In google, images are hosted in CDN URL type and I tried to download as an image from that CDN but it throws an error in C#. Used this c# code attached below.
using (var webClient = new WebClient())
{
byte[] imageBytes = webClient.DownloadData(imageUUl);
System.IO.File.WriteAllBytes(#"E:\Temp\img2.jpeg", imageBytes);
}
URL: https://lh6.googleusercontent.com/vpsleVfq12ZnALrwbIUqCTa0Fpqa5C8IUViGkESOSqvHshQpKCyOq4wsRfTcadG2WYgcW3m0yq_6M2l_IrSM3qr35spIML9iyIHEULwRu4mWw4CUjCwpVfiWnd5MXPImMw=w1280
Thanks in advance.
In GCP Cloud CDN, you can use a signed URL with authentication or signed cookies to authorize users and provide them with a time-limited token for accessing your protected content, so a Cloud CDN does not block requests without a signature query parameter or Cloud-CDN-Cookie HTTP cookie. It rejects requests with invalid (or otherwise malformed) request parameters, due to this I suggest reviewing your browser client security settings; how the authentication is managed to your CDN URL. Some clients store cookies by default if the security policy is allowed, also review how your CDN URL security ingress is configured because when you are using a CDN URLs with signed cookies the responses to signed and unsigned requests are cached separately, so a successful response to a valid signed request is never used to serve an unsigned request.
In another hand if you are using a signed CDN URL to limited secure access to file for a limited amount of time, there are some steps that you need to follow first:
Ensure that Cloud CDN is enabled.
If necessary, update to the latest version of the Google Cloud CLI:
`gcloud components update`
Creating keys for your signed URLs
To create keys, follow these steps.
1.In the Google Cloud Console, go to the Cloud CDN page.
2.Click Add origin.
3.Select an HTTP(S) load balancer as the origin.
4.Select backend services or backend buckets. For each one
-Click Configure, and then click Add signing key.
-Under Name, give the new signing key a name.
-Under the Key creation method, select Automatically generate or Let me enter.
- If you're entering your own key, type the key into the text field.
- Click Done.
- Under Cache entry maximum age, provide a value, and select a Unit of time from the drop-down list. You can choose among second, minute, hour, and day. The maximum amount of time is three (3) days.
5. Click Save.
6. Click Add.
Configuring Cloud Storage permissions.
Before you run the following command, add at least one key to a backend bucket in your project; otherwise, the command fails with an error because the Cloud CDN cache fill service account is not created until you add one or more keys for the project. Replace PROJECT_NUM with your project number and BUCKET with your storage bucket.
gsutil iam ch \ serviceAccount:service-PROJECT_NUM#cloud-cdn-fill.iam.gserviceaccount.com:objectViewer \ gs://BUCKET
List the keys on a backend service or backend bucket, run one of the
following commands:
gcloud compute backend-services describe BACKEND_NAME
gcloud compute backend-buckets describe BACKEND_NAME
Sign URLs and distribute them.
You can sign URLs by using the gcloud compute sign-url command or by using code that you write yourself. If you need many signed URLs, custom code provides better performance.
This command reads and decodes the base64url encoded key value from KEY_FILE_NAME, and then outputs a signed URL that you can use for GET
or HEAD requests for the given URL.
gcloud compute sign-url \
"https://example.com/media/video.mp4" \
--key-name my-test-key \
--expires-in 30m \
--key-file sign-url-key-file
In this link, you can find more info related to signed URLs and signed cookies.
You can't download an image in that way, since you need to provide an OAuth token. and you need to have the profile scope enabled
var GoogleAuth; // Google Auth object.
function initClient() {
gapi.client.init({
'apiKey': 'YOUR_API_KEY',
'clientId': 'YOUR_CLIENT_ID',
'scope': 'https://www.googleapis.com/auth/userinfo.profile',
'discoveryDocs': ['https://discovery.googleapis.com/discovery/v1/apis']
}).then(function () {
GoogleAuth = gapi.auth2.getAuthInstance();
// Listen for sign-in state changes.
GoogleAuth.isSignedIn.listen(updateSigninStatus);
});
}
Can Google signed URL generated on one google project can upload bucket object into another google project?
I have used google-document (Code-sample C#) to upload data to google storage bucket but do not want to upload data into another/different project bucket.
e.g. If service account used to generate Singed URL and google project name is "gcp-project-signed-data", then generated signed url can only upload data to "gcp-project-signed-data" and not any other project even if destination project is different i.e. it must give some access related error message,
How can we achieve this?
Two things:
Bucket names are in a global namespace, if your signed URL is for my-bucket there is no other bucket with that name in any other project:
https://cloud.google.com/storage/docs/naming-buckets#considerations
The signed URL reflects the permissions of the original (signing) service account. If the service account does not have permissions to write in other-bucket then any signed URLs created with that account cannot write to other-bucket either.
I am trying to create a signed URL for a private object stored in cloud storage.
The storage client is being created using a service account that has the Storage Admin role:
storage_client = storage.Client.from_service_account_json('service.json')
bucket = storage_client.bucket(bucket_name)
blob = bucket.blob(blob_name)
url = blob.generate_signed_url(
version="v4",
# This URL is valid for 15 minutes
expiration=datetime.timedelta(minutes=15),
# Allow GET requests using this URL.
method="GET"
)
This generates a URL that when accessed via a browser gives this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.</Details>
</Error>
What am I missing here? The service account has no problem interacting with the bucket or blob normally - I can download it/etc. It's just the Signed URL that doesn't work. I can make the object public and then download it - but that defeats the purpose of being able to generate a signed URL.
All of the other answers I've found seem to focus on issues using application default credentials or are very old examples from the v2 API.
Clearly there's something about how I'm using the service account - do I need to explicitly give it permissions on that particular object? Is the Storage Admin role not enough in this context?
Going crazy with this. Please help!
I want to upload 120 files, each around 1.2GB so about 150GB in total, from an HTTPS website onto my Google Cloud Storage.
I really, really don't want to have to download them all locally, and then upload them individually.
Is there any way around this? Surely I can just give Google Cloud Storage a URL to pull from? I don't control the HTTPS server.
It seems to be possible to upload from S3 to Google Cloud Storage, but S3 seems to suffer from the same problem.
If your website allows public access you can use the GCS Transfer Service to do it: https://cloud.google.com/storage/transfer/
I have a Google Compute VM (LAMP) webserver set up to copy files to a Google Storage Bucket, which then need to be accessed (read and write) by a program on a Google Compute VM (Windows 2008). I can't seem to find any documentation about how a Google Compute Engine Windows VM can access storage buckets.
Is there a way this is possible? Thanks.
I'm doing the same thing, but not with a windows VM, but I think the principle is the same.
First you need to allow Project Access for your VM from the Google Cloud Console https://console.developers.google.com/project, see the screenshot below:
Once you've done this you need to call the metadata server to get an access token from your program. You need to make a HTTP call to the metadata server, here is an example from the docs (https://cloud.google.com/compute/docs/authentication) using curl, bear in mind when programming this you need to also provide the header "Metadata-Flavor: Google":
$ curl "http://metadata/computeMetadata/v1/instance/service-accounts/default/token" \
-H "Metadata-Flavor: Google"
{
"access_token":"ya29.AHES6ZRN3-HlhAPya30GnW_bHSb_QtAS08i85nHq39HE3C2LTrCARA",
"expires_in":3599,
"token_type":"Bearer"
}
You obviously need to code this HTTP call and the parsing of the JSON data in whichever programming language you are using for your program and extract the "access_token", based on the "expires_in" field you might also need to implement a mechanism to fetch a new token once it expires. You can then use the Google supplied cloud storage client library (https://cloud.google.com/storage/docs/json_api/v1/libraries) for your programming language and use the access token above for authenticating calls to cloud storage. I use Java and the Cloud storage class in the API library has this method that can be used:
.setOauthToken("blah")
You can mount the drive with CloudBerry. I would like to find a better way to do it though using only Google Cloud. Please let me know if you find anything better.