I found one public bucket on the web, which contains some files, which everybody can view or download. I want to create the same thing, where I can upload similar files.
Here is how the permissions of this public bucket look like:
Unfortunately I cannot create the same thing, searched the whole web, but haven't found any step by step explanation.
Could anybody help me with this ?
According to the oficial documentation Making Data Public
Open the Cloud Storage browser in the Google Cloud Platform Console.
In the list of buckets, click on the name of the bucket that contains the object you want to make public, and navigate to the
object if it's in a subdirectory.
Click the drop-down menu associated with the object that you want to make public.
The drop-down menu appears as three vertical dots to the far right of
the object's row.
Select Edit permissions from the drop-down menu.
In the overlay that appears, click the + Add item button.
Add a permission for allUsers.
Select User for the Entity. Enter allUsers for the Name. Select
Reader for the Access.
Click Save.
From the gsutil command line (which is preinstalled on the cloud shell), you can set this up with these commands:
Make the bucket's contents publicly listable:
gsutil acl ch -g allUsers:R gs://bucketname
Make all objects in the bucket readable by anonymous users:
gsutil acl ch -g allUsers:R gs://bucketname/**
Make all new objects in the bucket publicly readable by default:
gsutil defacl ch -g allUsers:R gs://bucketname
Finally, standard warning about public buckets: anyone is allowed to download resources from this bucket, and the bill will go to the bucket owner, who is you. This could end up being expensive for you if, for instance, you upload a popular video and 10 million people download it. If you want to make an object publicly accessible but want the downloaders to pay for it, you may prefer to use a feature like Requester Pays: https://cloud.google.com/storage/docs/requester-pays
Related
I need to upload a file to OneDrive, via the command line. This will be done through a batch file which is distributed to end users.
From searching on Stack Overflow, I find questions like this one which say that you need to register an app and create an app password, using Azure. I don't have the necessary permissions to do this in the organization where I work, nor can I do anything that requires an admin account. So I can't any install software - I have to use what comes with Windows 10. I can't use VBA either as that's blocked.
I've managed to download files from OneDrive without anything like that, using the process described here:
Open the URL in either of the browser.
Open Developer options using Ctrl+Shift+I.
Go to Network tab.
Now click on download. Saving file isn’t required. We only need the network activity while browser requests the file from the server.
A new entry will appear which would look like “download.aspx?…”.
Right click on that and Copy → Copy as cURL.
Paste the copied content directly in the terminal and append ‘--output file.extension’ to save the content in file.extension since
terminal isn’t capable of showing binary data.
Example:
curl https://xyz.sharepoint.com/personal/someting/_layouts/15/download.aspx?UniqueId=cefb6082%2D696e%2D4f23%2D8c7a%2
…. some long text ….
cCtHR3NuTy82bWFtN1JBRXNlV2ZmekZOdWp3cFRsNTdJdjE2c2syZmxQamhGWnMwdkFBeXZlNWx2UkxDTkJic2hycGNGazVSTnJGUnY1Y1d0WjF5SDJMWHBqTjRmcUNUUWJxVnZYb1JjRG1WbEtjK0VIVWx2clBDQWNyZldid1R3PT08L1NQPg==;
cucg=1’ --compressed --output file.extension
I tried to do something similar after clicking 'upload' on the browser, but didn't find anything useful when trying to filter the requests.
I found these two questions but there is no keyboard shortcut to upload, AFAICT. Also the end user will be uploading a file to a folder I've shared with them from my OneDrive. Opening Chrome or Edge as a minimised window is fine, but I can't just shove a window in their face which automatically clicks on things - they won't like that.
It's just occurred to me that I might be able to use an office application to Save As the file to the necessary onedrive folder, where the keyboard shortcuts are pretty stable, but have no idea how to achieve that via the command line.
The best and more secure way to accomplish this goal I think is going to be with the Rest API for OneDrive.
(Small Files <4MB)
https://learn.microsoft.com/en-us/onedrive/developer/rest-api/api/driveitem_put_content?view=odsp-graph-online
(Large files)
https://learn.microsoft.com/en-us/onedrive/developer/rest-api/api/driveitem_createuploadsession?view=odsp-graph-online
You still need a Azure AD App Registration (which your admin should be able to configure for you), to provide API access to services in Azure. Coding with the API is going to be far easier and less complicated, not to mention more versatile.
I have a Google Form that uploads some pictures that I want to be visible to anyone with a web browser. Unfortunately they all arrive as private. Is there any way to have them default to public instead of having to go through and mark them individually?
You simply need to share the folder.
Click View Folder on the bottom-right corner of the file upload question
Change the permission of the folder to Anyone with the link *
*The folder in 1. is the subfolder of the question you selected.
In case there are multiple file upload questions and you want to share all of them, change the permission of the parent folder instead.
I have a method that gets a signed url for a blob in a google bucket and then returns that to users. Ideally, I could change the name of the file shown as well. Is this possible?
An example is:
https://storage.googleapis.com/<bucket>/<path to file.mp4>?Expires=1580050133&GoogleAccessId=<access-id>&Signature=<signature>
The part that I'd like to set myself is <path to file.mp4>.
The only way I can think of is having something in the middle that will be responsible for the name "swap".
For example Google App Engine with an http trigger or Cloud Function with storage trigger that whenever you need it will fetch the object, rename it, and either provide it to the user directly or store it with the new name in another bucket.
Keep in mind that things you want to store temporarily in GAE or Cloud Functions need to be stored in "/tmp" directory.
Then for renaming, if you are using GAE possibly you can use something like:
import os
os.system([YOUR_SHELL_COMMAND])
However, the easiest but more costly approach is to set a Function with storage trigger that whenever an object is uploaded it will store a copy of it with the desired new name in a different bucket that you will use for the users.
I am using CloverETL Designer for ETL operations and I want to load some csv files from GCS to my Clover graph. I used FlatFileReader and tried to get file using remote File URL but it is not working. Can someone please detail the entire process here??
The path for file in GCS is
https://storage.cloud.google.com/PATH/Write_to_a_file.csv
And I need to get this csv file into the FlatFileReader in CloverETL Designer
You should use the Google Cloud Storage API to GET the file; Clover's HTTPConnector component will allow you to pass in the appropriate parameters to make a GET request (you will presumably have to do an OAuth2 authentication first to get a token), and send the output to a local destination specified in "Output File URL." Then you can use a FlatFileReader to read from that local file.
GCS has several different ways to download files from your buckets. You can use the console and the Cloud Storage browser. Steps: open the storage browser, navigate to the object you want to download, right click, and save to your chosen local folder. If you use Chrome the save appears as “Save Link As…”.
To use the GS Utility, use this command:
`gsutil cp gs://[BucketName]/[ObjectName] [ObjectDestination]`.
Or you can use client libraries or the REST APIs to download files. With these last options you could work with a number of files or create a job to download them. Once they are in a location known to Clover ETL the process is straightforward.
Within Clover designer, under the navigation pane you can right click a folder and choose import. Pick the one in which you placed your GCS file. Once the file is imported then you can use data from it like any other datafile in Clover. Since this is a .csv file, remember to edit your metadata (right click the component, choose extract metadata then edit inside the Metadata Editor -- for data types, labels and such.) Assign metadata to the edges of your components so they know what is coming in/going out of that step. Depending on your file, this process may be repeated many times.
Even with an ETL tool, getting the data and data types correct can be tricky. If you have questions about how to configure data types or your edges in an ETL project, a wiki may help. The web has additional resources may help you get the end analysis you’re looking for.
Is it possible to enable Directory listing in Google Cloud Storage?
I was thinking on having a "domain bucket" and use it to list all the contents, similar to Nginx's autoindex on or Apache's Options +Indexes.
If I make the bucket public all contents will be listed as a XML, but not like a directory listing.
No, it is not currently possible.
You could perhaps implement such a thing by creating "index" pages in each directory that used JavaScript to query the bucket and render a list of objects, but there's no built-in support for this.
You might want to take a look at s3-bucket-listing. It's for Amazon S3 but works with GCS as well.