Who is charged when a user download data from an external bucket? - google-cloud-storage

Who is charged when a user downloads data from an external bucket ? The owner of the bucket, the account who's downloading the data, or both ?
Edit: By external bucket I mean a bucket that doesn't belong to the account used to download the data. For instance, company A wants to share data from its own bucket X, and company B wants to download the data from X.

I'm not sure what you mean by "external bucket", but in any case the bucket owner is charged for bandwidth resulting from object downloads (listed as "Egress" on the GCS pricing page).

Related

Github Metadata

I deleted my Github account for some personal reasons, however beforehand I downloaded the metadata for the account. At the current moment I cannot find how I can use that metadata to create a new account. Is this even possible? And if so, how?

download audit log for email activities

Is there a way to download g suite email audit log (not manually from google admin console) for a big company?
If i'm not wrong the API (https://developers.google.com/admin-sdk/email-audit) is limited by 1000 monitors per day?
May be is possible to have a mirror of logs in BigQuery or in any other place readable with no limits?
Thanks, Paolo.

Serve file with google cloud storage, google compute engine and external website

I got a question regarding a upload, edit and serve setup.
I have a setup where my shopify website lets users upload images to a google cloud bucket with javascript. When the file is uploaded to the bucket it's send to a compute engine which edits the files and the file is then uploaded to another bucket. All this is done.
But now I want to serve the file to the user on my shopify website. I can't figure out a way to do this. Is it even possible with my current setup? My problem is how to identify the user-session which uploaded the file, so that I can serve the file back to that person.
I hope someone has knowledge about this and is willing to help. Thanks!
Every person that logs into a Shopify store gets a customer ID. You can use this for your uploads. Ensure images get manipulated with that ID in mind. Now, use an App Proxy that sends the same customer ID to your App. Your App can then use this ID to find the image previously uploaded, and you can return it to the shop. A very common pattern of Shopify use.
As for getting the customer ID, one way is to dump it using Liquid since you have {{ customer.id }} or you can sniff it out of the cookies Shopify stores for a user session. It exists, but you'll have to dig for it, I forget its exact code.

Tableau Server Extracts issue

A developer had created dashboard and successfully published to tableau server for auto refresh extract every day, Developer had data and DB access at that time.
Now the same developer's access for DB is removed, is it possible to get fresh data to reports everyday? will the extract have new data everyday ?
Not unless you update the database credentials published with the data source to be something that the database accepts
You typically want some sort of service account for this purpose - so that the published dashboards still work when the original publisher leaves

Does Watson Analytics has a limit on the number of records it can import from dashDB?

I have a account in IBM Watson Analytics (https://watson.analytics.ibmcloud.com).
I also have an account in IBM Public Bluemix. I provisioned a dashDB instance and inserted 10,000 records into a dashDB table. In Bluemix dashDB, I verified (via SQLs etc.) that the 10,000 records do exist in the dashDB table.
I configured a connection in Watson Analytics to connect to my Bluemix dashDB instance. But, when I try to view the data, Watson Analytics shows / allows for upload (when I click 'shape before upload') only 1000 records from the dashDB table.
Is this a limit enforced on Watson Analytics accounts ?
In Watson Analytics, in my account settings/info page, for subscription it says "BACC-IBM WA Professional" and for total space it says that I have used only 97 MB out of available 100 GB. Also, it says that maximum upload size is 10 GB and the maximum number of columns allowed in your data asset is 500 columns. The data I have uploaded into dashDB was in total 1 GB for all of 10,000 records.
So, I do not understand why Watson Analytics is not able to show /allow for upload more than 1000 records from dashDB. Please help.
Your account type will allow you to upload up to 10 million rows of data in a single data set. The "shape before upload" view is just a preview of the data. It does not allow you to page through the entire data set. But it will upload your entire table. You can confirm this by doing the following:
Complete the upload of your table from dashDB.
Click on the new data tile. This opens a new discovery set.
At the top where it says "Ask a question about your data", just type "count rows".
Select the first visualization it suggests.
This will show the number of rows in a simple text visualization.