Pricing of google cloud storage [closed] - google-cloud-storage

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 1 year ago.
Improve this question
I want to understand the pricing of gcs. I have looked at the examples from gcs homepage, however i have trouble understanding egress, ingress, geo location differences etc.
Can someone help me with a breakdown of the price of the following workflow in google cloud storage. I live in Mexico.
Upload 1TB from my computer to a bucket
Store it for 1 month in us-east1 in standard storage
Download all 1TB to my computer

I did the calculation and they are $ 40.96 based on this documentation where I used the example of the Standard provisioned space and I did the following operation: $ 0.040 x 1024 = $ 40.96 if you want to know how much you would pay for other components you can check it here

Upload is free, storage is calculated by second (even is presented here per month, in the billing page it's presented in GB/s), download is free if you aren't in nearline, coldline or archive class, expect the egress bandwidth for downloading the size
You need to substract the 5Gb/month free storage and 1Gb of egress (forget this for 1Tb, it's not really significant.)

Related

How can I see the creation time stamp of data being accessed in Google Cloud Storage by looking at the audit logs? [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 days ago.
Improve this question
I am analysing some GCS audit logs, I want to categorize the data in buckets being accessed based on how old the data is. Lets' say there is a bucket mybucket having file1 uploaded in 2020, file2 in 2021 and file3 in 2022. So while analysing the auditlogs, I want to be able to group the access patterns based on in which year that data was created ? My question is - Do we have an option of getting the created on metadata for the data being accessed in auditlogs ? Or if there is some better way of achieving this, please share. Thanks!
Since Audit logs do not show creation timestamps of resources accessed, you can try using a cloud asset inventory cloud asset inventory docs export to create a BigQuery table of the bucket name/time created and join that onto the IAM policy audit log table reference docs.

DashDB Entry Plan sunset and databases lost [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 3 years ago.
Improve this question
I recently noticed that mi dashDB entry plan under a dedicated ibm cloud environment has been sunset. I read an article that said that, but i had not been informed previously, so i lost my two databases (production, and testing).
Does anyone know what i should do in this case? I have a lot of sensible data inside them, and i didn´t have any problem about changing the plan, but i don´t know how to do it because i cannot get inside the console (it doesn´t work anymore). Is there any way to recover my databases? Thanks.
Please open a support ticket here: https://watson.service-now.com/wcp
The support team can temporarily re-enable your access so that you can download a copy of your data.

What resources are allocated to a medium sized business's cloud implementation? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 3 years ago.
Improve this question
Jump to the accepted answer to see why the question doesn't actually have a distinct answer. And (LFMM) remember not to google too specifically when you don't know what you're looking for.
When a business purchases 200 licenses and is using the enterprise Salesforce CRM, what resources are allocated to that instance, and what operations within Salesforce are handled outside those dedicated resources?
Edit:
This can be helpful to know for interacting with an instance in numerous circumstances, least of which is their APIs.
This is actually a pretty difficult question to answer. Just by doing a little googling the only thing I am sure of is that record limits are the only thing tied to the amount of user licenses an organization buys. - Page 10
All orgs are still bound by the specified limits
The key here is cpu time is used as a limit. Does an organization with 1k license have a better CPU than an org with 20 users? I can't find documentation to support that, but you would have to imagine some resources would scale. But then you also look at Memory usage and that's the same across the board regardless of users.
You might want to search more on salesforce multi-tenancy. Here are some links to browse
https://developer.salesforce.com/page/Multi_Tenant_Architecture
https://www.youtube.com/watch?v=jrKA3cJmoms

Cloud Computing and MATLAB [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have written some MATLAB code that classifies sounds, based on an Artificial intelligence approach.
Now I would like to use the same program on the cloud.
Do I need I convert the code to some other programming language, or is it possible to use the same MATLAB code on the cloud?
If you would like to move your application to the cloud in order to speed it up by running it on a cloud computing resource, it's possible to parallelize your application using Parallel Computing Toolbox, and then to execute that on instances of MATLAB Distributed Computing Server that are running in the cloud, such as on Amazon EC2. MathWorks have resources on their website, including a white paper, on how to do this. Note that, unusually for MathWorks products, if you do this it's possible to pay for the instances of MATLAB Distributed Computing Server by the hour, rather than having to buy an expensive permanent license (speak to your account manager to find out about that payment option).
If you'd like to run your code on the cloud just for convenience, or to offload it from your computer, rather than to speed it up, then if you have a MATLAB license you can use MATLAB Mobile (for iPhone or Android) to run your code on MathWorks' own cloud resources, for free (including storage of up to 500MB of your data).
Of course, you may find that for various reasons you eventually think it's better to recode it all in a different language - but there are several options you can try out pretty quickly before committing to that lengthy task.
Yes, if parallel computing toolbox is not enough you can look at MATLAB Distributed Computing Server.

App Store guidelines 2.6 Apps that read or write data outside its designated container area will be rejected [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
I was wondering if the following guideline:
2.6 "Apps that read or write data outside its designated container area will be rejected"
means that you cannot read data retrieved from a NSUrlConnection. In my app I download the page source from my website and parse it into usable data. That won't disobey the guideline, will it?
No, that's not what it means at all. You're fine. Guideline 2.6 is intended to refer to applications that try to access system files (i.e, on the phone) outside the app sandbox.
I have published a couple of apps that do exactly what you describe: they get data from a web server, parse it into some data, and cache it locally. So I can confirm from my experience that this is fully supported behavior as per the App Store guidelines.