Google Drive Hits - google-cloud-storage

I want to share some pdf files publicly.
While there are many sophisticated ways of doing this, I currently have a bit.ly shortlink pointing to Google Drive.
Traffic is highly variable, but the maximum number of hits in a single day has been 500.
File sizes are around 1-5 MB. So a single hit should average around 10-20 MB of download.
To what extent will this method work?
5000 hits per day?
50,000?

Related

Used space displayed for my Google disk is 4 times more than total size of the stored files

As it follows from the notification displayed, I have used almost all available Google disk space (96%) while the total size of the files are 3.5 Gb only. Additional 1Gb was deleted and stored in the bin. What is the reason an how can I fix it? Also I have a lot of files shared with me from other accounts. But regarding Google Disk documentation they should not be taken into account. Additionally I have 0.8GB in Gmail and no files in Google photo
Go to this link and evaluate what are the files that are consuming more storage
Delete the files in your bin, as they are still counting towards your quota
After you delete the files, it usually takes some time to update the space on your Drive. A propagation matter.
Make sure you don't have a lot of photos taking quota out of your account

chrome.fileSystem.retainEntry increase 500 limit

According to the chrome.filesystem documentation there is a limit of 500 fileEntry's that can be retained and restored.
The app I am developing is a document management system that links to local files. Over time I expect that the user will have over 500 links.
Any ideas on how to increase the 500 limit, or an alternative strategy to keep a long lists of local file links ????

GCS: Is there a request limit for accessing objects?

I just heard that there is a limitation on the Google Cloud Storage, so that you can only access it with a request once per second. I searched through the internet, but didn't find any appropriate answer to this.
Is this right, or can i access it more then once per second? Just want to know for an webapplication i write at the moment, that can up- and download images on the Storage. If there is an limitation, it would cause some delay, if more requests per second are send from different users.
You may be referring to the limitation that you can update or overwrite the same object up to once per second. There's no limit to the number of times you can update across different objects, or to the number of reads you can do to any object.
https://cloud.google.com/storage/docs/concepts-techniques#object-updates

What is the upload limit on soundcloud

I sometimes get the error: { error_message: 'Sorry, you\'ve exceeded your upload limit.' } when I post sound files to soundcloud, using their http api.
I couldn't find any explanation for this 'upload limit' in their documentations.
Does anyone know if it's a daily limit? or a size limit? or a combination of both?
Thanks
Sparko is mostly right. The only difference is that you can tell how much remaining time you have by requesting the current user details (GET /me) and you'll there will be a key called upload_seconds_remaining.
Free users get 2 hours. Pro gets 4 hours. Pro Unlimited is unlimited. Regardless of the plan, individual tracks also can not be longer than ~6.5hrs (I forget the exact number)
Individual files cannot exceed 500mb Uploading Audio Files
However, I'd imagine this relates to your overall limit for uploading audio to SoundCloud based on the plan attached to the account you're posting to i.e exceeding the 2 hours provided by the free plan.
The API doesn't appear to provide a property for the remaining time provided to the user, although you could infer this from [user]plan & looping through all of their tracks and summing each [track]duration (although probably not advised).

Facebook Graph Latency

The following code fragment
for($i=0;$i<60;$i++){
$u[$i]=$_REQUEST["u".$i];
$pic[$i] =imagecreatefromjpeg("http://graph.facebook.com/".$u[$i]."/picture");
}
is taking more than 90 seconds to execute on my new server. It was taking less than 15 seconds on my shared hosting server. However, on dedicated server it is taking more than 90 seconds.
The data center of my new server is Asia Pacific.
Please advice on how I can reduce this time of fetching images on the graph.
thanks and regards
Why not just request all the pictures' URLs in a single call?
https://graph.facebook.com/?fields=picture&ids=[CSV LIST OF IDS]&access_token=ACCESS_TOKEN
You'll then have a list of all the images and can fetch them all however you so wish
is taking more than 90 seconds to execute on my new server.
Well, for 60 HTTP requests that’s not too bad, I’d say.
It was taking less than 15 seconds on my shared hosting server. However, on dedicated server it is taking more than 90 seconds.
Maybe the connection of your old server was just faster …?
The data center of my new server is Asia Pacific.
Do you know by any chance, which one it was before?
Please advice on how I can reduce this time of fetching images on the graph.
Do you have to request all these images in one go?
Maybe your app’s workflow (which we don’t know anything about yet) would allow for other approaches, like getting user images at a previous time (f.e. when a user starts using your app) and cache them locally, so that you don’t have to do 60+ HTTP requests in one go.