Getting mixed results. Some content is compressed, others not - azure-cdn

Using Standard Verizon CDN. Origin is Blob Storage. Accept-Type is always set, but only certain JS content returned is compressed with gzip. I've tried going through the CDN compression troubleshooting doc (https://learn.microsoft.com/en-us/azure/cdn/cdn-troubleshoot-compression) but it doesn't help. What's next in troubleshooting?

https://learn.microsoft.com/en-us/azure/architecture/best-practices/cdn
In Azure the default mechanism is to automatically compress content when CPU utilization is below 50%.
This may/may not be your problem. There's a setting on portal.azure to enable compression on the fly, which I think would be the solution

Related

Google storage operations extremely slow when using Customer Managed Encryption Key

We're planning on switching from Google managed keys to our own keys (working with sensitive medical data) but are struggling with the performance degradation when we turn on CMEK. We move many big files around storage in our application (5-200GB files), both with the Java Storage API and gsutil. The former stops working on even 2GB size files (times out, and when timeouts are raised silently does not copy the files), and the latter just takes about 100x longer.
Any insights into this behaviour?
When using CMEK, you are actually using an additional layer of encryption on top of Google-managed encryption keys and not replacing them. As for gsutil, if your moving process involves including the objects’ hashes then gsutil will perform an additional operation per object, this might explain why moving the big files is taking much longer than usual.
As a workaround, you may instead use resumable uploads. This type of upload works best with large files since it includes the option of uploading files in multiple chunks which allows you to resume an operation even if the flow of data is interrupted.

Save image as base64 in mongoDB

I looking for the best way to upload an image from mobile phone to my server. I am currently using html5 to open the camera and take the picture, then I convert the file into a base64 string, then I send to the server, then save it in MongoDB.
I am expecting around 1000 to 1500 user request per day ( upload image ) , so I have the following question :
Is it a good way to do it?
Should I compress the base64, if yes how?
Should use a specific server to handle this task?
My backend is node express and the front end is ReactJS.
Thanks
It all depends on your situation. Reading and writing images from a cdn via i.e. streams is usually faster than reading and writing binary representations of images i.e. base64 from a database. However, your speed if reading from a cdn will obviously be effected by what service you use. Today, companies like Amazon can offer storage to a very cheap price so if you are not building a hobby app for like a student project you can usually afford it. Storing binary representation of images actually end up a little bit bigger in size than storing the image itself. You don't compress the base64, you compress the image before converting it. However, if you can't afford a storage account and if you know your users won't upload that many images it is usually enough to store binary representations of the images in a database. Mongo Atlas, for example, offers 512 mb for free on their database clusters. Dividing tasks of your app such as database requests and cdn services from your main application is usually a good choice if possible. This way you will divide the cpu, memory, etc. of your hardware and it will lead to faster reading and writing tasks for the user.
There are a lot of different modules for doing this in node. JIMP is a pretty nice one with loads of built in functions like resizing images and converting them to binary, either as Buffer or base64.

Filepicker.io - Picasa size limitation

We're noticing some issues with Filepicker.io and the Picasa integration. It seems that large images (over ~ 2MB) aren't being processed - the POST to www.filepicker.io/api/store/ returns The specified bucket is not valid. for those files. Smaller size files process just fine.
Not sure where the issue lies. The response would indicate an issue with our S3 bucket perhaps but we've been able to process large Computer uploads without issue. Could it be a limitation in the Picasa Web API? Any information would be helpful. Thanks.

S3 Upload with Amazon IOS Sdk

I know its a known issue but has anyone found a way to "fix" the connection failure on iPhone in 3G of "relativly" large files ?
My application depends highly on S3 for upload and keeps failing uploads of files larger then 200KB
Depends on what's causing the failure.
An easy, albeit imperfect solution is to increase the timeout on your AmazonS3Client:
s3 = [[AmazonS3Client alloc] initWithAccessKey:S3_ACCESS_KEY_ID withSecretKey:S3_SECRET_KEY];
s3.timeout = 240;
I figured this out some time ago but forgot to update the reply, actually what was happening was that i was using an HTTP connection and it seems that if uploading Media files there are some Operators that have online "Conversors" dont know how to call them that take for instance your JPEG and "optimize" that jpg for mobile devices (this also applies to other media types), and since that modified the file that wont match S3 Header with the file "HASH", the way i worked around the problem was to use an HTTPS connection which prevents those intermediary servers to modify my upload

how to decrease the application size

My problem is that my application size is very high,
is there any idea to reduce size of application?
if i make application without content and content is uploaded my server then how i sync the application with content put on my server?
i want to know that once user download application after that when he use application then we stream the content and save his document folder.
once user stream then never required for streaming.
is it possible????
Thanks,
Reducing the size of your application depends on the TYPE of contents of your application. I highly doubt that the application code is the cause, and since you did not mention what they are I am assuming they are some kind of resource.
If your resources are images, try to use image compression programs. Or convert them to smaller sized images or optimize the images.
If your resources are documents / text files / files that have a high compression ratio when zipped. Then you can try to zip your resources and access them inside the compressed file (this will mean additional coding, and probably slower in performance).
These are just examples.
It is not advisable to stream large contents because it uses the network bandwidth which, depending on the user's plan, can cause a big spike in phone bills.
Yes it is possible that you can download your content and can save to application's document folder, when user runs your application for the first time. Thought it may affect the first impression to your user as it will take time to download.