I have a series of uncompressed (binary/Octet-stream) files on Google Cloud storage. I'm trying to download them using gzip. According to this page
https://developers.google.com/storage/docs/json_api/v1/how-tos/performance
I can add
Accept-Encoding: gzip
User-Agent: my program (gzip)
and download the files compressed. This does not work for me. Am I missing something? The files always come back uncompressed. Anyone else experience the same issue?
You can add that header to indicate that you're willing to receive gzipped content, but the HTTP spec says that there is no guarantee. In case of Google Cloud Storage, unless the object was already uploaded with gzip content-encoding, the response will not have gzipped content (i.e. GCS does not dynamically compress objects).
(The linked docs page could probably be more clear about this, I'll suggest to clarify this issue.)
Related
I want to encode locally and upload to avoid spending money in encoding.
Is this allowed? I did not find any documentation on it.
When I simply uploaded a file to the storage, the media services account said it could not play it without the ISM file. I had to encode (was it re-encode? it was an mp4) the file I had uploaded - I want to avoid that.
Yes, absolutely allowed and encouraged for customers. Especially ones that have custom encoding requirements that we may not support.
You can upload an .ism file that you create along with your encoded files. It's a simple SMIL 2.0 format XML file that points to the source files used.
It's a bit hard to find searching the docs, but there is a section outlining the workflow here - https://learn.microsoft.com/en-us/azure/media-services/latest/encode-dynamic-packaging-concept#on-demand-streaming-workflow
There is also a .NET Sample showing how to do it here:
https://github.com/Azure-Samples/media-services-v3-dotnet/tree/main/Streaming/StreamExistingMp4
You can see the code line at 111 that shows how to generate the .ism file -
// Generate the Server manifest for streaming .ism file.
// This file is a simple SMIL 2.0 file format schema that includes references to the uploaded MP4 files in the XML.
var manifestsList = await AssetUtils.CreateServerManifestsAsync(client, config.ResourceGroup, config.AccountName, inputAsset, locator);
We use Github's issue tracking for a lot of project management, both code-related and not. For simple files like images, we can share them with Github's CDN via drag and drop into an issue or comment. However, this has limitations:
Github imposes a file type restriction: they will only allow GIF, JPEG, JPG, PNG, DOCX, GZ, LOG, PDF, PPTX, TXT, XLSX or ZIP.
Files larger than 25 MB or images larger than 10 MB are not supported.
While URLs are anonymized with Camo (https://docs.github.com/en/free-pro-team#latest/github/authenticating-to-github/about-anonymized-image-urls), files are not actually securely stored or password protected. This is really problematic when the files shared have a lot of sensitive data in them.
Is there a plugin or simple solution that would let us securely attach large or non-standard file types, while maintaining the nice UI of github issues? We'd be ok using a 3rd-party storage system (like Drive/Dropbox/Sharepoint/AWS), but forcing users to upload something then copy/paste a link over into the issue isn't ideal.
There's no way for you to embed other file types in an issue without using a standard Markdown link that gets rendered through Camo. That's because GitHub has a strict Content-Security-Policy that prevents files from other domains from even loading. That's intentional, since it prevents people from trying to embed tracking content or content that changes based on the user (e.g., ads).
Even if you could use some way to embed the files in the page, your browser probably wouldn't render them due to the Content-Security-Policy.
I want to compress a file before uploading it on server in sapui5 (image(png/jpeg/jpg)/pdf/). I want to upload a large files but i want to compress that file before uploading. Please suggest me solution ?
I think you will not be able to do it with instant upload. Instead, you should obtain the file from the change event of the file uploader (check out the change event, it has a files parameter).
You can then use that file object together with the zip.js library to create an in-memory zip and save it into e.g. a Blob. Afterwards you simply send the blob into a POST request (e.g. look at How can javascript upload a blob?). Maybe you should also provide some file-uploading specific headers (like the Slug).
I support a web-application that displays reports from a database. Occassionally, a report will contain an attachment (which is typically an image/document which is stored in the database as well).
We serve the attachment via a dynamic .htm resource which streams the attachment from the database, and populates the content-type based on what type of attachment it is (we support PDFs, RTFs, and various image formats)
For RTFs we've come across a problem. It seems a lot of Windows users don't defaultly have an assocation for the 'application/rtf' content-type (they do have an association for the *.rtf file extention). As a result, clicking on the link to the attachment doesn't do anything in Internet Explorer 6.
Returning 'application/msword' as the content-type seems to make the RTF viewable when clicking on the link, but only for people who have MS Office installed (some of the users won't have this installed, and will use alternate RTF readers, like OpenOffice).
This application is accessed publicly, so we don't have control of the user's machine settings.
Has anybody here solved this before? And how? Thanks!
Use application/octet-stream content-type to force download. Once it's downloaded, it should be viewable in whatever is registered to handle .rtf files.
In addition to the Content-Type header, you also need to add the following:
Content-Disposition: attachment; filename=my-document.rtf
Wordpad (which is on pretty much every Windows machine) can view RTF files. Is there an 'application/wordpad' content-type?
Alternatively, given the rarety of RTF files, your best solution might be to use a server-side component to open the RTF file, convert it to some other format (like PDF or straight HTML), and serve that to the requesting client. I don't know what language/platform you're using on the server side, so I don't know what to tell you to use for this.
You might know that HTML related file formats are compressed using GZip compression, server side, (by mod_gzip on Apache servers), and are decompressed by compatible browsers. ("content encoding")
Does this only work for HTML/XML files? Lets say my PHP/Perl file generates some simple comma delimited data, and sends that to the browser, will it be encoded by default?
What about platforms like Silverlight or Flash, when they download such data will it be compressed/decompressed by the browser/runtime automatically? Is there any way to test this?
Does this only work for HTML/XML
files?
No : it is quite often used for CSS and JS files, for instance -- as those are amongst the biggest thing that websites are made of (except images), because of JS frameworks and full-JS applications, it represents a huge gain!
Actually, any text-based format can be compressed quite well (on the opposite, images can not, for instance, as they are generally already compressed) ; sometimes, JSON data returned from Ajax-requests are compressed too -- it's text data, afterall ;-)
Lets say my PHP/Perl file generates
some simple comma delimited data, and
sends that to the browser, will it be
encoded by default?
It's a matter of configuration : if you configured your server to compress that kind of content, it'll probably be compressed :-)
(If the browser says it accepts gzip-encoded data)
Here's a sample of configuration for Apache 2 (using mod_deflate) that I use on my blog :
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript application/javascript application/x-javascript application/xml
</IfModule>
Here, I want html/xml/css/JS te be compressed.
And here is the same thing, plus/minus a few configuration options I used once, under Apache 1 (mod_gzip) :
<IfModule mod_gzip.c>
mod_gzip_on Yes
mod_gzip_can_negotiate Yes
mod_gzip_minimum_file_size 256
mod_gzip_maximum_file_size 500000
mod_gzip_dechunk Yes
mod_gzip_item_include file \.css$
mod_gzip_item_include file \.html$
mod_gzip_item_include file \.txt$
mod_gzip_item_include file \.js$
mod_gzip_item_include mime text/html
mod_gzip_item_exclude mime ^image/
</IfModule>
Things that can be noticed here are that I don't want too small (the gain wouldn't be quite important) or too big (would eat too much CPU to compress) files to be compressed ; and I want css/html/txt/js files to be compressed, but not images.
If you want you comma-separated data to be compressed the same way, you'll have to add either it's content-type or it's extension to the configuration of your webserver, to activate gzip-compression for it.
Is there any way to test this?
For any content returned directly to the browser, Firefox's extensions Firebug or LiveHTTPHeaders are a must-have.
For content that doesn't go through the standard communication way of the browser, it might be harder ; in the end, you may have to end up using something like Wireshark to "sniff" what is really going through the pipes... Good luck with that!
What about platforms like Silverlight or Flash,
when they download such data will it be compressed/decompressed
by the browser/runtime automatically?
To answer your question about Silverlight and Flash, if they send an Accept header indicating they support compressed content, Apache will use mod_deflate or mod_gzip. If they don’t support compression they won’t send the header. It will “just work.” – Nate
I think Apache’s mod_deflate is more common than mod_gzip, because it’s built-in and does the same thing. Look at the documentation for mod_deflate (linked above) and you’ll see that it’s easy to specify which file types to compress, based on their MIME types. Generally it’s worth compressing HTML, CSS, XML and JavaScript. Images are already compressed, so they don’t benefit from compression.
The browser sends an "Accept-Encoding" header with the types of compression that it knows how to understand. The server looks at this, along with the user-agent and decides how to encode the result. Some browsers lie about what they can understand, so this is more complex than just searching for "deflate" in the header.
Technically, any HTTP/2xx response with content can be content-encoded using any of the valid content encodings (gzip, zlib, deflate, etc.), but in practice it's wasteful to apply compression to common image types because it actually makes them larger.
You can definitely compress the response from dynamic PHP pages. The simplest method is to add:
<?php ob_start("ob_gzhandler"); ?>
to the start of every PHP page. It's better to set it up through the PHP configuration, of course.
There are many test pages, easily found with Google:
http://www.whatsmyip.org/http_compression/
http://www.gidnetwork.com/tools/gzip-test.php
http://nontroppo.org/tools/gziptest/
http://www.nibbleguru.com/tools/gzip-test.php