S3 access: 'public' not working for files from internet URLs - filepicker.io

I am trying to use ink file picker's library to allow users to upload images to my S3 bucket. This works great for files from the file system, but files from the internet somehow don't have their permissions set correctly, so the returned path is not publicly readable.
store_options = {
path: 'product/'+product_url+'/',
access: 'public'
};
filepicker.pickAndStore({},store_options,function(InkBlobs){
console.log(JSON.stringify(InkBlobs));
});
There aren't a lot of knobs to turn here -- anyone have some advice?

Related

Can't load an Adobe Captivate HTML5 presentation in a Google Cloud Storage bucket

I am trying to store an Adobe Captivate HTML5 file in a bucket and then load the file and play the presentation. I've watched other videos on this and many people have successfully used Google Cloud Storage for this purpose, but they generally use it on a public bucket. My bucket is private and only accessible to a couple of users if it matters.
The file I'm trying to load will only go as far as showing a loading icon. Originally it only showed a blank page, and I was advised to add a tag to the with the "Authenticated URL" of the file and now it shows a loading icon but still won't load correctly.
Has anyone had this issue with html5 files before? Or know if there is anything in particular you need to do differently to view a file like this on a private bucket with a couple users vs allUsers?
Adobe Captivate HTML5 file, is still HTML codes. After you uploaded to the GCS, when you visit GCS bucket, actually you executed getobject, you downloaded the file to your browser, of course, the browser can write HTML file and display in your browser. Just set it to public to allusers, clean the browser cache. the try to use the public url visit again.

Enable directory listing in Google Cloud Storage

Is it possible to enable Directory listing in Google Cloud Storage?
I was thinking on having a "domain bucket" and use it to list all the contents, similar to Nginx's autoindex on or Apache's Options +Indexes.
If I make the bucket public all contents will be listed as a XML, but not like a directory listing.
No, it is not currently possible.
You could perhaps implement such a thing by creating "index" pages in each directory that used JavaScript to query the bucket and render a list of objects, but there's no built-in support for this.
You might want to take a look at s3-bucket-listing. It's for Amazon S3 but works with GCS as well.

serving using zinc and pharo

After working on zincs tutorial I want to serve an entire repo filled with .js, .html and .css files I created a firstpathsegment so that I upload all files to the sam place but then how to create a place to serve and store the files ? and how to upload them ?
If you read on in the tutorial, you'll find how to do that in the part 'Static file server'
(ZnServer startDefaultOn: 1701)
delegate: (ZnStaticFileServerDelegate new
directory: (FileDirectory on: '/var/www');
prefixFromString: 'static-files';
yourself).
If you compare that to the ZnMonticelloServerDelegate implementation, you'll see how to write files. If you want to be able to create subdirectories, take a look at FileSystem-Core-Public. In a public-facing site, you'll need to do something about authentication and access control. We mostly put an apache or nginx in front (and let that take care of the static files).

How do I upload files to S3 instead of my hard drive using PyroCMS 2.1.5?

I can't seem to figure out how to post files (images, etc) to amazon S3.
I'd like for this to be default behavior for ALL media upload areas.
Jerel Unruh put together a video all about this:
http://www.youtube.com/watch?v=Te61OzHK400
Really you just enter your Amazon credentials into the settings area, then hook a folder up with the bucket. All images will be sent to S3 and cached locally, and work identically to local files.

Is there any way to have a text file on the server be readable only by the browser?

I have a few pages on my web server that extract data from text files that each contain a JSON string. The pages use $.get
Is there any way to allow only the server/webpages access the files? I would prefer to not have people going to the file path and saving the JSON data to their computer.
If I set permissions to deny access to the default IUSR, then people visiting the site won't be able to load them.
Any tricks around this?
I put such files in a directory tree out of the one the web server can see. e.g., html pages accessible by the browser go into /var/www/public_html/filename.php, but files that should not be seen go into /var/privateFiles/anotherfile.txt. The web server root is /var/www - so the web server cannot see anotherfile.txt, but filename.php can include it using full path name.