After reading Steve Sandersons post on swf upload.
http://blog.stevensanderson.com/2008/11/24/jquery-ajax-uploader-plugin-with-progress-bar/
I have implemented the swf upload on a site I am working on, Some users are getting a variety of issues where the progress bar gets stuck, or they get the error message 2038 - with error code -220 (System IO error.) - this is not related to Certificates as in the test below both addresses can be accessed with http or https
I haven't been able to reproduce much of these errors, However when trying to upload large images over 2 mb
It works fine on the test site, But not on the live
UPDATE: I had posted examples here, now removed as the links don't work.
Both sites hosted on App Harbor. exactly the same code.
The Limit for image uploads should be 10 mb - and I have successfully uploaded larger images that the one posted here.
what could be the cause of this?
Can I ask what language the rest of the site is written in?
My first thought is that if it's an IO error it could be running out of space?
Run:
Df -h
On the servers and see what we get, remember that all file uploads are written to /tmp before being moved where you want them, so if that fills upload stops.
This turned out to be a configurations setting at the load balancer level, We have a dedicated load balancer with app harbor to so we can offer full ssl support. It had not been set up to allow requestes of 10mb, they have changed it now.
Just don't forget to have the parameters in php.ini that set :
session cookies to on
and session.use_only_cookies to off
and in the js plugin session are handled this way:
post_params: {
<?php echo "'".ini_get('session.name')."':'".session_id()."',"; ?>
}
Furthermore, don't forget to check the list of images extensions handled by your js plugin
Related
I need to download an archived google group.
Following link is one of the messages of that group for example.
https://groups.google.com/forum/#!topic/sci.aeronautics/ViFtpXfVm7M
The problem is, what i see in the browser does not appear in the downloaded webpage.
With my very limited knowledge, It seems to me like the reason behind it is this content is dynamically created by java-script. Or else, these downloaded files are with so called 'mbox' extension which is encrypted ?
What I've tried so far
First trys
Simple download
wget https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
With mirror
wget --mirror https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
Assuming its encrypted
With cookies.
wget --load-cookies=cookies.txt https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
Got thunderbird to setup my gmail and opening. did not open correctly
Assuming the content was javascript generated
Downloaded using phantomJS
https://askubuntu.com/questions/411540/how-to-get-wget-to-download-exact-same-web-page-html-as-browser
Downloaded using phantomJS with a different script
https://gist.github.com/giocomai/247d54e097b5083e2451
Used scripts available from Github
https://github.com/henryk/gggd
https://github.com/icy/google-group-crawler
But none did not work so far.
Can anyone please shed some light on how to download this page with its message as a readable html or txt file ?
Cheers
AyyoSalli
You could use https://groups.google.com/forum/feed/sci.aeronautics/msgs/atom.xml?num=100 to get some of the posts - but it only gets roughly half the posts in this case.
And it has all the messages from all topics together.
View it in Firefox or Classic Opera to see directly in a more human-readable form.
But since you say you already got a file in standard mbox format, what exactly is wrong with it - did you attempt to import it into a locally installed email or newsclient ? (like Thunderbird)
I'm trying to create a web page using Perfect(perfect.org), Where users will browse and upload files. Can anyone tell me how can I get the progress of file upload?
perfect.org-fileUploads
Refer above link and Do as-usual concept following in HTML-JS-PHP or HTML-JS-JSP or other programming
In other words
you can receive response status in percentage from server-side and display it to client or put loder while uploading the file
Thank you
Before an official solution released from PerfectlySoft Inc. for this feature request, you could try splitting the file into small pieces and upload them one by one, then merge them back to the server - since there is no such an industrial standard to apply, all other web servers either provide different solutions or simply stay away from it.
I am trying to use Blueimp jQuery File Upload and it works great with small files. If try to upload anything greater than 50mb it fails and I get an error 'Empty file upload result'.
I have seen lots of responses to questions from people getting the same error but they seem to get it despite the file uploading correctly or the code corrections suggested don't seem to apply to the code that is now supplied with the plugin.
The FAQ suggests that there is a server side restriction on file size but I have asked my host to increase it to 1GB and they have confirmed they have done this. I do not have permission to overwrite the php.ini as suggested in the FAQ, I just get a server error.
Has anyone else had this problem and if so how was it resolved?
I am using PHP.
All of a sudden one of our sites has developed an issue with TinyMCE, specifically it seems, in relation to the advimage plugin.
When trying to browse the image folder via TinyMCE I get an alert with one of these errors:
In Chrome I get:
2can't process ajax,TypeError: Cannot read property 'responseText' of null
In Firefox I get:
2can't process ajax,Invalid XML structure
Nothing has changed on this site for a good few months.
We have upgraded to PHP 5.4 very recently, but I don't see why that would be related.
This could be an issue with overly large images in the plugin's upload directory.
On each request, the advimage plugin scans the upload directory (set in your config) and generates thumbnails of any images it finds in there, then sends a list of images off to the client in the form of JSON or XML. If an image is too large to process, (low server memory or something), then the process quits and doesn't return any JSON/XML, hence the seemingly unrelated error message.
Prune any images over 1MB from the uploads directory. You may need to flush the cached thumbs as well. To stop your users/admins from uploading huge images, set an upload limit in the plugins config.
I have a WebApp that I've been try to make work offline. The WebApp is too big, even minified, to simply use the application cache (things download but I eventually get a window.applicationCache error). I'm trying to use XMLHttpRequest to get the larger scripts and main html and keep them in localStorage and just keep a small loader script in the application cache. The problem I'm seeing is that the XMLHttpRequest returns a network error when the loader script is being served locally. When the the cache is downloading no error is returned and it works fine. When I turn off the application cache the loader works fine, but of course then I need the network to get the loader.
I tried setRequestHeader("Cache-Control", "no-cache") but that didn't help.
Anybody have a clue?
What does your network: section in your manifest look like?
I found that if I weren't allowing wildcard network traffic it wouldn't load with XMLHttpRequest. So changing it to:
Network:
*
did the trick for us.
I think I found a solution. It would probably work for others.
I split the loader into two separate HTML files: one that uses XMLHttpRequest to get all the required files and put them in localStorage (the loader) and another that simply reads the files from localStorage and writes them into the document (the booter) with appropriate wrappers (e.g. ). The booter has a manifest file to keep it in the application cache. The loader does not. The user first invokes the booter. If the booter finds files already in localStorage it does it's thing. Otherwise, it uses location.replace() to invoke the loader. The loader loads the files from the server using XMLHttpRequest and puts them in localStorage, and then re-invokes the booter using location.replace(). This seems to not cause an network error.
In order to run offline, the user must invoke the booter in the iPhone Safari browser (which invokes the loader, which re-ivokes the booter) which boots the WebApp. In Safari, the user must then add the WebApp (the booter link) to their Home Screen (using the "+" button at the bottom). When offline the user can get to the app from the Home Screen icon. It takes a few seconds to re-render, but it's fully functional after that. It's the same delay when online. Invoking the link from the iPhone Safari browser will not work offline, though it will work online.
The booter monitors the application cache's "updateready" event so that when online and the when iPhone detects a change in the booter's manifest file and downloads a new booter, it will swap the new cache (window.applicationCache.swapCache()) and invoke the loader using location.replace() again. I also add an alert() to let the user know something funky is going on. So changing the manifest file (I mean making some bytes different, not just tweaking the modify time) will cause clients to get new files when online.
Interestingly, I noticed that localStorage set up in Safari is not available to the same page served from invoking the Home Screen icon, even though the cookies transfer! So the first time the booter is invoked from the icon it will reload the files even though they were previously loaded in Safari. Also, I had to explicitly prevent the loader from being cached as it was not reloading from the server when the rest of the files were updated.
You are correct. Ultimately it was the network section in the manifest.
I thought the site where the application was loaded from was included automatically and you didn't need to mess with it, but it's not true. You need to put the site in the network section.