I am trying to use Blueimp jQuery File Upload and it works great with small files. If try to upload anything greater than 50mb it fails and I get an error 'Empty file upload result'.
I have seen lots of responses to questions from people getting the same error but they seem to get it despite the file uploading correctly or the code corrections suggested don't seem to apply to the code that is now supplied with the plugin.
The FAQ suggests that there is a server side restriction on file size but I have asked my host to increase it to 1GB and they have confirmed they have done this. I do not have permission to overwrite the php.ini as suggested in the FAQ, I just get a server error.
Has anyone else had this problem and if so how was it resolved?
I am using PHP.
Related
Ive tried uploading a simple php script that prints a message in the page in SUITECRM and surprisingly the file got uploaded without any validations.
here's my proof:
When i check the upload folder and found the file in .php format in this location http://localhost/upload/7ED577F6-C8F1-3C9E-E518-587D98FF8A7B_evil.php
Is this a bug or is this an issue that haven't been patch ? what i can do to resolve this ?
There is no validation currently when a user uploads a file.
SuiteCRM simply checks for file permissions, maximum file size and not much more.
If you want to see for yourself here is the code
I need to download an archived google group.
Following link is one of the messages of that group for example.
https://groups.google.com/forum/#!topic/sci.aeronautics/ViFtpXfVm7M
The problem is, what i see in the browser does not appear in the downloaded webpage.
With my very limited knowledge, It seems to me like the reason behind it is this content is dynamically created by java-script. Or else, these downloaded files are with so called 'mbox' extension which is encrypted ?
What I've tried so far
First trys
Simple download
wget https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
With mirror
wget --mirror https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
Assuming its encrypted
With cookies.
wget --load-cookies=cookies.txt https://groups.google.com/d/topic/sci.aeronautics/ViFtpXfVm7M
Got thunderbird to setup my gmail and opening. did not open correctly
Assuming the content was javascript generated
Downloaded using phantomJS
https://askubuntu.com/questions/411540/how-to-get-wget-to-download-exact-same-web-page-html-as-browser
Downloaded using phantomJS with a different script
https://gist.github.com/giocomai/247d54e097b5083e2451
Used scripts available from Github
https://github.com/henryk/gggd
https://github.com/icy/google-group-crawler
But none did not work so far.
Can anyone please shed some light on how to download this page with its message as a readable html or txt file ?
Cheers
AyyoSalli
You could use https://groups.google.com/forum/feed/sci.aeronautics/msgs/atom.xml?num=100 to get some of the posts - but it only gets roughly half the posts in this case.
And it has all the messages from all topics together.
View it in Firefox or Classic Opera to see directly in a more human-readable form.
But since you say you already got a file in standard mbox format, what exactly is wrong with it - did you attempt to import it into a locally installed email or newsclient ? (like Thunderbird)
All of a sudden one of our sites has developed an issue with TinyMCE, specifically it seems, in relation to the advimage plugin.
When trying to browse the image folder via TinyMCE I get an alert with one of these errors:
In Chrome I get:
2can't process ajax,TypeError: Cannot read property 'responseText' of null
In Firefox I get:
2can't process ajax,Invalid XML structure
Nothing has changed on this site for a good few months.
We have upgraded to PHP 5.4 very recently, but I don't see why that would be related.
This could be an issue with overly large images in the plugin's upload directory.
On each request, the advimage plugin scans the upload directory (set in your config) and generates thumbnails of any images it finds in there, then sends a list of images off to the client in the form of JSON or XML. If an image is too large to process, (low server memory or something), then the process quits and doesn't return any JSON/XML, hence the seemingly unrelated error message.
Prune any images over 1MB from the uploads directory. You may need to flush the cached thumbs as well. To stop your users/admins from uploading huge images, set an upload limit in the plugins config.
Recently I've integrated Google Drive with my iOS application. Everything works fine but .ppt files. Normally if a file is a Drive file I use downloadURL to download it. If the file belongs to Google Docs I use one of the exportLinks (exactly the same as Alain described it here).
However all .ppt files (with "mimeType": "application/vnd.google-apps.presentation") which come from Google Docs are corrupted after being downloaded (I use an export link with exportFormat=pptx). The same file downloaded via web browser works fine.
I use ASIHTTPRequest lib for downloading files (which also can be the reason of corrupted .ppt?).
Any ideas why only ppt files cause problems?
I can already tell you that the lib you're using isn't the cause:I'm not using it but I've the same problem: it seems that there the code received isn't 200 (if ($httpRequest->getResponseHttpCode() == 200)) as it shows me a specific error message I've asked to return in case of. Also, when I'm trying to download a presentation in PDF or txt, it shows the same error.
It's not really an answer but I'm trying to understand also why only presentations are causing problems.
EDIT: the code received is 302. If it can help...
EDIT 2: After trying, I noticed that the first parameter is the file id and the second the export format:
https://docs.google.com/feeds/download/presentations/Export?docId=filedid&exportFormat=pptx
But in the 302 code, I have this location:
https://docs.google.com/feeds/download/presentations/Export?exportFormat=pptx&id=fileid
Not only the two parameters aren't in the same order but the name is id and not docid
When I take this URL, put it as the export link and then try to copy the file, it's working. I get a 200 response and the inside of the file.
I hope it helps.
After reading Steve Sandersons post on swf upload.
http://blog.stevensanderson.com/2008/11/24/jquery-ajax-uploader-plugin-with-progress-bar/
I have implemented the swf upload on a site I am working on, Some users are getting a variety of issues where the progress bar gets stuck, or they get the error message 2038 - with error code -220 (System IO error.) - this is not related to Certificates as in the test below both addresses can be accessed with http or https
I haven't been able to reproduce much of these errors, However when trying to upload large images over 2 mb
It works fine on the test site, But not on the live
UPDATE: I had posted examples here, now removed as the links don't work.
Both sites hosted on App Harbor. exactly the same code.
The Limit for image uploads should be 10 mb - and I have successfully uploaded larger images that the one posted here.
what could be the cause of this?
Can I ask what language the rest of the site is written in?
My first thought is that if it's an IO error it could be running out of space?
Run:
Df -h
On the servers and see what we get, remember that all file uploads are written to /tmp before being moved where you want them, so if that fills upload stops.
This turned out to be a configurations setting at the load balancer level, We have a dedicated load balancer with app harbor to so we can offer full ssl support. It had not been set up to allow requestes of 10mb, they have changed it now.
Just don't forget to have the parameters in php.ini that set :
session cookies to on
and session.use_only_cookies to off
and in the js plugin session are handled this way:
post_params: {
<?php echo "'".ini_get('session.name')."':'".session_id()."',"; ?>
}
Furthermore, don't forget to check the list of images extensions handled by your js plugin