All of a sudden one of our sites has developed an issue with TinyMCE, specifically it seems, in relation to the advimage plugin.
When trying to browse the image folder via TinyMCE I get an alert with one of these errors:
In Chrome I get:
2can't process ajax,TypeError: Cannot read property 'responseText' of null
In Firefox I get:
2can't process ajax,Invalid XML structure
Nothing has changed on this site for a good few months.
We have upgraded to PHP 5.4 very recently, but I don't see why that would be related.
This could be an issue with overly large images in the plugin's upload directory.
On each request, the advimage plugin scans the upload directory (set in your config) and generates thumbnails of any images it finds in there, then sends a list of images off to the client in the form of JSON or XML. If an image is too large to process, (low server memory or something), then the process quits and doesn't return any JSON/XML, hence the seemingly unrelated error message.
Prune any images over 1MB from the uploads directory. You may need to flush the cached thumbs as well. To stop your users/admins from uploading huge images, set an upload limit in the plugins config.
Related
I have a WOPI host running in a Blazor server application with all of the .wopitest tests passing for the desired functionality (others skipped).
When I upload a word document, I am able to view the document with no issues. I am also able to edit the document, however when I try and edit the document a second time, I get an error.
The error doesn't appear to be handled and seems to originate in the Office online javascript file.
Error on attempting second edit
Following the error, I am still able to open the document for viewing. It is the same behaviour if I use the 'Editing' button in the Office Online page or directly navigate to the editing page using an edit action url.
Supplementary information:
Using ngrok to debug locally
.NET 6
Using SQLite database for holding file information (including path to file)
Using local folders for storing file contents (e.g. 'data' folder containing all files)
Similar issues with .xslx files beign corrupted upon editing and requiring a 'repair' when opened with Excel. This repair removes cells containing text and indicates that it removes the theme.
Viewing a word document gives the following console errors View document error
The first editing of a word document gives the following console errors Edit document error
I was expecting to be able to repeatedly edit the document.
I tried opening the file in the Desktop version of Word and got the following error Desktop Word recover
Following a recover, the document appears to work as expected in Word (desktop) but still won't open for editing through WOPI.
Turns out it was the way the POST http request body was being saved.
Still not certain what was going wrong but somewhere along the way of writing the stream into a buffer and then saving that to a file corrupted the file.
I suspect the file stream was either truncating or adding a few bytes.
The interesting part being that Office Online was still able to view the file.
This indicates there is some tolerance for malformed files still being served.
I am trying to use Blueimp jQuery File Upload and it works great with small files. If try to upload anything greater than 50mb it fails and I get an error 'Empty file upload result'.
I have seen lots of responses to questions from people getting the same error but they seem to get it despite the file uploading correctly or the code corrections suggested don't seem to apply to the code that is now supplied with the plugin.
The FAQ suggests that there is a server side restriction on file size but I have asked my host to increase it to 1GB and they have confirmed they have done this. I do not have permission to overwrite the php.ini as suggested in the FAQ, I just get a server error.
Has anyone else had this problem and if so how was it resolved?
I am using PHP.
After reading Steve Sandersons post on swf upload.
http://blog.stevensanderson.com/2008/11/24/jquery-ajax-uploader-plugin-with-progress-bar/
I have implemented the swf upload on a site I am working on, Some users are getting a variety of issues where the progress bar gets stuck, or they get the error message 2038 - with error code -220 (System IO error.) - this is not related to Certificates as in the test below both addresses can be accessed with http or https
I haven't been able to reproduce much of these errors, However when trying to upload large images over 2 mb
It works fine on the test site, But not on the live
UPDATE: I had posted examples here, now removed as the links don't work.
Both sites hosted on App Harbor. exactly the same code.
The Limit for image uploads should be 10 mb - and I have successfully uploaded larger images that the one posted here.
what could be the cause of this?
Can I ask what language the rest of the site is written in?
My first thought is that if it's an IO error it could be running out of space?
Run:
Df -h
On the servers and see what we get, remember that all file uploads are written to /tmp before being moved where you want them, so if that fills upload stops.
This turned out to be a configurations setting at the load balancer level, We have a dedicated load balancer with app harbor to so we can offer full ssl support. It had not been set up to allow requestes of 10mb, they have changed it now.
Just don't forget to have the parameters in php.ini that set :
session cookies to on
and session.use_only_cookies to off
and in the js plugin session are handled this way:
post_params: {
<?php echo "'".ini_get('session.name')."':'".session_id()."',"; ?>
}
Furthermore, don't forget to check the list of images extensions handled by your js plugin
Can someone explain to me in simple term the concept of caching in GWT. I have read this in many places but may be due to my limited knowledge, i'm not being able to understand it.
Such as nocache.js, cache.js
or other things such as making the client cache files forever or how to make files cached by the client and then if file get changed on the server only then the client download these files again
Generally, there are 3 type of files -
Cache Forever
Cache for some time
Never Cache
Some files can never be cached, and will always fall in the "never cache" bucket. But the biggest performance wins comes from systematically converting files in the second bucket to files that can be cached forever. GWT makes it easy to do this in various ways.
The <md5>.cache.js files are safe to cache forever. If they ever change, GWT will rename the file, and so the browser will be forced to download it again.
The .nocache.js file should never be cached. This file is modified even if you change a single line of code and recompile. The nocache.js contains the links of the <md5>.cache.js, and therefore it is important that the browser always has the latest version of this file.
The third bucket contains images, css and any other static resources that are part of your application. CSS files are always changing, so you cannot tell the browser 'cache forever'. But if you use ClientBundle / CssResource, GWT will manage the file for you. Every time you change the CSS, GWT will rename the file, and therefore the browser will be forced to download it again. This lets you set strong cache headers to get the best performance.
In summary -
For anything that matches .cache., set a far-in-the-future expires header, effectively telling the browser to cache it forever.
For anything that matches .nocache., set cache headers that force the browser to re-validate the resource with the server.
For everything else, you should set a short expires header depending on how often you change resources.
Try to use ClientBundle / CssResource; this automatically renames your resources to *.cache bucket
This blog post has a good overview of the GWT bootstrapping process (and many other parts of the GWT system, incidentally), which has a lot to do with what gets cached and why.
Basically, the generated nocache.js file is a relatively small bit of JS whose sole purpose is to decide which generated permutation should be downloded.
Each individual permutation consists of the implementation of your app specific to the browser, language, etc., of the user. This is a lot more code than the simple bootstrapping code, and thus needs to be cached for your app to respond quickly. These are the cache.html files that get generated by the GWT compiler.
When you recompile and deploy your app, your users will download the nocache.js file as normal, but this will tell their browsers to download a new cache.html file with the app's new features. This will now be cached as well for the next time they load your app.
I have a WebApp that I've been try to make work offline. The WebApp is too big, even minified, to simply use the application cache (things download but I eventually get a window.applicationCache error). I'm trying to use XMLHttpRequest to get the larger scripts and main html and keep them in localStorage and just keep a small loader script in the application cache. The problem I'm seeing is that the XMLHttpRequest returns a network error when the loader script is being served locally. When the the cache is downloading no error is returned and it works fine. When I turn off the application cache the loader works fine, but of course then I need the network to get the loader.
I tried setRequestHeader("Cache-Control", "no-cache") but that didn't help.
Anybody have a clue?
What does your network: section in your manifest look like?
I found that if I weren't allowing wildcard network traffic it wouldn't load with XMLHttpRequest. So changing it to:
Network:
*
did the trick for us.
I think I found a solution. It would probably work for others.
I split the loader into two separate HTML files: one that uses XMLHttpRequest to get all the required files and put them in localStorage (the loader) and another that simply reads the files from localStorage and writes them into the document (the booter) with appropriate wrappers (e.g. ). The booter has a manifest file to keep it in the application cache. The loader does not. The user first invokes the booter. If the booter finds files already in localStorage it does it's thing. Otherwise, it uses location.replace() to invoke the loader. The loader loads the files from the server using XMLHttpRequest and puts them in localStorage, and then re-invokes the booter using location.replace(). This seems to not cause an network error.
In order to run offline, the user must invoke the booter in the iPhone Safari browser (which invokes the loader, which re-ivokes the booter) which boots the WebApp. In Safari, the user must then add the WebApp (the booter link) to their Home Screen (using the "+" button at the bottom). When offline the user can get to the app from the Home Screen icon. It takes a few seconds to re-render, but it's fully functional after that. It's the same delay when online. Invoking the link from the iPhone Safari browser will not work offline, though it will work online.
The booter monitors the application cache's "updateready" event so that when online and the when iPhone detects a change in the booter's manifest file and downloads a new booter, it will swap the new cache (window.applicationCache.swapCache()) and invoke the loader using location.replace() again. I also add an alert() to let the user know something funky is going on. So changing the manifest file (I mean making some bytes different, not just tweaking the modify time) will cause clients to get new files when online.
Interestingly, I noticed that localStorage set up in Safari is not available to the same page served from invoking the Home Screen icon, even though the cookies transfer! So the first time the booter is invoked from the icon it will reload the files even though they were previously loaded in Safari. Also, I had to explicitly prevent the loader from being cached as it was not reloading from the server when the rest of the files were updated.
You are correct. Ultimately it was the network section in the manifest.
I thought the site where the application was loaded from was included automatically and you didn't need to mess with it, but it's not true. You need to put the site in the network section.