I have installed SilverStripe on several servers successfully in the past (but I'm not a SilverStripe expert). This time my SS install fails to work and I'm at a loss how to fix it.
The Problem
SilverStripe 2.4.6 installed correctly on the server (AFAIK).
The front-end works as expected. (Show default theme. Pages all load correctly.)
I am able to log into the CMS admin section succesfully. The CMS loads but when changing site pages in the CMS using the browser pane on the left, the CMS shows the circular loading symbol. The new page load never completes.
Using the console of Firebug in Firefox - When attempting to change pages in the CMS (by clicking on the page browser pane) the CMS tries to load two pages. The second page request 404s.
The first GET request is from the initial page loads.
The following POST+GET requests fire when clicking on the page tree to change pages.
Attempting to Find the Solution
I've tried deleting and re-installing silverstripe twice. (2.4.7 and 2.4.6) Both times the problem recurs.
A strange thing is that this server is already running two other silverstripe sites (both of which I installed without a hitch). All three websites are accessed via different domains. I tried accessing this install via another domain thinking there might be something wrong with how this third domain is configured but that didn't help either.
What should I try now? I'm stumped.
Thanks in advance.
Responses to Comments
Check your root .htaccess file. Make sure RewriteBase is set to /
Checked. Full .htaccess on PasteBin
Indeed the javascrip URL is strange. Check if there is anything unusual about what's being returned from the previous POST request. Is the site running in dev, test or live mode?
I can't see anything unusual in the POST request.
Clue Found: The site is running in DEV mode. Switching to LIVE mode and the problem disappears. Also the second GET request only shows up in DEV mode.
Example Post request with response.
Example Get request with respones.
This is a work around more than a fix but if you'd rather be coding than bug hunting it might be worth a go! (remember to log out of SS before doing this fix)
In your mysite/_config.php file change
Director::set_environment_type("dev");
to
if(!isset($_GET['isDev']))
Director::set_environment_type("dev");
else
Director::set_environment_type("live");
Then you can develop the website in dev mode normally and to use the admin in live mode and avoid the bug you just go to: http://{your_domain}/admin?isDev=0
N.B. might find a proper answer when pastebin.com isn't overloaded and I can see your responses!
Related
I have a problem whose solution is certainly very simple, but it does not come to my mind at the moment :/
I have a multi-domain TYPO3 (6.1) installation and in one of the websites I need to temporarily show only one subpage, and over the rest of the pages I will work/update so I can not delete them. It is important that someone after entering a URL or going to the page from the Google search results has not opened this page, and has been redirected to this temporary.
I've tried the mount points but something does not work ...
Please help.
You can exchange the domain-records.
Make a new page on it's own (independent from the configuration of the domain it should replace). so it is a root-page. give it a domain record and disable the domain record of the pagetree it should replace.
Be aware to change the rootpageid configuration in realurl.
You also may need a special configuration for 404 handling for this domain as the most requests will be a 404 (or better 503).
And hurry up to update your system. TYPO3 6.1 is out of service for a long time.
I have already checked out this question, and it sounds like he's describing the same exact problem as me except for a few things:
I'm not running on https
80% of the time I try to debug, I get this message " Error parsing input URL, no data was scraped."
The scraper works perfectly on a different domain, but same server, same theme with almost identical content. Every time I try a domain it scrapes it perfectly including the image
During the 20% that it actually scrapes my page, I am having the same issue in the above link. It is reading my thumbnail, yet showing a blank image. The link brings me to a working image but it doesn't want to show anything.
The weird part is it worked completely fine about 10 months ago when I updated this blog on a daily basis. The only difference is I've switched servers recently. While that would explain a possibility, the other domain switched as well and doesn't have this problem.
I am at a loss why my links either show no image at all in facebook or give me the:
Domain Link
Domain
(no image, no description)
Very frustrating situation. Does anyone have any suggestions?
Update:
I have 6 domains...
When I moved servers recently, I found the new server wasn't prepared to compress the pages, so my blog posts looked crazy. This forced me to turn compression 'off' on WP Super Cache on my main blog. I also did it to my 2nd highest traffic blog figuring I'd get to the other 4 later.
Well, now those first two blogs appear to work fine in the facebook debugger, but the remaining 4 have troubles. The tricky part is, I completely removed WP Super Cache from one site and still had trouble fetching the data.
So while it seems logically it should have been a WP Super Cache issue, continuing to have errors despite removing it leads me to believe now? I'm still so baffled.
Update:
Ok, I loaded Chrome and IE, and both were able to pull the data with ease. The google snippet tool also worked great. I am going to try posting a link to my facebook fan page via chrome and see if it works correctly.
I did clear my FF cache and it didn't change, but I am still confused why one domain works ok while the other does not. Either way, if adding in Chrome works, I'll stick with that for now.
Any other suggestions?
Cache should not make any problem. If a browser can see your page, so can facebook debugger.
See if some 500 error is there. Try from different browser, clearing the browser cache etc. Try google rich snippet and see if a custom search engine is scrapping it fine.
PS: It will be nicer if you post url.
We have an application that is built exclusively in dev mode using the embedded jetty server that comes with GWT. We also host on jetty.
There are a number of pages we use for development only to do things like simulate SSO requests, view emails that were sent through the system, and check what files are uploaded.
When we try to link from these pages into a GWT page the problem becomes that &gwt.codesvr=192.168.0.101:9997 is not included in the URL and we get the error message "GWT module 'YourApp' may need to be (re)compiled". Obviously I can paste in "&gwt.codesvr=192.168.0.101:9997" manually but is very annoying. Does anybody know of a way to detect you are in the embedded Jetty dev mode server and auto generate links to have the correct "&gwt.codesvr=192.168.0.101:9997" added on?
Try this solution: https://stackoverflow.com/a/9122167/970308
I've updated this bookmarklet. It isn't perfect, but makes it quick while developing.
I suggest you create a Filter which will simply redirect you to an address with &gwt.codesvr=192.168.0.101:9997 as soon as you navigate to the one of the "development pages". If codesvr parameter is specific for each developer, each developer will have to set it in some cookie and filter will simply take this value from cookie.
We use the Facebook like button on the bottom of each page. We used to have the iFrame version but now changed to the FB version. On both solutions, we get an SSL error on our page because the image file is not loaded over a secure page.
When looking into the resources loaded, we see that two files are loaded securely:
https://www.facebook.com/plugins/like.php?app_id=110658975693059&href=http%3A%2F%2Fwww.stackoverflow.com&send=false&layout=button_count&width=280&show_faces=false&action=like&colorscheme=dark&font&height=21
https://s-static.ak.facebook.com/rsrc.php/v1/yK/r/PpEvPTmpg44.js
and the image sprite is loaded in plain http:
http://static.ak.fbcdn.net/rsrc.php/v1/z7/r/ql9vukDCc4R.png
I guess it is a temporary bug from facebook because the ssl-loaded css file references to a non-ssl image file. I created a bug report some time ago - with no response yet. Does any one have the same problem or even a solution on how to deal with that?
Thanks
There really is no fix for this that you can do on your end. Facebook must fix this, and they are extremely slow at fixing bugs. I noticed this on my site as well. Facebook's https css file is referencing non https images and causing this. I think this is a newer issue though because it used to work fine on my site.
We ended up grabbing the button resources and storing them locally. This improves page load time and solves any possible HTTPS issues.
I've seen several posts about DropDownLists getting cleared, or events not getting fired, but they don't seem to match this situation.
I've got (well I've reduced the problem to) a very simple asp.net website, a master page with a content page. The content page has a single DropDownList with AutoPostback set to True. The code behind updates a Label with the list's selected value. Not using UpdatePanel or AJAX (though I tried using them and I get exactly the same results). It's an intranet site using Windows authentication.
It works fine on IE and Chrome, but every time I try it on my iPad it just sits and spins. The postback appears to be happening, but either nothing's coming back (or being accepted) from the server, or the client just doesn't know how to finish things up, or I don't know what.
Sorry if this seems vague but I've spent two hours on Google and haven't come up with anything other than the fact that a simple page like this should work fine on an iPad, so I'm a little punchy.
Anybody got any pointers or ideas?
EDIT: Running this page through the remote web access portal my company uses, it works fine. So this may be an authentication problem between the iPad and IIS.
Not sure I have an answer but do you have the issue if you remove the DropDownList? If you need to build the list based on data maybe you could use a asp:repeater and build a html select list.