I am to use Blogengine .Net as my blogengine and most of it served well until I realized I have to run this identically on two servers and one database.
When the editor is on server 1, anything he/she posts will be written to the database and will be updated in the in memory post list(static List Posts). Meantime for any user on server 2, nothing done on server 1 is visible because the in memory post there won't be updated.
One life questioning solution is making the editor switch servers from their host files and with a help of a shinny button making them able to reload the posts as
BlogEngine.Core.Post.Reload();
HttpRuntime.Close();
but I really feel there should be a better solution because I m now thinking about would integrating Memcached somewhere here be easy, if not I guess I have to switch from BlogEngine .Net.
Any suggestion would be more than welcome,
Thanks,
Bilsay
I have the same problem. I am running 2 servers and a load balancer to do the work. I first login with the first server, and post the blog. than login on the second server and press the button "reset cache" which I wrote to get the data again from database. But this solution is still troublesome for user comments.
Thanks
There's an extension to help solve this problem:
http://allben.net/post/2009/05/10/Web-Farm-Extension-10
Related
Scratching my head with the following problem.
I have developed a SAPUI5 application using smarttemplates/odata annotations. In the first screen - i create entries for an entity - navigate to another screen - create entries for another entity - pretty simple.
But the problem is, I am encountering an error Virus scan profile not active in the backend (gateway embedded). When I switch off the virus profile in /n/IWFND/VIRUS_SCAN backend, all good . No issues at all.
But switch on to a default profile, back comes the problem. 90% of the times I get it in the second screen, but 10% in the first entity.
The weird part is, all the other UI5 applications in the same system works ! If it's a system problem, then it has to be affecting all the ones, but only my application is stuck with this problem.
I realize that there is something different between mine and the rest, but I am unable to figure out what it is . I have compared the projects, manifest/project.json, etc. $batch request payload of mine vs. others and I see no anomaly, other than the _changeset, rest all are identical, and adhering.
So, what would be the one thing that is making a POST request from my application trigger the check for virus scan profile and fail ?
Even strange is, the request statuscode is 202 , but it doesn't reach the backend at all.
Any help is highly appreciated,
thanks,
Sathish R
Finally, the issue seems that one of the properties in my entity is of type EDM.Guid, one of the types for which a virus scan would be performed before passing through the request. As we don't have any profiles configured, I ended up with the error. Converting it to string just fixed the issue :-)
thanks,
Sathish R
I am completely stuck on where to start with getting a log-in area for a Clojure site I am building (for fun).
I've looked at several resources, which I'll post below, mercilessly copy/pasted code, and the closest I can get is one of two situations:
The login page takes the login but says that the login failed, though, as far as I can tell, the login matches.
Or I get this error: No method in multimethod '->sql' for dispatch value: null
I'm not sure how to interpret the above error: is this specifying that I need a multi-method or is it specifying that I need to check for null? The null requirement makes no sense at all. I'm not really asking but if anyone wants to give an explanation, that is great.
I tested the output by comparing the results-to-select queries from raw non-hashed data, I've went through 5 variations on this theme, using everything from page-to-page calls to creating new defpartials, multi-methods, defn, etc.
Sources I have used (unfortunately, I can't list all of them being a first-time poster):
This one uses Clojure -> Korma -> PostgreSQL, but the code doesn't seem to work for multiple users?
http://www.vijaykiran.com/2012/01/17/web-application-development-with-clojure-part-2/
This one shows how to use Noir and PostgreSQL (Yes, I am using Noir):
https://yogthos.net:11794/blog/23-Noir+tutorial+-+part+2
The 4Clojure site, but that one uses CongoMongo:
The Heroku Twitter clone, but no mention of how to create logins for one person, much less several.
I also bought Programming Clojure from O'Reilly Press, but once again, nothing about how to create a log-in area.
FIRST EDIT: I was asked to create a github repository of a stand-alone site. This includes a working "Account Creation" area that is found in the welcome.clj file and only a form of the Login area in login.clj.
I was attempting to get some of the same errors working as I had last night and also attempting to get this working before I uploaded the files. I don't have any reasonable starting points yet, thus there is no beginning implementation as of yet. I'm seriously embarrassed at the solutions I've been coming up with, thus I don't want to post them. I get conceptually what I should do, but for some reason, I can't seem to translate this. This is my first github account: my background is Python, Scheme a'la SICP, and some Python + PostgreSQL marketing program I built.
SECOND EDIT: Ack! I can't seem to get the thing to work at all... Yeah, I spent well over 20 minutes (hours) on this one, so I have just have to admit that I don't yet have the requisite knowledge to accomplish this, no matter how many sources I look to. I committed the updated files and all the odd things I tried, including all the variations on login box to running raw SQL. The closest I can come is getting it so that I don't get any errors, but no evidence at all that someone is logged in. Thanks so much for the help and suggestions. I'll most certainly return to this later.
https://github.com/dt1/noirKormaLogin
There are a couple of issues that I see. First, in datapass.clj, you're creating an entity with no content. I'm not sure how Korma handles that. It's trying to thread results as inputs to other functions, so I could see how nil gets introduced there.
Secondly, you'll need something to handle the login post. (defpage ...) only handles GET requests by default. You'll need a separate defpage to handle the post. Something along these lines:
(defpage [:post "/login"] {:keys [user-name pwd]}
(if-let [user (db/find-user user)]
(if (noir.util.crypt/compare pwd (:password user))
(do
(noir.session/put! :some-key some-value)
(noir.response/redirect "/success"))
noir.response/redirect "/failed-to-login"))
(noir.response/redirect "/failed-to-login"))
session/put! is how you put data into the session. The default is to use an in-memory store. You'll need to add Ring middleware to use persistent sessions (look at Session Stores).
Also, as luck would have, someone just posted an authentication app for Noir... you may want to take a look: https://github.com/xavi/noir-auth-app
It's a long story, but I am trying to save an internal website from the pointy hair bosses who see no value from it anymore and will be flicking the switch at some point in the future. I feel the information contained is important and future generations will want to use it. No, it's not some adult site, but since it's some big corp, I can't say any more.
The problem is, the site is a mess of ASP and Flash that only works under IE7 and is buggy under IE8 and 32bit only even. All the urls are session style and are gibberish. The flash objects itself pull extra information with GET request to ASP objects. It's really poorly designed for scraping. :)
So my idea is to do a tcpdump as I navigate the entire site. Then somehow dump the result of every GET into a sql database. Then with a little messing with the host file, redirect every request to some cgi script that will look for a matching get request in the database and return the data. So the entire site will be located in an SQL database in URL/Data keypairs. Flat file may also work.
In theory, I think this is the only way to go about this. The only problem I see is if they do some client side ActiveX/Flash stuff that generates session URLs that will be different each time.
Anyway, I know Perl, and the idea seems simple with the right modules, so I think I can do most of the work in that, but I am open to any other ideas before I get started. Maybe this exist already?
Thanks for any input.
To capture I wouldn't use tcpdump, but either the crawler itself or a webproxy that can be tweaked to save everything, e.g. Fiddler, Squid, or mod_proxy.
I have a application built on Zend Framework I am trying to optimize.
I did some Xdebug profiling and although i cant say i understand every nitty gritty of the results i got, some things were quite obvious from the result.
For instance, the file Bootstrap.php seems to be the one gulping most of the time taking 4,553MS seconds which accounts for 92.49% of the total time.
And if i dig further, I could see that Zend_Application_Bootstrap_Boostrap->run takes the bulk of the time. Checking this out again, I found out that Zend_Controller_Front->Dispatch might actually be the function inside the Boostrap.php that takes time to execute.
Question is, from these indices that i have, how best can I go about Optimizing the application? If it caching, how do i go about applying Caching to this situation?
Thanks
From the look of the callgrinds, on the login page the app is spending most of it's time in curl_exec, which is to be expected if you're doing a remote login. But it is doing 10 separate curl_execs which seems excessive. I'm not familiar with the LinkedIn login auth, but is it possible your app is running the remote login code multiple times?
On the standard page request the app is spending most of its time connecting to MySQL, and it seems to be doing this twice. Are you using a remote DB server, and do you need two separate DB connections?
Assuming you are using a remote DB server and it is on the same network as your web server, there seems to be some networking issue there. I'd check the latency to that server if you can, and try connecting to the IP address instead of a hostname to see if that makes any difference (if doing this is much faster this would suggest an issue with the DNS setup on your web server).
We have a very strange problem at the moment whereby our Silverlight 4 application will not display the contents of ComboBox on (some) client machines.
We have an observable collection in our view model, with a simple binding expression on the ItemsSource on the ComboBox. In the page's Loaded event, we are calling a domain service to retrieve the items, then in it's Completed event we are storing the returned items into the collection. Nothing fancy.
Firstly, when we deployed the solution in Release mode, the combobox's had no values. I opened fiddler and saw the request and response from the domain service, and it was in fact successfully returning the correct data to the client. However the combobox's had no data in them.
I rebuilt in Debug|Any CPU configuration, hit the same URL and it worked straight away - the combobox's were filled with data. Problem solved? Not quite.
We then gave the URL to someone else to test (on the same network/subnet) and the combobox
s were empty again. It works on 5 machines, and doesn't work on the rest. I checked fiddler, and the response is coming back with all the data - Silverlight just doesn't populate the comboboxes.
We have tried a number of things - IE no addons, chrome incognito's, cross domain and client access policies... nothing seems to make a difference. We've tried running the browsers as administrator, we've even tried on very old machine running IE6, same issue (ie no UAC or anything strange there). No anti-virus installed on any of the machines. At a complete loss. We've tried machines on the company domain, and off the domain with no difference - different operating systems (xp/win7).
Does anyone have any ideas or solutions for me on this problem? Seems like there's something installed on the machine doing it?
Cheers,
Matt
This is probably a timing issue. ComboBoxes wreak all sorts of havoc. Check out Kyle's blog for tips and some tools to make things easier.
http://blogs.msdn.com/b/kylemc/archive/2010/11/02/silverlight-combobox-frequently-asked-questions.aspx
The general problem is that the Silverlight ComboBox was not designed with async data patterns in mind.