Hosting plugins and themes on two different servers - plugins

So I run a full fledged social network on Wordpress, and it undoubtedly requires a lot of plugins which take up a lost of disk space on my shared hosting account. I have two websites running on Wordpress, each on a separate Godaddy account. Lets call the first one website1 and the second website2.
Website1 gets tons of traffic, which takes up a lot of memory and that coupled with the plugins running slows down and sometimes even crashes the site.
Website2 is a completely different story. I was making it for a client, but he soon lost interest in it and now it just sits there, with all the memory not even half full. So in other words, website1 is overloaded and website2 having a relaxed time.
Now what I wanted to do was to upload half (or if possible, all) of the plugins to website2 but have them show up on website1.
Is this possible? I would request some clear and helpful answers :D
And if it can be done with plugins, can it be done with themes?

This link could be helpful for plugins and maybe there is a similar way for the theme.
https://wordpress.stackexchange.com/questions/74958/change-the-path-where-wordpress-plugins-are-uploaded

Related

How to implement continuous migration for large website?

I am working on a website of 3,000+ pages that is updated on a daily basis. It's already built on an open source CMS. However, we cannot simply continue to apply hot fixes on a regular basis. We need to replace the entire system and I anticipate the need to replace the entire system on a 1-2 year basis. We don't have the staff to work on a replacement system while the other is being worked on, as it results in duplicate effort. We also cannot have a "code freeze" while we work on the new site.
So, this amounts to changing the tire while driving. Or fixing the wings while flying. Or all sorts of analogies.
This brings me to a concept called "continuous migration." I read this article here: https://www.acquia.com/blog/dont-wait-migrate-drupal-continuous-migration
The writer's suggestion is to use a CDN like Fastly. The idea is that a CDN allows you to switch between a legacy system and a new system on a URL basis. This idea, in theory, sounds like a great idea that would work. This article claims that you can do this with Varnish but Fastly makes the job easier. I don't work much with Varnish, so I can't really verify its claims.
I also don't know if this is a good idea or if there are better alternatives. I looked at Fastly's pricing scheme, and I simply cannot translate what it means to a specific price point. I don't understand these cryptic cloud-service pricing plans, they don't make sense to me. I don't know what kind of bandwidth the website uses. Another agency manages the website's servers.
Can someone help me understand whether or not using an online CDN would be better over using something like Varnish? Is there free or cheaper solutions? Can someone tell me what this amounts to, approximately, on a monthly or annual basis? Any other, better ways to roll out a new website on a phased basis for a large website?
Thanks!
I think I do not have the exact answers to your question but may be my answer helps a little bit.
I don't think that the CDN gives you an advantage. It is that you have more than one system.
Changes to the code
In professional environments I'm used to have three different CMS installations. The fist is the development system, usually on my PC. That system is used to develop the extensions, fix bugs and so on supported by unit-tests. The code is committed to a revision control system (like SVN, CVS or Git). A continuous integration system checks the commits to the RCS. When feature is implemented (or some bugs are fixed) a named tag will be created. Then this tagged version is installed on a test-system where developers, customers and users can test the implementation. After a successful test exactly this tagged version will be installed on the production system.
A first sight this looks time consuming. But it isn't because most of the steps can be automated. And the biggest advantage is that the customer can test the change on a test system. And it is very unlikely that an error occurs only on your production system. (A precondition is that your systems are build on a similar/equal environment. )
Changes to the content
If your code changes the way your content is processed it is an advantage when your
CMS has strong workflow support. Than you can easily add a step to your workflow
which desides if the content is old and has to be migrated for the current document.
This way you have a continuous migration of the content.
HTH
Varnish is a cache rather than a CDN. It intercepts page requests and delivers a cached version if one exists.
A CDN will serve up contents (images, JS, other resources etc) from an off-server location, typically in the cloud.
The cloud-based solutions pricing is often very cryptic as it's quite complicated technology.
I would be careful with continuous migration. I've done both methods in the past (continuous and full migrations) and I have to say, continuous is a pain. It means double the admin time for everything, and assumes your requirements are the same at all points in time.
Unfortunately, I would say you're better with a proper rebuilt on a 1-2 year basis than a continuous migration, but obviously you know best about that.
I would suggest you maybe also consider a hybrid approach? Build yourself an export tool to keep all of your content in a transferrable state like CSV/XML/JSON so you can just import into a new system when ready. This means you can incorporate new build requests when you need them in a new system (what's the point in a new system if it does exactly the same as the old one) and you get to keep all your content. Plus you don't need to build and maintain two CMS' all the time.

How to A/B test an entire website design

We're building a new website design and instead of cutting over to it 100%, we'd like to ease into it so we can test as we go. The goal would be to have users that visit http://oursite.com to either get the "old" website or the new, and we could control the percentage of who gets the new site by 10%, 50%, etc.
I'm familiar with A/B tests for pages, but not an entire website domain. We're on a LAMP stack so maybe this can be done with Apache VHosts? We have 2 cloud servers running behind a cloud load balancer in production. The new site is entirely contained in an svn branch and the current production site runs out of the svn trunk.
Any recommendations on how I can pull this off?
Thanks you!
You absolutely can do this, and it's a great way to quickly identify things that make a big difference in improving conversion rates. It's dependent on a couple of things:
Your site has a common header. You'll be A/B testings CSS files, so this will only work if there's a single CSS call for the entire site (or section of the site).
You're only testing differences in site design. In this scenario all content, forms, calls to action, etc. would be identical between the versions. It is technically possible to run separate tests on other page elements concurrently. I don't recommend this as interpretation of results gets confusing.
The A/B testing platform that you choose supports showing the same version to a visitor throughout their visit. It would be pretty frustrating to see a site's theme change every time they hit another page. Most A/B testing platforms that I've used have an option for this.
Since you're testing substantial differences between versions, I also recommend that you calculate sample sizes before you begin. This will keep you from showing the losing version to too many people, and it will also give you confidence in the results (it's a mistake to let tests run until they reach statistical significance). There are a couple of online calculators that you can use (VisualWebsiteOptimizer, Evan Miller).

Multiple domains one sign on (without logging in to each one)

I have been asked to oversee the development of a handful of sites. The people running the show want it so that if you sign onto one of the sites, then you are automatically signed onto the rest of them.
One of my buddies who is a great programmer says there is no safe way to do this, is he right?
I had an idea that the main site (parent site) could host the daughter sites as sub domains, with each site having its own unique domain name.
What do you think?
Yes, it can be done. However, it won't be a trivial solution but will be a very expensive project that requires an extensive set of skills. Companies typically try to achieve this by establishing internal solutions themselves but tend to fail as complexity increases.
What you are trying to accomplished can also be done as a service. You may want to take a look at the following webpage:
http://www.covisint.com/web/guest/about-identity-services
Hope that helps!

I want to separate binary files (media) from my code repositories. Is it worth it? If so, how can I manage them?

Our repositories are getting huge because there's tons of media we have ( hundreds of 1 MB jpegs, hundreds of PDFs, etc ).
Our developers who check out these repositories have to wait an abnormally long time because of this for certain repos.
Has anyone else had this dilemma before? Am I going about it the right way by separating code from media? Here are some issues/worries I had:
If I migrate these into a media server then I'm afraid it might be a pain for the developer to use. Instead of making updates to one server he/she will have to now update two servers if they are doing both programming logic and media updates.
If I migrate these into a media server, I'll still have to revision control the media, no? So the developer would have to commit code updates and commit media updates.
How would the developer test locally? I could make my site use absolute urls, eg src="http://media.domain.com/site/blah/image.gif", but this wouldn't work locally. I assume I'd have to change my site templating to decide whether it's local/development or production and based on that, change the BASE_URL.
Is it worth all the trouble to do this? We deal with about 100-150 sites, not a dozen or so major sites and so we have around 100-150 repositories. We won't have the time or resources to change existing sites, and we can only implement this on brand new sites.
I would still have to keep scripts that generate media ( pdf generators ) and the generated media on the code repository, right? It would be a huge pain to update all those pdf generators to POST files to external media servers, and an extra pain taking caching into account.
I'd appreciate any insight into the questions I have regarding managing media and code.
First, yes, separating media and generated content (like the generated pdf) from the source control is a good idea.
That is because of:
disk space and checkout time (as you describe in your question)
the lack of CVS feature actually used by this kind of file (no diff, no merge, only label and branches)
That said, any transition of this kind is costly to put in place.
You need to separate the release management process (generate the right files at the right places) from the development process (getting from one or two referential the right material to develop/update your projects)
Binaries fall generally into two categories:
non-generated binaries:
They are best kept in an artifact repository (like Nexus for instance), under a label that would match the label used for the text sources in a VCS
generated binaries (like your pdf):
ideally, they shouldn't be kept in any repository, but only generated during the release management phase in order to be deployed.

Allow users to upload files to server via email?

I am administrating a small, private website with 100% trusted users (about 60 people, i know them all personally).
I am having many problems with the PHP based upload system i have in place currently, mainly with users encountering timeout errors and other varying issues due to the way the upload is handled (not to mention the complete deadzone in the UI created by making the user stare blindly at the page until the upload finishes
Anyways, i have been tossing around alternative forms of file uploading i could offer. FTP accounts were nixed due to the level of tech savvy required. Flash/Java uploaders were nixed because i don't really want proprietary third party applets running on my site.
The other idea i came up with that i think would be perfect would be to offer the ability to EMAIL the files to the server. Emailing attachments is a simple enough task, and better yet it provides the user with some tangible feedback to the uploading process.
My question is, how could i go about implementing such a system?
The server is running Gentoo Linux with Apache and i have full root access. Mail dameons can be installed to my needs.
If you have a better way to upload files, perhaps you could offer that instead?
Stick with PHP. It's certainly not perfect but the problems you're describing can probably be handled. max-execution-time and upload_max_filesize are configurable values. I would at least try tweaking those numbers (no php code changes required) before trying to implement an email based solution.
There are several file upload libraries with progress bars using pure javascript. Keep it in PHP.