How to make development site Noindex Nofollow, but production without - robots.txt

I have a joomla 3.3 development site and a production site. I'm doing all development in the development site and every time that the development get pushed to production, I need to make sure I change noindex,nofollow to index, follow.
Anyway I can keep development noindex,nofollow without changing production?

There is an easier method than using noindex, nofollow. Use canonical links on your pages, using the production url. Like this, even if search engines find about your development pages, they won't index them.

Use a conditional php code inside the template head php file.
<?php
//if host is development show noindex
$host = $_SERVER['HTTP_HOST'];
if($host == "development.com")
{
echo "<meta name=\"robots\" content=\"noindex, nofollow\">";
}
?>

When you move the files to the production server you don't have to copy configuration.php that store the variable for robots in public $robots = '';.
I highly recommend you to keep your development installation htaccess password protected to be sure that would never be accessed by search engines.
Hope this helps

Related

using TYPO3 core hooks only in one site of a multi site installation

i defined a hook in ext_localconf.php:
$GLOBALS['TYPO3_CONF_VARS']['SC_OPTIONS']['tslib/class.tslib_content.php']['typoLink_PostProc']['titleTagsInHiddenText'] = SNM\StmwiAccessibility\ExtendTypolink::class . '->convertTitleInHiddenText';
This hook will be executed on every link on the page, on all pages of all sites. This could be a performance killer ...
So, is there a possibility to restrict the use of the hook to the actual page? Is it possible to get the current site in ext_localconf.php?
e.g.:
$currentSite = ????;
if ($currentsite = 'rootPidOfMySite') {
$GLOBALS['TYPO3_CONF_VARS']['SC_OPTIONS']
}
This question rises in other contexts too: i often want to restrict the performance consuming configuration of extensions to one single site. I can do it with the static setup file but not with the stuff in ext_localconf.php.
Thanks!
Which TYPO3 version do you use? There is an API to access the site configuration. I'd say you need to register PSR-15 middleware and then you can access the site configuration. More details can be found in the documentation.

merge large existing web app into Sailjs site

I'm trying to merge large existing web app into sails.js. so I moved the folders into assets and build a custom route , 'GET /': '/assets/client/launch.html' and get 404 when I point my browser to http://localhost:1337/ as the / is correctly redirected to http://localhost:1337/assets/client/launch.html which produces the 404.
Now the file exists in the folder assets/client (and in .tmp), so I am thinking the Sails router is getting in the way.
I would leave the client (70K lines of JS) that generates all the UI dynamically and sailjs server that provides authentication separate and enable CORS but my customer wants client packaged with server. This type of operation is simple in frameworks like ASP.NET MVC but am wondering if Sails is up to the task.
Well, If everything you tried did not work out. There might be another solution ,
First of all since you are talking about sails app I am assuming other bundle must be sails as well ,
So here is what you do-
Change the port for another app that you want to attach to this.
Second whenever you want to go to page on another app simply redirect the client to another port ie
in html or esp put a href tag with different port.
<a href="localhost:PORT/route_to_file">
</a>
I got it working by placing my app into assets where we need to launch from assets/client/index.html as there would be too many dependencies to change. As I said above could not just add a route as Sails must getting in the way. However as in Chapter 3.2.2 of Sails in Action I generated a static asset npm install sails-generate-static --save. then I redirected to assets/client/index.html. As an aside that book is great and would highly recommend it.

Consuming a rest api with Drupal

I am currently maintaining a drupal 7 site. Although I already gained some experience with Drupal my knowledge is still rather low-level and doesn't go beyond installing and configuring modules.
My site is pretty much a metadata repository. People fill in forms about datasets.
My problem is probably rather basic (or maybe not - it's hard to say).
I need to implement an external file repository. The data repository provides a REST API that should allow the users to upload files to their repository using my drupal site.
After uploading a file the data repository then provides a permanent identifier that I have to save in addition to the other fields.
Now I'm looking for the best way to build a very simple UI that allows users to upload their files to the repository without leaving my drupal site (e.g. while they are in the process of filling in the fields). I also need the liberty to form the URLs needed myself as commands for the API as there are many options that need to be declared. For instance I need to provide a certain token in my URLs for the data repository to be able to identify who's uploading.
I already did some research and found modules like:
https://www.drupal.org/project/wsclient
https://www.drupal.org/project/chr
https://www.drupal.org/project/rest_client
or these topics:
http://drupal.stackexchange.com/questions/42103/how-do-i-consume-rest-as-a-client
https://www.drupal.org/node/1114312
However, I still wasn't able to find out what the best strategy is to implement that.
What I need are tips on what the best way is to do that.
Currently I don't really know what I'm looking for.
Also I do realise that the are drupal forums where I could ask a question like this, but I have far more better experiences with stackoverflow.
I'd appreciate any help.
Thanks
Sorry for answering my own question, but based on the information I got from here:
https://drupal.stackexchange.com/questions/42103/how-do-i-consume-rest-as-a-client
I was able to solve the problem.
I wrote a custom drupal module that communicates with the API.
Drupal 7 comes with its own function for https requests:
https://api.drupal.org/api/drupal/includes%21common.inc/function/drupal_http_request/7.x
However, it wasn't enough for my needs as I also neeeded to transfer data, so I ended up using the php library curl (https://curl.haxx.se/), which was built for exactly that purpose.
Basically, you install curl on your server where your drupal installation is also running on. Somewhere in your custom module, preferably in a config menu, you can check if the installation was succesful using something like this:
function _is_curl_installed() {
if (in_array ('curl', get_loaded_extensions())) {
return true;
}
else {
return false;
}
}
if (_is_curl_installed()) {
drupal_set_message(t('Curl is installed.'));
} else {
drupal_set_message(t('Curl is NOT installed.'), 'error');
}
Drupal should be able to recognise curl automatically after the installation.
And here is one example for a rest call in drupal using curl (based on this answer how to upload file using curl with php). Keep in mind that the syntax using '#' is outdated and that drupal will complain about it. In my example I used "new CURLFile".
$request = curl_init($deposit_url);
curl_setopt($request, CURLOPT_POST, true);
$cfile = new CURLFile($file_realpath,'application/octet-stream', $base_file_name);
curl_setopt(
$request,
// the content of the array depends on the API you are communicating with
CURLOPT_POSTFIELDS,
array(
'file' => $cfile,
'access_token' => $temp_token,
'verify' => false,
));
curl_setopt($request, CURLOPT_RETURNTRANSFER, true);
$output= curl_exec($request);
// output the response
// drupal_set_message(t('The curl output [1] is '.$output));
curl_close($request);
I hope this is helpful to someone.
EDIT: I released a contributed module that solves the described problem:
https://www.drupal.org/project/b2share
Yes you can implement nice REST API services using Drupal if you are using drupal 7 try to use "Services" MOdule that would be a great starting point for your project by using this module you can create your own endpoinds and you can intergrate other applications to this endpoinds.

Pow works locally, but serves other site on xip.io

I'm using Pow.cx for local development servers - Rails, PHP and static. It's working fine locally, but when I try to use the new xip.io functionality to browse from another device I'm getting a different localhost site every time.
This particular incorrectly-served site is not set up in Pow, but I have an older virtual host set up for it.
Put another way:
stm.dev serves the correct site on my desktop.
stm.192.168.1.XXX.xip.io on my iPhone serves up a different site that is not configured in Pow.
I haven't been able to find any mention of a similar problem online, has anyone else come across this? This particular site is static html, if it matters.
So far I have been unable to get Pow to automatically pick up the xip.io addresses. However I did finally get it working to the point that I can continue building the site.
I followed the instructions from this link http://blogs.adobe.com/shadow/2012/06/19/shadow-xip-io-virtual-hosts-workflow-simplified/ in setting up a vhost alias for the site. I believe that cuts Pow out of the loop, but at least it's working now for testing on the other devices I need.
I would love to have Pow working as described, so if there are any suggestions on that end I'd love to hear it.

Converting a Brownfield PHP Webapp to Zend Framework

We're thinking of converting our PHP Webapp from using no framework (which is killing us) to use Zend Framework. Because of the size of the application I don't think starting from scratch is going to be a viable option for management so I wanted to start researching how to slowly convert from the current site structure to one using Zend Framework but there isn't a lot of information on this process.
So far my plan is to dump the current code base into the public/ directory of the Zend Application, fix the numerous problems that I'm sure this will crop up and then start rewriting modules one at a time.
Has anyone had experience doing this in the past and how did it work out for you?
I've done a few of these now. What worked best for me was putting ZF 'around' the old app, so all requests go through ZF. I then have a 'Legacy' controller plugin, which checks whether the request can be satisfied by ZF, and if not, sends it to the old app:
class Yourapp_Plugin_Legacy extends Zend_Controller_Plugin_Abstract
{
public function preDispatch(Zend_Controller_Request_Abstract $request)
{
$dispatcher = Zend_Controller_Front::getInstance()->getDispatcher();
if (!$dispatcher->isDispatchable($request)) {
// send to the old code...
}
}
}
exactly how you then send the request to your old app depends a bit on how it is implemented. In one project, I examined the request, determined what file from the old code the request would have gone to, and then required that in. It sounds like this might be appropriate for you. In another project my solution was to route all these requests to a LegacyController in the ZF project, which ran the old code to get the resulting HTML and then rendered it inside the Zend_Layout from the new project.
The advantages of this approach are that you can gradually introduce ZF modules as you rewrite parts of the old app, until you reach the point where 100% of requests can be served by ZF. Also, since the ZF project has initialized before your old code is run, your old code can use the ZF autoloader, so you can start replacing classes in the old code with models written in a more ZF-style, and have them used by both parts of the app.