I have use Zend framework 1 on Linux Apache web server and have to put it in the sub-folder named "go" and i put the .htaccess with these line in it(inside go folder) :
# Rewrite rules for Zend Framework
RewriteEngine on
RewriteRule ^(.*)$ /public/$1 [QSA,L]
and what I've got after entering my domain.com/go is :
Not Found
The requested URL /index.php was not found on this server.
Additionally, a 404 Not Found error was encountered while trying to
use an ErrorDocument to handle the request.
What i does wrong ?
after changing "/public" to "public" i also get the "not found" error .
I use subdomain like : go.domain.com for my app but i want to know why my first method don't work ? Don't .htaccess works on subdirectory ?
you need to use RewriteBase in order for this to work
RewriteBase /go
and still put the .htaccess in the public folder , not anywhere else.
http://httpd.apache.org/docs/current/mod/mod_rewrite.html#rewritebase
The following solution worked for me:
First, place a .htaccess file in /go directory with following content
RewriteEngine on
RewriteRule ^(.*)$ public/$1 [NC,L]
This will redirect every request made to /go url to public directory.
Next, make sure you don't have any RewriteBase in your /go/public/.htaccess file unless needed.
At last, create a controller plugin and put the following code in routeStartup().
Zend_Controller_Front::getInstance()
->getRequest()
->setBaseUrl('/database/');
This last step will make it easy for you to add resources (css, js, images) to your HTML by using just $this->baseUrl() helper. You won't have to add /go with each file path.
Related
My client is changing his domain, and I'm curious about the best way to make all the existing links on other sites work, so that:
www.olddomain.com/page1.html
www.olddomain.com/other-section/page2.html
redirect automatically to:
www.newdomain.com/page1.html
www.newdomain.com/other-section/page2.html
Is there a way to set it up so that all the old links simply correspond to the exact same structure, simply with a new domain?
Thanks!
To the best of my knowledge you would have to keep the old domain and redirect from there with '301's in an .htaccess file or something similar.
An example .htaccess file might look like this:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !newdomain.com$ [NC]
RewriteRule ^(.*)$ http://www.newdomain.com/$1 [L,R=301]
Or:
RewriteEngine On
RewriteBase /
Redirect 301 /dir/ http://www.newdomain.com/dir/
Redirect 301 /dir/file.html http://www.newdomain.com/dir/file.html
I suggest you read up on 301 Redirects and .htaccess files. This .htaccess generator might also be of use to you.
I want to be able to run my Mojolicious Lite app on shared hosting either from root (www.domain.com/) or subfolder (www.domain.com/misc/mymojoapp/).
The app's .pl file always goes to cgi-bin folder of the domain (www.domain.com/cgi-bin/myapp.pl) and I want to use mod_rewrite rules in .htaccess to point to the app. Images/css/js files would be under www.domain.com/misc/mymojoapp/support.
But I cannot figure out how I reliably get misc/mymojoapp/ part of the path so I can pass it into templates. Is there a way?
# set apache handler to treat your specified script name(s) as a CGI program
Options +ExecCGI
<Files ~ "(mymojoapp)$">
SetHandler cgi-script
</Files>
# rewrite any requests into the appRewriteEngine onRewriteCond %{REQUEST_FILENAME}
!-fRewriteRule ^(.*)$ misc/mymojoapp/$1 [L]
and in your App
# set env variable to use root for pretty URLs
$ENV{SCRIPT_NAME} = '/';
Change the above setting for pretty URL
Im using AMPPS, I can't access my perl file, but I can every other file in the folder.
I'm getting a 403 forbidden.
Here's the code:
[Fri Jan 24 01:18:10.069985 2014] [cgi:error] [pid 38441] [client ::1:51823] Options
ExecCGI is off in this directory: /Applications/AMPPS/www/test/test1.pl, referer:
http://localhost/test/test1.html
I'm calling the perl file via an AJAX POST call
Things I have tried:
Adding +ExecCGI to every < Directory > structure I can find
Adding AddHandler cgi-script .cgi .pl to the aforementioned structures
Setting 777 permissions recursively the containing folder, along with the cgi-bin directory set by ScriptAlias in the httpd.conf.
Is there anyone that can give me some insight as to either:
What I'm doing wrong
How to configure the httpd.conf standard config file to run CGI scripts from the /www folder using AMPPS
Thanks in advance
Every thing looks fine. Just make sure that you include your httpd.conf in the apache2.conf file and recheck the directory path where you are giving your +ExecCGI and AddHandler options.
I am not sure about AMPPS, but thinking apache server settings will be same.
Looking at the log , looks like your directory path should be : /Applications/AMPPS/www/ .
And make sure you restart the server after each change you make.
Disclaimer: I know this questions sounds lame. But I am no n00b and I have done whatever I know and I could find help about this. I have already searched the forum for this and tried all the fixes given but none of them helped me hence this question.
The threads I have visited
https://askubuntu.com/questions/147348/bugzilla-testserver-pl-failing
http://www.thesitewizard.com/archive/addcgitoapache.shtml
Bugzilla error after installation: "TEST-FAILED Web Server is not executing CGI files"
Now with that
My Exact problem
I have installed bugzilla on a bitnamil lampstack. The lampstack already has two other applications up and running successfully. After my bugzilla installation when I am trying to visit the page I can see my whole perl script on the borwser.
Running it's own server check reveals me the following
TEST-OK Webserver is running under group id in $webservergroup.
TEST-OK Got padlock picture.
TEST-FAILED Webserver is fetching rather than executing CGI files.
What I have done in my setup
The bugzilla.conf file (which gets pulled in httpd.conf) has the
following settings enabled
> AddHandler cgi-script .cgi .pl
>
> Options +MultiViews +ExecCGI
>
> DirectoryIndex index.cgi
>
> AllowOverride All
The "AddHandler cgi-script .cgi .pl" is already enabled in my httpd.conf file.
I have not enabled separately +ExecCGI for all directories in httpd.conf but even that does not solve the problem
What am I doing wrong here?
You should have a directory block in your bugzilla.conf that looks something like this:
<Directory "/usr/local/apache2/htdocs/bugzilla">
AddHandler cgi-script .cgi
Options +ExecCGI +FollowSymLinks
DirectoryIndex index.cgi index.html
AllowOverride Limit FileInfo Indexes Options
AddType application/vnd.mozilla.xul+xml .xul
AddType application/rdf+xml .rdf
</Directory>
I believe you don't want the .pl to be 'handled'. And having All for AllowOverride is a security issue. The FollowSymLinks one is because my bugzilla directory in htdocs is a symlink to somewhere else on the system.
Did you run the checksetup.pl? It should have adjusted all the permissions for you, but check to see that the group that your web server runs as has read and execute permissions.
I had a similar problem with another package downloading the CGI files instead of executing them on the server. The answer to my problem was that on Ubuntu server 14.04, the module CGI on apache was disabled.
To fix:
sudo a2enmod cgi
sudo service apache2 restart
To check if the module is loaded, on Ubuntu:
apache2ctl -M | grep cgi
cgi_module (shared)
yellavon's answer solved my problem:
cgi script is not executing
Here's a copy of his answer:
Make sure you are loading the CGI module in the httpd.conf file:
LoadModule cgi_module modules/mod_cgi.so or LoadModule cgi_module modules/mod_cgid.so depending on which version of Apache you are running.
You can also read about additional solutions for Dynamic Content with CGI.
How to disable mod_deflate for PHP using the.htaccess file
for files in a specific directory
for all files that have extension of, for example .php?
I have tried both:
# for URL paths that begin with "/foo/bar/"
SetEnvIf Request_URI ^/foo/bar/ no-gzip=1
# for files that end with ".php"
<FilesMatch \.php$>
SetEnv no-gzip 1
</FilesMatch>
They don't work, I can't figure out why. At this point I want to disable it completely for all files in the directory.
Check this out. and reply if it works for you:
SetEnvIfNoCase Request_URI ".php$" no-gzip dont-vary
Let's simplify the issue and slowly make it more complicated as you get each piece working.
First, you stated in a comment: "When I try to access a specific php file on the server FF tells me that that there is a compression problem"
First, see if the issue happens when you disable mod_deflate. Comment out the deflate lines from your apache configuration file(s).
If that indeed fixes it, re-enable mod_deflate, then add these to your virtualhost that is hosting your site (don't mess around with .htaccess yet, that just adds another level of complexity):
SetEnvIfNoCase Request_URI "\.(php)$" no-gzip dont-vary
Try PHP files and see if they are still compressed.
If this works, then add the other condition
SetEnvIfNoCase Request_URI "^/foo/bar/.*" !no-gzip !dont-vary
Finally - put this all in your .htaccess file and Ta-dah!