How to disable mod_deflate for PHP using the.htaccess file
for files in a specific directory
for all files that have extension of, for example .php?
I have tried both:
# for URL paths that begin with "/foo/bar/"
SetEnvIf Request_URI ^/foo/bar/ no-gzip=1
# for files that end with ".php"
<FilesMatch \.php$>
SetEnv no-gzip 1
</FilesMatch>
They don't work, I can't figure out why. At this point I want to disable it completely for all files in the directory.
Check this out. and reply if it works for you:
SetEnvIfNoCase Request_URI ".php$" no-gzip dont-vary
Let's simplify the issue and slowly make it more complicated as you get each piece working.
First, you stated in a comment: "When I try to access a specific php file on the server FF tells me that that there is a compression problem"
First, see if the issue happens when you disable mod_deflate. Comment out the deflate lines from your apache configuration file(s).
If that indeed fixes it, re-enable mod_deflate, then add these to your virtualhost that is hosting your site (don't mess around with .htaccess yet, that just adds another level of complexity):
SetEnvIfNoCase Request_URI "\.(php)$" no-gzip dont-vary
Try PHP files and see if they are still compressed.
If this works, then add the other condition
SetEnvIfNoCase Request_URI "^/foo/bar/.*" !no-gzip !dont-vary
Finally - put this all in your .htaccess file and Ta-dah!
Related
I want to be able to run my Mojolicious Lite app on shared hosting either from root (www.domain.com/) or subfolder (www.domain.com/misc/mymojoapp/).
The app's .pl file always goes to cgi-bin folder of the domain (www.domain.com/cgi-bin/myapp.pl) and I want to use mod_rewrite rules in .htaccess to point to the app. Images/css/js files would be under www.domain.com/misc/mymojoapp/support.
But I cannot figure out how I reliably get misc/mymojoapp/ part of the path so I can pass it into templates. Is there a way?
# set apache handler to treat your specified script name(s) as a CGI program
Options +ExecCGI
<Files ~ "(mymojoapp)$">
SetHandler cgi-script
</Files>
# rewrite any requests into the appRewriteEngine onRewriteCond %{REQUEST_FILENAME}
!-fRewriteRule ^(.*)$ misc/mymojoapp/$1 [L]
and in your App
# set env variable to use root for pretty URLs
$ENV{SCRIPT_NAME} = '/';
Change the above setting for pretty URL
Im using AMPPS, I can't access my perl file, but I can every other file in the folder.
I'm getting a 403 forbidden.
Here's the code:
[Fri Jan 24 01:18:10.069985 2014] [cgi:error] [pid 38441] [client ::1:51823] Options
ExecCGI is off in this directory: /Applications/AMPPS/www/test/test1.pl, referer:
http://localhost/test/test1.html
I'm calling the perl file via an AJAX POST call
Things I have tried:
Adding +ExecCGI to every < Directory > structure I can find
Adding AddHandler cgi-script .cgi .pl to the aforementioned structures
Setting 777 permissions recursively the containing folder, along with the cgi-bin directory set by ScriptAlias in the httpd.conf.
Is there anyone that can give me some insight as to either:
What I'm doing wrong
How to configure the httpd.conf standard config file to run CGI scripts from the /www folder using AMPPS
Thanks in advance
Every thing looks fine. Just make sure that you include your httpd.conf in the apache2.conf file and recheck the directory path where you are giving your +ExecCGI and AddHandler options.
I am not sure about AMPPS, but thinking apache server settings will be same.
Looking at the log , looks like your directory path should be : /Applications/AMPPS/www/ .
And make sure you restart the server after each change you make.
I have been trying to run a simple perl-cgi script on windows 7. This is a simple HTML form with an OK button where clicking on OK button displays some text. But clicking the OK button on the HTML page, instead of executing and displaying perl file's output, the browser starts downloading the script. I have added handler in httpd.conf
AddHandler cgi-script .pl
But this doesn't help. I added the ExecCGI option in the httpd.conf but that didn't help either.
<Directory "C:/Program Files/Apache Software Foundation/Apache2.2/cgi-bin">
AllowOverride None
Options ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all
</Directory>
Here is the perl script being used:
#!C:\Perl\bin\perl
use CGI;
print "Content-type: text/plain","\n\n";
print "<html>","\n";
print "<head>\n\t<title>Server Information Ready</title>\n</head>","\n";
print "<body>","\n";
print "<h1>Server Information</h1>","\n";
print "</body></html>","\n";
And here is the html file:
<html>
<head><title>Server Information</title></head>
<body bgcolor="#EEAA55">
<h3>Please click OK to get the server information</h3>
<hr><pre>
<form action="http://localhost/cgi-bin/ctest/pranav1a.pl" method="post">
<input type="submit" value="OK">
</form>
</hr></pre>
</body>
</html>
I have tried this on chrome, IE and Mozilla. Mozilla and chrome start the perl file download, but IE just displays some weird content on clicking the OK button. How can I make the browser display output of file execution rather than starting the script download ?
PS: I have tried to use shebang line as '#!c:/Perl/bin/perl' which doesn't seem to work either. I am able to see the perl script's output when executed from cmd prompt.
Found the solution: in my case I was using 'localhost' in form action instead of 'localhost:8080'.
<form action="http://localhost:8080/cgi-bin/pranav1a.pl" method="post">
How about the +ExecCGI option? Try the + in front of it.
In addition, these is usually a problem with understanding suexec2 (not sure whether suexec2 applies to a Windows platform):
Read the whole page over there. Such problems are not fixable when you do not understand the gross limitations enforced by suexec. Common errors are:
wrong permissions on suexec2 executable.
CGI script is located in wrong location.
Have you worked through apache.org/docs/2.2/howto/cgi.html#troubleshoot yet?
Thanks Scavokovich.
In my case the following line was commented out in the httpd.conf file.
#LoadModule cgi_module /opt/freeware/lib64/httpd/modules/mod_cgi.so
Uncommenting it, and restarting apache allowed the CGI script to run instead of open as a text file.
Disclaimer: I know this questions sounds lame. But I am no n00b and I have done whatever I know and I could find help about this. I have already searched the forum for this and tried all the fixes given but none of them helped me hence this question.
The threads I have visited
https://askubuntu.com/questions/147348/bugzilla-testserver-pl-failing
http://www.thesitewizard.com/archive/addcgitoapache.shtml
Bugzilla error after installation: "TEST-FAILED Web Server is not executing CGI files"
Now with that
My Exact problem
I have installed bugzilla on a bitnamil lampstack. The lampstack already has two other applications up and running successfully. After my bugzilla installation when I am trying to visit the page I can see my whole perl script on the borwser.
Running it's own server check reveals me the following
TEST-OK Webserver is running under group id in $webservergroup.
TEST-OK Got padlock picture.
TEST-FAILED Webserver is fetching rather than executing CGI files.
What I have done in my setup
The bugzilla.conf file (which gets pulled in httpd.conf) has the
following settings enabled
> AddHandler cgi-script .cgi .pl
>
> Options +MultiViews +ExecCGI
>
> DirectoryIndex index.cgi
>
> AllowOverride All
The "AddHandler cgi-script .cgi .pl" is already enabled in my httpd.conf file.
I have not enabled separately +ExecCGI for all directories in httpd.conf but even that does not solve the problem
What am I doing wrong here?
You should have a directory block in your bugzilla.conf that looks something like this:
<Directory "/usr/local/apache2/htdocs/bugzilla">
AddHandler cgi-script .cgi
Options +ExecCGI +FollowSymLinks
DirectoryIndex index.cgi index.html
AllowOverride Limit FileInfo Indexes Options
AddType application/vnd.mozilla.xul+xml .xul
AddType application/rdf+xml .rdf
</Directory>
I believe you don't want the .pl to be 'handled'. And having All for AllowOverride is a security issue. The FollowSymLinks one is because my bugzilla directory in htdocs is a symlink to somewhere else on the system.
Did you run the checksetup.pl? It should have adjusted all the permissions for you, but check to see that the group that your web server runs as has read and execute permissions.
I had a similar problem with another package downloading the CGI files instead of executing them on the server. The answer to my problem was that on Ubuntu server 14.04, the module CGI on apache was disabled.
To fix:
sudo a2enmod cgi
sudo service apache2 restart
To check if the module is loaded, on Ubuntu:
apache2ctl -M | grep cgi
cgi_module (shared)
yellavon's answer solved my problem:
cgi script is not executing
Here's a copy of his answer:
Make sure you are loading the CGI module in the httpd.conf file:
LoadModule cgi_module modules/mod_cgi.so or LoadModule cgi_module modules/mod_cgid.so depending on which version of Apache you are running.
You can also read about additional solutions for Dynamic Content with CGI.
So basically i have a problem where the user will send a request to test.php?getnextfile=1, and i need to process the request and figure out what is the next file in line to be downloaded, and deliver it to the user. The part that i'm stuck on is how to get the correct filename to the user (the server knows the correct file name, user doesn't).
Currently i've tried to use wget on the test.php?getnextfile=1, and it doesn't actually save with the correct filename. Also tried to header redirect to the correct file, and that doesn't work either.
Any ideas?
Thanks a lot!
Jason
Since Jul 2010, this is impossible in default wget configuration. In process of fixing security bug, they switched "trust-server-names" option off by default. For more information, see this answer:
https://serverfault.com/questions/183959/how-to-tell-wget-to-use-the-name-of-the-target-file-behind-the-http-redirect
In your php script you need to set the 'Content-Disposition' header:
<?php
header('Content-type: application/zip');
header('Content-Disposition: attachment; filename="myfile.zip"');
readfile('myfile.zip');
?>
Use curl instead of wget to test it.
curl --remote-name --remote-header-name http://127.0.0.1:8080/download