I get this error when trying to access localhost via a browser.
AH01630: client denied by server configuration
I checked my site folder permissions using:
sudo chmod 777 -R *
Here is my configuration file:
<VirtualHost *:80>
ServerAdmin webmaster#localhost
DocumentRoot /home/user-name/www/myproject
<Directory />
Options FollowSymLinks
AllowOverride all
Allow from all
</Directory>
<Location />
Allow from all
Order Deny,Allow
</Location>
<Directory /home/user-name/www/myproject/>
Options Indexes FollowSymLinks MultiViews
AllowOverride all
Order allow,deny
Allow from all
</Directory>
ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/
<Directory "/usr/lib/cgi-bin">
AllowOverride all
Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all
</Directory>
ErrorLog ${APACHE_LOG_DIR}/error.log
# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
LogLevel warn
CustomLog ${APACHE_LOG_DIR}/access.log combined
Alias /doc/ "/usr/share/doc/"
<Directory "/usr/share/doc/">
Options Indexes MultiViews FollowSymLinks
AllowOverride all
Order deny,allow
Deny from all
Allow from 127.0.0.0/255.0.0.0 ::1/128
</Directory>
If you are using Apache 2.4
You have to check allow and deny rules
Check out http://httpd.apache.org/docs/2.4/upgrading.html#access
In 2.2, access control based on client hostname, IP address, and other
characteristics of client requests was done using the directives
Order, Allow, Deny, and Satisfy.
In 2.4, such access control is done in the same way as other
authorization checks, using the new module mod_authz_host.
The new directive is Require:
2.2 configuration:
Order allow,deny
Allow from all
2.4 configuration:
Require all granted
Also don't forget to restart the apache server after these changes (# service httpd restart)
For all directories write Require all granted instead of Allow from all
Update
If the above doesn't work then also remove this below mentioned line:
Order allow,deny
Double check that the DocumentRoot path is correct. That can cause this error.
I made the same changes that ravisorg suggested to OSX 10.10 Yosemite that upgrades Apache to version 2.4. Below are the changes that were added to http.conf.
<Directory />
AllowOverride none
Require all denied
</Directory>
<Directory /Volumes/Data/Data/USER/Sites/>
AllowOverride none
Require all granted
</Directory>
The problem is in VirtualHost but probablely is not
Require all granted
Confirm your config is correct,here is correct sample
This drove me absolutely nuts for a day and a half but I found a solution if all other solutions have been tried unsuccessfully.
This is for macOS.
Go to activity Monitor (spotlight search for: activity)
In activity monitor search for httpd which is the Apache service
Select the one that belongs to root and click X on the top left to close it.
At that point I immediately stopped getting 403 errors and everything started working as expected. Weird thing is i didn't even have to restart apache it just worked, i guess it restarted itself when i went to my localhost, I honestly don't know but I guess the problem is Apache not actually restarting when using apachectl restart, or stop or start. Hope this helps someone.
For me, all proposed solutions won't worked. This can help, if you use cgi, fastcig or fpm as proxy you have to add a location in your vhost to avoid this problem. This allows 404 to be passthrough proxy.
<Location />
require all granted
</Location>
in my case,
i'm using macOS Mojave (Apache/2.4.34). There was an issue in virtual host settings at /etc/apache2/extra/httpd-vhosts.conf file. after adding the required directory tag my problem was gone.
Require all granted
Hope the full virtual host setup structure will save you.
<VirtualHost *:80>
DocumentRoot "/Users/vagabond/Sites/MainProjectFolderName/public/"
ServerName project.loc
<Directory /Users/vagabond/Sites/MainProjectFolderName/public/>
Require all granted
</Directory>
ErrorLog "/Users/vagabond/Sites/logs/MainProjectFolderName.loc-error_log"
CustomLog "/Users/vagabond/Sites/logs/MainProjectFolderName.loc-access_log" common
</VirtualHost>
all you've to do replace the MainProjectFolderName with your exact ProjectFolderName.
I got resolved my self after spending couple of hours.
I installed Apache/2.4.7 (Ubuntu) through coookbook in vagrant vm.
/etc/apache2/apache2.conf file does not have <VirtualHost *:80> element by default.
I did two changes to get it done
added <VirtualHost *:80>
added
Options Indexes FollowSymLinks
AllowOverride all
Allow from all
then finally I just booted vm..
Has anyone thought about that wamp server default not include the httpd-vhosts.conf file.
My approach is to remove the note below
conf
# Virtual hosts
Include conf/extra/httpd-vhosts.conf
in httpd.conf file.
That is all.
If you tail the error log and reload the page, you should see some more information as to the exact problem.
Grab the environment variables so ${APACHE_LOG_DIR} will actually work...
source /etc/apache2/envvars
Then tail and watch...
tail -f ${APACHE_LOG_DIR}/error.log
This was driving me crazy. Finally figured out what the problem was:
I was using direct paths for the error log and they were wrong.
Why does Apache give a vague (and wrong) error message? Instead use a correct and useful error message like: Path for ErrorLog directive "/wrong/path/and/filename.log" is invalid.
Anyway, to fix make sure your error log directives look something like this:
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
If you are using Apache 2.4 in WampServer on windows OS.
You need to open https-vhosts.conf file in notepad.
C:\wamp64\bin\apache\apache2.4.37\conf\extra\https-vhosts.conf
If you unable to find above file. check screenshot below
<VirtualHost *:80>
ServerName localhost
DocumentRoot c:/wamp64/www
<Directory "c:/wamp64/www/">
Options Indexes FollowSymLinks MultiViews
AllowOverride All
Require local
</Directory>
</VirtualHost>
In above code Replace
Require local
with
Require all granted
And save it. Restart Apache service and try again.
For me, I had actually updated the Allow and Deny rules based on the 2.4 standard.
Require all granted
However, this was still causing me to receive the same AH01630 error. I found another thread and it suggested reinstalling apache2. Somehow this worked! If anyone cares to explain why, that would be helpful.
Credit to: AH01630: client denied by server configuration but require all granted is set (Apache 2.4, CentOs)
...
<IfVersion < 2.4>
Order allow,deny
Allow from all
</IfVersion>
<IfVersion >= 2.4>
Require all granted
</IfVersion>
...
If you have https host then don't forget to make Require all granted changes for ssl config too.
Also, sometimes it's useful to check permissions as the apache user:
# ps -eFH | grep http # get the username used by httpd
...
apache 18837 2692 0 119996 9328 9 10:33 ? 00:00:00 /usr/sbin/httpd -DFOREGROUND
# su -s/bin/bash apache # switch to that user
bash-4.2$ whoami
apache
bash-4.2$ cd /home
bash-4.2$ ls
bash-4.2$ cd mysite.com
bash-4.2$ ls
bash-4.2$ cat file-which-does-not-work.txt
When using Ubuntu check whether the CGI module is enabled. If not:
sudo a2enmod cgi
Ensure that any user-specific configs are included!
If none of the other answers on this page for you work, here's what I ran into after hours of floundering around.
I used user-specific configurations, with Sites specified as my UserDir in /private/etc/apache2/extra/httpd-userdir.conf. However, I was forbidden access to the endpoint http://localhost/~jwork/.
I could see in /var/log/apache2/error_log that access to /Users/jwork/Sites/ was being blocked. However, I was permitted to access the DocumentRoot, via http://localhost/. This suggested that I didn't have rights to view the ~jwork user. But as far as I could tell by ps aux | egrep '(apache|httpd)' and lsof -i :80, Apache was running for the jwork user, so something was clearly not write with my user configuration.
Given a user named jwork, here was my config file:
/private/etc/apache2/users/jwork.conf
<Directory "/Users/jwork/Sites/">
Require all granted
</Directory>
This config is perfectly valid. However, I found that my user config wasn't being included:
/private/etc/apache2/extra/httpd-userdir.conf
## Note how it's commented out by default.
## Just remove the comment to enable your user conf.
#Include /private/etc/apache2/users/*.conf
Note that this is the default path to the userdir conf file, but as you'll see below, it's configurable in httpd.conf. Ensure that the following lines are enabled:
/private/etc/apache2/httpd.conf
Include /private/etc/apache2/extra/httpd-userdir.conf
# ...
LoadModule userdir_module libexec/apache2/mod_userdir.so
For those who stuck at this error as me and nothing helped from above: check if problem folder from error.log actually exists on your server. Mine was generated automatically by Django in wrong place (was messed with static root, then manage.py collectstatic). Have no idea why one can not name errors correctly.
I actually solved this one by adding the directory access to the :80 entry.
<Directory "c:/whatever-directory-you-use/">
AllowOverride All
Require all granted
</Directory>
Before everyone gets all 'security' on me, under my specific circumstances this is not a security issue.
If you are using a remote resource, I'd recommend instead make sure your CURL request goes via HTTPS / TLS, then this directory entry goes on the 443 port.
I ran into this problem when setting up virtual hosts on a new server, and having the sites hosted outside of the standard /var/www/html path. There are two main things that you need to consider (and for example purposes, let's say you are setting up a site in /opt/my-site)
On a default 2.4 installation, there is an explicit denial of access to the root filesystem, and then it only gives access to /var/www/html. So you'll need to give the web server access to read from the new directory with something like this:
<Directory /opt/mysite>
Require all granted
</Directory>
Once you've done that, then make sure the DocumentRoot for your virtual host is somewhere under that path you've granted access to above.
<VirtualHost *:80>
ServerName neatwebsite.com
DocumentRoot /opt/my-site/html
</VirtualHost>
When I ran into this problem, I had all the VirtualHosts set up fine, but forgot to allow read from the directory structure (so I was missing the directive). You'll need to consider both these directives, and how they work together.
For Wamp 3 (Apache 2.4), besides putting the server online as described in the other answers, in the Virtual Hosts file conf/extra/httpd-vhosts.conf
you might need to replace
Require local
with
Require all granted
This is applicable if in httpd.conf you have
Include conf/extra/httpd-vhosts.conf
Beside missing Order and Allow directives mentioned in other answers be aware that a non-matching regular expression of a DirectoryMatch directive may also cause this error.
If the requested path is /home/user-foo1bar/www/myproject/ the folloing matcher won't match
<DirectoryMatch "/home/user-[a-z]+/www/myproject/">
...
</DirectoryMatch>
thus, even a valid access configuration might cause this error.
Because this thread is the first thing that pops up when searching for the error mentioned I would like to add another possible cause for this error: you may have mod_evasive active and the client seeing this error simply has crossed the limits configured in your mod_evasive.conf
This is especially a cause worth investigating if you are suddenly getting this error for a client that had no problems before and nothing else has changed.
(if mod_evasive is the cause then the error will go away by itself if the client just temporarily stops trying to access the site; however it may be a sign that you have configured too tight limits)
In case you are like me and tried all the answers above and none helped, remember to check if your apache2.conf has a line like IncludeOptional conf-enabled/*.conf and then check each of the .conf files in that folder.
Mine had a security.conf file that included this section (which needed to be removed/modified to not block the .php files I needed to run):
<FilesMatch "^(wp-config\.php|php\.ini|php5\.ini|install\.php|php\.info|readme\.md|README\.md|readme\.html|bb-config\.php|\.htaccess|\.htpasswd|readme\.txt|timthumb\.php|error_log|error\.log|PHP_errors\.log|\.sv$
Require all denied
</FilesMatch>
I don't know where that file came from unless at some point I went down a rabbit hole of trying to secure my apache configuration, found this conf, added it, and then totally forgot about it.
One obscure (having just dealt with it), yet possible, cause of this is an internal mod_rewrite rule, in the main config file (not .htaccess) that writes to a path which exists at the root of the server file system. Say you have a /media directory in your site, and you rewrite something like this:
RewriteRule /some_image.png /media/some_other_location.png
If you have a /media directory at the root of your server, the rewrite will be attempted to that (resulting in the access denied error) rather than the one in your site directory, since the file system root is checked first by mod_rewrite, for the existence of the first directory in the path, before your site directory.
The problem may be that directive is not under < Directory>
https://httpd.apache.org/docs/2.4/mod/mod_authz_host.html#requiredirectives
The directive can be referenced within a < Directory>, < Files>, or < Location> section as well as .htaccess files to control access to particular parts of the server. Access can be controlled based on the client hostname or IP address.
I got another one that may be useful to someone.
Was receiving the same error message after upgrading from PHP 5.6 => 7.0. We had changed the PHP upload settings, and forgot to change once copied over.
Even though i wasn't uploading images at the time, Silverstripe (our CMS) was refusing to save and throwing that error. Increased the image upload size and it worked straight away.
In case this helps anyone Googling around like I was, I had this error message trying to access a SVG file on my server, e.g. https://example.com/images/file.svg. Other file types seemed fine, just SVG were failing.
I hunted around /etc/httpd conf files and checked every require all denied type of configuration, and just could not find what config was having this effect.
I turned LogLevel to debug in the VirtualHost config and could see the mod_authz_core logging specifying there was a 'Require all denied' in effect:
[Mon Jun 10 13:09:54.321022 2019] [authz_core:debug] [pid 23459:tid 140576341206784] mod_authz_core.c(817): [client 127.0.0.1:54626] AH01626: authorization result of Require all denied: denied
[Mon Jun 10 13:09:54.321038 2019] [authz_core:debug] [pid 23459:tid 140576341206784] mod_authz_core.c(817): [client 127.0.0.1:54626] AH01626: authorization result of <RequireAny>: denied
[Mon Jun 10 13:09:54.321082 2019] [authz_core:error] [pid 23459:tid 140576341206784] [client 127.0.0.1:54626] AH01630: client denied by server configuration: /home/blah/htdocs/images/file.svg
Through blind testing I moved the file to the root of the web root, and found I could then access it at https://example.com/file.svg .. so it only failed in the 'images' folder. This led me to an .htaccess file in the images folder that I had no idea was there.
Turns out Zen Cart 1.5 comes with an images/.htaccess file that has:
# deny *everything*
<FilesMatch ".*">
<IfModule mod_authz_core.c>
Require all denied
</IfModule>
<IfModule !mod_authz_core.c>
Order Allow,Deny
Deny from all
</IfModule>
</FilesMatch>
# but now allow just *certain* necessary files:
<FilesMatch "(?i).*\.(jpe?g|gif|webp|png|swf)$" >
<IfModule mod_authz_core.c>
Require all granted
</IfModule>
<IfModule !mod_authz_core.c>
Order Allow,Deny
Allow from all
</IfModule>
</FilesMatch>
This was very annoying and I hope this might remind others to check .htaccess files at every level of the file system leading to the file you're having trouble accessing in case there is this kind of tom foolery going on.
This "bug" is actually the new normal behavior of Apache 2.4. In my case, I had a very specific rule to deny access to any folder or file with name starting with ".", so I had to set an exception for a particular public folder that requires such odd name.
For the record my particular Rewrite rule is:
RewriteRule "(?!\.trusted)(^|/)\." - [F]
This rule [F]obits everything starting with "." but .trusted, thanks to the magic of regex "?!" negation.
Related
i have a Apache2 Webserver with a VirtualHost which has a Directory Directive looking like this:
<Directory /path/to/folder>
Order allow,deny
Allow from all
Require all granted
Options Indexes
</Directory>
The path/to/folder includes some static HTML files. In these HTML files ive got link tags which refer to documents on the webserver.
When clicking or hovering on the link, the URL gets fully encoded (eg, / will be encoded to %5C). Those links do not work. Why dont they work?
I already tried the AllowEncodedSlashes NoDecode option, this did not work.
I try to run a Mezzio application on my server I do the following steps :
- Create a Mezzio project
composer create-project mezzio/mezzio-skeleton symphonie
I choose modular application, fast router, service manager, plates renderer and Whoops
I create my virtual host like this :
<VirtualHost *:80>
Alias /symphonie "/data/symphonie/public"
<Directory "/data/symphonie">
Options Indexes MultiViews FollowSymlinks
AllowOverride All
Require all granted
</Directory>
</VirtualHost>
But when I enter this url in Google Chrome : https://app.inra.fr/symphonie/ I got a 404 error.
I have no messages in the apache logs. By cons all the links on the page redirects me to https://app.inra.fr/ and not to https://app.inra.fr/symphonie/
Here is the configuration of my server:
Centos 8
Apache 2.4 with rewrite module enabled
PHP 7.3
Do you have any leads to solve my problem?
thanks in advance
Shishi
Mezzio application does not have built-in support for base path.
You would need to handle following aspects:
web server rewrites for subfolder. Looks like you did that part.
middleware will need to be piped early to remove base path from request object before passing it further
base path url helper.
Mezzio provides url helper in mezzio/mezzio-helpers package. Middleware from the previous step could also be used to inject base path into url helper. If some of your middlewares use different ways to handle urls, those will need to be setup as well.
Mezzio documentation have the page covering this use case:
https://docs.mezzio.dev/mezzio/v3/cookbook/using-a-base-path/
It is not too detailed and pull requests to improve it are welcome ;)
I'm running proftpd on my debian server, and all users are managed through ftpdb in mysql.
Currently i am trying to hide specific folders from specific user that i have.
I've tried to add the following to my proftpd conf file:
<Directory />
HideFiles (\-specific)$
</Directory>
and its hiding all folders that contain "-specific" in its name, but they are hidden from all users.
Any help will be well appreciated.
Thank you!
The best way to limit the actions of a user is to use the Limit directive:
<Directory /myfolder>
<Limit DIRS>
DenyUser bob
</Limit>
</Directory>
See more information on the Limit and Directory directives in the proftpd howto.
I have no experience in setting up servers.
So... How do you set up a web server that hosts documents like the example below?
For this, the basic apache installation should work. If the directory does not contain an index.html (or any of the other valid indices), then it will display the directory contents as per your example.
NB: Ensure that your apache config permits directory listing, by including Options Indexes andThenSomeOtherStuffMostLikely in the directory-definition in your httpd.conf. For example, here's the default on my webserver, which permits it:
<Directory "/usr/www/default/http">
Options Indexes FollowSymLinks
AllowOverride None
Order allow,deny
Allow from all
</Directory>
Now, if apache has both read and execute access to the directory (which it should have), then it'll display as an index.
I'm trying to test a facebook connect on my localhost. In the app settings in the facebook developer page I tried changing "Website with facebook login" to http://localsitename:8888 but I am given an error:
Error
Site URL must be a URL with a valid domain.
I have read where others have managed to do this successfully. Has anyone found a work around for this?
Looks like Facebook accepts only localhost as a domain name without any dots in it.
Make your local site’s name something with a dot in it, then it should work fine – localsite.name, mytestsite.local or even foo.bar …
CBroe's answer worked well!
If you are working with xampp, you can edit \xampp\apache\conf\extra\httpd-vhosts.conf and add another virtual host. also edit \windows\system32\driver\etc\hosts file and add a reference to the new host.
hosts: added
technotronic.fb 127.0.0.1
httpd-vhosts.conf:
added
<VirtualHost *>
DocumentRoot "E:/xampp/htdocs/technotronic"
ServerName technotronic.fb
<Directory "E:/xampp/htdocs/technotronic">
Order allow,deny
Allow from all
</Directory>
</VirtualHost>
This fixed the problem for me.