Design Advise: Sending signals to daemons through HTTP - perl

I'm using Apache on Ubuntu. I have a Perl script which basically read the files names of a directory, then rewrites a text file, then sends a signal to a daemon. How can this be done, as secure as possible through a web-page?
Actually I can run a simplified cgi in the code below, but not if I remove the comments. I'm looking for advise considering any of:
Using HTTP Requests?
How about Apache file permissions on the directory shown in code?
Is htaccess enough to enable user/pass access to the cgi?
Should I use a database instead of writing to a file and run a cron querying the db with permission granted to write and send the signal?
Granting as less permissions as possible to the webserver.
Should I set a VPN?
#!/usr/bin/perl -wT
use strict;
use CGI;
##fileList = </home/user/*>; #read a directory listing
my $query = CGI->new();
print $query->header( "text/html" ),
$query->p( "FirstFileNameInArray" ),
#$query->p( $fileList[0] ), #output the first file in directory
$query->end_html;

Presumably, the error you're getting from the commented lines is a permission denied when trying to read the /home/user directory. The way to fix this is (surprise, surprise) to give the apache user[1] to read that directory. There are three primary approaches to doing this:
In most environments, there's
really no good reason to hide all
filenames within a user's home
directory, so you could make the
directory world-readable with chmod
a+r /home/user. Unless you have a
specific reason to prevent the
general public from knowing the
names of the files in the user's
home directory, I'd tend to
recommend this approach.
If you want to be a bit more
restrictive about it, you could
change /home/user to be owned by a
group which the apache user belongs
to (or add the apache user to the
group that currently owns
/home/user) and then set
/home/user to be group-readable.
This will make it accessible to all
members of that group, but not the
general public.
If you need to have standard
filesystem permissions applied to
web access, you can look at
configuring suexec so that
individual requests can take on
permissions of users other than the
apache user. This is normally the
user who owns the code which is
being run to handle the request
(e.g., in this case, the user who
owns your directory-listing script),
but, if you're using htaccess-based
authentication, it may be possible
to configure suexec to decide
which user's permissions to take on
based on what user you log in as.
(I avoid suexec myself, so I'm not
100% certain if this can be done and
have no idea how to go about it if
it can.)
[1] ...by which I mean the user that apache is running as; depending on your system config, this user may be named "apache", "httpd", "nobody", "www-data", or something else entirely.

Related

Is it possible to load a user command from an in-memory namespace?

From what I can tell, user commands can only be loaded from namespace scripts located in the directories specified by the SALT cmddir setting.
But I have an interest in loading a user command directly from an in-memory namespace, without ever having a namespace script reside on a locally accessible disk.
An example use case might be loading a namespace that defines one or more user commands from a remote repository via ]get, and then "installing" the user commands into the workspace directly from memory.
Is this possible?
Bad news: No, you cannot currently do that.
Good news: I'm working on a rewrite of the user command system which makes this trivial to do.
Source: I'm in charge of the user command system at Dyalog.

Default UNIX permissions of Mongodb files in the hard drive

I noticed that the files in the data/ directory, hosting the databases and collections, are the r permission for others.
So basically, anyone can read the data! Isn't it strange, or is it something I'm missing?
I found no solution to change this behavior in the mondodb configuration (ubuntu 18.04). When you search mongodb file permissions, you will find threads about user permissions inside the database.
Thank you!
Im going to assume you're using WiredTiger, the default storage engine for mongo. Either way, the same concept applies.
You'll see the .wt files (the ones you're talking about), although readable by permission, are not very readable to the eye. Try look for yourself with less <example>.wt.
They're stored in a specific format, with compression and some encryption. Realistically, they shouldn't be able to be retrieved from outside of your server - and your users in the server should trusted, or given limited access to the locations of these files.
In short, if you apply the proper policies, and keep your actual database and server secure, then this is normal and expected. I hope this makes sense.
When you launch mongod you need to specify a path to the data directory, and this directory must already exist.
You can set the permissions on this directory to deny world-read access by running:
chmod o-rwx /path/to/data/dir
Normally this would be done prior to the first start of mongod.
Once this is done, none of the files in the data directory will be world-readable regardless of their individual permissions.
MongoDB does not need to have a provision to do this because it never creates the data directory.
A different way of accomplishing similar end result is to use umask, but changing permissions on data directory generally would be more reliable.

three ways to let PHP and a regular user edit the same files

I am a web developer, and for some upcoming projects I would like to use a file-based CMS. This means that many of the files I create at the start must be editable by the PHP user later, but also remain editable for my user (and also the other way around). My PC runs Debian 9, which I love but am not super knowledgeable about, and I have also just set up a local network server with Debian 9 for backups and possibly file sharing. (I'm using Webmin to configure this, which reflects my level of command line skills).
On my online shared hosting server, the PHP user and the FTP user seem to be the same, and 644/755 permissions work fine, this is also recommended by the CMS I'm using. I would like to mimic this on my computer so I don't have to fiddle with permissions all the time. But how do I do this? Currently, my regular user (anna) does not have access to www-data's files and vice versa. Putting them in the same group still means changing file permissions. Making anna the PHP user is a Bad Idea (as far as I understand it) because anna has sudo permissions.
So far I have researched three possible solutions that I don't really know very much about, and I would like to know which is the best route to take.
Develop locally on my computer and use apache-mpm-itk or suPHP to let PHP edit the files (I got that idea from this question on ServerFault).
Develop locally on my computer and rsync the files to my server with grunt-rsync, and somehow get rsync to set the ownership to www-data (another ServerFault thread helping here).
Mount the project's server directory, which is owned by www-data, on my computer with SSHFS and then either edit the files on the server directly or copy them over from my local directory with grunt-copy.
What do you think: from a security and ease of use perspective, which is the best way? Or do you know an even better one?
Thank you for taking the time to read and think about this!
Anna~
I figured it out! I finally ended up reading about running PHP as CGI instead of as an Apache module, and that this would solve my permissions problem. Plus, as far as I understand it, there are no extra security precautions to take when I'm the only one working with it on my local computer.
In case someone comes across this who might find it helpful, here's what I did (basically following these instructions):
I installed php7.0-fpm
Edited /etc/apache2/sites-enabled/000-default.conf and put the following just before </VirtualHost>:
DirectoryIndex index.php
<LocationMatch "^(.*\.php)$">
ProxyPass fcgi://127.0.0.1:9000/var/www/html
</LocationMatch>
I activated the Apache module proxy_fcgi (via Webmin, which apparently does an automatic Apache restart)
In /etc/php/7.0/fpm/pool.d/www.conf I commented out a listen line and put another below like this:
; listen = /run/php/php7.0-fpm.sock
listen = 127.0.0.1:9000
I then restarted PHP-FPM with this command: /etc/init.d/php7.0-fpm restart (a little different from the instructions, I'm on Debian 9). After that, phpinfo() gave me the Server API "FPM/FastCGI".
And finally, I changed the user and group from www-data to anna in three places, twice in /etc/php/7.0/fpm/pool.d/www.conf and then once more in /usr/lib/tmpfiles.d/php7.0-fpm.conf (this last bit may be Ubuntu/Debian specific, my thanks go to Keith for a comment on StackExchange).
And that was it! :-)

need perl script to connect to database, but don't want the password in plain text [duplicate]

When a PHP application makes a database connection it of course generally needs to pass a login and password. If I'm using a single, minimum-permission login for my application, then the PHP needs to know that login and password somewhere. What is the best way to secure that password? It seems like just writing it in the PHP code isn't a good idea.
Several people misread this as a question about how to store passwords in a database. That is wrong. It is about how to store the password that lets you get to the database.
The usual solution is to move the password out of source-code into a configuration file. Then leave administration and securing that configuration file up to your system administrators. That way developers do not need to know anything about the production passwords, and there is no record of the password in your source-control.
If you're hosting on someone else's server and don't have access outside your webroot, you can always put your password and/or database connection in a file and then lock the file using a .htaccess:
<files mypasswdfile>
order allow,deny
deny from all
</files>
The most secure way is to not have the information specified in your PHP code at all.
If you're using Apache that means to set the connection details in your httpd.conf or virtual hosts file file. If you do that you can call mysql_connect() with no parameters, which means PHP will never ever output your information.
This is how you specify these values in those files:
php_value mysql.default.user myusername
php_value mysql.default.password mypassword
php_value mysql.default.host server
Then you open your mysql connection like this:
<?php
$db = mysqli_connect();
Or like this:
<?php
$db = mysqli_connect(ini_get("mysql.default.user"),
ini_get("mysql.default.password"),
ini_get("mysql.default.host"));
Store them in a file outside web root.
For extremely secure systems we encrypt the database password in a configuration file (which itself is secured by the system administrator). On application/server startup the application then prompts the system administrator for the decryption key. The database password is then read from the config file, decrypted, and stored in memory for future use. Still not 100% secure since it is stored in memory decrypted, but you have to call it 'secure enough' at some point!
This solution is general, in that it is useful for both open and closed source applications.
Create an OS user for your application. See http://en.wikipedia.org/wiki/Principle_of_least_privilege
Create a (non-session) OS environment variable for that user, with the password
Run the application as that user
Advantages:
You won't check your passwords into source control by accident, because you can't
You won't accidentally screw up file permissions. Well, you might, but it won't affect this.
Can only be read by root or that user. Root can read all your files and encryption keys anyways.
If you use encryption, how are you storing the key securely?
Works x-platform
Be sure to not pass the envvar to untrusted child processes
This method is suggested by Heroku, who are very successful.
if it is possible to create the database connection in the same file where the credentials are stored. Inline the credentials in the connect statement.
mysql_connect("localhost", "me", "mypass");
Otherwise it is best to unset the credentials after the connect statement, because credentials that are not in memory, can't be read from memory ;)
include("/outside-webroot/db_settings.php");
mysql_connect("localhost", $db_user, $db_pass);
unset ($db_user, $db_pass);
If you are using PostgreSQL, then it looks in ~/.pgpass for passwords automatically. See the manual for more information.
Previously we stored DB user/pass in a configuration file, but have since hit paranoid mode -- adopting a policy of Defence in Depth.
If your application is compromised, the user will have read access to your configuration file and so there is potential for a cracker to read this information. Configuration files can also get caught up in version control, or copied around servers.
We have switched to storing user/pass in environment variables set in the Apache VirtualHost. This configuration is only readable by root -- hopefully your Apache user is not running as root.
The con with this is that now the password is in a Global PHP variable.
To mitigate this risk we have the following precautions:
The password is encrypted. We extend the PDO class to include logic for decrypting the password. If someone reads the code where we establish a connection, it won't be obvious that the connection is being established with an encrypted password and not the password itself.
The encrypted password is moved from the global variables into a private variable The application does this immediately to reduce the window that the value is available in the global space.
phpinfo() is disabled. PHPInfo is an easy target to get an overview of everything, including environment variables.
Your choices are kind of limited as as you say you need the password to access the database. One general approach is to store the username and password in a seperate configuration file rather than the main script. Then be sure to store that outside the main web tree. That was if there is a web configuration problem that leaves your php files being simply displayed as text rather than being executed you haven't exposed the password.
Other than that you are on the right lines with minimal access for the account being used. Add to that
Don't use the combination of username/password for anything else
Configure the database server to only accept connections from the web host for that user (localhost is even better if the DB is on the same machine) That way even if the credentials are exposed they are no use to anyone unless they have other access to the machine.
Obfuscate the password (even ROT13 will do) it won't put up much defense if some does get access to the file, but at least it will prevent casual viewing of it.
Peter
We have solved it in this way:
Use memcache on server, with open connection from other password server.
Save to memcache the password (or even all the password.php file encrypted) plus the decrypt key.
The web site, calls the memcache key holding the password file passphrase and decrypt in memory all the passwords.
The password server send a new encrypted password file every 5 minutes.
If you using encrypted password.php on your project, you put an audit, that check if this file was touched externally - or viewed. When this happens, you automatically can clean the memory, as well as close the server for access.
Put the database password in a file, make it read-only to the user serving the files.
Unless you have some means of only allowing the php server process to access the database, this is pretty much all you can do.
If you're talking about the database password, as opposed to the password coming from a browser, the standard practice seems to be to put the database password in a PHP config file on the server.
You just need to be sure that the php file containing the password has appropriate permissions on it. I.e. it should be readable only by the web server and by your user account.
An additional trick is to use a PHP separate configuration file that looks like that :
<?php exit() ?>
[...]
Plain text data including password
This does not prevent you from setting access rules properly. But in the case your web site is hacked, a "require" or an "include" will just exit the script at the first line so it's even harder to get the data.
Nevertheless, do not ever let configuration files in a directory that can be accessed through the web. You should have a "Web" folder containing your controler code, css, pictures and js. That's all. Anything else goes in offline folders.
Just putting it into a config file somewhere is the way it's usually done. Just make sure you:
disallow database access from any servers outside your network,
take care not to accidentally show the password to users (in an error message, or through PHP files accidentally being served as HTML, etcetera.)
Best way is to not store the password at all!
For instance, if you're on a Windows system, and connecting to SQL Server, you can use Integrated Authentication to connect to the database without a password, using the current process's identity.
If you do need to connect with a password, first encrypt it, using strong encryption (e.g. using AES-256, and then protect the encryption key, or using asymmetric encryption and have the OS protect the cert), and then store it in a configuration file (outside of the web directory) with strong ACLs.
Actually, the best practice is to store your database crendentials in environment variables because :
These credentials are dependant to environment, it means that you won't have the same credentials in dev/prod. Storing them in the same file for all environment is a mistake.
Credentials are not related to business logic which means login and password have nothing to do in your code.
You can set environment variables without creating any business code class file, which means you will never make the mistake of adding the credential files to a commit in Git.
Environments variables are superglobales : you can use them everywhere in your code without including any file.
How to use them ?
Using the $_ENV array :
Setting : $_ENV['MYVAR'] = $myvar
Getting : echo $_ENV["MYVAR"]
Using the php functions :
Setting with the putenv function - putenv("MYVAR=$myvar");
Getting with the getenv function - getenv('MYVAR');
In vhosts files and .htaccess but it's not recommended since its in another file and its not resolving the problem by doing it this way.
You can easily drop a file such as envvars.php with all environment variables inside and execute it (php envvars.php) and delete it. It's a bit old school, but it still work and you don't have any file with your credentials in the server, and no credentials in your code. Since it's a bit laborious, frameworks do it better.
Example with Symfony (ok its not only PHP)
The modern frameworks such as Symfony recommends using environment variables, and store them in a .env not commited file or directly in command lines which means you wether can do :
With CLI : symfony var:set FOO=bar --env-level
With .env or .env.local : FOO="bar"
Documentation :

Can't create folder using mkdir in perl, permission denied?

I'm trying to create a directory with a perl script after calling it via Ajax through a web interface. I'm using IIS7.5 to run my webserver.
The problem arises when I use either mkdir($path) or system("mkdir", $path), with the errors being "Permission denied" or "Access is denied", respectively. I believe I've set up the permissions correctly to give read/write/execute permissions as well as special permissions to create files and folders to the script.
Please let me know also if this should should be posted elsewhere, thanks!
I would think you could find the user running the script with "whoami".
Also, I'm not clear on what context the script is running in, nor where is the directory it is trying to create. It might be necessary to add the "-p" option so that necessary parent directories are created.
No comment on the safety of this ... I assume that those checks are being made elsewhere.
Please keep in mind that , the folder that you are planning to create this new folder in , must be owned by apache ( or any other web server software, you might be using ).
sudo chown apache *rootfolderfornewfolder*
I hope this helps

Categories