I have a website which runs in Perl cgi files. When a user logs in it creates a new session using Perl CGI::Session.
The problem comes from accessing two duplicated websites located under different user directories. For example, www.abc.edu/~AAA/project/ and www.abc.edu/~BBB/project/
These are exactly the same website on the same machine, so they share the same /tmp directory.
When I login to AAA's website (~AAA/project/*), it creates a session cookie on my
computer, in which the domain name is abc.edu. Then it creates session
information in /tmp directory which is owned by ‘AAA’, because the owner of the script is supposed to be 'AAA'.
Then if I access BBB's website (~BBB/project/*), it tries to use the session info
stored on my computer because the domain name is the same. However,
the session info stored in /tmp is owned by ‘AAA’, it cannot read or write the session information.
[edit] This is like A/B testing websites, and I agree that they should not share the sessions information.
I am thinking that the session information stored in /tmp should be readable and writable by anyone in this case to resolve the issues.
[edit] I realized the security issues that #simbabque pointed out, and also I found that -path parameter of session cookies can be used to differentiate those two groups of users. So now my question is what if I indeed want to use common authentication system between those two website, how can I share the session information without causing security issues? What is the typical way to handle in this A/B testing and shared authentication system? Thanks for your helps.
I was planning to write a long answer with an example application, but after rereading your comments and the question I think the answer is rather simple:
If you intend to use one login mechanism and the site's users are aware of this, then there is no security concern. It's being done all the time. A lot of systems today are made up of more then just one program to form one application, and they need to do that.
If the ownership of the files in the temp directory is a problem because the applications run as different system users, then simply don't use files as the session storage. Use a database or a key/value-store for example.
Or you could put both users into the same group and make the files group-read-writable. There are a lot of solutions here.
Related
I am trying to make a online chatting application.
How do I make a variable persist across users logged in from machines from different places in PHP?
I tried using session_write_close method but didn't work out.
Please guide with a step by step procedure. Many thanks.
Sessions are per-browser. The session id is most commonly stored in browser cookies, or it can be passed around directly with each request. Either way, you cannot use the same session on different computers, not even multiple browsers on the same computer.
Instead of persisting the session, you can persist variables the same way you store user configurations, for example by saving settings in a database. The variables you want to store should be tied to the user account, not to the session.
We are using perforce in my company and heavily rely on it. I need some suggestion for the following scenario:
Our Depot structure is something like this:
//depot
/product1
/component1
/component2
.
.
/componentN
/*.java
/*.xml
/product2
/component1
/component2
.
.
/componentN
/*.java
/*.xml
Every product has multiple components and every component consist of java or xml or some other program file. Every component has a manager/owner associated with it.
Right now, we have blocked the write permissions for every user and only when it is approved by the manager/owner after code review, we open the write permission for that user for any file/folder to check in. This process becomes a little untidy because the manager/developer have to wait for perforce admin to allow permissions (update protections table of perforce). Also, we give them a window of only 24 hrs to check in (due to agile, which i dont understand much :)), after which we are supposed to block the write access again for that user.
What I am looking for is a mechanism where perforce admins can delegate this responsibility to respective managers/owners without giving them super user or admin access and which automatically disables the write permission after 24 hrs.
Any suggestions ?
Thanks in advance.
There's nothing to do this out of the box, per se.
The closest thing I can think of is if the mainline version of these components were permissioned by a group with an owner. The owner of the group is allowed to add and remove members from the group, thus delegating the permissioning to the "gatekeeper" rather than the admins, themselves.
Let me know if you require further clarification about this.
One common solution is to build a simple tool which reads and writes the protections table, the group memberships, etc., to implement the policies that you desire.
The protections and groups data are not complex in format, and you can easily write a little bit of text-processing code that writes and re-writes these specs according to your needs.
Then install your tool on the server machine in a secure fashion, granting the tool the rights to update the protections table, and have your component administrators use the tool to manage the permissions.
For example, I've seen this done by writing a small web application, in Java or Perl for example, installing that on a web server on a secure machine, and letting the component admins operate that tool through a web interface.
All your tool has to provide is (a) a simple login/logout mechanism for your component admins (the web server may already do this for you), (b) a command that takes a user name and a folder name and grants permission, and (c) a command (or a timer) that removes that permissions subsequently.
Currently in Postgres the largest security hole is the .conf files that the database relies on, this is because someone with access to the system (not necessarily the database) can modify the files and gain entry. Because of this I am seeking out resources on how to encrypt those .conf files and then decrypt them during each session of the database. Performance is not really an issue at this point. Does anyone have any resources on this or has anyone developed any prototypes that utilize this functionality?
Edit
Since there seems to be some confusion here about what it is I am asking. The scenario can best be illustrated on a Windows box with the following groups:
1) Administrators System Administrators
2) Database Administrators Postgres Administrators
3) Auditors Security Auditors
The Auditors group typically needs access to log files and configuration files to ensure system security. However, the issue comes when a member of the Auditors group needs to view the Postgres configuration and log files. If this member decides that they want to access the database even though they do not have a database account it is a very short task to break in . How does one go about preventing this? Answers such as: Get better auditors are quite poor as you can never fully predict what people will do.
You are fine. No need to encrypt, so long as you have permissions on the *.conf files correct.
Your postgresql.conf and pg_hba.conf should both be marked as readable only by the postgres user/group. If you don't have actual users with those permissions, then only root can see them.
So, are you trying to prevent root from making changes? Cause just a normal user can't change those files, and if you don't trust root, you've already lost.
I think you might be stuck - here's what you said:
The Auditors group typically needs access to log files and configuration files
and then:
How does one go about preventing [Auditors from accessing the database using the values in the configuration files]?
If you really want to let Auditors get at your config files but are nervous about them accessing your database, your best bet would be to move your config files off of your server to somewhere else - and then make sure Auditors don't actually have access to your production systems. They could still look at the log files all they wanted, but they wouldn't be able to access the database server to try to get at the database itself.
I've read that things can go wrong with your web server which may lead to display of PHP scripts as plain text files in a web browser; consequently I've moved most of my PHP scripts to a directory outside the web root. Now I've been wondering whether the same could happen to the CGI scripts in my cgi-bin.
My main concern is one script which contains a user name and password for my MySQL database. If this is a possible security hole (at least as far as the database content is concerned), is there a way of putting sensitive data in a different location and getting it from there (like saving it in a file in a different directory and reading it from that file, for example)? My scripts are written in Perl btw.
I've read that things can go wrong with your web server which may lead to display of PHP scripts as plain text files in a web browser; consequently I've moved most of my PHP scripts to a directory outside the web root. Now I've been wondering whether the same could happen to the CGI scripts in my cgi-bin.
Yes. If something goes wrong that causes the programs to be served instead of executed, then any of their content will be exposed. It is exactly the same issue as with PHP (except that given the way that cgi-bin directories are usually configured (i.e. aliased to a directory outside the web root), it is slightly harder for the problems to occur).
My main concern is one script which contains a user name and password for my MySQL database. If this is a possible security hole (at least as far as the database content is concerned), is there a way of putting sensitive data in a different location and getting it from there (like saving it in a file in a different directory and reading it from that file, for example)?
Yes. Exactly that, just make sure the directory is outside the webroot.
For additional security, make sure the database only accepts the credentials for connections from the minimum set of hosts that need to access it. e.g. if the database is on the same server as the web server, then only let the credentials work for localhost. Causing the database to only listen on the localhost network interface would also be a good idea in that case.
My scripts are written in Perl btw.
I'd look at using one of the Config::* modules for this.
One concern worth mentioning is specific to shared hosting.
If you're on a host shared with other users, it may be impossible to hide the password from them.
This depends on configuration details for the OS and the webserver.
For instance, it is common to have an Apache configuration on Linux on which the only way for a user offering a website to make files readable or writable to the webserver user is to make them readable/writable to all users.
You may trust all of these users not to abuse this themselves, but if one of these websites has a vulnerability that allows intruders to view the full file system, the intruder can then exploit that on all other websites.
There are countermeasures against this, but they complicate things for the users, so many hosters don't implement them.
It's definitely not a good idea to hardcode a password in a script if you can avoid it. Fortunately both Postgres and MySQL support loading DB credentials from a file. For Postgres you use ~/.pgpass and for MySQL I believe it's ~/.my.cnf. In either case you would adjust the permissions so that only the user running the script has permission to read the file. The advantage of this approach is that you don't have to write the code to read the file - the DB client library does it automatically.
It is definitely a security concern. You should store the password encrypted in a separate file and make sure that only your app has access to it.
If you use directory configured as cgi-bin, there is no way for file to be shown except error with Apache configuration. If you use Perl programs outside cgi-bin directories but inside site root, it may happen.
Also, you may configure DB to accept connections only from local socket, so knowing DB password would be useless.
You've already gotten better answers than I can provide, but as a note:
It's very bad form to store passwords as plaintext, period.
In the same way it's very bad form to overwrite or delete files without asking permission. If you do it, it will bite you or your client in the butt eventually.
Is there a way to persist an string from an online click once application. I saw something about isolated file storage as answers to other questions. But none of them specify if it works also for online apps (I really don't think so).
I think that something like a cookie will work. Is there something like that available?
The application must run only online (is triggered with some parameters), but for each user it needs to save a file with specific information asked to him. Once the app runs for the very first time it must not ask for that info to the user.
Thanks.
You can store the information in LocalApplicationData. Just create a directory with either your application name or your company name, stick the string in a file, and read it from there. This article shows you how to persist this data, and not have it impacted by ClickOnce updates. It will work even though your application is online-only. (Online-only C/O apps are still installed, it just means it always runs it from the URL, and requires the user to be connected in order to install the app.)