PHP mail() in dev environment: open the mail in a browser/editor instead of sending it? - email

On my machine I need to test the mails sent by my application. I'd rather avoid sending real mails.
Is there a way to have the email content showed to the screen a way or another, maybe by opening it in gedit or any text editor?
Maybe like replacing the commandline used to launch "sendmail"?
I am asking for Linux machines (Ubuntu more specifically).

Include a means of determining your environment in your project, or at least some kind of global variable that holds that information.
Then build an abstract mail interface that either sends real mails if it's running on a production server, but logs them to local files in case it runs on a dev machine / environment. As a logging package, I would recommend Monolog.
This would allow you to design the rest of your application (or at least the mail sending components) in a way that doesn't have to care about the environment.

After searching, here is the solution I came to:
create a script that will fake a smtp server
/usr/local/bin/sendmail-fake:
#!/bin/bash
{
date
echo $#
cat
} >> /var/log/sendmail-fake.log
configure PHP:
php.ini:
sendmail_path = /usr/local/bin/sendmail-fake
In this setup, emails are logged into a file. The script could be modified to open the content into a browser.
More details on the blog post.

Related

three ways to let PHP and a regular user edit the same files

I am a web developer, and for some upcoming projects I would like to use a file-based CMS. This means that many of the files I create at the start must be editable by the PHP user later, but also remain editable for my user (and also the other way around). My PC runs Debian 9, which I love but am not super knowledgeable about, and I have also just set up a local network server with Debian 9 for backups and possibly file sharing. (I'm using Webmin to configure this, which reflects my level of command line skills).
On my online shared hosting server, the PHP user and the FTP user seem to be the same, and 644/755 permissions work fine, this is also recommended by the CMS I'm using. I would like to mimic this on my computer so I don't have to fiddle with permissions all the time. But how do I do this? Currently, my regular user (anna) does not have access to www-data's files and vice versa. Putting them in the same group still means changing file permissions. Making anna the PHP user is a Bad Idea (as far as I understand it) because anna has sudo permissions.
So far I have researched three possible solutions that I don't really know very much about, and I would like to know which is the best route to take.
Develop locally on my computer and use apache-mpm-itk or suPHP to let PHP edit the files (I got that idea from this question on ServerFault).
Develop locally on my computer and rsync the files to my server with grunt-rsync, and somehow get rsync to set the ownership to www-data (another ServerFault thread helping here).
Mount the project's server directory, which is owned by www-data, on my computer with SSHFS and then either edit the files on the server directly or copy them over from my local directory with grunt-copy.
What do you think: from a security and ease of use perspective, which is the best way? Or do you know an even better one?
Thank you for taking the time to read and think about this!
Anna~
I figured it out! I finally ended up reading about running PHP as CGI instead of as an Apache module, and that this would solve my permissions problem. Plus, as far as I understand it, there are no extra security precautions to take when I'm the only one working with it on my local computer.
In case someone comes across this who might find it helpful, here's what I did (basically following these instructions):
I installed php7.0-fpm
Edited /etc/apache2/sites-enabled/000-default.conf and put the following just before </VirtualHost>:
DirectoryIndex index.php
<LocationMatch "^(.*\.php)$">
ProxyPass fcgi://127.0.0.1:9000/var/www/html
</LocationMatch>
I activated the Apache module proxy_fcgi (via Webmin, which apparently does an automatic Apache restart)
In /etc/php/7.0/fpm/pool.d/www.conf I commented out a listen line and put another below like this:
; listen = /run/php/php7.0-fpm.sock
listen = 127.0.0.1:9000
I then restarted PHP-FPM with this command: /etc/init.d/php7.0-fpm restart (a little different from the instructions, I'm on Debian 9). After that, phpinfo() gave me the Server API "FPM/FastCGI".
And finally, I changed the user and group from www-data to anna in three places, twice in /etc/php/7.0/fpm/pool.d/www.conf and then once more in /usr/lib/tmpfiles.d/php7.0-fpm.conf (this last bit may be Ubuntu/Debian specific, my thanks go to Keith for a comment on StackExchange).
And that was it! :-)

How to Run a shell script with no linux server...?

I have a .sh file, which when run, downloads database info, html pages, etc... on my IP address. So it is basically using LAMP to make a website. However, the server I use is very unreliable. Is there a way to, per say, run the .sh script in CPanel or WHM online? They are domain companies, and so I thought I could download my .sh file there, so that my site is already set up.
Thanks,
Ali

KSH script won't email when nohup

I have a unique issue, i am in a unix environment and have a ksh script that ssh's to multiple sites, executes some code and then returns a response and then emails that response to an email address.
The script works perfectly when i run it, but since it must run for several hours i wish to nohup the script.
Here is where the problem is. When i nohup the script the email is not sent. I have scoured the boards looking for a reason or solution to no avail. if someone could point me in the right direction i would greatly appreciate it.
Here is my mail portion of the script:
mail -s "subject" email#address.com < /usr/etc/bin/mydir/infofile.out &&
rm -f infofile.out
exit;
EDIT: my environment is AIX 6.1.7.1
Finally figured out the answer, and even thou i was being dumb, i feel i have a responsibility to answer anyway, just in case someone else runs across this issue.
Turns out when i nohup my script it DOES send the email correctly. Its just that by nohuping and logging out it forces the email to be sent from the unix mail utility's default email address, and in my environment that address sends out hundreds of useless alerts, most of which i have filtered in outlook to go to a trash folder, well the email i was sending ended up in that trash folder.
Thanks to those who responded, especially shellter, your recommendation to use shell debugging is what let me know that it was sending from that default mail account.

How to send email within an NSIS installer?

Background:
For our software product (web application) clients will need to request a license from us before installing it.
We would need to check if they are a paying client (a manual process at the moment).
I need the ability for one of the initial steps of my installer to be let user request license via a custom page in the NSIS installer.
They would put in some mandatory fields and then this would get fired off in an email.
Ideally, NSIS would let them know that the email had successfully been sent.
They would then exit the wizard, but when we verify who they are and send them a serial (few days later), they can run installer again, and there would be an option to enter the serial, allowing them to progress to the next screen.
Ideally, the custom page will look something like this.
Question:
It is the emailing bit that I am currently stumped on.
I have not seen any plugins to do this.
How can I send an email from a custom page in NSIS?
- I imagine there might be a couple of approaches?
Probably best way to send emails is to use external application.
You can develop simple application in C/C++/C# if you have some programming skills, but there exist many apps for this purpose, e.g try this one called bmail:
http://www.beyondlogic.org/solutions/cmdlinemail/cmdlinemail.htm
C:\>mail -s smtp.server -t cpeacock#max -f root#neptune -h -a "Subject e.g. Fatal Error"
-b "Body of message e.g. Fatal Error occurred in cgi script, secure.cgi"
Simply use nsExec plugin to call this .exe with your desired parameters.
Alternatively create.bat file with appropriate parameters, unpack it to $PLUGINS directory together with bmail.exe and launch the .bat using ExecWait command from NSIS.
For creating custom page try this tool for NSIS: http://www.graphical-installer.com - it allows you to create skinned installer with custom page you need.

PHP Slow to process soap request via browser but fine on the command line

I am trying to connect to an external SOAP service using PHP and have written a small php test script that just connects to the service and performs a simple request to check everything is working.
This all works correctly but when I run via a browser request, it is very slow taking somewhere in the region of 40s to establish the initial connection. When I do the same request using the exact same script on the command line, it goes through straight away.
Does anyone have any ideas as to why this might be?
Cheers
PHP caches the wsdl in /tmp. If you run from the command line first, the cache file will be owned by whatever user you're running the script as, and apache won't be able to read the cache. The wsdl will have to be downloaded and parsed every time which will be slow.
Check the permissions of /tmp/wsdl*.
Maybe external SOAP service trying to check your IP, and your server has ICMP allowed, when your local network - not.
Anyway, this question might be answered more clearly by administrator of external SOAP service :)
Is there a difference between the php.inis that are being used?
On a standard ubuntu server installation:
diff /etc/php5/apache2/php.ini /etc/php5/cli/php.ini
//edit:
Another difference might be in the include paths. Had this trouble myself on a local test server, it didn't actually use the soap class that was included (it didn't include anything, because the search paths weren't valid), but it included the built-in soap_client class.