I would like to realize a PowerShell script which add additional generic IMAP/SMTP mail account(s), ie manually adding credentials, server names, ports, security methods, etc.
Such script should be "client-agnostic", so add the above informations onto MSOutlook or WindowsMail or ThunderBird depending on what desktop client is locally installed.
Currently I haven't found any module/cmdlet yet, but only codes specific for Exchange or MAPI such Gmail.
Related
My company is no longer supporting our Linux mail server (all will be handled by gmail).
Over the years I've run many mail clients on the Linux server: elm, alpine, squirrelMail, roundCube. My most recent client has been RoundCube.
Ideally I'd like Thunderbird to import the most current folders from RoundCube; these appear to me to be inside Maildir/ (with deeper directories like .saved-mailed, etc). But I also have Mail/ (which alpine appears to reference).
But upon adding this account to Thunderbird, some mix of folders is presented to me: not all from Maildir/ and not all from Mail/...in fact no 'new' Roundcube folders are presented.
Where does Thunderbird search on a linux mail server to 'subscribe' folders? And how can I access this location to force the subscription of the folders I actually want?
I gave up trying to determine where Thunderbird (Tb) searches for mail.
Instead I copied all Roundcube email in Maildir/ to my local machine and then used the code here
https://gist.github.com/lftbrts/249f034a439d3eb2e008f73506cacc2d
to convert that email to mbox format.
Then I copied all that converted email to Tb's 'Local Folders' directory; Tb was able to load all the converted folders and then I 'dragged' them (using Tb) to the synced Gmail account.
So the above named coded saved the day!
Let's say I would like to check some user mailbox properties from within PowerShell. I can run the script in Exchange Management Shell but the problem is that I have no guarantee that the end user will be running the script directly on Exchange or a machine with any Exchange tools. So, I can tell the end user to just run the script in the PowerShell (not EMS) and encode importing pssesion into the script.
However, here comes the main problem of mine, I cannot hard-code the server name into the script (it will be used in many different environments) and I would like to avoid asking the end user to provide the Exchange Server name for the pssesion.
Is there any way to obtain the Exchange Server name automatically with just vanilla PowerShell (no EMS, etc.)? The script will be ran by users with domain admin privileges, most likely there will be no Outlook on the machines (so no MAPI profiles configuration), if that is of any help.
I'm not sure how portable this is (it works on my E2K7 setup, but your mileage may vary)...
You can look in AD to get a list of exchange servers by doing something like the following:
$exchangeServers = [ADSI]"LDAP://contoso.com/CN=Exchange Servers,OU=Microsoft Exchange Security Groups,DC=contoso,DC=com"
$exchangeServers.Member
In my environment, this lists all of the exchange server computer accounts, plus a few other groups, but it's a starting point.
Currently the application in question can be downloaded from our website after logging in with email address. Then during download we inject the user credentials to the executable, thus after installation, the user's email address is automatically available in the app.
Our aim is to allow installing this app via active directory in a way that the email address of the user (to whom the app is assigned) is injected.
Is it possible somehow? E.g. using MSP, MST files with the MSI?
Thanks,
Peter
For Active Directory deployment you need and MSI package. However, this does not solve your problem completely.
An MSI package can be configured to receive the email address as a command line parameter when installing. The problem comes from the deployment process, i.e. when you deploy through Active Directory you need to set a command line that is valid for all users, as the package will be installed on all on the selected/specified computers. This means that you have no option to specify a unique email address for each user.
A workaround would be to have a custom action included in the MSI package that reads the email address from the user's computer and uses it in your installation package. This would mean that your users would need to have the email address stored in a known location (registry entry or file), which you can read with your custom action (C# or C++ code, DLL generated as output).
On my machine I need to test the mails sent by my application. I'd rather avoid sending real mails.
Is there a way to have the email content showed to the screen a way or another, maybe by opening it in gedit or any text editor?
Maybe like replacing the commandline used to launch "sendmail"?
I am asking for Linux machines (Ubuntu more specifically).
Include a means of determining your environment in your project, or at least some kind of global variable that holds that information.
Then build an abstract mail interface that either sends real mails if it's running on a production server, but logs them to local files in case it runs on a dev machine / environment. As a logging package, I would recommend Monolog.
This would allow you to design the rest of your application (or at least the mail sending components) in a way that doesn't have to care about the environment.
After searching, here is the solution I came to:
create a script that will fake a smtp server
/usr/local/bin/sendmail-fake:
#!/bin/bash
{
date
echo $#
cat
} >> /var/log/sendmail-fake.log
configure PHP:
php.ini:
sendmail_path = /usr/local/bin/sendmail-fake
In this setup, emails are logged into a file. The script could be modified to open the content into a browser.
More details on the blog post.
I am DBA. I am trying to write bunch of scripts that I could execute from one central server. Ideal would be to send all the scripts from central server to say 50+ servers across multiple win domains (for databases management purposes).
The problem I am running into is - security. Seems like PowerShell Remoting is the way to go. But when I send a script to another server, I get 'not digitally signed' error.
I could 'self sign'. But that cert if only trusted on local machine. So that option is out.
Maybe Certificate Authority is a way to go. Or adding trusted hosts. I just have no clue on this one, so if you know any blog posts or how to do this - it would be big help.
Well, it's a security risk, but there's always the possibility of setting the execution policy to RemoteSigned, keeping a local repository on each server and calling those as needed via PS-Remoting. I don't like that idea one bit though.
If you are doing remote execution, you will need to sign your scripts. A detailed step by step can be found here. It even covers deploying the cert via GPO so that it's domain trusted.
I would use PowerShell remoting. This would allow you to run it as remote commands instead of remote scripts. If you catch the bottom of this SimpleTalk article, after "Persistent Sessions". It shows the option of executing a set of commands against each server instead of the script. This should prevent having to deal with the remote signed issue and provide a little more control.
The only thing to deal with on remote sessions is your credentials. I have not tried this on multiple domains but a few stand-alone servers.