Is there any way to Send mails with Reports attached from EMR?
I am using Amazon Web Services. I don't want to write a script inside EC2 to fetch data from EMR, add it on cron, then send the mails daily. Any luck, there is already any Jobs Scheduler from Amazon to automate this?
Problem:
Implement daily job to generate .csv/.xls files on top of Hive
Send the report in email
Thanks in advance!
If you use AWS Data pipeline (and use EMR as a node inside it), it has OnSuccees and OnFailure alarm support. In the alarm you can configure to send email to you.
http://aws.amazon.com/datapipeline/faqs/ (look for "How do I add alarms to an activity?").
You cant but customize the email content. May be you can keep the CSVs in a predesignated location with some time/date based naming convention. This way when you get the success mail, you know where to look into for the record.
All this you can do without writing any extra code (just configurations).
One alternative is to setup Oozie in your EMR cluster and create a workflow that sends email through Amazon SES.
You can read up more on Oozie on their open source page:
https://oozie.apache.org/
You might find this helpful too:
https://github.com/lila/emr-oozie-sample
and finally Amazon SES:
http://aws.amazon.com/ses/
Related
I use Metabase to generate dashboards and reports. I need to generate files using the scheduler and instead of sending them by e-mail, I need to make them available in SFTP. Do you have any suggestion on how to automate this processes ?
I use PostgreSQL as a database source.
I can also try other open source tools if needed.
I didn't find much information on how to do it yet.
I have a requirement where I need to transfer 1000s of files(around 200 GB) via IICS. How can I perform this using IICS?
The files are stored in a server that does not run a secure agent which makes it complicated
Sure it's possible, the module to be used exactly for this purpose is Mass Ingestion. This is not a scenario for CDI.
You don't need a Secure Agent on either source or target server. What you need instead, is the connections on the Secure Agent to the source and target servers, then you define the job - and that's it.
you can use SFTP,share the keys which is a one time activity. Any from,the Unix team can do ,once it's done you can create the connection in IICS admin console. After that,use mass ingestion thru IICS to pull the files.
HTH.
Regards,
Tech_Seeker
I am facing below problem,Appreciate if any one help
I am having jenkin job which will trigger java jar contains code to read the email from excel and the same email to be passed in jenkins for sending email in username field.
Thanks
Did you take a look at parameterized jobs (If you want to manually trigger it)?
If, you want to read from excel and pass to another job please take a look at this.
I'm trying to understand your problem:
1. Jenkins to trigger emails to DevOps team that the deployment task results
2. Application to trigger emails that the application is being deployed successfully (Or any other scenarios you want to achieve, please indicate, and I will try to enhance this post)
If above is the case, you should try to utilize Jenkins to complete the full process: build, test, deploy and verify, then consolidate the results then send out via email
It's a clear cut that Jenkins for deployment and app for business
There are different ways to verify your application is deployed successfully depends on how you identify if app deployment is successful.
Jenkins can detect those signals, e.g. send a ping or curl to the application and verify the response
Now only Jenkins needs to know the list of emails address for the deployment results, you can use parameterized jobs as #Avneesh Srivastava mentioned
These are following areas where Scheduling Task Using Marklogic can be used
1.Loading content. For example, periodically checking for new content from an external data source, such as a web site, web service, etc.
2.Synchronizing content. For example, when MarkLogic is used as a metadata repository, you might want to periodically check for changed data.
3.Delivering batches of content: For example, initiate an RSS feed, hourly or daily.
4.Delivering aggregated alerts, either hourly or daily.
5.Delivering reports, either daily, weekly, or monthly.
6.Polling for the completion of an asynchronous process, such as the creation of a PDF file
My requirement is to schedule a task for bulk loading data from local file system to Marklogic DB using any data loading option available in Marklogic such as
1.MLCP
2.Xquery
3.Rest API
4.Java API
5.WebDAV.
So is there any option to execute this programatically. I prefer MLCP since I need to perform bulk load of data from local file system
Similar to your question at Execute MLCP Content Load Command as a schedule task in Marklogic , I would start with a tool like Apache Camel. There are other options - Mule, Spring Integration, and plenty of commercial/graphical ETL tools - but I've found Camel to be very easy to get started with, and you can utilize the mlcp Camel component at https://github.com/rjrudin/ml-camel-mlcp .
How can I set a PHP script to run on a schedule? I don't have full control over the server as I am using a hosting company, I have a PLESK administration for the hosting though.
Thanks
I believe PLESK has a crontab area underneath each domain.
Alternatively, if you have shell access, here's a good tutorial on editing your crontab from the command-line.
crontab. video tutorial here:
http://www.webhostingresourcekit.com/flash/plesk-8-linux/plesk8linux_crontab.html
What you're looking for is called a cron job: an automated task that can execute a http request on your server.
Since you're hosted, it's impossible to manually set up a cron job to run. However, many web hosts offer online tools for creating cron jobs through their control panel (cpanel, plesk, etc).
If that isn't an option, there are some paid and SOME free cron services you might be able to find if you poke around long enough.