Send function output as an email in IBM cloud functions - ibm-cloud

I have following code which gives me a csv file:
def main(args):
data = get_data()
csv_output = get_csv_file_data(data)
return {'body': csv_output.getvalue(),
'headers': {'Content-Type': 'text/csv',
'Content-Disposition': 'attachment;filename=myfilename.csv'}}
Is there a way in IBM cloud that I can send this CSV file as an email within a function?

IBM Cloud does not provide an email service as part of its platform services (IBM Cloud cata), but you could use SendGrid, either directly SendGrid or through the classic infrastructure services on IBM Cloud.
This code sample, OpenWhisk Contact, uses SendGrid for sending out email with IBM Cloud Functions.

Related

sending xml data to a rest api using data factory & self hosted integration runtime

I am trying to POST data to a REST API using Azure Data Factory.
The REST API only accepts and returns XML.
As the ADF Rest API activity only handles Json, I will have to do this via an azure function.
The issue with this is the Azure Function cant use our hosted integration runtime. The hosted integration runtime is on a VM with a static IP which has been whitelisted by the API owner and so is required.
Im asking for help listing my options to perform the following in Azure:
Read data from an on premise sql server, outputted in xml.
pass this data row by row to the API
thank you for any help.

Does Amazon services support on-premise hosting?

We intend to develop a enterprise Bot using amazon lex that will fetch response from a SQL server, and display result along with visual presentation. Does Lex support on premise deployment?
Will there be any challenges in using Lex vs Google Dialogflow (formerly known as api.ai)?
Please suggest.
The bot agent you will develop that will reside on AWS, you can access it on AWS Lex console and you cannot have it on-premise.
You can, however, use webhooks which you can have on-premise.
You can use amazon-lex to understand user query and match intent, once the intent is matched, you can perform the operations using if-else conditions and get data from your SQL server.
This way none of your data will be on AWS.

How can I access IBM Cloud Compose RabbitMQ logs?

Is there a way to get IBM Cloud Compose for RabbitMQ logs using web interface or cli?
There is Syslog-NG for RabbitMQ and it has other cloud logging services namely papertrail and Loggly providing the webinterface along with two IBM Cloud Compose API calls for logs
Get list of available logfiles
GET /2016-07/deployments/:id/logfiles
Get details of a logfile including download link
GET /2016-07/deployments/:id/logfiles/:logfile_id
To make use of the API, you will need a handful of digital assets; a token for your account to access the IBM Cloud API and a foundation endpoint for your queries. Check this link for details on how to get the token, endpoint and example cURL calls

Push data from dashDB to Watson Analytics using DataWorks?

I have an account in IBM Public Bluemix. I provisioned a dashDB instance and have inserted data into a dashDB table. I login to Bluemix using my IBM ID (not IBM Intranet ID)
I also have a Watson Analytics account. Please note that my Watson Analytics account/access is part of a larger team. I do not access it using the short Watson Analytics URL https://watson.analytics.ibmcloud.com. I access my account using https://watson.analytics.ibmcloud.com/home/data?loginAccountId=3ZPDZ2KL8DE0&loginTenantId=1VRPUK1QI0A5. when I go to this URL, it redirects me to IBM Intranet Authentication page and I login using IBM Intranet ID (not IBM ID).
I need to push data from the Bluemix dashDB table to the Watson Analytics account that I have.
when I create a connection in Bluemix DataWorks, it does not allow me to specify Watson Analytics URL. It allows me to enter only the user name and password. I created a connection for Watson Analytics using my IBM ID. And then created an activity to move data from dashDB to Watson Analytics. When I run the activity it fails. Please help.
you can follow steps as in the video - https://www.youtube.com/watch?v=0WAq3qVpENo
Instead of using bluemix dataWorks service, use the integrated dataworks within watson analytics by creating the data connections using dashDB credentials.
You are using an IBM internal userid that is not currently supported by DataWorks accessed via Bluemix. Short term, you will have to use the integrated DataWorks support. Unless you are developing a Bluemix application that needs to automatically push data, this will meet your needs.
Unless you are performing DataWorks cleansing operations on your data in dashDB, you can push your data into Watson Analytics directly via the API.

How to connect Google Data Studio to Google Cloud SQL

I have a Google Cloud SQL database that I can connect to with my SQL client. However, I have not been able to connect Google Data Studio to the Google Cloud SQL database with the Cloud SQL data source. I have the IP address and credentials from Google Cloud SQL.
My guess is that Google Data Studio cannot connect to Google Cloud SQL because I need to add an IP address to allow the traffic into Google Cloud SQL but Google Data Studio does not have or does not publicize an IP range.
Has anyone had success connecting to Google Cloud SQL using the Cloud SQL data source in Google Data Studio?
The following ranges need to be whitelisted for Data Studio to work:
https://developers.google.com/apps-script/guides/jdbc
Since August 2017, Google Data Studio connects natively to Google Cloud SQL databases:
The current version of the connector no longer requires whitelisting,
and it provides encryption; however, to use these features, you must
create a new data source using the Google Cloud SQL connector.
See Google Cloud SQL connector in help for detailed description on how to create a connector.