Can we send a file as an alert to SLACK via concourse - concourse

I am trying to send a CSV file as an alert to Slack via Concourse but not able to do so -- Any thoughts on this?
Is there a way with which we can send a file and get it as an alert into SLACK?

Whatever file you want to send in slack. Upload in s3 (you can find other resources also) via concourse. For uploading a file into s3, you can configure s3 resource in concourse like this:
name: s3_upload_file
type: s3
source:
access_key_id: <give_access_key>
bucket: <bucket-name>
secret_access_key: <secret-access-key>
regexp: <file name or you can specify regex also here>
After uploading a file into s3, put an s3 URL in a file via concourse script like following:
echo -e "File can be downloaded from <s3-url> >> slack.txt"
And then sent this message into slack like this:
- put: slack-alert
params:
always_notify: true
channel: ((slack-channel-name))
text_file:
text: |
$TEXT_FILE_CONTENT
So, with the URL anyone can download file from s3 or any other source wherever you have uploaded.

Related

Jmeter - How load HTTP Body text from file directory

I have multiple HTTP body data saved in couple of text files in one directory. I need to load that HTTP body data from those files which in that directory into SOAP request via jmeter.
The easiest way to get the file names into a JMeter Variable is using Directory Listing Config plugin
Once done you can reference each and every file content using __FileToString() function like:
${__FileToString(${body},,)}
This way each virtual user will send the content of the next file on each iteration:

How to upload/download file from GCS to/from ftp server with Airflow FTPHook

I am currently trying to use the FTPHook in Airflow in order to upload and download file to/from a remote ftp. But I'm not sure if I can use the gs:// path as part of the source/destination path.
I currently don't want to use local folder within the AF pod since the file size might get big, so I would rather use gcs path directly or gcs file stream.
conn = FTPHook(ftp_conn_id='ftp_default')
conn.store_file('in', 'gs://bucket_name/file_name.txt')
link to the FTPHook code:
here
Thanks for any help!
I found a simple streaming solution to upload/download from gcs to ftp server and vice versa using pysftp which I'll like to share with you.
First, I found this solution, which was working great, but the only issue with that solution was that it didn't support upload file from gcs to FTP. So I was looking for something else.
So than I was looking into different approach, so I've found this google document which basically allow you to stream to/from blob file which was exactly what I was looking for.
params = BaseHook.get_connection(self.ftp_conn_id)
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
ftp = pysftp.Connection(host=params.host, username=params.login, password=params.password,
port=params.port,
cnopts=cnopts)
#This will download file from FTP server to GCS location
with ftp.open(self.ftp_folder + '/' + file_to_load, 'r+') as remote_file:
blob = bucket.blob(self.gcs_prefix + file_to_load)
blob.upload_from_file(remote_file)
#This will upload file from GCS to FTP server
with sftp.open(self.ftp_folder + '/' +file_name,'w+') as remote_file:
blob = bucket.blob(fileObject['name'])
blob.download_to_file(remote_file)
GCS does not implement FTP support, so this won't work.
It looks like FTP hook only knows how to deal with a local file path or buffer, not one of the GCS APIs.
You might be able to find (or write) some code that reads from FTP and writes to GCS.

when using the spring cloud data flow sftp source starter app file_name header is not found

spring cloud dataflow sftp source starter app states that file name should be in the headers (mode=contents). However, when I connect this source to a log sink, I see a few headers (like Content-Type) but not the file_name header. I want to use this header to upload the file to S3 with the same name.
spring server: Spring Cloud Data Flow Local Server (v1.2.3.RELEASE)
my apps are all imported from here
stream definition:
stream create --definition "sftp --remote-dir=/incoming --username=myuser --password=mypwd --host=myftp.company.io --mode=contents --filename-pattern=preloaded_file_2017_ --allow-unknown-keys=true | log" --name test_sftp_log
configuring the log application to --expression=#root --level=debug doesn't make any difference. Also, writing my own sink that tries to access the file_name header I get an error message that such header does not exist
logs snippets from the source and sink are in this gist
Please follow this link bellow, You need to code your own Source and populate such a header manually downstream already after FileReadingMessageSource. And only after that send the message with content and appropriate header to the target destination.
https://github.com/spring-cloud-stream-app-starters/file/issues/9

SpringBoot gitHub config server not able to read from folder in Repo

I was getting my hands into spring boot with using a git based config file.
This is my repo.
In here I have some config files inside tolls-config folder this does retrieve the config file
But when in the tutorial this repo is used it works
API call :
localhost:8888/s1rates/default
My Questions
Do I need to modify the API call
Should my uri be pointing to specific folder in repo( I tried putting folder URL and it gives a 500)
Every config file has to been its own repo ( there is no way
around it )
I loked at your repo and it seems that you misspelled the property for searchPaths instead of search-paths like you have in your config.
spring:
cloud:
config:
server:
git:
uri: https://github.com/spring-cloud-samples/config-repo
searchPaths: foo,bar*

How do i programatically set the email recipients for a jenkins job

I am trying to update the email recipients for loads of jenkins job with new set of email lists, I am unable to find the right API to do so. Although this could be updated in the config file directly but wanted to use Jenkins APIs if available any
Edit : I am referring to the below field
Post-build Actions: E-mail Notification > Recipients
Well. Currently I do not have the code but I have the thoughts.
1st step:
You can use whatever jenkins API (REST, python wrapper,etc...) to dump all your job names into a txt file, saying job_list.txt.
Below is an example. And you can find the useage from This link.
import jenkins
j = jenkins.Jenkins('http://your_url_here', 'username', 'password')
j.get_jobs()
2nd step:
As you can see, each job has an config file with path like $JENKINS_HOME/jobs/job_name/config.xml. This can also be accessed from your browser. From browser it looks like this:
So your question can be simplified into either:
"How to update the recipients value of config.xml from each job folder under $JENKINS_HOME/jobs directory".
Or:
"How to update the recipients value of config.xml from each job url like http://your_jenkins_url/job/each_job_name/config.xml".
So this can be done by any script language like python,ruby,shell,vb or whatever http-like lib like 'urllib2', 'requests' etc...
3rd step:
After all the config.xml file updated, don't forget to restart your jenkins to take effects.
Good luck, buddy!
Edited(2015-05-27)
There is an existing Groovy script written by #edalquist which can update the email address in a programatically way. https://github.com/jenkinsci/jenkins-scripts/blob/master/scriptler/updateEmailAddress.groovy