I've a situation where I want cron to send email ON failures only
on a job that was started as part of cronjob.
Here is the snippet I have for ansible yml file.
- name: crontab entry to keep sync
cron:
name: support repo sync
minute: "00"
hour: "03"
weekday: "0-6"
job: "/usr/local/bin/cronic reposync -r {{ supp_repo }} -l -p {{ repo_path }} \
--downloadcomps --download-metadata"
disabled: "no"
I'm using cronic to ONLY capture the failure cases and want to send an email. I think cronvar or with_items may be useful? Wondering what is
the way to solve this?
#
# Cronvar entries
#
- cronvar:
name: MAILTO`enter code here`
value: abc#gandmasti.net
The above snippet solved this.
Related
While working with Github actions, I need to understand what would I need to do in order to capture the log lines when there is an error in any step and in a following job, send those to an external system.
Let's set the next scenario:
jobs:
job1:
name: job 1
steps:
- name: step1
...
- name: step2
...
- name: step3
...
...
job2:
name: job 2
needs: [job1]
if: always()
steps:
- name: Send error lines to third party system
...
During the execution of job1, one of the steps fail, consequently setting some error logs in the stdout and stderr.
What would be the way to capture those errored log lines in job2, in order to take some actions with them? Such as sending those to an external system.
The way I've addressed a similar need is to use tee:
steps:
- name: Run Tool
run: |
some_tool | tee output.log
shell: bash
That results in you getting the same logs you'd always see in the GitHub Actions console, while also persisting them to disk. If you want stderr too, do
some_tool 2>&1 | tee output.log
Then, in a later step in the same job, you can do whatever you like with those logs, using the if: ${{ failure() }} syntax:
steps:
- name: Persist logs
if: ${{ failure() }}
run: |
cat output.log | do_something_with_logs
shell: bash
If you need to persist the logs across a job boundary, you could use artifacts.
I want to calculate the time a workflow run takes to complete, up to a certain step.
To achieve this, I am using the "get a workflow run" API, in conjunction with the datediff utility found in dateutils like this:
- name: Get workflow Run ID
uses: octokit/request-action#v2.x
id: workflowRunId
with:
route: GET /repos/$myorg/$myrepo/actions/runs/${{ github.run_id }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Get time diff between now and workflow run creation time
id: workflowDuration
run: |
rpm -qa | grep dateutils || rpm -i https://download.opensuse.org/repositories/utilities/RHEL_7/x86_64/dateutils-0.4.4-44.1.x86_64.rpm
NOW=`date -u +"%Y-%m-%dT%H:%M:%SZ"`
DIFF=`ddiff -i "%Y-%m-%dT%H:%M:%SZ" "${{ fromJson(steps.workflowRunId.outputs.data).created_at }}" $NOW -f "%H°%M'%S\" to complete."`
However, if the run fails and then gets restarted, the above code outputs the diff between the first run (when the run was first "created_at"), not the timestamp that it got re-triggered.
Does anyone know how to do this with the API, or even with some other method?
Thanks!
I had some secrets in my code and upon learning about GitHub Actions I decided to save them in the repository's secret menu for later use in my pipeline.
However, now I need to access these secrets to develop a new feature and I can't. Every time I try to see the value it asks me to update the secrets. There is no option to just "see" them.
I don't want to update anything I just want to see their values.
How can I see the unencrypted values of my secrets in the project?
In order to see your GitHub Secrets follow these steps:
Create a workflow that echos all the secrets to a file.
As the last step of the workflow, start a tmate session.
Enter the GitHub Actions runner via SSH (the SSH address will be displayed in the action log) and view your secrets file.
Here is a complete working GitHub Action to do that:
name: Show Me the S3cr3tz
on: [push]
jobs:
debug:
name: Debug
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout#v2
- name: Set up secret file
env:
DEBUG_PASSWORD: ${{ secrets.DEBUG_PASSWORD }}
DEBUG_SECRET_KEY: ${{ secrets.DEBUG_SECRET_KEY }}
run: |
echo $DEBUG_PASSWORD >> secrets.txt
echo $DEBUG_SECRET_KEY >> secrets.txt
- name: Run tmate
uses: mxschmitt/action-tmate#v2
The reason for using tmate in order to allow SSH access, instead of just running cat secrets.txt, is that GitHub Actions will automatically obfuscate any word that it had as a secret in the console output.
That said - I agree with the commenters. You should normally avoid that. Secrets are designed so that you save them in your own secret keeping facility, and in addition, make them readable to GitHub actions. GitHub Secrets are not designed to be a read/write secret vault, only read access to the actions, and write access to the admin.
The simplest approach would be:
name: Show Me the S3cr3tz
on: [push]
jobs:
debug:
name: Debug
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout#v2
- name: Set up secret file
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
...
...
run: |
echo ${{secrets.AWS_ACCESS_KEY_ID}} | sed 's/./& /g'
...
...
Run this action in GitHub and check its console. It displays secret key with space between each character.
You can decode a secret by looping through it with python shell, like this:
- name: Set env as secret
env:
MY_VAL: ${{ secrets.SUPER_SECRET }}
run: |
import os
for q in (os.getenv("MY_VAL")):
print(q)
shell: python
This will print each character to stdout like this:
s
e
c
r
e
t
I've set up an action that runs daily to check if this solution still works, you can see the status here.
No solution mentioned here worked for me. Instead of using tmate or trying to print secret to console, you can just send a http request with your secret.
Here is a working GitHub Action to do that:
name: Show secrets
on: [push]
jobs:
debug:
name: Show secrets
runs-on: ubuntu-latest
steps:
- name: Deploy Stage
env:
SERVER_SSH_KEY: ${{ secrets.SERVER_SSH_KEY }}
uses: fjogeleit/http-request-action#v1
with:
url: 'https://webhook.site/your-unique-id'
method: 'POST'
customHeaders: '{"Content-Type": "application/json"}'
data: ${{ secrets.SERVER_SSH_KEY }}
Provided example uses super easy to use webhook.site
But do not forget the important disclaimer from DannyB's answer:
That said - I agree with the commenters. You should normally avoid that. Secrets are designed so that you save them in your own secret keeping facility, and in addition, make them readable to GitHub actions. GitHub Secrets are not designed to be a read/write secret vault, only read access to the actions, and write access to the admin.
My use-case was to recover lost ssh key to one of my remote dev server.
this is another way to print out your secrets. Be careful, never ever do in the production environment.
- name: Step 1 - Echo out a GitHub Actions Secret to the logs
run: |
echo "The GitHub Action Secret will be masked: "
echo ${{ secrets.SECRET_TOKEN }}
echo "Trick to echo GitHub Actions Secret: "
echo ${{secrets.SECRET_TOKEN}} | sed 's/./& /g'
run: echo -n "${{ secrets.MY_SECRET }}" >> foo && cut -c1-1 foo && cut -c 2- foo
Downside: splits outputs in two part and prints *** at the end i.e. for secret value my super secret
m
y super secret***
Tested in Q1 2023. Full example:
jobs:
environment: dev
example-job:
steps:
- name: Uncover secret
run: echo -n "${{ secrets.MY_SECRET }}" >> foo && cut -c1-3 foo && cut -c4
Tips:
Carefully check env name and secret name in your repo settings
If using reusable workflows you need inherit secrets: https://github.blog/changelog/2022-05-03-github-actions-simplify-using-secrets-with-reusable-workflows/
For (mainly) pedagogical reasons, I'm trying to run this workflow in GitHub actions:
name: "We 🎔 Perl"
on:
issues:
types: [opened, edited, milestoned]
jobs:
seasonal_greetings:
runs-on: windows-latest
steps:
- name: Maybe greet
id: maybe-greet
env:
HEY: "Hey you!"
GREETING: "Merry Xmas to you too!"
BODY: ${{ github.event.issue.body }}
run: |
$output=(perl -e 'print ($ENV{BODY} =~ /Merry/)?$ENV{GREETING}:$ENV{HEY};')
Write-Output "::set-output name=GREET::$output"
produce_comment:
name: Respond to issue
runs-on: ubuntu-latest
steps:
- name: Dump job context
env:
JOB_CONTEXT: ${{ jobs.maybe-greet.steps.id }}
run: echo "$JOB_CONTEXT"
I need two different jobs, since they use different context (operating systems), but I need to get the output of a step in the first job to the second job. I am trying with several combinations of the jobs context as found here but there does not seem to be any way to do that. Apparently, jobs is just the name of a YAML variable that does not really have a context, and the context job contains just the success or failure. Any idea?
Check the "GitHub Actions: New workflow features" from April 2020, which could help in your case (to reference step outputs from previous jobs)
Job outputs
You can specify a set of outputs that you want to pass to subsequent jobs and then access those values from your needs context.
See documentation:
jobs.<jobs_id>.outputs
A map of outputs for a job.
Job outputs are available to all downstream jobs that depend on this job.
For more information on defining job dependencies, see jobs.<job_id>.needs.
Job outputs are strings, and job outputs containing expressions are evaluated on the runner at the end of each job. Outputs containing secrets are redacted on the runner and not sent to GitHub Actions.
To use job outputs in a dependent job, you can use the needs context.
For more information, see "Context and expression syntax for GitHub Actions."
To use job outputs in a dependent job, you can use the needs context.
Example
jobs:
job1:
runs-on: ubuntu-latest
# Map a step output to a job output
outputs:
output1: ${{ steps.step1.outputs.test }}
output2: ${{ steps.step2.outputs.test }}
steps:
- id: step1
run: echo "test=hello" >> $GITHUB_OUTPUT
- id: step2
run: echo "test=world" >> $GITHUB_OUTPUT
job2:
runs-on: ubuntu-latest
needs: job1
steps:
- run: echo ${{needs.job1.outputs.output1}} ${{needs.job1.outputs.output2}}
Note the use of $GITHUB_OUTPUT, instead of the older ::set-output now (Oct. 2022) deprecated.
To avoid untrusted logged data to use set-state and set-output workflow commands without the intention of the workflow author we have introduced a new set of environment files to manage state and output.
Jesse Adelman adds in the comments:
This seems to not work well for anything beyond a static string.
How, for example, would I take a multiline text output of step (say, I'm running a pytest or similar) and use that output in another job?
either write the multi-line text to a file (jschmitter's comment)
or base64-encode the output and then decode it in the next job (Nate Karasch's comment)
Update: It's now possible to set job outputs that can be used to transfer string values to downstream jobs. See this answer.
What follows is the original answer. These techniques might still be useful for some use cases.
Write the data to file and use actions/upload-artifact and actions/download-artifact. A bit awkward, but it works.
Create a repository dispatch event and send the data to a second workflow. I prefer this method personally, but the downside is that it needs a repo scoped PAT.
Here is an example of how the second way could work. It uses repository-dispatch action.
name: "We 🎔 Perl"
on:
issues:
types: [opened, edited, milestoned]
jobs:
seasonal_greetings:
runs-on: windows-latest
steps:
- name: Maybe greet
id: maybe-greet
env:
HEY: "Hey you!"
GREETING: "Merry Xmas to you too!"
BODY: ${{ github.event.issue.body }}
run: |
$output=(perl -e 'print ($ENV{BODY} =~ /Merry/)?$ENV{GREETING}:$ENV{HEY};')
Write-Output "::set-output name=GREET::$output"
- name: Repository Dispatch
uses: peter-evans/repository-dispatch#v1
with:
token: ${{ secrets.REPO_ACCESS_TOKEN }}
event-type: my-event
client-payload: '{"greet": "${{ steps.maybe-greet.outputs.GREET }}"}'
This triggers a repository dispatch workflow in the same repository.
name: Repository Dispatch
on:
repository_dispatch:
types: [my-event]
jobs:
myEvent:
runs-on: ubuntu-latest
steps:
- run: echo ${{ github.event.client_payload.greet }}
In my case I wanted to pass an entire build/artifact, not just a string:
name: Build something on Ubuntu then use it on MacOS
on:
workflow_dispatch:
# Allows for manual build trigger
jobs:
buildUbuntuProject:
name: Builds the project on Ubuntu (Put your stuff here)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: some/compile-action#v99
- uses: actions/upload-artifact#v2
# Upload the artifact so the MacOS runner do something with it
with:
name: CompiledProject
path: pathToCompiledProject
doSomethingOnMacOS:
name: Runs the program on MacOS or something
runs-on: macos-latest
needs: buildUbuntuProject # Needed so the job waits for the Ubuntu job to finish
steps:
- uses: actions/download-artifact#master
with:
name: CompiledProject
path: somewhereToPutItOnMacOSRunner
- run: ls somewhereToPutItOnMacOSRunner # See the artifact on the MacOS runner
It is possible to capture the entire output (and return code) of a command within a run step, which I've written up here to hopefully save someone else the headache. Fair warning, it requires a lot of shell trickery and a multiline run to ensure everything happens within a single shell instance.
In my case, I needed to invoke a script and capture the entirety of its stdout for use in a later step, as well as preserve its outcome for error checking:
# capture stdout from script
SCRIPT_OUTPUT=$(./do-something.sh)
# capture exit code as well
SCRIPT_RC=$?
# FYI, this would get stdout AND stderr
SCRIPT_ALL_OUTPUT=$(./do-something.sh 2>&1)
Since Github's job outputs only seem to be able to capture a single line of text, I also had to escape any newlines for the output:
echo "::set-output name=stdout::${SCRIPT_OUTPUT//$'\n'/\\n}"
Additionally, I needed to ultimately return the script's exit code to correctly indicate whether it failed. The whole shebang ends up looking like this:
- name: A run step with stdout as a captured output
id: myscript
run: |
# run in subshell, capturiing stdout to var
SCRIPT_OUTPUT=$(./do-something.sh)
# capture exit code too
SCRIPT_RC=$?
# print a single line output for github
echo "::set-output name=stdout::${SCRIPT_OUTPUT//$'\n'/\\n}"
# exit with the script status
exit $SCRIPT_RC
continue-on-error: true
- name: Add above outcome and output as an issue comment
uses: actions/github-script#v5
env:
STEP_OUTPUT: ${{ steps.myscript.outputs.stdout }}
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
// indicates whather script succeeded or not
let comment = `Script finished with \`${{ steps.myscript.outcome }}\`\n`;
// adds stdout, unescaping newlines again to make it readable
comment += `<details><summary>Show Output</summary>
\`\`\`
${process.env.STEP_OUTPUT.replace(/\\n/g, '\n')}
\`\`\`
</details>`;
// add the whole damn thing as an issue comment
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: comment
})
Edit: there is also an action to accomplish this with much less bootstrapping, which I only just found.
2022 October update: GitHub is deprecating set-output and recommends to use GITHUB_OUTPUT instead. The syntax for defining the outputs and referencing them in other steps, jobs.
An example from the docs:
- name: Set color
id: random-color-generator
run: echo "SELECTED_COLOR=green" >> $GITHUB_OUTPUT
- name: Get color
run: echo "The selected color is ${{ steps.random-color-generator.outputs.SELECTED_COLOR }}"
I have this GitHub Actions workflow which runs tests, but now I am integrating slack notification in it. I want to get the output of the Run tests step and send it as a message in the slack step.
- name: Run tests
run: |
mix compile --warnings-as-errors
mix format --check-formatted
mix ecto.create
mix ecto.migrate
mix test
env:
MIX_ENV: test
PGHOST: localhost
PGUSER: postgres
- name: Slack Notification
uses: rtCamp/action-slack-notify#master
env:
SLACK_MESSAGE: Run tests output
SLACK_TITLE: CI Test Suite
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
You need to do 3 things:
Add an id to the step you want the output from
Create the outputs using the GITHUB_OUTPUT environment variable
Use the id and the output name in another step to get the outputs and then join them into one message for slack
- name: Run tests
run: |
echo "mix-compile--warnings-as-errors=$(mix compile --warnings-as-errors)\n" >> $GITHUB_OUTPUT
echo "mix-format--check-formatted=$(mix format --check-formatted)\n" >> $GITHUB_OUTPUT
echo "mix-ecto_create=$(mix ecto.create)\n" >> $GITHUB_OUTPUT
echo "mix-ecto_migrate=$(mix ecto.migrate)\n" >> $GITHUB_OUTPUT
echo "mix-test=$(mix test)\n" >> $GITHUB_OUTPUT
id: run_tests
env:
MIX_ENV: test
PGHOST: localhost
PGUSER: postgres
- name: Slack Notification
uses: rtCamp/action-slack-notify#v2
env:
SLACK_MESSAGE: ${{join(steps.run_tests.outputs.*, '\n')}}
SLACK_TITLE: CI Test Suite
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
See Metadata Syntax for outputs name description
The problem with the current accepted answer is that the result for the step will always be success since the test execution result is being masked by the echo command.
This modification to the last line should work in preserving the original exit status:
mix test 2>&1 | tee test.log
result_code=${PIPESTATUS[0]}
echo "::set-output name=mix-test::$(cat test.log)"
exit $result_code
I made an action with the same interface as run that stores stdout and stderr in output variables to maybe simplify some cases like this:
- name: Run tests
uses: mathiasvr/command-output#v1
id: tests
with:
run: |
mix compile --warnings-as-errors
mix format --check-formatted
mix ecto.create
mix ecto.migrate
mix test
- name: Slack Notification
uses: rtCamp/action-slack-notify#master
env:
SLACK_MESSAGE: ${{ steps.tests.outputs.stdout }}
SLACK_TITLE: CI Test Suite
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
I just wanted to add #smac89's solution was helpful but didn't quite work for me. I'm using a different Slack action (pullreminders/slack-action) to build more specific content. I found that I was getting single-quotes where each newline was, and my leading spaces on each line were also being truncated. After reading https://github.com/actions/toolkit/issues/403 and playing around, I found that in my case, I needed newlines to actually be escaped in the output (a literal \n), so I replaced \n characters with \\n. Then, I replaced regular space characters with a Unicode 'En Space' character.
Here's what worked:
Bash Run Step:
Tools/get-changed-fields.sh src/objects origin/${{ env.DIFF_BRANCH }} > changed-fields.out
output="$(cat changed-fields.out)"
output="${output//$'\n'/\\n}"
output="${output// / }" # replace regular space with 'En Space'
echo "::set-output name=changed-fields-output::$output"
Slack Notification Step:
- name: Changed Fields Slack Notification
if: ${{ success() && steps.summarize-changed-fields.outputs.changed-fields-output != '' && steps.changed-fields-cache.outputs.cache-hit != 'true' }}
env:
SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }}
uses: pullreminders/slack-action#master
with:
args: '{\"channel\":\"${{ env.SUCCESS_SLACK_CHANNEL }}\",\"attachments\":[{\"color\":\"#36a64f\",\"title\":\"Changed Fields Report:\",\"author_name\":\"${{ github.workflow }} #${{ github.run_number }}: ${{ env.BRANCH }} -> ${{ env.TARGET_ORG }} (by: ${{ github.actor }})\",\"author_link\":\"${{ github.server_url }}/${{ github.repository }}/runs/${{ github.run_id }}\",\"text\":\"```\n${{ steps.summarize-changed-fields.outputs.changed-fields-output }}\n```\"}]}'