Setting with for local action script - github

Is it possible to set with parameters not process.env.message for local scripts?
Getting error:
The workflow is not valid. .github/workflows/message.yml (Line: n, Col: n): Unexpected value 'run' .github/workflows/message.yml (Line: n, Col: n): Required property is missing: uses
e.g.
.github/scripts/action.js
#!/usr/bin/env node
import * as core from '#actions/core'
async function run() {
if (core.getInput('message') !== 'must-message') {
core.setFailed('message is invalid')
}
}
run()
./github/workflows/message.yml
name: "message"
jobs:
validate:
runs-on: ubuntu-latest
steps:
- name: check message
run: .github/scripts/action.js
with:
message: 'must-message'

Related

Boolean input not passed correctly to reusable workflow

I have an invoking and a called (reusable) workflow.
Here is the input of the reusable one:
stg_image_build:
description: whether to build the staging image
required: true
type: boolean
default: false
Here is the relevant part of the calling workflow
push_to_stg:
description: "Build and push image to staging registry"
type: boolean
default: false
required: true
...
build-and-push-image:
uses: path/to/reusable/workflow.yaml#master
with:
stg_image_build: ${{ inputs.push_to_stg }}
This fails as follows:
The template is not valid. .github/workflows/calling.yaml (Line: 72, Col: 24): Unexpected type of value '', expected type: Boolean.
The error refers to this line:
stg_image_build: ${{ inputs.push_to_stg }}
Why?

GitHub actions: if conditionals in execution of step fails

I want to run a GitHub actions step if an env var matches a value
- name: send slack failure
uses: rtCamp/action-slack-notify#v2
if: env.STATUS_MESSAGE != "Success"
However the above syntax is marked as error:
The workflow is not valid. .github/workflows/file.yaml (Line: 107, Col: 13): Unexpected symbol: '"Success"'. Located at position 23 within
How can I compare the value of the env var to a given value?
The correct syntax is:
if: ${{ env.STATUS_MESSAGE != 'Success' }}
${{ and }} are optional

Azure Devops Yaml Python Script task: PythonScriptV0 , how to pass arrays as the field supports only a strings

I am trying to pass to a python scripts 3 parameters one of them is an array, this works when i run the script locally, i am using sys.argv to achieve this.
However the argument field support only strings as far i can see. How can i go around this any ideas? Thanks
the array is ${{ parameters.packageVersion }}
Code:
- task: PythonScript#0
displayName: 'Modify ansible inventory files (wms_common.yml) with deployed versions'
inputs:
scriptSource: filePath
scriptPath: deployment/s/azure-devops/scripts/script.py
arguments: |
../../inventories/${{ variables.inventory }}/group_vars/all/wms_common.yml
../../inventories/central/${{ variables.inventory }}/group_vars/all/wms_common.yml
${{ parameters.packageVersion }}
Error:
/azure-devops/wms.full.pipeline.yml (Line: 95, Col: 18): Unable to convert from Array to String. Value: Array
Edit: Reframed question
I think the below YAML will help you convert the array to String objects and use them.
variables:
myVariable1: 'value1'
myVariable2: 'value2'
system.debug: true
parameters:
- name: InstanceArgs
type: object
default: [1,2]
steps:
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: |
import argparse
parse = argparse.ArgumentParser(description="Test")
parse.add_argument("test1")
parse.add_argument("test2")
args = parse.parse_args()
print(args.test1)
print(args.test2)
print('this is a test.')
# arguments: $(myVariable1) $(myVariable2)'
arguments: ${{join(' ',parameters.InstanceArgs)}}
You need to use the join expression in YAML to get the elements:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#join

GitHub actions to trigger build on new Pull Requests

I have the following workflow to trigger CMake builds on my GitHub project:
name: C/C++ CI
on:
push:
branches: [ master, develop ]
pull_request:
types: [ opened, edited, reopened, review_requested ]
branches: [ master, develop ]
jobs:
build:
runs-on: ubuntu-18.04
steps:
- name: Install deps
run: sudo apt-get update; sudo apt-get install python3-distutils libfastjson-dev libcurl4-gnutls-dev libssl-dev -y
- uses: actions/checkout#v2
- name: Run CMake
run: mkdir build; cd build; cmake .. -DCMAKE_INSTALL_PREFIX=/home/runner/work/access/build/ext_install;
- name: Run make
run: cd build; make -j8
I expected it to trigger builds on new Pull Requests and have the build status as a condition to approve the merging.
However I'm finding it a bit challenging to achieve such results. I'm sort of a newbie when it comes to GitHub Actions.
I'm able to accomplish your scenario with a combination of github actions and github protected branch settings.
You've got your github actions setup correctly to run on a Pull Request with a destination branch: master or develop.
Now you have to configure your repo to prevent merging a PR if the CI fails:
On your Github Repo, go to Settings => Branches => Add a rule => Set branch name pattern to master => Enable 'Require status checks to pass before merging' => Status checks found in the last week for this repository pick the CI build you want to enforce
Until I write this response there is no way to do that only using GitHub actions, but you can do that by writing an action, using javascript or other of languages supported by GitHub Actions.
import * as core from '#actions/core'
import * as github from '#actions/github'
import {getRequiredEnvironmentVariable} from "./utils";
type GitHubStatus = { context: string, description?: string, state: "error" | "failure" | "pending" | "success", target_url?: string }
function commitStatusFromConclusion(conclusion: CheckConclusion): GitHubStatus{
let status: GitHubStatus = {
context: "branch-guard",
description: "Checks are running...",
state: "pending",
};
if (conclusion.allCompleted) {
if (conclusion.failedCheck) {
status.state = "failure";
status.description = `${conclusion.failedCheck.appName} ${conclusion.failedCheck.conclusion}`;
status.target_url = conclusion.failedCheck.url
} else {
status.state = "success";
status.description = "All checks are passing!";
}
}
return status;
}
export async function setStatus(repositoryOwner: string, repositoryName: string, sha: string, status: GitHubStatus): Promise<number> {
let api = new github.GitHub(getRequiredEnvironmentVariable('GITHUB_TOKEN'));
let params = {
owner: repositoryOwner,
repo: repositoryName,
sha: sha,
};
let response = await api.repos.createStatus({...params, ...status});
return response.status
}
and after you create the action you only have to call the step inside your workflow:
on:
pull_request: # to update newly open PRs or when a PR is synced
check_suite: # to update all PRs upon a Check Suite completion
type: ['completed']
name: Branch Guard
jobs:
branch-guard:
name: Branch Guard
if: github.event.check_suite.head_branch == 'master' || github.event.pull_request.base.ref == 'master'
runs-on: ubuntu-latest
steps:
- uses: YOUR-REP/YOUR-ACTION#v0.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
If you want more doc about create javascript actions
I got this example from:
Block PR merges when Checks for target branches are failing
Hope that it help you.

Inconsistent behavior on the functioning of the dataflow templates?

When i create a dataflow template, the characteristics of Runtime parameters are not persisted in the template file.
At runtime, if i try to pass a value for this parameter, i take a 400 error
I'm using Scio 0.3.2, scala 2.11.11 with apache beam (0.6).
My parameters are the following :
trait XmlImportJobParameters extends PipelineOptions {
def getInput: ValueProvider[String]
def setInput(value: ValueProvider[String]): Unit
}
They are registred with this code
val options = PipelineOptionsFactory.fromArgs(cmdlineArgs: _*).withValidation().as[XmlImportJobParameters](classOf[XmlImportJobParameters])
PipelineOptionsFactory.register(classOf[XmlImportJobParameters])
implicit val (sc, args) = ContextAndArgs(cmdlineArgs)
To create the template i call sbt with this parameters :
run-main jobs.XmlImportJob --runner=DataflowRunner --project=MyProject --templateLocation=gs://myBucket/XmlImportTemplate --tempLocation=gs://myBucket/staging --instance=myInstance
If i pass explicitly --input, it becomes a StaticValue instead of RuntimeValue, and this time, i can see it in the template file.
The template is called from a google function watching a bucket storage (inspired from https://shinesolutions.com/2017/03/23/triggering-dataflow-pipelines-with-cloud-functions/) :
...
dataflow.projects.templates.create({
projectId: projectId,
resource: {
parameters: {
input: `gs://${file.bucket}/${file.name}`
},
jobName: jobs[job].name,
gcsPath: 'gs://MyBucket/MyTemplate'
}
}
...
The 400 error :
problem running dataflow template, error was: { Error: (109c1c52dc52fec7): The workflow could not be created. Causes: (109c1c52dc52fb8e): Found unexpected parameters: ['input' (perhaps you meant 'runner')] at Request._callback (/user_code/node_modules/googleapis/node_modules/google-auth-library/lib/transporters.js:85:15) at Request.self.callback (/user_code/node_modules/googleapis/node_modules/request/request.js:188:22) at emitTwo (events.js:106:13) at Request.emit (events.js:191:7) at Request.<anonymous(/user_code/node_modules/googleapis/node_modules/request/request.js:1171:10) at emitOne (events.js:96:13) at Request.emit (events.js:188:7) at IncomingMessage.<anonymous> (/user_code/node_modules/googleapis/node_modules/request/request.js:1091:12) at IncomingMessage.g (events.js:291:16) at emitNone (events.js:91:20) code: 400, errors: [ { message: '(109c1c52dc52fec7): The workflow could not be created. Causes: (109c1c52dc52fb8e): Found unexpected parameters: [\'input\' (perhaps you meant \'runner\')]', domain: 'global', reason: 'badRequest' } ] }
Same error when i try this :
gcloud beta dataflow jobs run xmlJobImport --gcs-location gs://MyBucket/MyTemplate --parameters input=gs://MyBucket/file.csv
=>
(gcloud.beta.dataflow.jobs.run) INVALID_ARGUMENT: (260a4f3f738a8ad9): The workflow could not be created. Causes: (260a4f3f738a8f96): Found unexpected parameters: ['input' (perhaps you meant 'runner'), 'projectid' (perhaps you meant 'project'), 'table' (perhaps you meant 'zone')]
The current settings are :
Current Settings:
appName: XmlImportJob$
autoscalingAlgorithm: THROUGHPUT_BASED
input: RuntimeValueProvider{propertyName=input, default=null, value=null}
instance: StaticValueProvider{value=staging}
jobName: xml-import-job
maxNumWorkers: 1
network: staging
numWorkers: 1
optionsId: 0
project: myProjectId
projectid: StaticValueProvider{value=myProjectId}
provenance: StaticValueProvider{value=ac3}
record: StaticValueProvider{value=BIEN}
root: StaticValueProvider{value=LISTEPA}
runner: class org.apache.beam.runners.dataflow.DataflowRunner
stableUniqueNames: WARNING
streaming: false
subnetwork: regions/europe-west1/subnetworks/net-staging
table: StaticValueProvider{value=annonce}
tempLocation: gs://-flux/staging/xmlImportJob/
templateLocation: gs://-flux-templates/XmlImportTemplate
workerMachineType: n1-standard-1
zone: europe-west1-c
Environement
Coping the answer from the issue:
Scio does not currently expose ValueProvider based APIs - we now have an issue open for this #696
A working example would be something like:
object WordCount {
def main(cmdlineArgs: Array[String]): Unit = {
val (sc, args) = ContextAndArgs(cmdlineArgs)
sc.customInput("input", TextIO.read().from(sc.optionsAs[XmlImportJobParameters].getInput))
.map(_.toUpperCase)
.saveAsTextFile(args("output"))
sc.close()
}
}
For the job above, to create template:
run-main com.example.WordCount --runner=DataflowRunner --project=<project> --templateLocation=gs://<template-bucket> --tempLocation=gs://<temp-location> --output=gs://<example-of-static-arg-output>
To submit job:
gcloud beta dataflow jobs run rav-test --gcs-location=gs://<template-bucket> --parameters=input=gs://<runtime-value>