I have an environment.yml shown as follow, I would like to read out the content of the name variable (core-force) and set it as a value of the global variable in my azure-pipeline.yamal file how can I do it?
name: core-force
channels:
- conda-forge
dependencies:
- click
- Sphinx
- sphinx_rtd_theme
- numpy
- pylint
- azure-cosmos
- python=3
- flask
- pytest
- shapely
in my azure-pipeline.yml file I would like to have something like
variables:
tag: get the value of the name from the environment.yml aka 'core-force'
Please check this example:
File: vars.yml
variables:
favoriteVeggie: 'brussels sprouts'
File: azure-pipelines.yml
variables:
- template: vars.yml # Template reference
steps:
- script: echo My favorite vegetable is ${{ variables.favoriteVeggie }}.
Please note, that variables are simple string and if you want to use list you may need do some workaraund in powershell in place where you want to use value from that list.
If you don't want to use template functionality as it is shown above you need to do these:
create a separate job/stage
define step there to read environment.yml file and set variables using REST API or Azure CLI
create another job/stage and move you current build defitnion into there
I found this topic on developer community where you can read:
Yaml variables have always been string: string mappings. The doc appears to be currently correct, though we may have had a bug when last you visited.
We are preparing to release a feature in the near future to allow you to pass more complex structures. Stay tuned!
But I don't have more info bout this.
Global variables should be stored in a separate template file. This file ideally would be in a separate repo where other repos can refer to this.
Here is another answer for this
We are currently using cloud formation to create a glue job (via codebuild and codepipeline). The one thing we are stuck on is how to automate the code that goes into the glue job.
Our current relevant piece of the cloudformation template looks like this:
MyJob:
Type: AWS::Glue::Job
Properties:
Command:
Name: glueetl
ScriptLocation: "s3://aws-glue-scripts//your-script-file.py"
DefaultArguments:
"--job-bookmark-option": "job-bookmark-enable"
ExecutionProperty:
MaxConcurrentRuns: 2
MaxRetries: 0
Name: cf-job1
Role: !Ref MyJobRole
The problem is is the "ScriptLocation". Looks like it is required to be an S3 location. How can we automate the upload of this. The code is in a .py file in our Git repository and I assume is uploaded to the artifact repository as are of the codebuild process, but how to access it?
Would like to hear how others are doing this. Thanks!
EDIT: I was able to find a similar stack overflow post:AWS Glue automatic job creation but it the answers really don't give a solution or understand the question posed.
I've written a tool to handle the upload of stack dependencies, including CloudFormation nested templates and non-inline Lambda functions.
Currently AWS Glue was not handled since I haven't try it in any project yet. But it should be easy to expand to support Glue.
The dependencies were defined in separate config file, and a piece of code within the tool is responsible for the config. Here's the sample config:
Nested CloudFormation templates:
# DEPENDS=( <ParameterName>=<NestedTemplate> )
#
# Required: Yes if has nested template, otherwise No
# Default: None
# Syntax:
# <ParameterName>: The name of template parameter that is referred at the
# value of nested template property `TemplateURL`.
# <NestedTemplate>: A local path or a S3 URL starting with `s3://` or
# `https://` pointing to the nested template.
# The nested templates at local is going to be uploaded
# to S3 Bucket automatically during the deployment.
# Description:
# Double quote the pairs which contain whitespaces or special characters.
# Use `#` to comment out.
# ---
# Example:
# DEPENDS=(
# NestedTemplateFooURL=/path/to/nested/foo/stack.json
# NestedTemplateBarURL=/path/to/nested/bar/stack.json
# )
Lambda functions:
# LAMBDA=( <S3BucketParameterName>:<S3KeyParameterName>=<LambdaFunction> )
#
# Required: Yes if has None-inline Lambda Function, otherwise No
# Default: None
# Syntax:
# <S3BucketParameterName>: The name of template parameter that is referred
# at the value of Lambda property `Code.S3Bucket`.
# <S3KeyParameterName>: The name of template parameter that is referred
# at the value of Lambda property `Code.S3Key`.
# <LambdaFunction>: A local path or a S3 URL starting with `s3://` pointing
# to the Lambda Function.
# The Lambda Functions at local is going to be zipped and
# uploaded to S3 Bucket automatically during the deployment.
# Description:
# Double quote the pairs which contain whitespaces or special characters.
# Use `#` to comment out.
# ---
# Example:
# DEPENDS=(
# S3BucketForLambdaFoo:S3KeyForLambdaFoo=/path/to/LambdaFoo.py
# S3BucketForLambdaBar:S3KeyForLambdaBar=s3://mybucket/LambdaBar.py
# )
The tools were written in bash and come with 2 parts:
xsh: It works as a bash library framework.
xsh-lib/aws: It's a library of xsh.
The code you may need to expand is located in xsh-lib/aws/functions/cfn/deploy.sh.
The example deploy command looks like:
$ xsh aws/cfn/deploy -C /path/to/your/template-and-config-dir -t stack.json -c sample.conf
I'm considering to abstract the dependencies such as CloudFormation template, Lambda functions and Glue, into a single interface for both configs and handlers.
This will make it easier to add new dependency handlers to the deployer.
Update:
Scary scary, restarting the computer helped.
I will keep it, if someone encounter this.
Setup:
Powershell version: 6.2.1
MacOs: Catalina
I am using Classe, so I have to use "using module" to be able to reference classes.
I am also using "Import-Module" to load modules.
But now i get some strange behaviour - class constructer called twice and same file opened twice in VSCode.
Loading:
Import-Module module_one.psm1
Import-Module module_two.psm1
In module_one.psm1:
Class MyClass {
MyClass(){} # << This is called twice
}
In module_two.psm1:
using module module_one.psm1
[MyClass]::new()
Is there no way to use Classes with "Import-Module"?
I'd like to combine two (or more) analysis_options.yaml files for my project but wasn't able to find a way how to do that.
This works:
include: package:pedantic/analysis_options.yaml
...
This works too:
include: package:flutter/analysis_options_user.yaml # note different "base" lint rules
...
But I'd need something like this:
include:
- package:pedantic/analysis_options.yaml
- package:flutter/analysis_options_user.yaml
...
...which results in following error:
warning: The include file - package:pedantic/analysis_options.yaml
- package:flutter/analysis_options_user.yaml
in /home/.../analysis_options.yaml cannot be found. (include_file_not_found at [...] analysis_options.yaml:1)
Has anyone encountered/resolved same problem?
You can only include one analysis options file.
Plus package:pedantic/analysis_options.yaml and package:flutter/analysis_options_user.yaml
enforce different rules.
Example:
Pedantic does not use empty_statements
analysis_options_user.yaml uses it
I have a project written in CoffeeScript that uses AngularJS. My vendor dependancies are installed using Bower and my file structure is like this:
- assets
- js
- app
- model
- *.coffee
- factory
- *.coffee
...
- app.coffee
- config.coffee
- routes.cofeee
- vendor
- angular
- lodash
...
- dist
What I'm trying to do is the following:
I'm trying to work out how I can use RequireJS's r.js to optimise my app files so that I essentially get a concatenated file all ordered nice (so vendor dependancies, my config and routes, and they my app files).
Integrate this into my Grunt file.
I've tried using the r.js optimiser but maybe I've being too silly as all it seems to do is copy my app files (minus the vendor dependancies) into the dist folder; it does, however, manage to optimise the coffee generated js files.
Has anyone got any experience with this?
I figured it out: r.js works by reading your mainConfigFile and any modules you name within your configuration, the important note here is that r.js only looks at the first require/define within your named modules and goes off to seek them; so, for example, I had one named module called app:
require ['config'], (cfg) ->
require ['angular'], (A) ->
A.module cfg.ngApp, []
require ['routes'], () ->
require [
'factory/a-factory',
'service/a-service',
'controller/a-controller'
], () ->
A.bootstrap document, [cfg.ngApp]
The problem here was that r.js never got past the first require statement and thus the concatenation wasn't working. When I changed this to, say (my app.coffee):
require ['config'], (cfg) ->
require ['angular'], (A) ->
A.module cfg.ngApp, []
require ['bootstrap'], (bootstrap) ->
bootstrap()
And my bootstrap.coffee:
define [
'config',
'angular',
'routes',
'factory/a-factory',
'service/a-service',
'controller/a-controller'
], (cfg, A, routes) ->
class Bootstrap
constructor: () ->
routes()
A.bootstrap document, [cfg.ngApp]
This meant that I only needed to define angular and bootstrap in my r.js configuration as includes and then r.js would do the rest, like so:
baseUrl: 'assets/js/app',
mainConfigFile: 'assets/js/app/config.js',
name: 'app',
include: [
'../vendor/requirejs/require',
'bootstrap'
],
out: 'assets/js/dist/app.js'
And now it all works fine! ~~It's a shame that I have to tell r.js to include requirejs though, maybe I've done something silly there?~~
Blimey, I'm such a dingus!
So in my HTML I was loading my concatenated script as:
<script src="assets/js/dist/app.js"></script>
When really it should be loaded like this:
<script src="assets/js/vendor/requirejs/require.js" data-main="assets/js/dist/app"></script>
D'oh!
From r.js doc
https://github.com/jrburke/r.js/blob/master/build/example.build.js#L322
Nested dependencies can be bundled in requireJS > v1.0.3
//Finds require() dependencies inside a require() or define call. By default
//this value is false, because those resources should be considered dynamic/runtime
//calls. However, for some optimization scenarios, it is desirable to
//include them in the build.
//Introduced in 1.0.3. Previous versions incorrectly found the nested calls
//by default.
findNestedDependencies: false,