How to a create a daily build folder using buildbot? - buildbot

I want to save a copy of the nightly build, I figured putting each build into its own daily folder would be idea. However I cannot use the time from buildbot master.cfg because that it set when it is configured:
copy_files = [".\\release\\MyProgram.exe",
".\\install\\ChangeLog.js",
".\\translations.txt"]
server_dest_path_by_date = server_dest_path + "\\{0}".format(time.strftime("%Y-%m-%d"))
my_return.addStep(steps.MakeDirectory(dir=server_dest_path_by_date))
for file in copy_files:
my_return.addStep(ShellCommand(command=["copy", file, server_dest_path_by_date, "/y"]))
How would I get the current run date for use in the destination?

You need to set the date as a Property during runtime in your build config. Do something like this:
my_return.addStep(SetPropertyFromCommand(
property = 'dateRightNow',
command = ['python', '-c', '"import datetime;print datetime.datetime.now().strftime('%y-%m-%d')"']
))
For Python 3.6:
my_return.addStep(SetPropertyFromCommand(
property = 'dateRightNow',
command = ['python', '-c', 'import datetime;print(datetime.datetime.now().strftime("%y-%m-%d"))']
))
and then use the property like this:
my_return.addStep(steps.MakeDirectory(
dir=Interpolate('%(prop:dateRightNow)s')))
for file in copy_files:
my_return.addStep(ShellCommand(command=["copy", file, Interpolate('%(prop:dateRightNow)s'), "/y"]))
Make sure you import Interpolate and setPropertyFromCommand unto:
from buildbot.process.properties import Interpolate
from buildbot.steps.shell import SetPropertyFromCommand

The better way is to use a custom renderer for util.Interpolate(...)
#util.renderer
def cur_date(props):
return datetime.date.today().isoformat()
And later use it as custom keyword in build factory step
cppcheck_dst = '/home/upload/%(kw:cur_date)s/'
bF.addStep(steps.MakeDirectory(dir=util.Interpolate(cppcheck_dst, cur_date=cur_date)))
bF.addStep(steps.CopyDirectory(src='build/build.scan/static/',
dest=util.Interpolate(cppcheck_dst, cur_date=cur_date)))

Related

Convert CloudFormation template (YAML) to Troposphere code

I have a large sized CloudFormation template written in Yaml, I want to start using Troposphere instead. Is there any easy way to convert the CF template to Troposphere code?
I have noticed this script here https://github.com/cloudtools/troposphere/blob/master/troposphere/template_generator.py
This creates a Troposphere python object, but I am not sure if its possible to output it Troposphere code.
You can do it by converting the CF YAML to JSON and running the https://github.com/cloudtools/troposphere/blob/master/scripts/cfn2py passing the JSON file in as an argument.
Adding to the good tip from #OllieB
Install dependencies using pip or poetry:
https://python-poetry.org/docs/#installation
https://github.com/cloudtools/troposphere
https://github.com/awslabs/aws-cfn-template-flip
pip install 'troposphere[policy]'
pip install cfn-flip
poetry add -D 'troposphere[policy]'
poetry add -D cfn-flip
The command line conversion is something like:
cfn-flip -c -j template-vpc.yaml template-vpc.json
cfn2py template-vpc.json > template_vpc.py
WARNING: it appears that the cfn2py script might not be fully unit tested or something, because it can generate some code that does not pass troposphere validations. I recommend adding a simple round-trip test to the end of the output python script, e.g.
if __name__ == "__main__":
template_py = json.loads(t.to_json())
with open("template-vpc.json", "r") as json_fd:
template_cfn = json.load(json_fd)
assert template_py == template_cfn
See also https://github.com/cloudtools/troposphere/issues/1879 for an example of auto-generation of pydantic models from CFN json schemas.
from troposphere.template_generator import TemplateGenerator
import yaml
with open("aws/cloudformation/template.yaml") as f:
source_content = yaml.load(f, Loader=yaml.BaseLoader)
template = TemplateGenerator(source_content)
This snippet shall give you a template object from Troposphere library. You can then make modifications using Troposphere api.

How can I use a system environment variable inside a pyramid ini file?

I exported a variable called DBURL='postgresql://string'and I want to use it in my configuration ini file, e.g::
[app:kotti]
sqlalchemy.url = %(DBURL)s
That's not working.
Put this in your __init__.py:
def expandvars_dict(settings):
"""Expands all environment variables in a settings dictionary."""
return dict((key, os.path.expandvars(value)) for
key, value in settings.items())
Then when you export an environment variable to your shell, the proper syntax is this:
sqlalchemy.url = ${DBURL}
Once you have that environment variable set within your .ini, then you can use the configparser syntax:
sqlalchemy.connection = %(sqlalchemy.url)s%(user:pass and other stuff)s
Idea stolen from https://stackoverflow.com/a/16446566/2214933
PasteDeploy (the ini format pyramid is using here) does not support reading directly from environment variables. A couple common options are:
1) Set that option yourself in your main.
import os
def main(global_config, **settings):
settings['sqlalchemy.url'] = os.environ['DBURL']
config = Configurator(settings=settings)
...
2) Define your ini file as a jinja2 template and have a command to render it out to ini format, and just run that as part of your deploy process.

pytest implementing a logfile per test method

I would like to create a separate log file for each test method. And i would like to do this in the conftest.py file and pass the logfile instance to the test method. This way, whenever i log something in a test method it would log to a separate log file and will be very easy to analyse.
I tried the following.
Inside conftest.py file i added this:
logs_dir = pkg_resources.resource_filename("test_results", "logs")
def pytest_runtest_setup(item):
test_method_name = item.name
testpath = item.parent.name.strip('.py')
path = '%s/%s' % (logs_dir, testpath)
if not os.path.exists(path):
os.makedirs(path)
log = logger.make_logger(test_method_name, path) # Make logger takes care of creating the logfile and returns the python logging object.
The problem here is that pytest_runtest_setup does not have the ability to return anything to the test method. Atleast, i am not aware of it.
So, i thought of creating a fixture method inside the conftest.py file with scope="function" and call this fixture from the test methods. But, the fixture method does not know about the the Pytest.Item object. In case of pytest_runtest_setup method, it receives the item parameter and using that we are able to find out the test method name and test method path.
Please help!
I found this solution by researching further upon webh's answer. I tried to use pytest-logger but their file structure is very rigid and it was not really useful for me. I found this code working without any plugin. It is based on set_log_path, which is an experimental feature.
Pytest 6.1.1 and Python 3.8.4
# conftest.py
# Required modules
import pytest
from pathlib import Path
# Configure logging
#pytest.hookimpl(hookwrapper=True,tryfirst=True)
def pytest_runtest_setup(item):
config=item.config
logging_plugin=config.pluginmanager.get_plugin("logging-plugin")
filename=Path('pytest-logs', item._request.node.name+".log")
logging_plugin.set_log_path(str(filename))
yield
Notice that the use of Path can be substituted by os.path.join. Moreover, different tests can be set up in different folders and keep a record of all tests done historically by using a timestamp on the filename. One could use the following filename for example:
# conftest.py
# Required modules
import pytest
import datetime
from pathlib import Path
# Configure logging
#pytest.hookimpl(hookwrapper=True,tryfirst=True)
def pytest_runtest_setup(item):
...
filename=Path(
'pytest-logs',
item._request.node.name,
f"{datetime.datetime.now().strftime('%Y%m%dT%H%M%S')}.log"
)
...
Additionally, if one would like to modify the log format, one can change it in pytest configuration file as described in the documentation.
# pytest.ini
[pytest]
log_file_level = INFO
log_file_format = %(name)s [%(levelname)s]: %(message)
My first stackoverflow answer!
I found the answer i was looking for.
I was able to achieve it using the function scoped fixture like this:
#pytest.fixture(scope="function")
def log(request):
test_path = request.node.parent.name.strip(".py")
test_name = request.node.name
node_id = request.node.nodeid
log_file_path = '%s/%s' % (logs_dir, test_path)
if not os.path.exists(log_file_path):
os.makedirs(log_file_path)
logger_obj = logger.make_logger(test_name, log_file_path, node_id)
yield logger_obj
handlers = logger_obj.handlers
for handler in handlers:
handler.close()
logger_obj.removeHandler(handler)
In newer pytest version this can be achieved with set_log_path.
#pytest.fixture
def manage_logs(request, autouse=True):
"""Set log file name same as test name"""
request.config.pluginmanager.get_plugin("logging-plugin")\
.set_log_path(os.path.join('log', request.node.name + '.log'))

Terraform - Pass in Variable to "Source" Parameter

I'm using Terraform in a modular fashion in order to build out my infrastructure. I do this by having a configuration file that calls in the different modules. I want to pass an infrastructure variable which picks up what tagged version of the Github repository the application should be building out. Most importantly I'm trying to figure out how to make a concatenation of a string happen in the "source" variable of the configuration file.
module "athenaelb" {
source = "${concat("git::https://github.com/ORG/REPONAME.git?ref=",var.infra_version)}"
aws_access_key = "${var.aws_access_key}"
aws_secret_key = "${var.aws_secret_key}"
aws_region = "${var.aws_region}"
availability_zones = "${var.availability_zones}"
subnet_id = "${var.subnet_id}"
security_group = "${var.athenaelb_security_group}"
branch_name = "${var.branch_name}"
env = "${var.env}"
sns_topic = "${var.sns_topic}"
s3_bucket = "${var.elb_s3_bucket}"
athena_elb_sns_topic = "${var.athena_elb_sns_topic}"
infra_version = "${var.infra_version}"
}
I want it to compile and for the source to look like this (for example): git::https://github.com/ORG/REPONAME.git?ref=v1
Anyone have any thoughts on how to make this work?
Thanks,
Keren
This is not possible currently in Terraform itself.
The only way to achieve something like this is to use a separate script to interact with the git repository that Terraform clones into a subdirectory of the .terraform/modules directory and switch it to a different tag depending on which version you need. This is non-ideal since Terraform organizes these into directories based on a hash of the module path, but if you can identify the module in question it is safe to run git checkout within these repositories as long as you do not run terraform get again afterwards.
For more details and discussion on this issue, see issue #1439 in Terraform's issue tracker, where this feature was requested.
You could use envsubst or python jinja and use these wrapper scripts in your pipeline deploy script to actually build the scripts from .envsubst and .jinja files before your terraform plan/apply
https://github.com/uvoo/process-templates/tree/main/scripts
I wish terraform would support this but my guess is they never will so just add some simple functions/files into deploy scripts which is usually the best way to deploy.

IPYTHON: adding a new folder to searchable template locations for `nbconvert`

The IPYTHON documentation implies there is a way to modify the config file to include an additional path for templates.
Please advise. I have a template file which I want to use, with extension *.tpl which I do not want to have to move around to the local directory of where I do my work.
Any tips? I've searched everywhere and can't find this. It seems to only search the local directory where I am running the ipython nbconvert test.ipynb --to slides --template output_toggle_html.
Thanks.
In the ipython_nbconvert_config.py file located , you can enter the line c.TemplateExporter.template_path = ['.'] which will do the same as the default behavior, however you can add to this list. For example, the code below adds $IPYTHONDIR/nbextensions/templates and will search for *.tpl files in those locations, in the order in which they are provided in the list.
from os import environ
IPYTHONDIR = environ["IPYTHONDIR"]
template_rel_path = '/nbextensions/templates'
template_path = IPYTHONDIR + template_rel_path
c.TemplateExporter.template_path = [
'.',
template_path
]