When we run gradle openapi generator tasks, part of the generated output has an infrastructure folder with all of the shared/common files required for the client to build.
When I start specifying the apis and models that I want generated, then this infrastructure folder is no longer created, so nothing builds!!
What am I doing wrong?
task buildMyClient(type: org.openapitools.generator.gradle.plugin.tasks.GenerateTask){
generatorName = "kotlin"
inputSpec = "$rootDir/openapi/specs/myservice.yaml".toString()
outputDir = "$buildDir/generated".toString()
apiPackage = "com.mycompany.myservice"
packageName = "com.mycompany.myservice"
modelPackage = "com.mycompany.myservice.model"
globalProperties = [
apis: "MyController",
models: "MyApiRequest,MyApiResponse,EmbeddedApiResponse"]
configOptions = [
"enumPropertyNaming": "UPPERCASE"
]
}
Here are the docs on the gradle plugin:
https://github.com/OpenAPITools/openapi-generator/blob/master/modules/openapi-generator-gradle-plugin/README.adoc
Here are the docs on global properties:
https://openapi-generator.tech/docs/globals/
Seems that I might have to specify all supporting files? How do I do that? There isn't some way to use the default?
Related
I want to run feature files in a desired order or sequence, for example:
tags:"`#ProtractorScenario` or #CucumberScenario"
But cucumber scenario is getting executed first. Can someone guide me on this?
Note: Cucumber is executing scenario based on alphabetical order of feature file in folder
Also, in cases with more than 50+ feature files, what would be the best way to define sequencing of cucumber feature files?
In order to have reliable tests, your tests should be independent and not rely on the order they are run in. The reason being that your test shouldn't depend on the system being in a certain state, as this will lead to flaky tests. Each of your tests should set up the expected state (and teardown!), so they can be run independently.
Below is how protractor executes cucumber feature files:
Protractor finds out all feature files specified in specs, save the absolute file path into an array, let's call it feature_list.
Protractor starts a session (start a browser instance)
Protractor generates a Cucumber CLI as below, and execute the CLI to hand over the running control cucumber:
./node_modules/bin/cucumber --require xxx --format xxx feature1,feature2,....featureN
feature1,feature2,....featureN calculated by feature_list.join(',')
From above, we can learn the only opportunity to change the order
is given an order-done feature_list to protractor specs.
Note: every member of the feature_list should be absolute/relative
path of single feature file. folder and wildcard are not recommended to appear in the path.
You can get a solution code from my github: spec.filter.js, which implements:
filter feature file by cucumberOpts.tags
order filter result of above step 1 by priority
Guide to use spec.filter.js:
// protractor conf file
const specFilter = require('./spec.filter.js');
var config = {
seleniumAddress: 'xxxxx',
capabilities:'xxxx',
framework: 'custom',
frameworkPath: require.resolve('protractor-cucumber-framework'),
ignoreUncaughtExceptions: true,
specs: [
'./aa/**/*.feature',
'./bb/**/*.feature'
],
cucumberOpts: {
require: [
'xxx'
],
priorities: {
// feature has tag #SPGC-21542 or #SPGC-21944 or #SPGC-21946
// will has priority 1
'1': ['#SPGC-21542 or #SPGC-21944', '#SPGC-21946'],
// feature has tag #SPGC-22055 will has priority 2,
// feature has heighest priority will put ahead at
// the `specs` list and get executed firstly.
'2': ['#SPGC-22055']
}
tags: ""
}
....
};
exports.config = specFilter(config);
I'm using the gradle tooling API to run functional tests for my own build script.
I would like to access tasks' properties, e.g. the destinationDir of a JavaCompile task, and i don't know how to accomplish this.
Simple example:
Snippet in my buildScript (I defined a SourceSet 'openjpa'):
compileOpenjpaJava {
destinationDir = file(getOpenjpaClassesDir())
}
private String getOpenjpaClassesDir(){
return "build/classes_openjpa"
}
In my functional test I read about a way to access the tasks, but I cannot access the destinationDir-property.
GradleProject project = connection.getModel(GradleProject.class);
project.tasks.each { myTask ->
if ("compileOpenjpaJava" == myTask.name) {
return myTask.destinationDir.absolutePath // brings a runtime error like: unknown property 'destinationDir'
}
}
A similar question w/o answers is here: Gradle tooling api get task outputs
Is it possible at all to access tasks' properties?
Thanks
Jan
I'm using Terraform in a modular fashion in order to build out my infrastructure. I do this by having a configuration file that calls in the different modules. I want to pass an infrastructure variable which picks up what tagged version of the Github repository the application should be building out. Most importantly I'm trying to figure out how to make a concatenation of a string happen in the "source" variable of the configuration file.
module "athenaelb" {
source = "${concat("git::https://github.com/ORG/REPONAME.git?ref=",var.infra_version)}"
aws_access_key = "${var.aws_access_key}"
aws_secret_key = "${var.aws_secret_key}"
aws_region = "${var.aws_region}"
availability_zones = "${var.availability_zones}"
subnet_id = "${var.subnet_id}"
security_group = "${var.athenaelb_security_group}"
branch_name = "${var.branch_name}"
env = "${var.env}"
sns_topic = "${var.sns_topic}"
s3_bucket = "${var.elb_s3_bucket}"
athena_elb_sns_topic = "${var.athena_elb_sns_topic}"
infra_version = "${var.infra_version}"
}
I want it to compile and for the source to look like this (for example): git::https://github.com/ORG/REPONAME.git?ref=v1
Anyone have any thoughts on how to make this work?
Thanks,
Keren
This is not possible currently in Terraform itself.
The only way to achieve something like this is to use a separate script to interact with the git repository that Terraform clones into a subdirectory of the .terraform/modules directory and switch it to a different tag depending on which version you need. This is non-ideal since Terraform organizes these into directories based on a hash of the module path, but if you can identify the module in question it is safe to run git checkout within these repositories as long as you do not run terraform get again afterwards.
For more details and discussion on this issue, see issue #1439 in Terraform's issue tracker, where this feature was requested.
You could use envsubst or python jinja and use these wrapper scripts in your pipeline deploy script to actually build the scripts from .envsubst and .jinja files before your terraform plan/apply
https://github.com/uvoo/process-templates/tree/main/scripts
I wish terraform would support this but my guess is they never will so just add some simple functions/files into deploy scripts which is usually the best way to deploy.
I'm trying to first uninstall a package, then install the latest version of that same package. Simple you would think, but when I include the following code in my DSC configuration:
### remove old product setup
Package removeOldProduct {
Ensure = 'Absent'
Name = 'My Product Name'
Path = ""
ProductId = ""
}
### now install the latest product setup
Package productSetup {
Ensure = 'Present'
Name = 'My Product Name'
Path = "$productShare\Repository\product.msi"
ProductId = ""
Arguments = "ACCEPT_EULA=1 /q"
DependsOn = '[Package]MsSql'
}
While creating the .mof file, I receive the following error:
Test-ConflictingResources : A conflict was detected between resources '[Package]productSetup and '[Package]removeOldProduct in node 'myNodeServer'. Resources have identical key properties but there are
differences in the following non-key properties: 'Path;Ensure;Arguments'.
I don't want to use a Script resource to process my uninstall. What am I doing wrong here?
Your configuration is supposed to be idempotent, generally, so this doesn't make a lot of sense. You would be uninstalling and reinstalling the package every time the configuration is applied (every 30 minutes or whatever it's set to).
An MSI installer should support upgrading automatically, which means you would just ensure the installation of the (newer) MSI.
I have created a plugin for Gradle to deploy to my company's OpenVMS directory structures, which adds deployDev, deployTest, and, when credentials are provided, deployProd tasks to an application build. I have extracted our configuration files for all environments to a separate project so that we can deploy configuration separately from the application.
These custom tasks depend on the application plugin's distZip task and I haven't found a good way to get the appropriate configuration files into the zip based on the called task (e.g. deployDev includes dev config).
I have tried having the deploy* tasks copy configuration files during the configuration phase with:
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
}
deployDev.configure {
copyResources('dev')
}
deployTest.configure {
copyResources('test')
}
project.tasks.findByName('deployProd')?.configure {
copyResources('prod')
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
This doesn't work due to configuration phase executing on all tasks, so even when we execute deployDev, the test configuration has been copied over the dev resources.
I also have tried:
file('build/props').mkdirs()
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
outputs.upToDateWhen {false}
}
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask('deployProd')) {
copyResources('prod')
}else if(graph.hasTask('deployTest')){
copyResources('test')
}else if(graph.hasTask('deployDev')){
copyResources('dev')
}
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
However, with this proposal the distZip task always is UP-TO-DATE. Is there a correct way to do this?