I have a Scala server application that uses gradle and scala and application plugins for build and running. I start the application with gradle run.
A part of my gradle. A typical one, really:
...
apply plugin: 'scala'
apply plugin: 'application'
...
mainClassName = "mypackage.Main"
How to automatically re-compile and restart the application whenever I modify the source code?
Preferably, using the CLI and gradle without any IDE usage. Also, I've heard a similar feature is in sbt, but I'm not intending to use sbt.
I think that you can use the --continous tag solution :
In a terminal:
gradle build --continuous
In an other terminal:
gradle run
I'd tried gradle run --continuous but didn't work because the run task is never-ending and seems like --continuous doesn't start a new build/task unless the previous one has finished.
So, the closest solution I could reach is:
Add an endpoint that shuts down the application.
Make a setup such that you can call the endpoint with one click.
Run gradle run --continuous.
If you make edits to the source code and want to restart the application, call the endpoint.
Yes, that's not exactly "automatically re-compiling and restarting" but that's the closest I could reach. Also, of course, don't keep needless endpoints in your code; either make it automatic to be unavailable in production, or don't apply this solution in the first place.
Related
I have flow integrated into a webpack / babel build using flow-babel-webpack-plugin and it works great. Webpack dev server compiles / serves assets in less than a second and if there are flow type errors it prints them out nicely. I'm very happy with that.
The problem begins when I turn on my IDE. In both VSCode and Atom, if I enable any kind of flow support, my webpack / babel build immediately begins to choke. It will take anywhere between 4 and 70 seconds to compile any change. Often it fails and gives multiple flow is still initializing notices and indicates it has tried to start the server over and over.
I suspect that both webpack and the IDE are trying to spin up separate flow servers at the same time and this is causing a conflict. Or they are using the same flow server and this is, for some reason, also a problem. I just can't figure out what to do about it. I have tried pointing at separate binaries with webpack using the global flow and the IDE using the one from node_modules. No dice.
It seems like this must be an extremely common use case - flow + a webpack watcher + any IDE whatsoever.
I'd like to have both my webpack build compile flow code and have my IDE show me syntax errors etc. So far that's been impossible
It looks like that plugin uses its own copy of Flow, from the flow-bin package:
index.js
package.json
If this version is out of sync with what your IDE is starting up, then they will fight -- starting up one version of Flow will kill any Flow server with a different version that is already running in that directory.
If you put flow-bin in your devDependencies (alongside this webpack plugin) and lock it to a specific version, and also set your IDE to use the Flow binary from flow-bin, then it looks like npm will just install the version you specify, and both the plugin and the IDE will be able to use the same Flow version.
Without knowing more specifics about your setup, it's hard to recommend a more concrete solution. You'll have to either make it so both your IDE and this webpack plugin are running the same version of Flow, or stop using either the IDE or the webpack plugin.
I am finishing a Continuous Integration system with Jenkins and Gradle for a REST service. It will build the App and dependent sub-libraries, build a Docker, start main docker and secondary ones (database, ...) all in Gradle.
As it is a REST service I have a separate project that executes the REST tests completely from outside my project just as it is a REST client, and works ok...
Once my project is built and everything running I need to execute the build in the other project (which is just for tests) as a subproject, and wether it passes or not the tests I want to continue the main script as Dockers need to be stopped and deleted. What is the best approach for this?
Thanks
You just need to create a task with type: GradleBuild in parameter
Example:
task buildAnotherProjectTask(type: GradleBuild) {
buildFile = '../pathToBuildFileInTheOtherProject/build.gradle'
tasks = ['build'] // You can run any task like that
tasks = ['test']
}
and to run it u can use the following command
gradle buildAnotherProjectTask
This is worked with me when i tried it.
Hope my answer will help :)
Yes, I do need to apply both the 'maven' and 'artifactory' plugins. However, there are many, many existing Jenkins jobs which explicitly call 'uploadArchives' (supplied by the maven plugin). I would like to override that in my own plugin to force it to call 'artifactoryPublish' when 'uploadArchives' is used.
I want to do this programmatically in my plugin, rather than changing hundreds of build.gradle files, and I have tried every combination of calls I can think of with no luck so far.
The last caveat is that this has to work back to Gradle 2.4 at a minimum, as the gradle wrapper is being used and several different versions of gradle are invoked over a set of builds.
Anyone have an idea on how to accomplish this?
Here's the latest thing I tried:
project.getTasks().replace('uploadArchives', MyTask)
Although the constructor is called (verified by a println message), the task itself is never called (verified by adding an explicit 'throw' statement and never seeing the exception).
Just to clarify...
I want to override uploadArchives for 3 main reasons:
Many existing jobs (as well as programmer's fingers) with
uploadArchives programmed in.
We WANT the extra info artifactoryPublish creates and uploads,
especially if done via the Jenkins plugin, so you get Jenkins job
number, etc.
We do NOT want anyone using plain uploadArchives and publishing
WITHOUT the extra build info.
I finally found the answer myself... it's not exactly straightforward, but does work as we wished, except for the extra output line about uploadArchives during a build.
project.plugins.apply('com.jfrog.artifactory')
project.getTasks().create([name: 'uploadArchives', type: MyOverrideTask, overwrite: true, dependsOn: ['artifactoryPublish'] as String[], group: 'Upload'])
Yes, I could have just disabled uploadArchives after making it depend on artifactoryPublish, but then instead of just printing 'uploadArchives' during the build, it prints 'uploadArchives [Skipped]', and we didn't want there to be any confusion.
When I try to run few commands with triggering in a Play project, say
~ ;run ;stage
, only run is triggering on source code changes. Is there a way to trigger both commands?
Play v.2.3.6 and sbt 0.13.7 are in use.
Try ~run.
I don't think you should use stage when doing development. stage is used to create folder (inside target/universal) that you can to package and run the app when you want to deploy it in test or production.
I know it's possible to re-compile or re-run tests on file changes. I want know to if it's possible to perform something similar for the run command. ~ run does not work. (That makes sense since the run never finishes).
Is there a way to create a task that watches for file changes, quit the running server and relaunch it ?
If not what other tool, would you suggest to get the same behaviour ?
You will have to bring in an external project like sbt-revolver
https://github.com/spray/sbt-revolver