Sbt always does full rebuild because of modified binary dependency rt.jar - scala

Sbt always does full rebuild because of modified binary dependency: rt.jar:
[debug]
[debug] Initial source changes:
[debug] removed:Set()
[debug] added: Set()
[debug] modified: Set()
[debug] Removed products: Set()
[debug] Modified external sources: Set()
[debug] Modified binary dependencies: Set(C:\Program Files\Java\jdk1.6.0_37\jre\lib\rt.jar)
[debug] Initial directly invalidated sources: Set()
Obviously, rt.jar wasn't changed, its created/accesses/modified dates are the same and quite old.
It's in c:\Program Files\Java\jdk1.6.0_37\jre\lib\
JAVA_HOME is set as C:\Progra~1\Java\jdk1.6.0_37
JAVA_HOME/bin is in PATH.
Any ideas why sbt thinks rt.jar was changed?

I had JAVA_HOME set to C:\Progra~1\Java\jdk1.6.0_37 and sbt resolved the dependency as C:\Program Files\Java\jdk1.6.0_37\jre\lib\rt.jar. Sbt uses java.io.File.equals() which is not correct in this case:
In sbt/compile/inc/Incremental.scala:
def externalBinaryModified(entry: String => Option[File], analysis: File => Option[Analysis], previous: Stamps, current: ReadStamps)(implicit equivS: Equiv[Stamp]): File => Boolean =
dependsOn =>
analysis(dependsOn).isEmpty &&
orTrue(
for {
name <- previous.className(dependsOn)
_ = println("Name: " + name)
e <- entry(name)
_ = println("entry: " + e)
} yield {
val resolved = Locate.resolve(e, name)
println("resolved: " + resolved)
println("dependsOn: " + dependsOn)
println("resolved != dependsOn: " + (resolved != dependsOn))
(resolved != dependsOn) || !equivS.equiv(previous.binary(dependsOn), current.binary(resolved))
}
)
gives the following output:
Name: java.lang.Object
entry: c:\Progra~1\Java\jdk1.6.0_37\jre\lib\rt.jar
resolved: c:\Progra~1\Java\jdk1.6.0_37\jre\lib\rt.jar
dependsOn: c:\Program Files\Java\jdk1.6.0_37\jre\lib\rt.jar
resolved != dependsOn: true
Thus, sbt always thinks rt.jar was changed.
Workaround would be setting JAVA_HOME to "c:\Program Files\Java\jdk1.6.0_37"
Solution would be to do:
resolved.getCanonicalPath != dependsOn.getCanonicalPath

Related

SonarQube does not calculate code coverage

I am using postgres sql as my DB in my springboot application .
My SonarQube is unable to calculate code coverage.can someone please guide me in this
build.gradle
plugins {
id 'org.springframework.boot' version "${springBootVersion}"
id 'io.spring.dependency-management' version '1.0.15.RELEASE'
id 'java'
id 'eclipse'
id 'jacoco'
id 'org.sonarqube' version "3.3"
id 'com.google.cloud.tools.jib' version "${jibVersion}"
}
group = 'com.vsi.postgrestoattentive'
if (!project.hasProperty('buildName')) {
throw new GradleException("Usage for CLI:"
+ System.getProperty("line.separator")
+ "gradlew <taskName> -Dorg.gradle.java.home=<java-home-dir> -PbuildName=<major>.<minor>.<buildNumber> -PgcpProject=<gcloudProject>"
+ System.getProperty("line.separator")
+ "<org.gradle.java.home> - OPTIONAL if available in PATH"
+ System.getProperty("line.separator")
+ "<buildName> - MANDATORY, example 0.1.23")
+ System.getProperty("line.separator")
+ "<gcpProject> - OPTIONAL, project name in GCP";
}
project.ext {
buildName = project.property('buildName');
}
version = "${project.ext.buildName}"
sourceCompatibility = '1.8'
apply from: 'gradle/sonar.gradle'
apply from: 'gradle/tests.gradle'
apply from: 'gradle/image-build-gcp.gradle'
repositories {
mavenCentral()
}
dependencies {
implementation("org.springframework.boot:spring-boot-starter-web:${springBootVersion}")
implementation("org.springframework.boot:spring-boot-starter-actuator:${springBootVersion}")
implementation 'org.springframework.boot:spring-boot-starter-web:2.7.0'
developmentOnly 'org.springframework.boot:spring-boot-devtools'
testImplementation 'org.springframework.integration:spring-integration-test'
testImplementation 'org.springframework.batch:spring-batch-test:4.3.0'
implementation("org.springframework.boot:spring-boot-starter-data-jpa:${springBootVersion}")
implementation 'org.postgresql:postgresql:42.2.16'
implementation 'org.springframework.batch:spring-batch-core:4.1.1.RELEASE'
implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.14.1'
implementation group: 'io.micrometer', name: 'micrometer-registry-datadog', version: '1.7.0'
implementation 'com.google.cloud:libraries-bom:26.3.0'
implementation 'com.google.cloud:google-cloud-storage:2.16.0'
testImplementation('org.mockito:mockito-core:3.7.7')
//Below 4 dependencies should be commented in local
implementation 'org.springframework.cloud:spring-cloud-starter-kubernetes-client-all:2.0.4'
implementation 'io.kubernetes:client-java:12.0.0'
implementation("org.springframework.cloud:spring-cloud-gcp-starter-metrics:${gcpSpringCloudVersion}")
implementation 'org.springframework.cloud:spring-cloud-gcp-logging:1.2.8.RELEASE'
testImplementation('org.mockito:mockito-core:3.7.7')
testImplementation 'org.springframework.boot:spring-boot-test'
testImplementation 'org.springframework:spring-test'
testImplementation 'org.assertj:assertj-core:3.21.0'
testImplementation("org.springframework.boot:spring-boot-starter-test:${springBootVersion}") {
exclude group: "org.junit.vintage", module: "junit-vintage-engine"
}
}
bootJar {
archiveFileName = "${project.name}.${archiveExtension.get()}"
}
springBoot {
buildInfo()
}
test {
finalizedBy jacocoTestReport
}
jacoco {
toolVersion = "0.8.8"
}
jacocoTestReport {
dependsOn test
}
//: Code to make build check code coverage ratio
project.tasks["bootJar"].dependsOn "jacocoTestReport","jacocoTestCoverageVerification"
tests.gradle
test {
finalizedBy jacocoTestReport
useJUnitPlatform()
testLogging {
exceptionFormat = 'full'
}
afterSuite { desc, result ->
if (!desc.parent) {
println "Results: (${result.testCount} tests, ${result.successfulTestCount} successes, ${result.failedTestCount} failures, ${result.skippedTestCount} skipped)"
boolean skipTests = Boolean.parseBoolean(project.findProperty('SKIP_TESTS') ?: "false")
if (result.testCount == 0 && !skipTests) {
throw new IllegalStateException("No tests were found. Failing the build")
}
}
}
jacocoTestCoverageVerification {
dependsOn test
violationRules {
rule{
limit {
//SMS-28: Since project is in nascent stage setting code coverage ratio limit to 1%
minimum = 0.5
}
}
}
}
}
sonar.gradle
apply plugin: "org.sonarqube"
apply plugin: 'jacoco'
jacoco {
toolVersion = "0.8.5"
reportsDir = file("$buildDir/jacoco")
}
jacocoTestReport {
reports {
xml.enabled true
html.enabled true
csv.enabled false
}
}
JenkinsBuildFile
pipeline {
agent any
environment {
// TODO: Remove this
GIT_BRANCH_LOCAL = sh (
script: "echo $GIT_BRANCH | sed -e 's|origin/||g'",
returnStdout: true
).trim()
CURRENT_BUILD_DISPLAY="0.1.${BUILD_NUMBER}"
PROJECT_FOLDER="."
PROJECT_NAME="xyz"
//Adding default values for env variables that sometimes get erased from GCP Jenkins
GRADLE_JAVA_HOME="/opt/java/openjdk"
GCP_SA="abc"
GCP_PROJECT="efg"
SONAR_JAVA_HOME="/usr/lib/jvm/java-17-openjdk-amd64"
SONAR_HOST="http://sonar-sonarqube:9000/sonar"
}
stages {
stage('Clean Workspace') {
steps {
echo "Setting current build to ${CURRENT_BUILD_DISPLAY}"
script {
currentBuild.displayName = "${CURRENT_BUILD_DISPLAY}"
currentBuild.description = """Branch - ${GIT_BRANCH_LOCAL}"""
}
dir("${PROJECT_FOLDER}") {
echo "Changed directory to ${PROJECT_FOLDER}"
echo 'Cleaning up Work Dir...'
// Had to add below chmod command as Jenkins build was failing stating gradlew permission denied
sh 'chmod +x gradlew'
//gradlew clean means deletion of the build directory.
sh './gradlew clean -PbuildName=${CURRENT_BUILD_DISPLAY} -Dorg.gradle.java.home=${GRADLE_JAVA_HOME}'
//mkdir -p creates subdirectories
//touch creates new empty file
sh 'mkdir -p build/libs && touch build/libs/${PROJECT_NAME}-${CURRENT_BUILD_DISPLAY}.jar'
}
}
}
stage('Tests And Code Quality') {
steps {
dir("${PROJECT_FOLDER}") {
echo 'Running Tests and SonarQube Analysis'
withCredentials([string(credentialsId: 'sonar_key', variable: 'SONAR_KEY')]) {
sh '''
./gradlew -i sonarqube -Dorg.gradle.java.home=${SONAR_JAVA_HOME} \
-Dsonar.host.url=${SONAR_HOST} \
-PbuildName=${CURRENT_BUILD_DISPLAY} \
-Dsonar.login=$SONAR_KEY \
-DprojectVersion=${CURRENT_BUILD_DISPLAY}
'''
}
echo 'Ran SonarQube Analysis successfully'
}
}
}
stage('ECRContainerRegistry') {
steps {
withCredentials([file(credentialsId: 'vsi-ops-gcr', variable: 'SECRET_JSON')]) {
echo 'Activating gcloud SDK Service Account...'
sh 'gcloud auth activate-service-account $GCP_SA --key-file $SECRET_JSON --project=$GCP_PROJECT'
sh 'gcloud auth configure-docker'
echo 'Activated gcloud SDK Service Account'
dir("${PROJECT_FOLDER}") {
echo "Pushing image to GCR with tag ${CURRENT_BUILD_DISPLAY}..."
sh './gradlew jib -PbuildName=${CURRENT_BUILD_DISPLAY} -PgcpProject=${GCP_PROJECT} -Dorg.gradle.java.home=${GRADLE_JAVA_HOME}'
echo "Pushed image to GCR with tag ${CURRENT_BUILD_DISPLAY} successfully"
}
echo 'Revoking gcloud SDK Service Account...'
sh "gcloud auth revoke ${GCP_SA}"
echo 'Revoked gcloud SDK Service Account'
}
}
}
}
post {
/*
TODO: use cleanup block
deleteDir is explicit in failure because always block is run
before success causing archive failure. Also cleanup block is not
available in this version on Jenkins ver. 2.164.2
*/
success {
dir("${PROJECT_FOLDER}") {
echo 'Archiving build artifacts...'
archiveArtifacts artifacts: "build/libs/*.jar, config/**/*", fingerprint: true, onlyIfSuccessful: true
echo 'Archived build artifacts successfully'
echo 'Publising Jacoco Reports...'
jacoco(
execPattern: 'build/jacoco/*.exec',
classPattern: 'build/classes',
sourcePattern: 'src/main/java',
exclusionPattern: 'src/test*'
)
echo 'Published Jacoco Reports successfully'
}
echo 'Cleaning up workspace...'
deleteDir()
}
failure {
echo 'Cleaning up workspace...'
deleteDir()
}
aborted {
echo 'Cleaning up workspace...'
deleteDir()
}
}
}
Below is the error which i get in Jenkins console
> Task :sonarqube
JaCoCo report task detected, but XML report is not enabled or it was not produced. Coverage for this task will not be reported.
Caching disabled for task ':sonarqube' because:
Build cache is disabled
Task ':sonarqube' is not up-to-date because:
Task has not declared any outputs despite executing actions.
JaCoCo report task detected, but XML report is not enabled or it was not produced. Coverage for this task will not be reported.
User cache: /var/jenkins_home/.sonar/cache
Default locale: "en", source code encoding: "UTF-8"
Load global settings
Load global settings (done) | time=101ms
Server id: 4AE86E0C-AX63D7IJvl7jEHIL9nIz
User cache: /var/jenkins_home/.sonar/cache
Load/download plugins
Load plugins index
Load plugins index (done) | time=48ms
Load/download plugins (done) | time=91ms
Process project properties
Process project properties (done) | time=9ms
Execute project builders
Execute project builders (done) | time=1ms
Java "Test" source files AST scan (done) | time=779ms
No "Generated" source files to scan.
Sensor JavaSensor [java] (done) | time=8057ms
Sensor JaCoCo XML Report Importer [jacoco]
'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
No report imported, no coverage information will be imported by JaCoCo XML Report Importer
Sensor JaCoCo XML Report Importer [jacoco] (done) | time=4ms
Can someone please guide me how to resolve this issue of sonar not calculating code coverage
While it could be caused by a number of things, from the logs, it seems likely that it couldn't find the report.
As the first comment says, check to verify that the report is being generated (but not found). If it is generated, then its likely SQ can't find the XML Jacoco report.
Note the error line, near the end:
'sonar.coverage.jacoco.xmlReportPaths' is not defined. Using default locations: target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml
An example of defining this location this would be putting this line in sonar-project.properties:
sonar.coverage.jacoco.xmlReportsPaths=build/reports/jacoco.xml
It can also be defined directly in a *.gradle file:
sonarqube {
properties {
property "sonar.coverage.jacoco.xmlReportsPaths", "build/reports/jacoco.xml"
}
}

Setup of Scala/Flink project using Bazel

I am trying to setup a simple flink application from scratch using Bazel. I've bootstrapped the project by running
sbt new tillrohrmann/flink-project.g8
and after that I have added some files in order for Bazel to take control of the building (i.e., migrate from sbt). This is how the WORKSPACE looks like
# WORKSPACE
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
skylib_version = "1.0.3"
http_archive(
name = "bazel_skylib",
sha256 = "1c531376ac7e5a180e0237938a2536de0c54d93f5c278634818e0efc952dd56c",
type = "tar.gz",
url = "https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/releases/download/{}/bazel-skylib-{}.tar.gz".format(skylib_version, skylib_version),
)
rules_scala_version = "5df8033f752be64fbe2cedfd1bdbad56e2033b15"
http_archive(
name = "io_bazel_rules_scala",
sha256 = "b7fa29db72408a972e6b6685d1bc17465b3108b620cb56d9b1700cf6f70f624a",
strip_prefix = "rules_scala-%s" % rules_scala_version,
type = "zip",
url = "https://github.com/bazelbuild/rules_scala/archive/%s.zip" % rules_scala_version,
)
# Stores Scala version and other configuration
# 2.12 is a default version, other versions can be use by passing them explicitly:
load("#io_bazel_rules_scala//:scala_config.bzl", "scala_config")
scala_config(scala_version = "2.12.11")
load("#io_bazel_rules_scala//scala:scala.bzl", "scala_repositories")
scala_repositories()
load("#io_bazel_rules_scala//scala:toolchains.bzl", "scala_register_toolchains")
scala_register_toolchains()
load("#io_bazel_rules_scala//scala:scala.bzl", "scala_library", "scala_binary", "scala_test")
# optional: setup ScalaTest toolchain and dependencies
load("#io_bazel_rules_scala//testing:scalatest.bzl", "scalatest_repositories", "scalatest_toolchain")
scalatest_repositories()
scalatest_toolchain()
load("//vendor:workspace.bzl", "maven_dependencies")
maven_dependencies()
load("//vendor:target_file.bzl", "build_external_workspace")
build_external_workspace(name = "vendor")
and this is the BUILD file
package(default_visibility = ["//visibility:public"])
load("#io_bazel_rules_scala//scala:scala.bzl", "scala_library", "scala_test")
scala_library(
name = "job",
srcs = glob(["src/main/scala/**/*.scala"]),
deps = [
"#vendor//vendor/org/apache/flink:flink_clients",
"#vendor//vendor/org/apache/flink:flink_scala",
"#vendor//vendor/org/apache/flink:flink_streaming_scala",
]
)
I'm using bazel-deps for vendoring the dependencies (put in the vendor folder). I have this on my dependencies.yaml file:
options:
buildHeader: [
"load(\"#io_bazel_rules_scala//scala:scala_import.bzl\", \"scala_import\")",
"load(\"#io_bazel_rules_scala//scala:scala.bzl\", \"scala_library\", \"scala_binary\", \"scala_test\")",
]
languages: [ "java", "scala:2.12.11" ]
resolverType: "coursier"
thirdPartyDirectory: "vendor"
resolvers:
- id: "mavencentral"
type: "default"
url: https://repo.maven.apache.org/maven2/
strictVisibility: true
transitivity: runtime_deps
versionConflictPolicy: highest
dependencies:
org.apache.flink:
flink:
lang: scala
version: "1.11.2"
modules: [clients, scala, streaming-scala] # provided
flink-connector-kafka:
lang: java
version: "0.10.2"
flink-test-utils:
lang: java
version: "0.10.2"
For downloading the dependencies, I'm running
bazel run //:parse generate -- --repo-root ~/Projects/bazel-flink-scala --sha-file vendor/workspace.bzl --target-file vendor/target_file.bzl --deps dependencies.yaml
Which runs just fine, but then when I try to build the project
bazel build //:job
I'm getting this error
Starting local Bazel server and connecting to it...
ERROR: Traceback (most recent call last):
File "/Users/salvalcantara/Projects/me/bazel-flink-scala/WORKSPACE", line 44, column 25, in <toplevel>
build_external_workspace(name = "vendor")
File "/Users/salvalcantara/Projects/me/bazel-flink-scala/vendor/target_file.bzl", line 258, column 91, in build_external_workspace
return build_external_workspace_from_opts(name = name, target_configs = list_target_data(), separator = list_target_data_separator(), build_header = build_header())
File "/Users/salvalcantara/Projects/me/bazel-flink-scala/vendor/target_file.bzl", line 251, column 40, in list_target_data
"vendor/org/apache/flink:flink_clients": ["lang||||||scala:2.12.11","name||||||//vendor/org/apache/flink:flink_clients","visibility||||||//visibility:public","kind||||||import","deps|||L|||","jars|||L|||//external:jar/org/apache/flink/flink_clients_2_12","sources|||L|||","exports|||L|||","runtimeDeps|||L|||//vendor/commons_cli:commons_cli|||//vendor/org/slf4j:slf4j_api|||//vendor/org/apache/flink:force_shading|||//vendor/com/google/code/findbugs:jsr305|||//vendor/org/apache/flink:flink_streaming_java_2_12|||//vendor/org/apache/flink:flink_core|||//vendor/org/apache/flink:flink_java|||//vendor/org/apache/flink:flink_runtime_2_12|||//vendor/org/apache/flink:flink_optimizer_2_12","processorClasses|||L|||","generatesApi|||B|||false","licenses|||L|||","generateNeverlink|||B|||false"],
Error: dictionary expression has duplicate key: "vendor/org/apache/flink:flink_clients"
ERROR: error loading package 'external': Package 'external' contains errors
INFO: Elapsed time: 3.644s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
Why is that? Anyone can help? It would be great having detailed instructions and project templates for Flink/Scala applications using Bazel. I've put everything together in the following repo: https://github.com/salvalcantara/bazel-flink-scala, feel free to send a PR or whatever.

How can I mark some ScalaTests so that they only execute when invoked explicitly

I'm new to SBT and scalatest but am wondering how I get some of my org.scalatest._ tests to only execute "on demand".
In SBT I can invoke all of the Unit Tests like sbt:Sprout> test, or all of the IntegrationTests like sbt:Sprout> it:test. I need a way to annotate the tests that allows an sbt:Sprout test invocation to skip them but with some other invocation executes ONLY these tests. Scalatest docs speak of some sbt:Sprout> test-only *RedSuite invocation to allow me to "Categorize" my tests, but its not clear how to leverage that so they DON'T run as Unit Tests. org.scalatest.Tags alone do not get it off the "default" of getting executed when sbt:Sprout> test. I need these to be ignored unless invoked explicitly.
Is this use case possible in ScalaTest through SBT?
You can specify tag names of tests to include or exclude from a run. To specify tags to include, use -n followed by a list of tag names to include. Similarly, to specify tags to exclude, use -l followed by a list of tag names to exclude
(please look here for more info in the official documents).
For example:
package com.test
import org.scalatest.FlatSpec
import org.scalatest.Tag
object IncludeTest extends Tag("com.tags.Include")
object ExcludeTest extends Tag("com.tags.Exclude")
class TestSuite extends FlatSpec {
"Test1" taggedAs(IncludeTest) in {
val sum = 1 + 1
assert(sum === 2)
}
"Test2" taggedAs(ExcludeTest) in {
val minus = 2 - 1
assert(minus === 1)
}
}
To include IncludeTest and exclude ExcludeTest tag, you should do:
test-only org.* -- -n com.tags.Include -l com.tags.Exclude
assume assertion specified inside a fixture-context object could be used to implement conditional-ignore semantics dependant on environmental flag. For example, consider the following IfIgnored fixture
trait IfIgnored extends Assertions {
assume(System.getenv("runIgnoredTest").toBoolean, "!!! TEST IGNORED !!!")
}
which can be instantiated like so
it should "not say goodbye" in new IfIgnored {
Hello.greeting shouldNot be ("goodbye")
}
Now if we define the following settings in build.sbt
Test / fork := true,
Test / envVars := Map("runIgnoredTest" -> "false")
and the following tests
class HelloSpec extends FlatSpec with Matchers {
"The Hello object" should "say hello" in {
Hello.greeting shouldEqual "hello"
}
it should "not say goodbye" in new IfIgnored {
Hello.greeting shouldNot be ("goodbye")
}
it should "not say live long and prosper" in new IfIgnored {
Hello.greeting shouldNot be ("live long and prosper")
}
}
then executing sbt test should output
[info] HelloSpec:
[info] The Hello object
[info] - should say hello
[info] - should not say goodbye !!! CANCELED !!!
[info] scala.Predef.augmentString(java.lang.System.getenv("runIgnoredTest")).toBoolean was false !!! TEST IGNORED !!! (HelloSpec.scala:6)
[info] - should not say live long and prosper !!! CANCELED !!!
[info] scala.Predef.augmentString(java.lang.System.getenv("runIgnoredTest")).toBoolean was false !!! TEST IGNORED !!! (HelloSpec.scala:6)
[info] Run completed in 2 seconds, 389 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 2, ignored 0, pending 0
[info] All tests passed.
where we see only should say hello ran whilst the rest were ignored.
To execute only ignored tests we could define the following custom command testOnlyIgnored:
commands += Command.command("testOnlyIgnored") { state =>
val ignoredTests = List(
""""should not say goodbye"""",
""""should not say live long and prosper""""
).mkString("-z ", " -z ", "")
"""set Test / envVars := Map("runIgnoredTest" -> "true")""" ::
s"""testOnly -- $ignoredTests""" :: state
}
Note how we are making use of -z runner argument to run particular tests, for example,
testOnly -- -z "should not say goodbye" -z "should not say live long and prosper"
Also note how we are manually adding test names to ignoredTests. Now executing sbt testOnlyIgnored should output
[info] HelloSpec:
[info] The Hello object
[info] - should not say goodbye
[info] - should not say live long and prosper
[info] Run completed in 2 seconds, 298 milliseconds.
[info] Total number of tests run: 2
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
where we see should say hello was not run whilst all the ignored tests ran.
If we drop the requirement of having to run ignored tests separately, then we can use provided ignore annotation like so
ignore should "not say goodbye" in {
Hello.greeting shouldNot be ("goodbye")
}
which on sbt test outputs
[info] HelloSpec:
[info] The Hello object
[info] - should say hello
[info] - should not say goodbye !!! IGNORED !!!
[info] - should not say live long and prosper !!! IGNORED !!!
[info] Run completed in 2 seconds, 750 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 2, pending 0

How to modify jar name generate by cmd sbt package

For example , if I ran sbt package and sbt will generate a jar name like project_2.11-version.jar, how can I modify this name to a random name?
Here is what the docs say:
The generated artifact name is determined by the artifactName setting.
This setting is of type (ScalaVersion, ModuleID, Artifact) => String.
And the default implementation is:
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
artifact.name + "-" + module.revision + "." + artifact.extension
}
You can copy this code into your build.sbt or Build.scala file and change how it constructs the artefact name. For example:
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
java.util.UUID.randomUUID.toString + "." + artifact.extension
}

How to publish to multiple repositories in SBT?

I am in the middle of upgrading Nexus version. As part of the process I've set up a new Nexus instance which will run in parallel with the older Nexus instance.
While migrating to the new instance I want to thoroughly test and vet the new instance before pulling the plug on older instance. This requires me to temporarily modify the publish workflow in such a way that sbt publishes the artifacts to both the Nexus instances.
I highly doubt the following code will actually work:
publishTo <<= (version) {
version: String =>
if (version.trim.endsWith("SNAPSHOT")) Some("snapshots" at "http://maven1.dev.net:8081/nexus/content/" + "repositories/snapshots/")
else Some("releases" at "http://maven1.dev.net:8081/nexus/content/" + "repositories/releases/")
},
credentials += Credentials("Sonatype Nexus Repository Manager", "maven1.dev.net", "release-eng", "release"),
publishTo <<= (version) {
version: String =>
if (version.trim.endsWith("SNAPSHOT")) Some("snapshots" at "http://maven2.dev.net:8081/nexus/content/" + "repositories/snapshots/")
else Some("releases" at "http://maven2.dev.net:8081/nexus/content/" + "repositories/releases/")
},
credentials += Credentials("Sonatype Nexus Repository Manager", "maven2.dev.net", "release-eng", "release"),
I also tried looking into a plugin called sbt-multi-publish but I couldn't compile and use it, either.
With Commands and How to change a version setting inside a single sbt command? I could define a new command - myPublishTo - that changes publishTo setting before executing the original publish task:
def myPublishTo = Command.command("myPublishTo") { state =>
val extracted = Project.extract(state)
Project.runTask(
publish in Compile,
extracted.append(List(publishTo := Some(Resolver.file("file", target.value / "xxx"))), state),
true
)
Project.runTask(
publish in Compile,
extracted.append(List(publishTo := Some(Resolver.file("file", target.value / "yyy"))), state),
true
)
state
}
commands += myPublishTo
With this, execute myPublishTo as any other command/task.
You could also define a couple of aliases - pxxx, pyyy and pxy - in build.sbt that would execute a series of commands using ;.
addCommandAlias("pxxx", "; set publishTo := Some(Resolver.file(\"file\", target.value / \"xxx\")) ; publish") ++
addCommandAlias("pyyy", "; set publishTo := Some(Resolver.file(\"file\", target.value / \"yyy\")) ; publish") ++
addCommandAlias("pxy", "; pxxx ; pyyy")
In sbt console you can execute them as any other commands/tasks.
[sbt-0-13-1]> alias
pxxx = ; set publishTo := Some(Resolver.file("file", target.value / "xxx")) ; publish
pyyy = ; set publishTo := Some(Resolver.file("file", target.value / "yyy")) ; publish
pxy = ; pxxx ; pyyy
[sbt-0-13-1]> pxy
[info] Defining *:publishTo
[info] The new value will be used by *:otherResolvers, *:publishConfiguration
[info] Reapplying settings...
[info] Set current project to sbt-0-13-1 (in build file:/Users/jacek/sandbox/so/sbt-0.13.1/)
...
[info] published sbt-0-13-1_2.10 to /Users/jacek/sandbox/so/sbt-0.13.1/target/xxx/default/sbt-0-13-1_2.10/0.1-SNAPSHOT/sbt-0-13-1_2.10-0.1-SNAPSHOT-javadoc.jar
[success] Total time: 1 s, completed Jan 9, 2014 11:20:48 PM
[info] Defining *:publishTo
[info] The new value will be used by *:otherResolvers, *:publishConfiguration
[info] Reapplying settings...
...
[info] published sbt-0-13-1_2.10 to /Users/jacek/sandbox/so/sbt-0.13.1/target/yyy/default/sbt-0-13-1_2.10/0.1-SNAPSHOT/sbt-0-13-1_2.10-0.1-SNAPSHOT-javadoc.jar
[success] Total time: 0 s, completed Jan 9, 2014 11:20:49 PM
This is an old question, but the problem persists. I tried to revive sbt-multi-publish, but it's really old (sbt-0.12) and uses some sbt internals that are hard to deal with. So I took another approach and wrote a new plugin: sbt-publish-more.
It doesn't involve any on-the-fly settings changing or custom commands like the other answer.
After you add the plugin, just set resolvers you want to publish to (taking your code as an example):
publishResolvers := {
val suffix = if (isSnapshot.value) "shapshots" else "releases"
Seq(
s"Maven1 ${suffix}" at s"http://maven1.dev.net:8081/nexus/content/repositories/${suffix}/",
s"Maven2 ${suffix}" at s"http://maven2.dev.net:8081/nexus/content/repositories/${suffix}/"
)
}
And call publishAll task, it will publish to both repositories.
You can also publish to different repositories with different configurations. Check usage docs for details.