This is the first time ever i am trying to publish an artifact, and it could not be harder.
I am using sbt 0.13.8 with plug-ins sbt-release 1.0.8, sbt-sonatype 2.0 sbt-pgp 1.1.1
relevant part of my build sbt looks like this:
pgpSecretRing := file("""C:\Users\kali\.sbt\gpg\secring.asc""")
pgpPublicRing := file("""C:\Users\kali\.sbt\gpg\pubring.asc""")
//pgpSecretRing := file("""C:\Users\kali\AppData\Local\lxss\home\kali\.gnupg\secring.gpg""")
//pgpPublicRing := file("""C:\Users\kali\AppData\Local\lxss\home\kali\.gnupg\pubring.gpg""")
usePgpKeyHex("c500a525a2efcb99")
name := "project-name"
organization := "com.mehmetyucel"
version := "0.0.2-SNAPSHOT"
scalaVersion := "2.12.2"
crossScalaVersions := Seq("2.11.11", "2.12.2")
lazy val projectName = project in file(".")
homepage := Some(url("https://some-github-url"))
scmInfo := Some(
ScmInfo(url(
"some-github-url"),
"some-github-url.git"))
developers := List(
Developer(
"mehmetyucel",
"Mehmet Yucel",
"mehmet#mehmetyucel.com",
url("some-github-url")))
licenses += ("MIT", url("http://opensource.org/licenses/MIT"))
publishMavenStyle := true
publishTo := Some(
if (isSnapshot.value)
Opts.resolver.sonatypeSnapshots
else
Opts.resolver.sonatypeStaging
)
releaseCrossBuild := true
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
runClean,
runTest,
setReleaseVersion,
commitReleaseVersion,
tagRelease,
// For non cross-build projects, use releaseStepCommand("publishSigned")
releaseStepCommandAndRemaining("+publishSigned"),
setNextVersion,
commitNextVersion,
releaseStepCommand("sonatypeReleaseAll"),
pushChanges
)
The first 5 lines here I added out of desperation because when I do sbt release "I think" it signs my packages with a key that i am not aware of.
the error message i get is:
[info] Evaluate: signature-staging
[info] Failed: signature-staging, failureMessage:No public key: Key with id: (5b09423c9d5fbb5d) was not able to be located on http://keyserver.ubuntu.com:11371/. Upload your public key and try the operation again.
but my key is already uploaded. I can go and find my key on keyserver.ubuntu.com (http://keyserver.ubuntu.com/pks/lookup?op=get&search=0xC500A525A2EFCB99) unfortunately you will realize that the key id here is different. And I have no idea where this "5b09423c9d5fbb5d" is coming from.
I tried cloning the repository into 3 differnt systems (macos, ubuntu and win10) create a brand new key upload the key to ubuntu keyserver and try to release it again. The id in the error message is always same (5b09423c9d5fbb5d) I have no idea where this is coming from and how it is same despite using a totally different key/system
I tried changing usePgpKeyHex("c500a525a2efcb99") to usePgpKeyHex("c500a525a2efcb91") (basically something that does not exist) and I get
[error] Not a valid command: failure-- (similar: onFailure)
[error] Not a valid project ID: failure--
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: failure--
[error] failure--
[error] ^
on signing stage, which is awesome, which actually means sbt is using my key to sign so, still where does 5b09423c9d5fbb5d come from?
Solved:
Apparently this key id was coming from the very first "publishSigned" i did for this project, which failed for another reason. I kept running publishSigned sonatypeRelease back to back quite a few times and this caused a lot of "open" staging repositories in sonatype.
I went to https://oss.sonatype.org/ and dropped all open staging repositories using nexus UI and restarted the release process and everything works.
Related
I have defined a minimal build.sbt with two custom profiles ‘dev’ and ‘staging’ (what SBT seems to call Configurations). However, when I run SBT with the Configuration that was defined first in the file (dev), both Configuration blocks are executed - and if both modify the same setting, the last one wins (staging).
This seems to break any notion of conditional activation, so what am I doing wrong with SBT?
For reference, I want to emulate the conditionally activated Profiles concept of Maven e.g. mvn test -P staging.
SBT version: 1.2.1
build.sbt:
name := "example-project"
scalaVersion := "2.12.6"
...
fork := true
// Environment-independent JVM property (always works)
javaOptions += "-Da=b"
// Environment-specific JVM property (doesn’t work)
lazy val Dev = config("dev") extend Test
lazy val Staging = config("staging") extend Test
val root = (project in file("."))
.configs(Dev, Staging)
.settings(inConfig(Dev)(Seq(javaOptions in Test += "-Dfoo=bar")))
.settings(inConfig(Staging)(Seq(javaOptions in Test += "-Dfoo=qux")))
Command:
# Bad
sbt test
=> foo=qux
a=b
# Bad
sbt clean dev:test
=> foo=qux
a=b
# Good
sbt clean staging:test
=> foo=qux
a=b
Notice that despite of the inConfig usage you're still setting javaOptions in Test, i.e. in the Test config. If you remove in Test, it works as expected:
...
.settings(inConfig(Dev)(javaOptions += "-Dfoo=bar"))
.settings(inConfig(Staging)(javaOptions += "-Dfoo=qux"))
(also Seq(...) wrapping is unnecessary)
Now in sbt:
> show Test/javaOptions
[info] *
> show Dev/javaOptions
[info] * -Dfoo=bar
> show Staging/javaOptions
[info] * -Dfoo=qux
You can achieve the same result by scoping each setting explicitly (without inConfig wrapping):
.settings(
Dev/javaOptions += "-Dfoo=bar",
Staging/javaOptions += "-Dfoo=qux",
...
)
(here Conf/javaOptions is the same as javaOptions in Conf)
I am trying to call sbt assembly from the command line passing it a scalac compiler flag to elides (elide-below 1).
I have managed to get the flag working in the build.sbt by adding this line to the build.sbt
scalacOptions ++= Seq("-Xelide-below", "1")
And also it's working fine when I start sbt and run the following:
$> sbt
$> set scalacOptions in ThisBuild ++=Seq("-Xelide-below", "0")
But I would like to know how to pass this in when starting sbt, so that my CI jobs can use it while doing different assembly targets (ie. dev/test/prod).
One way to pass the elide level as a command line option is to use system properties
scalacOptions ++= Seq("-Xelide-below", sys.props.getOrElse("elide.below", "0"))
and run sbt -Delide.below=20 assembly. Quick, dirty and easy.
Another more verbose way to accomplish the same thing is to define different commands for producing test/prod artifacts.
lazy val elideLevel = settingKey[Int]("elide code below this level.")
elideLevel in Global := 0
scalacOptions ++= Seq("-Xelide-below", elideLevel.value.toString)
def assemblyCommand(name: String, level: Int) =
Command.command(s"${name}Assembly") { s =>
s"set elideLevel in Global := $level" ::
"assembly" ::
s"set elideLevel in Global := 0" ::
s
}
commands += assemblyCommand("test", 10)
commands += assemblyCommand("prod", 1000)
and you can run sbt testAssembly prodAssembly. This buys you a cleaner command name in combination with the fact that you don't have to exit an active sbt-shell session to call for example testAssembly. My sbt-shell sessions tend to live for a long time so I personally prefer the second option.
I'm a little confused on the Scala/SBT documentation for creating Scala tasks. Currently I can run the following from the command line:
sbt ";set target := file(\"$PWD/package/deb-upstart\"); set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart; debian:packageBin; set target := file(\"$PWD/package/deb-systemv\"); set serverLoading in Debian := com.typesafe.sbt.packager.archtypes.ServerLoader.SystemV; debian:packageBin; set target := file(\"$PWD/package/rpm-systemd\"); rpm:packageBin"
This resets my target each time to a different directory (deb-upstart, deb-systemv and rpm-systemd) and runs an sbt-native-package task for each of those settings. (Yes, I realizing I'm compiling it three different times; but sbt-native-packager doesn't seems to have a setting for the artifact directory)
This works fine from a bash prompt, but I've been trying to put the same target into jenkins (replacing $PWD with $WORKSPACE) and I can't seem to get the quote escaping correct. I thought it might be easier just to have a task in either by build.sbt or project/Build.scala that runs all three of those tasks, changing out the target variable each time (and replacing $PWD or $TARGET with the full path of the base directory).
I've attempted the following:
lazy val packageAll = taskKey[Unit]("Creates deb-upstart, deb-systenv and rpm-systemd packages")
packageAll := {
target := baseDirectory.value / "package" / "deb-upstart"
serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart
(packageBin in Debian).value
target := baseDirectory.value / "package" / "deb-systemv"
serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.SystemV
(packageBin in Debian).value
target := baseDirectory.value / "package" / "rpm-systemd"
(packageBin in Rpm).value
}
But the trouble is the .value causes the tasks to get evaluated before my task is even run, so they don't get the new target setting (as stated in this other question: How can I call another task from my SBT task?)
So, I figured this out for you :)
As you already mentioned, combining a multiple tasks in a single one where some of the tasks depend on the same setting, doesn't work out as expected.
Instead we do the following
Create a task for each of our custom steps, e.g. packaging debian for upstart
Define an alias that executes these commands in order
Define tasks
lazy val packageDebianUpstart = taskKey[File]("creates deb-upstart package")
lazy val packageDebianSystemV = taskKey[File]("creates deb-systenv package")
lazy val packageRpmSystemD = taskKey[File]("creates rpm-systenv package")
Example task implementation
The implementation is pretty simple and the same for each of
the tasks.
// don't forget this
import com.typesafe.sbt.packager.archetypes._
packageDebianUpstart := {
// define where you want to put your stuff
val output = baseDirectory.value / "package" / "deb-upstart"
// run task
val debianFile = (packageBin in Debian).value
// place it where you want
IO.move(debianFile, output)
output
}
Define alias
And now compose these tasks into a single alias with
addCommandAlias("packageAll", "; clean " +
"; set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.SystemV" +
"; packageDebianSystemV " +
"; clean " +
"; set serverLoading in Debian := com.typesafe.sbt.packager.archetypes.ServerLoader.Upstart" +
"; packageDebianUpstart " +
"; packageRpmSystemD")
You can look at the complete code here.
Update
Setting the SystemLoader inside the alias seems to be the right
way to solve this. A clean is unfortunately necessary between
each build for the same output format.
Here is how it was configured for Sbt 0.12.x:
parallelExecution in test := false
testGrouping in Test <<= definedTests in Test map { tests =>
tests.map { test =>
import Tests._
import scala.collection.JavaConversions._
new Group(
name = test.name,
tests = Seq(test),
runPolicy = SubProcess(javaOptions = Seq(
"-server", "-Xms4096m", "-Xms4096m", "-XX:NewSize=3584m",
"-Xss256k", "-XX:+UseG1GC", "-XX:+TieredCompilation",
"-XX:+UseNUMA", "-XX:+UseCondCardMark",
"-XX:-UseBiasedLocking", "-XX:+AlwaysPreTouch") ++
System.getProperties.toMap.map {
case (k, v) => "-D" + k + "=" + v
}))
}.sortWith(_.name < _.name)
}
During migration to Sbt 0.13.x I get the following error:
[error] Could not accept connection from test agent: class java.net.SocketException: socket closed
java.net.SocketException: socket closed
at java.net.DualStackPlainSocketImpl.accept0(Native Method)
at java.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:131)
at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:398)
at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:199)
at java.net.ServerSocket.implAccept(ServerSocket.java:530)
at java.net.ServerSocket.accept(ServerSocket.java:498)
at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:48)
at java.lang.Thread.run(Thread.java:745)
Migration changes are just updates in sbt & plugin versions.
Are there any other approaches to forking and ordering of tests in Sbt 0.13.x to overcome that exception?
Works fine on Linux and Mac OS.
Got error on Windows because of limit of classpath length that prevents launching of test agent instance with following error in System.err:
Error: Could not find or load main class sbt.ForkMain
I also got this error when moving to Scala repo to sbt version sbt.version = 1.3.8 (previously 1.2.8 was ok). Strangely worked fine on my mac, but failed on teamcity linux build agents.
Fix for me was to set
fork := false,
in build.sbt.
Not sure why repo had it previously set to fork := true (guess it was cut/paste from somewhere else as no strong reason for this in this repo), but this change resolved the issue. Locally on my mac also runs a few seconds faster now.
See here for background
https://www.scala-sbt.org/1.0/docs/Forking.html
When using sbt with forking (fork in run := true), every output from my application to stdout is prefixed by [info]; output to stderr is prefixed with [error].
This behavior is somewhat annoying when using a Java logging framework which outputs to stderr. The resulting debug messages typically look like this:
[error] [main] INFO MyClass ...
[error] [main] DEBUG MyClass ...
I would like to suppress these prefixes like when running the code without forking. What I tried:
setting sbt -Dsbt.log.noformat=true in the sbt launch script. But this only removes colored ANSI output; prefixes are still there just without color
setting logLevel in run := Level.Error in build.sbt. This does not seem to have any influence on logging with forking.
Is there any way to suppress the prefixes?
You need to set the output strategy of your project.
In my extended build I have the following settings:
settings = Project.defaultSettings ++ Seq(
fork := true, // Fork to separate process
connectInput in run := true, // Connects stdin to sbt during forked runs
outputStrategy := Some(StdoutOutput) // Get rid of output prefix
// ... other settings
)
Can do
sbt -error ...
and also
sbt -warn ...