How can I create a custom sbt task from existing tasks - scala

I want to create a custom sbt task that use publishLocal in Docker, login in ecr, and publishLocal in Docker.
Also note that I modified "publishLocal in Docker" settings only on custom task.
I tried writing the code bellow but it did not work.
val releaseDev = taskKey[Unit]("push docker image")
releaseDev := (push in ecr).value
releaseDev := releaseDev.dependsOn(publishLocal in Docker).value
releaseDev := (projectSettings ++ Seq(repositoryName in ecr := (packageName in Docker).value + "-stg" + ":" + (version in Docker).value))

Here is how to trigger existing tasks from your custom one. From your question, you want them to be triggered in some particular sequence, so here is what you need to do
lazy val releaseDev = taskKey[Unit]("push docker image")
releaseDev := Def.sequential(push in ecr, publishLocal in Docker).value
Now when you trigger releaseDev it will first turn push in ecr then publishLocal in Docker.
Note: don't forget to add the lazy in your taskKey definition since it may lead to some strange initialization issues.

Thank you marios.
I tried writing the code below and task dependencies were properly handled.
However repositoryName is used that one of project settings.
// docker publish settings
import com.amazonaws.regions.{Region, Regions}
region in ecr := Region.getRegion(Regions.AP_NORTHEAST_1)
repositoryName in ecr := (packageName in Docker).value + ":" + (version in Docker).value
localDockerImage in ecr := (packageName in Docker).value + ":" + (version in Docker).value
// Publisher Setting
//~~~~~~~~~~~~~~~~~~~
import ReleaseTransformations._
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
runClean,
runTest,
setReleaseVersion,
commitReleaseVersion,
tagRelease,
ReleaseStep(state => Project.extract(state).runTask(publishLocal in Docker, state)._1),
ReleaseStep(state => Project.extract(state).runTask(login in ecr, state)._1),
ReleaseStep(state => Project.extract(state).runTask(push in ecr, state)._1),
setNextVersion,
commitNextVersion,
pushChanges
)
lazy val releaseDev = taskKey[Unit]("push docker image")
releaseDev := Def.sequential( publishLocal in Docker, login in ecr, push in ecr).value
(repositoryName in ecr) in releaseDev := (packageName in Docker).value + "-stg" + ":" + (version in Docker).value

Related

How to modify jar name generate by cmd sbt package

For example , if I ran sbt package and sbt will generate a jar name like project_2.11-version.jar, how can I modify this name to a random name?
Here is what the docs say:
The generated artifact name is determined by the artifactName setting.
This setting is of type (ScalaVersion, ModuleID, Artifact) => String.
And the default implementation is:
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
artifact.name + "-" + module.revision + "." + artifact.extension
}
You can copy this code into your build.sbt or Build.scala file and change how it constructs the artefact name. For example:
artifactName := { (sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
java.util.UUID.randomUUID.toString + "." + artifact.extension
}

How can I provision Kubernetes with kops using multi document yaml file?

I've got a multi document yaml file, how can I provision a Kubernetes cluster from this?
Using kops create -f only seems to import the cluster definition, and does not import the instance groups etc which are defined in additional 'documents' with the yaml:
...
masters: private
nodes: private
---
apiVersion: kops/v1alpha2
kind: InstanceGroup
...
According to kops source code, multiple section in one file is supported.
Perhaps you need to check your file for mistakes.
Here is the part responsible for parsing file and creating resources. To make code shorter, I've deleted all the error checks.
For investigation, please see original file:
func RunCreate(f *util.Factory, out io.Writer, c *CreateOptions) error {
clientset, err := f.Clientset()
// Codecs provides access to encoding and decoding for the scheme
codecs := kopscodecs.Codecs //serializer.NewCodecFactory(scheme)
codec := codecs.UniversalDecoder(kopsapi.SchemeGroupVersion)
var clusterName = ""
//var cSpec = false
var sb bytes.Buffer
fmt.Fprintf(&sb, "\n")
for _, f := range c.Filenames {
contents, err := vfs.Context.ReadFile(f)
// TODO: this does not support a JSON array
sections := bytes.Split(contents, []byte("\n---\n"))
for _, section := range sections {
defaults := &schema.GroupVersionKind{
Group: v1alpha1.SchemeGroupVersion.Group,
Version: v1alpha1.SchemeGroupVersion.Version,
}
o, gvk, err := codec.Decode(section, defaults, nil)
switch v := o.(type) {
case *kopsapi.Cluster:
// Adding a PerformAssignments() call here as the user might be trying to use
// the new `-f` feature, with an old cluster definition.
err = cloudup.PerformAssignments(v)
_, err = clientset.CreateCluster(v)
case *kopsapi.InstanceGroup:
clusterName = v.ObjectMeta.Labels[kopsapi.LabelClusterName]
cluster, err := clientset.GetCluster(clusterName)
_, err = clientset.InstanceGroupsFor(cluster).Create(v)
case *kopsapi.SSHCredential:
clusterName = v.ObjectMeta.Labels[kopsapi.LabelClusterName]
cluster, err := clientset.GetCluster(clusterName)
sshCredentialStore, err := clientset.SSHCredentialStore(cluster)
sshKeyArr := []byte(v.Spec.PublicKey)
err = sshCredentialStore.AddSSHPublicKey("admin", sshKeyArr)
default:
glog.V(2).Infof("Type of object was %T", v)
return fmt.Errorf("Unhandled kind %q in %s", gvk, f)
}
}
}
{
// If there is a value in this sb, this should mean that we have something to deploy
// so let's advise the user how to engage the cloud provider and deploy
if sb.String() != "" {
fmt.Fprintf(&sb, "\n")
fmt.Fprintf(&sb, "To deploy these resources, run: kops update cluster %s --yes\n", clusterName)
fmt.Fprintf(&sb, "\n")
}
_, err := out.Write(sb.Bytes())
}
return nil
}
This may not have been possible in the past, but it's possible now.
Here's how I'm currently IaC'ing my kops cluster.
I have 2 files:
devkube.mycompany.com.cluster.kops.yaml
instancegroups.kops.yaml
The 2nd file is a multi-document yaml file like you're requesting.
(with multiple instance.yaml's inside of it separated by ---).
Note: I've successfully been able to combine both of these .yaml files into a single multi-document .yaml file and have it work fine, I just do it this way so I can reuse the instancegroups.yaml on multiple clusters / keep the git repo dry.
I generated the files by using:
Bash# kops get cluster --name devkube.mycompany.com -o yaml > devkube.mycompany.com.cluster.kops.yaml
Bash# kops get ig --name devkube.mycompany.com -o yaml > instancegroups.kops.yaml
Here's how I use these files to provision a kops cluster from git:
export AWS_PROFILE=devkube-kops
export KOPS_STATE_STORE=s3://kops-state-devkube.mycompany.com
clustername="devkube.mycompany.com"
kops replace --name $clustername -f ../kops/qak8s.vibrenthealth.com.cluster.yaml --force
kops replace --name $clustername -f ../kops/instancegroups.yaml --force
kops create secret --name $clustername sshpublickey admin -i ~/.ssh/id_rsa.pub #so the person who provisions the cluster can ssh into the nodes
kops update cluster --name $clustername --yes

sbt-release publish artifact to wrong repository

So I'm trying to use sbt-release, and I'm having issues where it's publishing the artifact to my snapshot repository, and not the release repository.
val Organization = "com.mycompany"
val Name = "My Service"
val Version = "0.1-SNAPSHOT"
...
settings = Defaults.coreDefaultSettings ++ Seq(
name := Name,
organization := Organization,
version := Version,
scalaVersion := ScalaVersion
...
assemblyJarName in assembly := s"my-service-${Version}.jar",
...
)
publishTo := {
val nexus = "my.nexus.url.com/repositories/"
if (isSnapshot.value)
Some("snapshots" at nexus + "snapshots/")
else
Some("releases" at nexus + "releases/")
},
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
If I remove the -SNAPSHOT from the version, then it publishes it to the correct repository, but shouldn't sbt-release be telling it to do that by itself?
Also if I get rid of the if (isSnapshot.value) then sbt publish will also publish to the wrong repository.
If I could get some help on this, I would really appreciate it.
It was the version I had here. It was over riding the version.sbt which is where 0.1-SNAPSHOT should be stored.

How to publish to multiple repositories in SBT?

I am in the middle of upgrading Nexus version. As part of the process I've set up a new Nexus instance which will run in parallel with the older Nexus instance.
While migrating to the new instance I want to thoroughly test and vet the new instance before pulling the plug on older instance. This requires me to temporarily modify the publish workflow in such a way that sbt publishes the artifacts to both the Nexus instances.
I highly doubt the following code will actually work:
publishTo <<= (version) {
version: String =>
if (version.trim.endsWith("SNAPSHOT")) Some("snapshots" at "http://maven1.dev.net:8081/nexus/content/" + "repositories/snapshots/")
else Some("releases" at "http://maven1.dev.net:8081/nexus/content/" + "repositories/releases/")
},
credentials += Credentials("Sonatype Nexus Repository Manager", "maven1.dev.net", "release-eng", "release"),
publishTo <<= (version) {
version: String =>
if (version.trim.endsWith("SNAPSHOT")) Some("snapshots" at "http://maven2.dev.net:8081/nexus/content/" + "repositories/snapshots/")
else Some("releases" at "http://maven2.dev.net:8081/nexus/content/" + "repositories/releases/")
},
credentials += Credentials("Sonatype Nexus Repository Manager", "maven2.dev.net", "release-eng", "release"),
I also tried looking into a plugin called sbt-multi-publish but I couldn't compile and use it, either.
With Commands and How to change a version setting inside a single sbt command? I could define a new command - myPublishTo - that changes publishTo setting before executing the original publish task:
def myPublishTo = Command.command("myPublishTo") { state =>
val extracted = Project.extract(state)
Project.runTask(
publish in Compile,
extracted.append(List(publishTo := Some(Resolver.file("file", target.value / "xxx"))), state),
true
)
Project.runTask(
publish in Compile,
extracted.append(List(publishTo := Some(Resolver.file("file", target.value / "yyy"))), state),
true
)
state
}
commands += myPublishTo
With this, execute myPublishTo as any other command/task.
You could also define a couple of aliases - pxxx, pyyy and pxy - in build.sbt that would execute a series of commands using ;.
addCommandAlias("pxxx", "; set publishTo := Some(Resolver.file(\"file\", target.value / \"xxx\")) ; publish") ++
addCommandAlias("pyyy", "; set publishTo := Some(Resolver.file(\"file\", target.value / \"yyy\")) ; publish") ++
addCommandAlias("pxy", "; pxxx ; pyyy")
In sbt console you can execute them as any other commands/tasks.
[sbt-0-13-1]> alias
pxxx = ; set publishTo := Some(Resolver.file("file", target.value / "xxx")) ; publish
pyyy = ; set publishTo := Some(Resolver.file("file", target.value / "yyy")) ; publish
pxy = ; pxxx ; pyyy
[sbt-0-13-1]> pxy
[info] Defining *:publishTo
[info] The new value will be used by *:otherResolvers, *:publishConfiguration
[info] Reapplying settings...
[info] Set current project to sbt-0-13-1 (in build file:/Users/jacek/sandbox/so/sbt-0.13.1/)
...
[info] published sbt-0-13-1_2.10 to /Users/jacek/sandbox/so/sbt-0.13.1/target/xxx/default/sbt-0-13-1_2.10/0.1-SNAPSHOT/sbt-0-13-1_2.10-0.1-SNAPSHOT-javadoc.jar
[success] Total time: 1 s, completed Jan 9, 2014 11:20:48 PM
[info] Defining *:publishTo
[info] The new value will be used by *:otherResolvers, *:publishConfiguration
[info] Reapplying settings...
...
[info] published sbt-0-13-1_2.10 to /Users/jacek/sandbox/so/sbt-0.13.1/target/yyy/default/sbt-0-13-1_2.10/0.1-SNAPSHOT/sbt-0-13-1_2.10-0.1-SNAPSHOT-javadoc.jar
[success] Total time: 0 s, completed Jan 9, 2014 11:20:49 PM
This is an old question, but the problem persists. I tried to revive sbt-multi-publish, but it's really old (sbt-0.12) and uses some sbt internals that are hard to deal with. So I took another approach and wrote a new plugin: sbt-publish-more.
It doesn't involve any on-the-fly settings changing or custom commands like the other answer.
After you add the plugin, just set resolvers you want to publish to (taking your code as an example):
publishResolvers := {
val suffix = if (isSnapshot.value) "shapshots" else "releases"
Seq(
s"Maven1 ${suffix}" at s"http://maven1.dev.net:8081/nexus/content/repositories/${suffix}/",
s"Maven2 ${suffix}" at s"http://maven2.dev.net:8081/nexus/content/repositories/${suffix}/"
)
}
And call publishAll task, it will publish to both repositories.
You can also publish to different repositories with different configurations. Check usage docs for details.

Building paths in SBT for the packageMappings of the sbt-native-packager

I am very new to SBT and need to create a RPM package for one of my project. The RPM contains only 1 file, which is a one-jar created by the sbt-onejar plugin). I want to use sbt-native-packager plugin and have created a Packagin.scala file under the /project directory like this:
object Packaging {
val settings: Seq[Setting[_]] = packagerSettings ++ deploymentSettings ++ mapGenericFilesToLinux ++ Seq(
maintainer := "Team",
packageSummary := "Summary",
packageDescription := """Description""",
mappings in Universal += {
file("target/scala-2.10/projectname_2.10-0.1-one-jar.jar") -> "/opt/projectname/projectname-0.1.jar"
},
linuxPackageMappings in Rpm <+= (baseDirectory) map { _:File =>
(packageMapping(file("target/scala-2.10/projectname_2.10-0.1-one-jar.jar") -> "/opt/projectname/projectname-0.1.jar")
withUser "someusr" withGroup "somegroup" withPerms "0755")
},
name in Rpm := "projectname",
version in Rpm <<= version apply { sv => sv split "[^\\d]" filterNot (_.isEmpty) mkString "." },
rpmRelease := "1",
rpmVendor := "Vendor",
rpmUrl := Some("url"),
rpmGroup := Some("group"),
rpmLicense := Some("BSD")
)
}
1) I don't want to hardcode the file names. Instead of having "target/scala-2.10/projectname_2.10-0.1-one-jar.jar" I need a way to use existing SettingKey's, i.e. target + "scala-" + scalaVersion + "/" + name + "_" + scalaVersion + "-" + version + "-one-jar.jar" - how do you do this=
2) For the value rpmRelease := "1" I want to use a System property, i.e. in Maven I would do ${rpm.buildNumber}, how does that work in SBT?
3) Is there anything I should do better in regards to the sbt-native-packager plugin?
1) You should always use task output in sbt rather than raw filesystem lookups. Because sbt has parallel execution, if you don't put an explicit dependency on the output of a task, then you have no guarantee that a file will be created before you run your task.
In that vein you want to change your package mappings line to be something like this:
mappings in Universal += {
oneJar.value -> "/opt/projectname/projectname-0.1.jar"
},
Where the oneJar key is defined in the onejar plugin.
2) Sbt just uses scala for the build language, so you can grab system properties similarly (but please also provide a default):
rpmRelease := Option(sys.props("rpm.buildNumber")) getOrElse "1"
3) Right now you're defining a generic package and redefining the same file in the Rpm with a different user. The mapGenericFilesToLinux settings still lack a few customizations, but if you're not creating universal distributions, you should be able to drop that bit of settings and instead directly configure your linux package:
linuxPackageMappings in Rpm <+= (oneJar) map { jar:File =>
(packageMapping(jar -> "/opt/projectname/projectname-0.1.jar")
withUser "someusr" withGroup "somegroup" withPerms "0755")
},