Running sbt tasks from commandline without plugin.sbt - scala

How to run an sbt build with task from a plugin that is not defined in plugins.sbt?
In maven it's as easy as:
mvn groupId:artifactId:1.2.3:goal
e.g. mvn org.owasp:dependency-check-maven:7.1.1:check
In sbt I get:
$ sbt net.vonbuchholtz:sbt-dependency-check:4.1.0:check
...
[error] Expected ID character
[error] Not a valid command: net (similar: set, new, inspect)
[error] Expected project ID
[error] Expected configuration
[error] Expected ':'
[error] Expected key
[error] Not a valid key: net (similar: test, name, run)
[error] net.vonbuchholtz:sbt-dependency-check:4.1.0:check
[error] ^

You have to have plugin added to sbt to be able to call it.
If you don't want to add it to the project you can add it globally:
// Put things into
// ~/.sbt/1.0/plugins/plugins.sbt
// Actually, you can name the file differently as long as it's in
// ~/.sbt/1.0/plugins/
// and ends with .sbt, sbt will load all files ending with .sbt
// from there.
addSbtPlugin("net.vonbuchholtz" % "sbt-dependency-check" % "4.1.0")
Alternatively, you can define some globally ignored file in .gitignore (assuming you are using git and don't want to accidentally commit things).
// ~/.gitconfig
[core]
excludesfile = ~/.gitignore_global
// ~/.gitignore_global
local.sbt
Then you can add all you want there e.g. local.sbt - you will be able to add some ad hoc changes to you repo and don't worry that they will be commited upstream.
As yet another option, you can put these plugin configs into some file e.g. ~/.extra_plugins.sbt and add them with a command line:
// ~/extra_plugins.sbt
addSbtPlugin("net.vonbuchholtz" % "sbt-dependency-check" % "4.1.0")
sbt --addPluginSbtFile="~/extra_plugins.sbt" dependencyCheck
As far as I can tell you cannot skip the step when you are creating the .sbt file. At best you could auto-generate it and populate it with a script:
// something like this
local tmp_sbt=`mktemp`
echo 'addSbtPlugin("net.vonbuchholtz" % "sbt-dependency-check" % "4.1.0")' >> "$tmp_sbt"
sbt --addPluginSbtFile="$tmp_sbt" dependencyCheck
rm "$tmp_sbt"

Related

Build Scala Play app using different config file

I have a Scala Play application builded using sbt.
When I build the package using sbt dist command, this will result in an archive deployed in /target/universal path. The problem is that I have 3 configuration files:
application.conf
dev.conf
prod.conf
If I try to run sbt dist -Dconfig.resource=prod.conf, I get an error:
[error] Expected ID character
[error] Not a valid command: prod (similar: stopProd, runProd, reload)
[error] Expected project ID
[error] Expected configuration
[error] Expected ':' (if selecting a configuration)
[error] Expected key
[error] Not a valid key: prod (similar: products, projectId, project-id)
[error] prod.conf
Even if I use quotes, or -Dconfig.file, the result is the same. Where am I wrong? thanks
late edit:
sbt dist will build the bin/*.bat files, but the rest of files will remain unbuilded. So, I think if I go for this .zip solution, the package will be the same for all environment types (dev, stage, prod), but I can choose the config file when I run the command to start application:
for example, for prod I can use:
nohup ./app-name-1.0/bin/app-name -J-Xms8g -J-Xmx8g -Dconfig.resource=prod.conf -Dhttp.port=7777</dev/null>> logfile.log 2>&1 &
where -Dconfig.resource=prod.conf make what I want in this post.
the dist command does not take a config argument , therefore you are receiving the error not a valid command.
You can use a system property to specify a config file https://www.playframework.com/documentation/2.8.x/ConfigFile.
This allows you to have a different file loaded for your production environment
Found a tutorial on scala-sbt.org for sbt-native-packager, but got strange errors.
For moment, the only solution to use a different config file for a sbt project builded as .bat file is to specify the config file for starting command, how I specified above (-Dconfig.resource=prod.conf):
nohup ./path -J-Xms8g -J-Xmx8g -Dconfig.resource=prod.conf -Dhttp.port=7777</dev/null>>
So, first step is to use sbt dist to create the zip package without matter about config file. Then, unzip the file on server, and use command to start the application.

How do I publish a scala app, if I'm asked for credentials?

I issue the sbt publish command, and get a prompt asking me to enter a username and a password. Can I provide them in build.sbt or some place else, so I don't have to manually enter them?
In my build.sbt file I have this:
publishTo := Some(Resolver.sftp("Server", "url", "port"))
You can put credentials in a file and reference in credentials.sbt so that sbt will load it, and use it while publishing or dependency download,
STEP1: setup creds file path in ~/.sbt/1.0/plugins/credentials.sbt
echo 'credentials += Credentials(Path.userHome / ".sbt" / ".credentials")' > ~/.sbt/1.0/plugins/credentials.sbt
note: echo some-stuff > some-file will redirect contents to a file. echo is a linux command
STEP2: Your ~/.sbt/.credentials would look like,
realm=Artifactory Realm // or Sonatype Nexus Repository Manager
host=server.com // don't put in http:// or https:// protocal
user=your.username.for.server.com
password=password.for.server.com
STEP3: setup publish config in build.sbt something like below:
publishTo in ThisBuild := {
if (isSnapshot.value)
Some("Artifactory Realm" at "server.com" + "/artifactory/libs-snapshot-local")
else
Some("Artifactory Realm" at "server.com" + "/artifactory/libs-release-local")
}
STEP4: you can verify credentials.sbt is picked up by sbt, just by running sbt clean compile
$ sbt clean compile
[info] Loading settings for project global-plugins from idea.sbt,credentials.sbt ...
Related resources:
Official documentation: https://www.scala-sbt.org/1.0/docs/Publishing.html
How to access a secured Nexus with sbt?
SBT publish to JFrog artifactory
Look in the logs closely for any location where sbt might be looking for the credentials file. Ex: Unable to find credentials file /root.ivy2/.credentials.
Notice that root and ivy2 are separated by a dot and not a slash. Create the above directory if it does not exist and place the credentials as shown above in the .credentials file probably using cat if the OS is linux or macos.
sbt version : 0.13

ignoreUntrackedFiles with sbt-release plugin

I'm trying to ignore my unversioned files for a release. This support ticket says that i just need to add the key ignoreUntrackedFiles := true into my Build.scala file.
When I do this, I get an error trying to build my project
[error] /home/chris/dev/scalacoin/project/Build.scala:19: not found: value ignoreUntrackedFiles
[error] ignoreUntrackedFiles := true,
I've added the plugin to my project, but my Build.scala file cannot find that key. How do I tell my Build.scala file about that plugin's key?
There is a big green 'open' button, indicating that that feature is an open pull request. I.e. it doesn't exist yet in the released plugin.

How do I invoke a shell command during a Play 2.0 build (sbt)?

I would like to add a symlink from the .git/hooks directory to a file in my working tree during a regular Play! framework 2.0 build. According to the Play documentation, all sbt functionality is available as normal in a Play build. Based on google searches, I'm trying to add this code to the ApplicationBuild object in my project/Build.scala file:
val symlinkGitPrepushHookTask = compile in Compile <<= compile in Compile map {comp =>
val output = "ln -sf ../../.hooks/pre-push.py .git/hooks/pre-push".!!
print(output)
comp
}
From my reading of the sbt docs, this should be adding a dependency to the compile task in the Compile scope. The dependency is on its existing value, but with my additional function mapped to it. Now when the compile task runs, my anonymous function should be run too. This does not successfully create the symlink, and does not even seem to run.
Immediately after posting this, I thought I would try adding the example I had found to the project/plugins.sbt file. This works, although it seems an abuse of a file to specify plugins.
... existing plugins.sbt content ...
compile in Compile <<= compile in Compile map { comp =>
"ln -sf ../../.hooks/pre-push.py .git/hooks/pre-push".!!
comp
}
The blank line is critical, as this is the delimiter in the .sbt format.

Scala sbt - Cleaning files with a wildcard or regex

Suppose I have a Scala program that creates files ending in .foo.
I'm building with sbt and want to remove these files whenever sbt clean is called.
Add additional directory to clean task in SBT build shows that a singe file can be added by
cleanFiles <+= baseDirectory { _ / "test.foo" }
However, it's unclear how to extend this to do:
cleanFiles <append> <*.foo>
All .foo files will be in the same directory, so I don't need to recursively check directories.
Though, that would also be interesting to see.
How can I configure sbt to clean files matching a wildcard, or regex?
Is it a bad design decision to have sbt clean remove files my program generates?
Should I instead use a flag in my program? Using sbt clean seems cleaner to me rather
than having to call sbt clean then sbt "run --clean".
This will find anything that matches *.foo in the base directory (but not child directories):
cleanFiles <++= baseDirectory (_ * "*.foo" get)
This works because Seq[File] gets implicitly converted to a PathFinder, which has methods such as * (match the pattern in the base directory) and ** (match the pattern including child directories). Then get converts it back to a Seq[File].