container:start Not a valid key: start at Scalatra-website-examples/deployment - scala

I'm trying to follow this tutorial about scalatra deployment.
But I get error on > container:start, I very new to scala and have no idea what to do, googling result tells me to setup xsbt-web-plugin, but is it right? isn't examples need to fully independent for such custom dependencies or it is scala way?
upd(console output):
➜ scalatra-heroku git:(master) sbt
[info] Loading project definition from /Users/user1/folder1/scalatra-website-examples/2.4/deployment/scalatra-heroku/project
[info] Compiling 1 Scala source to /Users/user1/folder1/scalatra-website-examples/2.4/deployment/scalatra-heroku/project/target/scala-2.10/sbt-0.13/classes...
[warn] there were 1 deprecation warning(s); re-run with -deprecation for details
[warn] one warning found
[info] Set current project to Heroku Example (in build file:/Users/user1/folder1/scalatra-website-examples/2.4/deployment/scalatra-heroku/)
> container:start
[error] Not a valid key: start (similar: state, startYear, target)
[error] container:start
[error]

It looks like that README is messed up. Try this example instead:
https://github.com/kissaten/scalatra-heroku
And you can always follow this guide:
http://scalatra.org/2.4/guides/deployment/heroku.html

Related

How to have a SBT subproject depending on the root project?

I'm trying to use mdoc in a simple SBT single-project build (link). mdoc requires me to create a subproject for the documentation, but I'd like to avoid moving all the code to a subfolder. I was trying to create a docs subproject that depends on the root project:
lazy val core = project.in(file("."))
lazy val docs = project.in(file("docs")).dependsOn(core)
However, this makes SBT try to find a JAR for my root project (obviously not finding it):
(sbt) core ❯ docs/compile
[info] Updating
[info] Resolved dependencies
[warn]
[warn] Note: Unresolved dependencies path:
[error] stack trace is suppressed; run last docs / update for the full output
[error] (docs / update) sbt.librarymanagement.ResolveException: Error downloading net.ruippeixotog:akka-testkit-specs2_2.12:0.3.0-SNAPSHOT
[error] Not found
[error] Not found
[error] not found: /Users/rui/.ivy2/local/net.ruippeixotog/akka-testkit-specs2_2.12/0.3.0-SNAPSHOT/ivys/ivy.xml
[error] not found: https://repo1.maven.org/maven2/net/ruippeixotog/akka-testkit-specs2_2.12/0.3.0-SNAPSHOT/akka-testkit-specs2_2.12-0.3.0-SNAPSHOT.pom
[error] Total time: 0 s, completed Dec 28, 2019 11:27:50 PM
With other subproject deps (e.g. if I make the core project point to another folder) SBT adds a direct classpath dependency to the subproject's target/ folder. Why is the root project handled differently? Is there another way to make this work?

sbt Hello World raising UnknownFormatConversionException: Conversion = '0'

I tried to create a hello world following instructions given in the sbt-documentation.
$ sbt new sbt/scala-seed.g8
WARN: No sbt.version set in project/build.properties, base directory: /Users/pankaj/Work/Code/learn
[warn] Executing in batch mode.
[warn] For better performance, hit [ENTER] to switch to interactive mode, or
[warn] consider launching sbt without any commands, or explicitly passing 'shell'
[info] Set current project to learn (in build file:/Users/pankaj/Work/Code/learn/)
Minimum Scala build.
name [My Something Project]: hello
Template applied in ./hello
$
Now I go into the "hello" directory just created and run "sbt".
$ sbt
[info] Loading project definition from /Users/pankaj/Work/Code/learn/hello/project
[info] Updating {file:/Users/pankaj/Work/Code/learn/hello/project/}hello-build...
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-2cf13e211b2cb31f0d3b317289dca70[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/pankaj/Work/Code/learn/hello/project/target/scala-2.10/sbt-0.13/classes...
java.util.UnknownFormatConversionException: Conversion = '0'
at java.util.Formatter.checkText(Formatter.java:2579)
at java.util.Formatter.parse(Formatter.java:2565)
at java.util.Formatter.format(Formatter.java:2501)
…
…
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
[error] (compile:compile) java.util.UnknownFormatConversionException: Conversion = '0'
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q
$
First, I am clueless why the build tool is using scala-2.10, when the generated build.sbt file has "scalaVersion" 2.12.1 specified under "inThisBuild" settings.
Second, and the main issue is, how to debug and fix the
UnknownFormatConversionException: Conversion = '0'
Not sure how to get past the "Hello, World!" example.

Scalc SBT compile failing

I am trying to follow code from below link
http://spark.apache.org/docs/latest/quick-start.html
But when I am trying to create package it is failing. I want to know 2 thinks
obviously, why it is failing
why it is showing older version of the scala, while I mentioned 2.11
Below is the error message.
[info] Set current project to default-0464ce (in build file:/home/ubuntu/simple_sbt/)
[info] Updating {file:/home/ubuntu/simple_sbt/}default-0464ce...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/ubuntu/simple_sbt/target/scala-2.9.1/classes...
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:1: object apache is not a member of package org
[error] import org.apache.spark.SparkContext
[error] ^
[error] /home/ubuntu/simple_sbt/src/main/scala/SimpleApp.scala:2: object apache is not a member of package org
[error] import org.apache.spark.SparkContext._
[error] ^
[error] two errors found
[error] {file:/home/ubuntu/simple_sbt/}default-0464ce/compile:compile: Compilation failed
[error] Total time: 2 s, completed Aug 30, 2016 3:19:18 AM
when you run sbt package , sometimes it fails as there are no dependencies that are downloaded and will be resolved for the files imported.
Try running, sbt run first and then sbt package . sbt run should bring in all the dependencies on top of which packaging and compiling will be possible.
If the above does not solves your problem, you need to share your sbt build file and the environment that you are using. In which directory you are running these commands, will also play a role.

Why is Typesafe activator command `activator dependencies` not working?

I created a new project using Typesafe Activator. In the command prompt I execute the command activator dependencies. This results in:
E:\sample_app>activator dependencies
[info] Loading project definition from E:\sample_app\project
[info] Updating {file:/E:/sample_app/project/}sample_app-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Set current project to sample_app (in build file:/E:/sample_app/)
[error] Not a valid command: dependencies
[error] Not a valid project ID: dependencies
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: dependencies (similar: all-dependencies, rpm-dependencies, allDependencies)
[error] dependencies
[error] ^
Versions used:
Java version is 1.8.0_51,
Activator is 1.3.6,
OS is windows 8,
64 - bit OS.
Analysis
First, dependencies is not valid sbt commands. (All sbt commands can be used also in activator.)
Solution
Either you mean libraryDependencies (which is an sbt setting) so call
activator libraryDependencies
Or you want to see the classpath of the dependencies ( a sbt task, so you need to use show to see the output of the sbt task), e.g.
activator "show dependencyClasspath"
Edit as of 2015-09-30,3:50am
If calling from console, the combined command must be put into quotes. Here: "show dependencyClasspath"

"./sbt/sbt assembly" errors "Not a valid command: assembly" for Apache Spark project

I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:
[info] Set current project to default-289e76 (in build file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]
I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.
The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:
$ /tmp ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error] ^
Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):
$ spark-0.8.1-incubating sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.
To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .