Unresolved symbol s2 in Specs2 class - specs2

When I compile my specification, the compiler tells me
"error: value s2 is not a member of StringContext"
The salient portion of my specification class is:
import org.specs2._
import specification._
import mock._
class EnterpriseDirectoryActionSpec extends Specification { def is = s2"""
An enterprise directory action should provide enabled fields
after a call to doDefault ${c().e1}
after a call to doSearchPrevious ${c().e2}
after a call to doSearchNext ${c().e3}
after a call to doExecuteSearch ${c().e4}
"""
...
What is causing the error, and how can I correct it?
I'm using Specs2 (artifact specs2_2.10) version 1.14.

You need to use a later version of specs2: specs2-2.0-RC1or specs2-2.0-RC2-SNAPSHOT

For the benefit of others reading this, I put the following into my pom.xml:
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2_2.10</artifactId>
<version>2.0-RC2-SNAPSHOT</version>
<scope>test</scope>
</dependency>
...along with the repository entry for snapshots:
<!--
We need this repository in order to have access to a snapshot version of
the Specs2 testing library for Scala. In particular, the snapshot version
includes support for using string interpolation in test specifications.
-->
<repository>
<id>oss.sonatype.org</id>
<name>snapshots</name>
<url>http://oss.sonatype.org/content/repositories/snapshots</url>
</repository>

Related

Jacoco is analysing class twice and failing

I currently have this configuration for JaCoCo in my pom.xml:
<plugin>
<!-- Maven JaCoCo Plugin configuration for Code Coverage Report -->
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<inherited>true</inherited>
<configuration>
<excludes>
<exclude>**/*Test.class</exclude>
</excludes>
</configuration>
When I run mvn clean verify site, I get a failed build, based in this warning (and important thing: I have only 1 class, SayHello.scala):
Analyzed bundle 'dummy-project' with 2 classes
[WARNING] Rule violated for bundle dummy-project: classes missed count is 1, but expected maximum is 0
Failed to execute goal org.jacoco:jacoco-maven-plugin:0.8.5:check (default-check) on project dummy-project: Coverage checks have not been met. See log for details.
And finally, when I check the report, it's analysing the same class (the only difference is the extra "." in the second line), failing in one of them:
Update
SayHello.scala
package com.dummy
object SayHello {
def sayIt: String = "hello, world"
}
SayHelloTest.scala
package com.dummy
import org.scalatest.funsuite.AnyFunSuite
import org.scalatest.matchers.should.Matchers
class SayHelloTest extends AnyFunSuite with Matchers {
test("SayHello says hello") {
SayHello.sayIt shouldBe "hello, world"
}
}
Anyone had a similar issue? Thank you.
JaCoCo analyzes .class files, not source files. The Scala compiler may produce multiple .class files from a single source file. Your SayHello.scala class most likely contains a companion object. An object is always compiled to a class of the same name with $ at the end, which implements the companion object singleton at the bytecode level. If you go to your target/classes directory, you'll most likely see those two files there - SayHello.class and SayHello$.class.
Two records in the JaCoCo report correspond to those two class files. The dot at the end instead of a $ is most likely a jacoco report rendering issue.
To skip the companion object class from analyzing, just add it to your exclusion list:
<configuration>
<excludes>
<exclude>**/*Test.class</exclude>
<exclude>**/*$.class</exclude>
</excludes>
</configuration>
However, it seems that coverage of the methods in the companion object is attributed to SayHello$.class, not SayHello.class, so by removing the $ class from the list, you essentially lose the coverage.
I'm not aware of any workarounds for the Maven+JaCoCo setup, apparently the jacoco-maven-plugin is not really Scala-aware. However, the sbt plugin appears to have worked around these issues, see e.g. https://blog.developer.atlassian.com/using-jacoco-a-code-coverage-tool-for-scala/
If switching to sbt is not an option, you could take a look at some Scala-specific code coverage tools, like scoverage, which also has a maven plugin.

error: object Stemmer is not a member of package org.apache.spark.mllib.feature

Importing the package org.apache.spark.mllib.feature.Stemmer in Spark-shell using Scala returns the following error:
:47: error: object Stemmer is not a member of package org.apache.spark.mllib.feature
import org.apache.spark.mllib.feature.Stemmer
I am trying to use stemming to my words using:
val stemmer_product_title = new Stemmer()
.setInputCol("ngrams")
.setOutputCol("stemmed")
.setLanguage("English")
Here ngrams is a 1-gram transformed text. Could anyone help me with this please? I would be grateful.
Add the following dependency to your pom.xml
<dependency>
<groupId>com.github.master</groupId>
<artifactId>spark-stemming_2.10</artifactId>
<version>0.2.0</version>
</dependency>
or to your build.sbt
libraryDependencies += "com.github.master" %% "spark-stemming" % "0.2.1"

IllegalStateException when trying to query a MongoDB domain class using Grails 2.3.7

I am working on a legacy project that uses Grails 2.3.7 (with Maven) and Java 7, and I have to add a connection to a MongoDB database while keeping the existing Hibernate ones.
I have added the following to my pom.xml file:
<dependency>
<groupId>org.grails.plugins</groupId>
<artifactId>mongodb</artifactId>
<type>zip</type>
<version>3.0.2</version>
</dependency>
And this to the BuildConfig.groovy file:
plugins {
compile ':mongodb:3.0.2'
compile 'org.grails.plugins:mongodb:3.0.2'
}
(I have tried it both with and without the compile 'org.grails.plugins:mongodb:3.0.2' line)
On the DataSource.groovy file I have configured the db connection as follows:
grails {
mongodb {
host = "xxx.xxx.xxx.xxx"
port = "27017"
databaseName = "db"
username = "user"
password = "pass"
}
}
and the connection itself seems to be working, because if I change any value in there the Grails application does not even start.
I have then created a simple Domain class, Thingy.groovy:
class Thingy {
String identifier
String description
static mapWith = "mongo"
static constraints = {
}
}
And now, when I start the app, any call to methods of that class throws an IllegalStateException: "Method on class [Thingy] was used outside of a Grails application. If running in the context of a test using the mocking API or bootstrap Grails correctly.". However, if at the same place I call any methods of the old Domain classes that use the other datasource, they work like a charm.
Also, when starting the server I get another exception which I guess might be related, but I'm not sure what to do with it either: ERROR - Error configuring dynamic methods for plugin [mongodb:3.0.2]: org/grails/datastore/mapping/query/api/BuildableCriteria
java.lang.NoClassDefFoundError: org/grails/datastore/mapping/query/api/BuildableCriteria.
I have also tried using the MongoDB plugin 3.0.3, but with the same results.
This answer https://stackoverflow.com/a/35710495/451420 gave me a clue. I had to update the grails-datastore-core and grails-datastore-gorm versions manually as well:
<dependency>
<groupId>org.grails</groupId>
<artifactId>grails-datastore-gorm</artifactId>
<version>3.1.4.RELEASE</version>
</dependency>
<dependency>
<groupId>org.grails</groupId>
<artifactId>grails-datastore-core</artifactId>
<version>3.1.4.RELEASE</version>
</dependency>
In case it helps anyone else, I found out which versions to use by looking at the <dependencies> inside the POM file of the mongodb plugin (https://repo.grails.org/grails/plugins/org/grails/plugins/mongodb/3.0.3/mongodb-3.0.3.pom)

Spring Boot Application: No converter found for return value of type

I am writing a simple REST API according to this Spring-Boot tutorial. On my local dev machines (Ubuntu 15.04 and Windows 8.1) everything works like a charm.
I have an old 32-bit Ubuntu 12.04 LTS server lying around on which I wanted to deploy my REST service.
The starting log is ok, but as soon as I send a GET request to the /user/{id} endpoint, I get the following error:
java.lang.IllegalArgumentException: No converter found for return value of type: class ch.gmazlami.gifty.models.user.User
And then down the stacktrace:
java.lang.IllegalArgumentException: No converter found for return value of type: class java.util.LinkedHashMap
The entire stacktrace is posted here.
I looked into some answers referring this error, but those don't seem to apply to my problem, since I'm using Spring-Boot, no xml configs whatsoever.
The affected controller is:
#RequestMapping(value = "/user/{id}", method = RequestMethod.GET)
public ResponseEntity<User> getUser(#PathVariable Long id){
try{
return new ResponseEntity<User>(userService.getUserById(id), HttpStatus.OK);
}catch(NoSuchUserException e){
return new ResponseEntity<>(HttpStatus.NOT_FOUND);
}
}
Any help would be greatly appreciated. It is very weird since the exact same things work on other machines perfectly.
This happened to me, on one resource only (one method) and I did not understand why. All methods within classes in the same package, with the same annotations, same call to ResponseEntity.ok(...) etc. just worked.
But not this one.
It turns out I had forgottent to generate the getters on my POJO class !
As soon as I had added them it worked.
Hopefully it can save somebody some time eventually...
you should make some changes to your pom.xml and mvc-dispatcher-servlet.xml files:
Add the following dependecies to your pom.xml :
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.4.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.3</version>
</dependency>
and update your mvc-dispatcher-servlet.xml:
<mvc:annotation-driven>
<mvc:message-converters>
<bean class="org.springframework.http.converter.StringHttpMessageConverter"/>
<bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter"/>
</mvc:message-converters>
</mvc:annotation-driven>
This happens when you forget the "build" call:
return ResponseEntity.status(HttpStatus.BAD_REQUEST);
should be:
return ResponseEntity.status(HttpStatus.BAD_REQUEST).build();
I meet with this problem, because I omitted Getters and Setters method.
To add to the rest of the answers: the methods must be public.
My IDE flagged that the methods could be "package only", prompting me to remove the "public" portion of the declaration (which I foolishly did).
I added public to my methods and solved the problem.
I was using IntelliJ Idea and its auto-generated getters and setters. Since I had a boolean field called success, the getter was named isSucccess(). I renamed it getSuccess() and the error went away.

Embedded PostgreSQL for Java JUnit tests

Is there an embedded PostgreSql so that we could unit test our PostgreSql driven application?
Since PostgreSql has some dialects, it's better to use embedded PostgreSql itself than other embedded databases.
Embedded does not necessarily mean it must be embedded in the JVM process. It also does not necessarily need to use in-memory persistence. It should be loaded automatically by the dependency management (Maven, Gradle), so that Unit tests can run on every machine without having to install and configure a local PostgreSQL server.
The is an "embedded" PostgresSQL server that has been designed for unit testing from Java:
https://github.com/yandex-qatools/postgresql-embedded
Embedded postgresql will provide a platform neutral way for running postgres binary in unit tests. Much of the code has been crafted from Flapdoodle OSS's embed process
As an aside, there also exists similar projects for Mongo, Redis, Memcached and nodejs.
No, there is no embedded PostgreSQL, in the sense of an in-process-loadable database-as-a-library. PostgreSQL is process oriented; each backend has one thread, and it spawns multiple processes to do work. It doesn' make sense as a library.
The H2 database supports a limited subset of the PostgreSQL SQL dialect and the use of the PgJDBC driver.
What you can do is initdb a new temporary database, start it with pg_ctl on a randomized port so it doesn't conflict with other instances, run your tests, then use pg_ctl to stop it and finally delete the temporary database.
I strongly recommend that you run the temporary postgres on a non-default port so you don't risk colliding with any locally installed PostgreSQL on the machine running the tests.
(There is "embedded PostgreSQL in the sense of ecpg, essentially a PostgreSQL client embedded in C source code as preprocessor based C language extensions. It still requires a running server and it's a bit nasty to use, not really recommended. It mostly exists to make porting from various other databases easier.)
I tried the project suggested by #btiernay (yandex-qatools). I spent a good few days with this and without any offence it's over engineered solution which doesn't work in my case as I wanted to download the binaries from internal repository rather than going to public internet. In theory it supports it but in fact it doesn't.
OpenTable Embedded PostgreSQL Component
I ended up using otj-pg-embedded and it works like a charm. It was mentioned in comments so I thought I'll mention it here as well.
I used it as standalone DB and not via rule for both unit tests and local development.
Dependency:
<dependency>
<groupId>com.opentable.components</groupId>
<artifactId>otj-pg-embedded</artifactId>
<version>0.7.1</version>
</dependency>
Code:
#Bean
public DataSource dataSource(PgBinaryResolver pgBinaryResolver) throws IOException {
EmbeddedPostgres pg = EmbeddedPostgres.builder()
.setPgBinaryResolver(pgBinaryResolver)
.start();
// It doesn't not matter which databse it will be after all. We just use the default.
return pg.getPostgresDatabase();
}
#Bean
public PgBinaryResolver nexusPgBinaryResolver() {
return (system, machineHardware) -> {
String url = getArtifactUrl(postgrePackage, system + SEPARATOR + machineHardware);
log.info("Will download embedded Postgre package from: {}", url);
return new URL(url).openConnection().getInputStream();
};
}
private static String getArtifactUrl(PostgrePackage postgrePackage, String classifier) {
// Your internal repo URL logic
}
You can use a container instance of PostgreSQL.
Since spinning a container is a matter of seconds, this should be good enough for unittests.
Moreover, in case you need to persist the data, e.g. for investigation, you don't need to save the entire container, only the data files, which can be mapped outside of the container.
One of example of how to do this can be found here.
If you are looking to run an in-process version of postgres from an Integration (or similar) test suite, the postgresql-embedded worked fine for me.
I wrote a small maven plugin that can be used as a maven wrapper around a forked version of postgresql-embedded.
I am using the container instance of PostgreSQL in the tests.
https://www.testcontainers.org/#about
https://www.testcontainers.org/modules/databases/jdbc/
dependencies:
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.7.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-params</artifactId>
<version>5.7.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.7.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<version>1.15.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<version>1.15.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>postgresql</artifactId>
<version>1.15.3</version>
<scope>test</scope>
</dependency>
And do the tests:
#SpringBootTest
#ActiveProfiles({"test"})
#Testcontainers
class ApplicationTest {
#Container
static PostgreSQLContainer<?> postgreSQL = new PostgreSQLContainer<>("postgres:12.7");
#DynamicPropertySource
static void postgreSQLProperties(DynamicPropertyRegistry registry) {
registry.add("spring.datasource.username", postgreSQL::getUsername);
registry.add("spring.datasource.password", postgreSQL::getPassword);
}
#Test
void someTests() {
}
in application-test.yml:
source:
datasource:
url: jdbc:tc:postgresql:12.7:///databasename