Im trying to fetch a list of objects from the database using Spring boot Webflux with the postgres R2DBC driver, but I get an error saying:
value ignored org.springframework.transaction.reactive.TransactionContextManager$NoTransactionInContextException: No transaction in context Context1{reactor.onNextError.localStrategy=reactor.core.publisher.OnNextFailureStrategy$ResumeStrategy#7c18c255}
it seems all DatabaseClient operations requires to be wrap into a transaction.
I tried different combinations of the dependencies between spring-boot-data and r2db but didn't really work.
Version:
<spring-boot.version>2.2.0.RC1</spring-boot.version>
<spring-data-r2dbc.version>1.0.0.BUILD-SNAPSHOT</spring-data-r2dbc.version>
<r2dbc-releasetrain.version>Arabba-M8</r2dbc-releasetrain.version>
Dependencies:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-r2dbc</artifactId>
<version>${spring-data-r2dbc.version}</version>
</dependency>
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-postgresql</artifactId>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
</dependency>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-bom</artifactId>
<version>${r2dbc-releasetrain.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
fun findAll(): Flux<Game> {
val games = client
.select()
.from(Game::class.java)
.fetch()
.all()
.onErrorContinue{ throwable, o -> System.out.println("value ignored $throwable $o") }
games.subscribe()
return Flux.empty()
}
#Table("game")
data class Game(#Id val id: UUID = UUID.randomUUID(),
#Column("guess") val guess: Int = Random.nextInt(500))
Github repo: https://github.com/odfsoft/spring-boot-guess-game/tree/r2dbc-issue
I expect read operations to not require #Transactional or to run the query without wrapping into the transactional context manually.
UPDATE:
After a few tries with multiple version I manage to find a combination that works:
<spring-data-r2dbc.version>1.0.0.BUILD-SNAPSHOT</spring-data-r2dbc.version>
<r2dbc-postgres.version>0.8.0.RC2</r2dbc-postgres.version>
Dependencies:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-r2dbc</artifactId>
<version>${spring-data-r2dbc.version}</version>
</dependency>
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-postgresql</artifactId>
<version>${r2dbc-postgres.version}</version>
</dependency>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.r2dbc</groupId>
<artifactId>r2dbc-bom</artifactId>
<version>Arabba-RC2</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
it seems the versions notation went from 1.0.0.M7 to 0.8.x for r2dbc due to the following:
https://r2dbc.io/2019/05/13/r2dbc-0-8-milestone-8-released
https://r2dbc.io/2019/10/07/r2dbc-0-8-rc2-released
but after updating to the latest version a new problem appear which is that a transaction is required to run queries as follow:
Update configuration:
#Configuration
class PostgresConfig : AbstractR2dbcConfiguration() {
#Bean
override fun connectionFactory(): ConnectionFactory {
return PostgresqlConnectionFactory(
PostgresqlConnectionConfiguration.builder()
.host("localhost")
.port(5432)
.username("root")
.password("secret")
.database("game")
.build())
}
#Bean
fun reactiveTransactionManager(connectionFactory: ConnectionFactory): ReactiveTransactionManager {
return R2dbcTransactionManager(connectionFactory)
}
#Bean
fun transactionalOperator(reactiveTransactionManager: ReactiveTransactionManager) =
TransactionalOperator.create(reactiveTransactionManager)
}
Query:
fun findAll(): Flux<Game> {
return client
.execute("select id, guess from game")
.`as`(Game::class.java)
.fetch()
.all()
.`as`(to::transactional)
.onErrorContinue{ throwable, o -> System.out.println("value ignored $throwable $o") }
.log()
}
Disclaimer this is not mean to be used in production!! still before GA.
Related
I'm using Shiro 1.7.1 and Guice 4.2.3, below is the snippet of my POM file,
<properties>
<shiro.version>1.7.1</shiro.version>
<guice.version>4.2.3</guice.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.shiro</groupId>
<artifactId>shiro-web</artifactId>
<version>${shiro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.shiro</groupId>
<artifactId>shiro-guice</artifactId>
<version>${shiro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.shiro</groupId>
<artifactId>shiro-ehcache</artifactId>
<version>${shiro.version}</version>
</dependency>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>${guice.version}</version>
</dependency>
<dependency>
<groupId>com.google.inject.extensions</groupId>
<artifactId>guice-servlet</artifactId>
<version>${guice.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>
...
</dependencies>
I'm customizing Shiro's LogoutFilter by creating a new class,
package com.myshiro.myshiro;
import org.apache.shiro.web.filter.authc.LogoutFilter;
public class MyLogoutFilter extends LogoutFilter {
}
and bind org.apache.shiro.web.filter.authc.LogoutFilter to the above customized MyLogoutFilter,
package com.myshiro.myshiro;
public class MyShiroModule extends ShiroWebModule {
public MyShiroModule(ServletContext servletContext) {
super(servletContext);
}
protected void configureShiroWeb() {
try {
bindRealm().toConstructor(IniRealm.class.getConstructor(Ini.class));
} catch (NoSuchMethodException e) {
addError(e);
}
bind(org.apache.shiro.web.filter.authc.LogoutFilter.class).to(MyLogoutFilter.class).in(Scopes.SINGLETON);
addFilterChain("/logout", LOGOUT);
}
}
and I try to create the Guice injector in the unit test class like this,
public class MyShiroModuleTest {
#Mock
private ServletContext servletContext;
#Test
public void test() {
Guice.createInjector(new MyShiroModule(servletContext));
}
}
and it failed with the following errors,
1) Binding to null instances is not allowed. Use toProvider(Providers.of(null)) if this is your intended behaviour.
at org.apache.shiro.guice.web.ShiroWebModule.configureShiro(ShiroWebModule.java:136)
2) A binding to org.apache.shiro.web.filter.authc.LogoutFilter was already configured at com.myshiro.myshiro.MyShiroModule.configureShiroWeb(MyShiroModule.java:25).
at org.apache.shiro.guice.web.ShiroWebModule.setupFilterChainConfigs(ShiroWebModule.java:209)
From the second note above, it explained that the binding to org.apache.shiro.web.filter.authc.LogoutFilter is already configured in both MyShiroModule and ShiroWebModule. Do you have any idea of how to bind to my customized LogoutFilter?
This issue did not happened in Shiro 1.3.x.
My sample project is available here, you can see the error simply when you mvn clean install.
Sounds like your problem is related to Guice 4, and less about Shiro. Instead of re-using the same binding key, define a new one, something like:
bind(MyLogoutFilter.class).to(MyLogoutFilter.class).in(Scopes.SINGLETON);
addFilterChain("/logout", Key.get(MyLogoutFilter.class));
I need to set embedded mongodb in my springboot project but it show infinite error logs. Someone can help me?
I use these dependencies
<!-- https://mvnrepository.com/artifact/de.flapdoodle.embed/de.flapdoodle.embed.mongo -->
<dependency>
<groupId>de.flapdoodle.embed</groupId>
<artifactId>de.flapdoodle.embed.mongo</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/cz.jirutka.spring/embedmongo-spring -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
<version>3.8.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver</artifactId>
<version>3.8.0</version>
<exclusions>
<exclusion>
<groupId>org.mongodb</groupId> <!-- Exclude Project-E from Project-B -->
<artifactId>bson</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-core</artifactId>
<version>3.8.0</version>
</dependency>
<dependency>
<groupId>cz.jirutka.spring</groupId>
<artifactId>embedmongo-spring</artifactId>
<version>1.3.1</version>
</dependency>
And then i configure the mongoTemplate with this method in a configuration class
private static final String MONGO_DB_URL = "localhost";
private static final String MONGO_DB_NAME = "embedded_db";
#Bean
public MongoTemplate mongoTemplate() throws IOException {
EmbeddedMongoFactoryBean mongo = new EmbeddedMongoFactoryBean();
mongo.setBindIp(MONGO_DB_URL);
MongoClient mongoClient = (MongoClient) mongo.getObject();
return new MongoTemplate(mongoClient, MONGO_DB_NAME);
}
But when i run my application it show this error+
Exception in thread "main" 11:56:00.710 [Thread-0] DEBUG de.flapdoodle.embed.process.store.CachingArtifactStore - force delete for PRODUCTION:Windows:B64 and de.flapdoodle.embed.process.extract.ImmutableExtractedFileSet#545997b1
11:56:00.710 [Thread-1] DEBUG de.flapdoodle.embed.mongo.AbstractMongoProcess - try to stop mongod
java.lang.NoSuchMethodError: 'void com.mongodb.client.internal.MongoClientDelegate.<init>(com.mongodb.connection.Cluster, java.util.List, java.lang.Object)'
at com.mongodb.Mongo.<init>(Mongo.java:319)
at com.mongodb.Mongo.<init>(Mongo.java:291)
at com.mongodb.Mongo.<init>(Mongo.java:286)
at com.mongodb.Mongo.<init>(Mongo.java:282)
at com.mongodb.MongoClient.<init>(MongoClient.java:180)
at com.mongodb.MongoClient.<init>(MongoClient.java:155)
at com.mongodb.MongoClient.<init>(MongoClient.java:145)
at cz.jirutka.spring.embedmongo.EmbeddedMongoBuilder.build(EmbeddedMongoBuilder.java:104)
at cz.jirutka.spring.embedmongo.EmbeddedMongoFactoryBean.getObject(EmbeddedMongoFactoryBean.java:52)
at com.nextage.arcacrmconnector.commons.EmbeddedMongoDb.mongoTemplate(EmbeddedMongoDb.java:20)
at com.nextage.arcacrmconnector.commons.MongoTemplateSingleton.setMongoTemplate(MongoTemplateSingleton.java:20)
at com.nextage.arcacrmconnector.commons.MongoTemplateSingleton.getMongoTemplate(MongoTemplateSingleton.java:13)
at com.nextage.arcacrmconnector.services.CommonMongoService.<init>(CommonMongoService.java:12)
at com.nextage.arcacrmconnector.services.LogService.<init>(LogService.java:18)
at com.nextage.arcacrmconnector.consumer.QueueConsumerTimerTask.<init>(QueueConsumerTimerTask.java:23)
at com.nextage.arcacrmconnector.application.Application.<clinit>(Application.java:30)
11:56:00.716 [Thread-0] WARN de.flapdoodle.embed.process.io.file.Files - could not delete C:\Users\DONATE~1\AppData\Local\Temp\extract-70ac2cd1-bb5b-4276-9243-cdf6b52db3famongod.exe. Will try to delete it again when program exits.
Exception in thread "Thread-0" java.lang.IllegalStateException: Shutdown in progress
at java.base/java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:66)
at java.base/java.lang.Runtime.addShutdownHook(Runtime.java:213)
at de.flapdoodle.embed.process.io.file.FileCleaner.forceDeleteOnExit(FileCleaner.java:51)
at de.flapdoodle.embed.process.io.file.Files.forceDelete(Files.java:128)
at de.flapdoodle.embed.process.extract.ExtractedFileSets.delete(ExtractedFileSets.java:77)
at de.flapdoodle.embed.process.store.ArtifactStore.removeFileSet(ArtifactStore.java:90)
at de.flapdoodle.embed.process.store.CachingArtifactStore$FilesWithCounter.forceDelete(CachingArtifactStore.java:176)
at de.flapdoodle.embed.process.store.CachingArtifactStore.removeAll(CachingArtifactStore.java:100)
at de.flapdoodle.embed.process.store.CachingArtifactStore$CacheCleaner.run(CachingArtifactStore.java:196)
at java.base/java.lang.Thread.run(Thread.java:830)
How can i fix it? It is a dependency version error?
String tempFile = System.getenv("temp") + File.separator + "extract-" + System.getenv("USERNAME") + "-extractmongod";
String executable;
if (System.getenv("OS") != null && System.getenv("OS").contains("Windows")) {
executable = tempFile + ".exe";
} else {
executable = tempFile + ".sh";
}
Files.deleteIfExists(new File(executable).toPath());
Files.deleteIfExists(new File(tempFile + ".pid").toPath());
please try this temporary solution. write this in mongo config file,This should be executed before the start of embedded mongo db
link is : https://github.com/flapdoodle-oss/de.flapdoodle.embed.mongo/issues/171
I m using
<dependency>
<groupId>org.lognet</groupId>
<artifactId>grpc-spring-boot-starter</artifactId>
<version>2.1.4</version>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>io.zipkin.brave</groupId>
<artifactId>brave-instrumentation-grpc</artifactId>
<version>4.13.1</version>
</dependency>
<dependency>
<groupId>io.zipkin.brave</groupId>
<artifactId>brave</artifactId>
<version>4.13.1</version>
</dependency>
and I want to use brave-instrumentation-grpc monitor my grpc server application. So I followed advice below:
https://github.com/LogNet/grpc-spring-boot-starter#interceptors-support
https://github.com/openzipkin/brave/tree/master/instrumentation/grpc
#Configuration
public class GrpcFilterConfig{
#GRpcGlobalInterceptor
#Bean
public ServerInterceptor globalInterceptor(){
Tracing tracing = Tracing.newBuilder().build();
GrpcTracing grpcTracing = GrpcTracing.create(tracing);
return grpcTracing.newServerInterceptor();
}
}
The question is : The GlobalInterceptor i defined does not bind to grpcserver.
debug deep insde the GRpcServerRunner.class, it seems that the code does not return beansWithAnnotation
Map<String, Object> beansWithAnnotation = this.applicationContext.getBeansWithAnnotation(annotationType);
beansWithAnnotation is null.
Is there something wrong with my usage?
So I know that in Mongo Shell, you use dot notation to get the field you want in any document.
How is dot notation achieved in MongoDB Scala. I'm confused as to how it works. Here is the code that fetches a document from a collection:
val record = collection.find().projection(fields(include("offset"), excludeId())).limit(1)
EDIT:
I'm trying to work on a mechanism to basically re-consume Kafka records at a point where the consumer was shutdown. To do this, I store my kafka records in an external database, and then try to fetch the most recent offset from there and start consuming from that point. Here is my Scala method that should do that:
def getLatestCommitOffsetFromDB(collectionName: String): Long = {
import com.mongodb.Block
import org.bson.Document
val printBlock = new Block[Document]() {
override def apply(document: Document): Unit = {
println(document.toJson)
}
}
import com.mongodb.async.SingleResultCallback
val callbackWhenFinished = new SingleResultCallback[Void]() {
override def onResult(result: Void, t: Throwable): Unit = {
System.out.println("Latest offset fetched from database.")
}
}
var obj: String = " "
try {
val record = collection.find().projection(fields(include("offset"), excludeId())).limit(1)
//TODO FIND A WAY TO GET THE VALUE AND STORE IT IN A VARIABLE
} catch {
case e: RuntimeException =>
logger.error(s"MongoDB Server Error : Unable to fetch data from collection : $collection")
logger.error(e.printStackTrace().toString())
}
obj.toLong
}
The problem isn't that I can fetch documents from Mongo, more-so that I'm trying to access a particular field in Mongo. The Document has four fields in it: topic, partition, message, and offset. I want to get the "offset" field and store that in a variable, so I can use it as a restarting point to re-consume Kafka records.
where do I go from there?
POM.xml
<?xml version="1.0" encoding="UTF-8"?>
http://maven.apache.org/xsd/maven-4.0.0.xsd">
4.0.0
<groupId>OffsetManagementPoC</groupId>
<artifactId>OffsetManagementPoC</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.12</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.25</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>0.10.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.6.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.6.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-annotations -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.6.5</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>casbah_2.12</artifactId>
<version>3.1.1</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.mongodb.scala</groupId>
<artifactId>mongo-scala-driver_2.12</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.4.2</version>
</dependency>
<dependency>
<groupId>org.mongodb.scala</groupId>
<artifactId>mongo-scala-driver_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>bson</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver-async</artifactId>
<version>3.4.3</version>
</dependency>
<dependency>
<groupId>org.mongodb.scala</groupId>
<artifactId>mongo-scala-bson_2.11</artifactId>
<version>2.1.0</version>
</dependency>
</dependencies>
You can modify your query this way:
import com.mongodb.MongoClient
import com.mongodb.client.MongoCollection
import com.mongodb.client.model.Projections
def getLatestCommitOffsetFromDB(
databaseName: String,
collectionName: String
): Long = {
val mongoClient = new MongoClient("localhost", 27017);
val collection =
mongoClient.getDatabase(databaseName).getCollection(collectionName)
val record = collection
.find()
.projection(
Projections
.fields(Projections.include("offset"), Projections.excludeId()))
.first
record.get("offset").asInstanceOf[Double].toLong
}
I think you were missing the com.mongodb.client.model.Projections imports in order to use fields, include and excludeId
I used first instead of limit(1) to make it easier to extract the result.
first returns a Document object on which you can call get to retrieve the value of the requested field.
But in fact, since you just want one record and one field, you can remove the projection!:
val record = collection.find().first
According to the documentation, collection.find() accepts a com.mongodb.DBObject
One of the implementations of that interface that you can use is BasicDBObject which is basically like a mutable.Map[String, Object]. You can use the constructor which accepts a map like:
val query = new com.mongodb.BasicDBObject(Map(
"foo.bar" -> "value1"
"bar.foo" -> "value2"
))
val record = collection.find(query)....
Here is my application.properties:
spring.application.name=person
server.port=8080
eureka.client.service-url.defaultZone=http://localhost:8761/eureka
# this line of config doesn't work
person.ribbon.NFLoadBalancerRuleClassName=asdfasdfasdf
By setting person.ribbon.NFLoadBalancerRuleClassName to asdfasdfasdf there should be some errors shown in console output but there's none, which means this config doesn't work. I cannot tell what's going on.
Here are the dependencies:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-eureka-server</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-hystrix</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-hystrix-dashboard</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-feign</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-ribbon</artifactId>
</dependency>
</dependencies>
The version of spring-cloud is Brixton.SR3,
I successfully configured Ribbon with following configuration class:
#Configuration
#RibbonClient(name = "person", configuration = RibbonConfiguration.RibbonConfig.class)
public class RibbonConfiguration {
static class RibbonConfig {
#Bean
public IRule rule() {
return new WeightedResponseTimeRule();
}
}
}