I have a read-side that is supposed to write entries to Cassandra, I would like to write a test that ensure that sends an event to the read-side and then check in Cassandra that the row has indeed been written. How am I supposed to access a Cassandra session within the test?
I do it following way:
class MyProcessorSpec extends AsyncWordSpec with BeforeAndAfterAll with Matchers {
private val server = ServiceTest.startServer(ServiceTest.defaultSetup.withCassandra(true)) { ctx =>
new MyApplication(ctx) {
override def serviceLocator = NoServiceLocator
override lazy val readSide: ReadSideTestDriver = new ReadSideTestDriver
}
}
override def afterAll(): Unit = server.stop()
private val testDriver = server.application.readSide
private val repository = server.application.repo
private val offset = new AtomicInteger()
"The event processor" should {
"create an entity" in {
for {
_ <- feed(createdEvent.id, createdEvent)
entity <- repository.getEntityIdByKey(createdEvent.keys.head)
entities <- repository.getAllEntities
} yield {
entity should be(Some(createdEvent.id))
entities.length should be(1)
}
}
}
private def feed(id: MyId, event: MyEvent): Future[Done] = {
testDriver.feed(id.underlying, event, Sequence(offset.getAndIncrement))
}
}
Related
In my Scala application, I load the FXML file in the constructor of the controller and set the controller with fxmlLoader.setController(this).
UPDATE (1): A more comprehensive example:
abstract class Controller[A <: Parent] {
val root: A = loadRoot()
private val stage: Stage = new Stage()
def openWindow(): Unit = {
stage.setScene(new Scene(root))
stage.show()
stage.toFront()
}
private def loadRoot(): A = {
val loader = new FXMLLoader(getDefaultLocation())
loader.setController(this)
loader.load()
}
def getDefaultLocation(): URL = ???
}
--
class SampleController private() extends Controller[VBox] {
#FXML private var text: TextField = _
#FXML def initialize(): Unit = {
text.textProperty().set("That is some text.")
}
}
object SampleController {
def apply(): SampleController = new SampleController()
}
UPDATE (2): SampleController() is called whithin an Akka actor:
val controller = SampleController()
Platform.runLater(() => controller.openWindow())
I now experience that sometimes the initialize method is called before the c variables are bound. Can anyone think of any circumstances when that can happen?
I have a stream application that reads from a topic and stores the state in a global store. There are several processors that write to the same topic by reading from the state store, updating a particular field, and writing it back to the topic.
I've noticed that some writes contain stale data and overwrites a record that was previously updated. I want to know what techniques I can use to achieve this "locking" of the records so that no processor updates the record while it is being read and processed by another. So far I believe this can be achieved by enabling exactly-once processing but would like your expert opinion on this and pointers to other things I might be missing.
streamBuilder
.addGlobalStore(
Stores
.keyValueStoreBuilder(
productGlobalState,
keySerde,
valueSerde
)
.withLoggingDisabled(),
"product-source-1",
consumedInstance,
processorSupplier
)
.stream("product-source-1")
.transform(transformer1)
.to("product-sink")
.stream("product-source-2")
.transform(transformer2)
.to("product-sink")
//state store processor
val productProcessorSupplier: ProcessorSupplier[ProductId, ProductValue] =
() =>
new AbstractProcessor[ProductId, ProductValue] {
private var store: KeyValueStore[ProductId, ProductValue] = _
override def init(context: ProcessorContext): Unit = {
super.init(context)
this.store = context
.getStateStore(ProductGlobalStoreName)
.asInstanceOf[KeyValueStore[ProductId, ProductValue]]
}
override def process(key: ProductId, value: ProductValue): Unit = {
store.put(key, value)
context().commit()
}
}
//transformer1
val transformer1: ProductTransformer =
() =>
new Transformer[
ProductId,
ProductValue,
KeyValue[ProductId, ProductValue]
] {
private var prodStore: KeyValueStore[ProductId, ProductValue] = _
override def init(context: ProcessorContext): Unit = {
this.prodStore = context
.getStateStore(ProductGlobalStoreName)
.asInstanceOf[KeyValueStore[ProductId, ProductValue]]
}
override def transform(key: ProductId, value: ProductValue): KeyValue[ProductId, ProductValue] = {
val updatedProd = Option(prodStore.get(key)).map { p =>
value.copy(isSale = p.isSale, isNew = p.isNew)//attempt to preserve these fields and not override them
} getOrElse value
KeyValue.pair(key, updatedProd)
}
override def close(): Unit = ()
}
//transformer2
val transformer2 = () => new Transformer[ProductId, ProductValue, KeyValue[ProductId, ProductValue]] {
private var context: ProcessorContext = _
protected var prodStore: KeyValueStore[ProductId, ProductValue] = _
override def init(context: ProcessorContext): Unit = {
this.context = context
this.prodStore = context
.getStateStore(ProductGlobalStoreName)
.asInstanceOf[KeyValueStore[ProductId, ProductValue]]
}
override def transform(key: ProductId, value: ProductValue): KeyValue[ProductId, ProductValue] = {
val product = prodStore.get(key)
val newValue = product.copy(product.isNew = true)
KeyValue.pair(key, newValue)
}
}
//... other transformers similar to transformer2 but updating a different fields.
I have a graph that reads from sqs, writes to another system and then deletes from sqs. In order to delete from sqs i need a receipt handle on the SqsMessage object
In the case of Http flows the signature of the flow allows me to say which type gets emitted downstream from the flow,
Flow[(HttpRequest, T), (Try[HttpResponse], T), HostConnectionPool]
In this case i can set T to SqsMessage and i still have all the data i need.
However some connectors e.g google cloud pub sub emits a completely useless (to me) pub sub id.
Downstream of the pub sub flow I need to be able to access the sqs message id which i had prior to the pub sub flow.
What is the best way to work around this without rewriting the pub sub connector
I conceptually want something a bit like this:
Flow[SqsMessage] //i have my data at this point
within(
.map(toPubSubMessage)
.via(pubSub))
... from here i have the same type i had before within however it still behaves like a linear graph with back pressure etc
You can use PassThrough integration pattern.
As example of usage look on akka-streams-kafka -> Class akka.kafka.scaladsl.Producer -> Mehtod def flow[K, V, PassThrough]
So just implement your own Stage with PassThrough element, example inakka.kafka.internal.ProducerStage[K, V, PassThrough]
package my.package
import java.util.concurrent.atomic.AtomicInteger
import scala.concurrent.Future
import scala.util.{Failure, Success, Try}
import akka.stream._
import akka.stream.ActorAttributes.SupervisionStrategy
import akka.stream.stage._
final case class Message[V, PassThrough](record: V, passThrough: PassThrough)
final case class Result[R, PassThrough](result: R, message: PassThrough)
class PathThroughStage[R, V, PassThrough]
extends GraphStage[FlowShape[Message[V, PassThrough], Future[Result[R, PassThrough]]]] {
private val in = Inlet[Message[V, PassThrough]]("messages")
private val out = Outlet[Result[R, PassThrough]]("result")
override val shape = FlowShape(in, out)
override protected def createLogic(inheritedAttributes: Attributes) = {
val logic = new GraphStageLogic(shape) with StageLogging {
lazy val decider = inheritedAttributes.get[SupervisionStrategy]
.map(_.decider)
.getOrElse(Supervision.stoppingDecider)
val awaitingConfirmation = new AtomicInteger(0)
#volatile var inIsClosed = false
var completionState: Option[Try[Unit]] = None
override protected def logSource: Class[_] = classOf[PathThroughStage[R, V, PassThrough]]
def checkForCompletion() = {
if (isClosed(in) && awaitingConfirmation.get == 0) {
completionState match {
case Some(Success(_)) => completeStage()
case Some(Failure(ex)) => failStage(ex)
case None => failStage(new IllegalStateException("Stage completed, but there is no info about status"))
}
}
}
val checkForCompletionCB = getAsyncCallback[Unit] { _ =>
checkForCompletion()
}
val failStageCb = getAsyncCallback[Throwable] { ex =>
failStage(ex)
}
setHandler(out, new OutHandler {
override def onPull() = {
tryPull(in)
}
})
setHandler(in, new InHandler {
override def onPush() = {
val msg = grab(in)
val f = Future[Result[R, PassThrough]] {
try {
Result(// TODO YOUR logic
msg.record,
msg.passThrough)
} catch {
case exception: Exception =>
decider(exception) match {
case Supervision.Stop =>
failStageCb.invoke(exception)
case _ =>
Result(exception, msg.passThrough)
}
}
if (awaitingConfirmation.decrementAndGet() == 0 && inIsClosed) checkForCompletionCB.invoke(())
}
awaitingConfirmation.incrementAndGet()
push(out, f)
}
override def onUpstreamFinish() = {
inIsClosed = true
completionState = Some(Success(()))
checkForCompletion()
}
override def onUpstreamFailure(ex: Throwable) = {
inIsClosed = true
completionState = Some(Failure(ex))
checkForCompletion()
}
})
override def postStop() = {
log.debug("Stage completed")
super.postStop()
}
}
logic
}
}
I'm using Akka in my project and pull config values in my MainActor class. I want to be able to use commit, version, author tag inside of another file in order to build an avro response, but I can't just simply make MainActor the parent class of my Avro response interface. Is there a workaround?
My MainActor class
class MainActor extends Actor with ActorLogging with ConfigComponent with ExecutionContextComponent with DatabaseComponent with DefaultCustomerProfiles {
override lazy val config: Config = context.system.settings.config
override implicit lazy val executionContext: ExecutionContext = context.dispatcher
override val db: Database = Database.fromConfig(config.getConfig("com.ojolabs.customer-profile.database"))
private val avroServer = context.watch {
val binding = ReflectiveBinding[CustomerService.Async](customerProfileManager)
val host = config.getString("com.ojolabs.customer-profile.avro.bindAddress")
val port = config.getInt("com.ojolabs.customer-profile.avro.port")
context.actorOf(AvroServer.socketServer(binding, host, port))
}
val commit = config.getString("com.ojolabs.customer-profile.version.commit")
val author = config.getString("com.ojolabs.customer-profile.version.author")
val tag = config.getString("com.ojolabs.customer-profile.version.tag")
val buildId = config.getString("com.ojolabs.customer-profile.version.buildId")
override def postStop(): Unit = {
db.close()
super.postStop()
}
//This toplevel actor does nothing by default
override def receive: Receive = Actor.emptyBehavior
}
The class I want to pull values into
trait DefaultCustomerProfiles extends CustomerProfilesComponent {
self: DatabaseComponent with ExecutionContextComponent =>
lazy val customerProfileManager = new CustomerService.Async {
import db.api._
override def customerById(id: String): Future[AvroCustomer] = {
db.run(Customers.byId(UUID.fromString(id)).result.headOption)
.map(_.map(AvroConverters.toAvroCustomer).orNull)
}
override def customerByPhone(phoneNumber: String): Future[AvroCustomer] = {
db.run(Customers.byPhoneNumber(phoneNumber).result.headOption)
.map(_.map(AvroConverters.toAvroCustomer).orNull)
}
override def findOrCreate(phoneNumber: String, creationReason: String): Future[AvroCustomer] = {
db.run(Customers.findOrCreate(phoneNumber, creationReason)).map(AvroConverters.toAvroCustomer)
}
override def createEvent(customerId: String, eventType: String, version: Double, data: String, metadata: String): Future[AvroCustomerEvent] = {
val action = CustomerEvents.create(
UUID.fromString(customerId),
eventType,
Json.parse(data),
version,
Json.parse(metadata)
)
db.run(action).map(AvroConverters.toAvroEvent)
}
override def getVersion() : Version = {
}
}
Create another trait that defines the values, and mix it in with your MainActor and DefaultCustomerProfiles traits.
trait AnvroConfig {
self: ConfigComponent
val commit = config.getString("com.ojolabs.customer-profile.version.commit")
val author = config.getString("com.ojolabs.customer-profile.version.author")
val tag = config.getString("com.ojolabs.customer-profile.version.tag")
val buildId = config.getString("com.ojolabs.customer-profile.version.buildId")
}
I think what you really need is an Akka Extension, which enables you to add features, like custom config, to your Akka system in an elegant way. This way, you would have access to those config values within all your actors from the actor system. As an example, check out this nice blog post.
As for the other class from your example, you should pass them as parameters - it should be concerned with retrieving and parsing the config itself.
I am not a Groovy expert, but I did read the book "Groovy in Action". In Groovy, each closure comes with a "context", where the items inside the closure can get access to pseudo-variables like "this", "owner", and "delegate", that let the items know who called the closure. This allows one to write DSLs like this (from Groovy in Action):
swing = new SwingBuilder()
frame = swing.frame(title:'Demo') {
menuBar {
menu('File') {
menuItem 'New'
menuItem 'Open'
}
}
panel {
// ...
}
}
Note that 'menuBar' "knows" that it belongs to 'frame' because it can get context information about the owner and delegate of the closure.
Is this possible to do in Scala? If so, how?
One way is to use a scala.util.DynamicVariable to track the context. Something like the SwingBuilder could be implemented as
import scala.util.DynamicVariable
import javax.swing._
object SwingBuilder {
case class Context(frame: Option[JFrame], parent: Option[JComponent])
}
class SwingBuilder {
import SwingBuilder._
val context = new DynamicVariable[Context](Context(None,None))
def frame(title: String)(f: =>Unit) = {
val res = new JFrame(title)
res.add(new JPanel())
context.withValue(Context(Some(res),context.value.parent)){f;res}
}
def menuBar(f: =>Unit) = {
val mb = new JMenuBar()
context.value.frame.foreach(_.setJMenuBar(mb))
context.withValue(Context(context.value.frame,Some(mb))){f;mb}
}
def menu(title: String)(f: =>Unit) = {
val m = new JMenu(title)
context.value.parent.foreach(_.asInstanceOf[JMenuBar].add(m))
context.withValue(Context(context.value.frame,Some(m))){f;m}
}
def menuItem(title: String) = {
val mi = new JMenuItem(title)
context.value.parent.foreach(_.asInstanceOf[JMenu].add(mi))
}
}
object Test {
def main(args: Array[String]) {
val builder = new SwingBuilder()
import builder._
val f = frame("Demo") {
val mb = menuBar {
menu("File") {
menuItem("New")
menuItem("Open")
}
}
}
f.setVisible(true)
}
}