Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
Sergey Morgunov
And of course, Lagom PersistentEntity will be support in some future versions.
Vasile Gorcinschi
Nice, thanks @ihostage - hopefully they will update the docs sometime soon. akka-projection is in akka 's namespace. Would you recommend a lagom repo that I could use as state of the art lagom practices...(particularly around Root Agregates - whichever impl).
Hello everyone,
If anyone is having any idea about below error
Oops, cannot start the server.
java.lang.RuntimeException: Old version of Akka Discovery from Akka Management found on the classpath. Remove com.lightbend.akka.discovery:akka-discovery from the classpath..
Why this issue is coming in lagom 1.6
1 reply
can we use aws aurora instead of cassandra ?
when using lagom persistence
how can we define cassandra CloudSecureConnectBundle in application.conf ?
Rajkumar Parthasarathi
Hi - I have deployed Lagom in Openshift - with Kafka connection successful, but cassandra readside when trying to use contact-points - deployment fails. I have followed the config given n documentation using contact points for cassandra. Is there anything special to be done for cassandra connection using contact points. Need help, stuck for past few days. Getting an error like following "cannot connect to Contactpoints" - But my cassandra in openshift also is running fine .....
5 replies
Jason Pickens
How can I override the scalaVersion on the extraProjects?
I’m adding a compiler plugin via another sbt plugin and this uses cross CrossVersion.full. This fails for the extra projects because the plugin doesn’t exist for Scala 2.12.10 but it does exist for 2.12.11 which is what I am using in all other projects.
It would also be really nice to be able to disable these extraProjects somehow. The cassandra one (which we are not using) messes up our classpath.
Maximiliano Biandratti
hello guys! I'm trying use a plugin in lagom from sonar report (sonarqube). I'm working with com.aol.sbt-sbt-sonarrunner-plugin and com.github.mwz-sbt-sonar but I didnt had luck. Somebody had the same problem and could resolved it? Thanks!
I am having this issue with a stateless service Unhealthy (x1 just now) Readiness probe failed: HTTP probe failed with statuscode: 500
7 replies
I have disbaled akka cluster
lagom {
    cluster {
        exit-jvm-when-system-terminated = off
        bootstrap {
            enabled = off
Ivan Matala
anyone had success making Lagom work with Istio? I'm having issues on my readiness probe always returning connection refused even though I added followed the instructions here such as adding the annotation https://doc.akka.io/docs/akka-management/current/bootstrap/istio.html
10 replies
Greetinng everyone. I try to deserialize json-message, but lagom return this JsResultException(errors:List((/trustTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/resonanceTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationSources,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/municipalities,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/spreadTypes,List(JsonValidationError(List(error.path.missing),ArraySeq())))))
com.lightbend.lagom.scaladsl.api.transport.DeserializationException: JsResultException(errors:List((/trustTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/resonanceTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationSources,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/municipalities,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/spreadTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))))) (400/1003 Unsupported Data/Bad Request)
at com.lightbend.lagom.scaladsl.api.transport.DeserializationException$.apply(Exceptions.scala:351)
at com.lightbend.lagom.scaladsl.api.deser.LowPriorityMessageSerializerImplicits$$anon$15$JsValueFormatDeserializer.deserialize(MessageSerializer.scala:293)
at com.lightbend.lagom.scaladsl.api.deser.LowPriorityMessageSerializerImplicits$$anon$15$JsValueFormatDeserializer.deserialize(MessageSerializer.scala:287)
at com.lightbend.lagom.internal.scaladsl.client.ScaladslServiceApiBridge.negotiatedDeserializerDeserialize(ScaladslServiceApiBridge.scala:95)
at com.lightbend.lagom.internal.scaladsl.client.ScaladslServiceApiBridge.negotiatedDeserializerDeserialize$(ScaladslServiceApiBridge.scala:95)
at com.lightbend.lagom.internal.scaladsl.client.ScaladslClientServiceCallInvoker.negotiatedDeserializerDeserialize(ScaladslServiceClientInvoker.scala:147)
at com.lightbend.lagom.internal.scaladsl.client.ScaladslClientServiceCallInvoker.negotiatedDeserializerDeserialize(ScaladslServiceClientInvoker.scala:147)
at com.lightbend.lagom.internal.client.ClientServiceCallInvoker.$anonfun$makeStrictCall$3(ClientServiceCallInvoker.scala:285)
at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:430)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:92)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:92)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:47)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:47)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1016)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1665)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1598)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
Caused by: play.api.libs.json.JsResultException: JsResultException(errors:List((/trustTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/resonanceTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationSources,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/informationTypes,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/municipalities,List(JsonValidationError(List(error.path.missing),ArraySeq()))), (/spreadTypes,List(JsonValidationError(List(error.path.missing),ArraySeq())))))
... 20 more

here is my code import java.time.LocalDateTime

import Message.Message
import akka.NotUsed
import com.lightbend.lagom.scaladsl.api.{Descriptor, Service, ServiceCall}
import com.lightbend.lagom.scaladsl.api.Service.{named, restCall}
import com.lightbend.lagom.scaladsl.api.transport.Method
import play.api.libs.json.{Format, Json, Reads, Writes}

trait RumorsAPI extends Service {

def getDictionaryData:ServiceCall[NotUsed,AuditoryEnvironmentResponse]

def uploadRumor:ServiceCall[String,AuditoryEnvironmentResponse]

override def descriptor: Descriptor = {
restCall(Method.GET,"/dictionaries", getDictionaryData),
restCall(Method.POST, "/rumor", uploadRumor

case class Rumor (createdAt:LocalDateTime, municipality:Int, theme:String, text:String, preconditions:String,
informationSource:Int, informationType:Int, resonanceType:Int, spreadType:Int, trustType:Int)

object Rumor {
implicit val format: Format[Rumor] = Json.format[Rumor]

and test "the rumor api test" should {
"add rumor data in database" in ServiceTest.withServer(ServiceTest.defaultSetup) {ctx =>
new AuditoryEnvironmentApplication(ctx) with LocalServiceLocator
} { server =>
val client = server.serviceClient.implement[RumorsAPI]
    .flatMap {
      access_token =>
          "Dissonance_DeviceID=8598ebdd-014b-4839-843a-59caee01a200; path=/; " +
            "expires=Mon, 01 Jan 2029 00:00:00 GMT;"))
            Rumor(LocalDateTime.now(), 23943, "Test", "Test", "Test", 24024, 24020, 24018,
              24037, 24014)
            case nd: NotificationData =>
              nd shouldBe a [_]
            case d: NotificationData =>
              d shouldBe a [_]
What i m doing wrong?
Lagom cannot deserialize Json (((
Instead of json, I use String
Ignasi Marimon-Clos
Try with def uploadRumor:ServiceCall[Rumor,AuditoryEnvironmentResponse]. Lagom will pick up the json de/ser from the companion object automatically.
hi, when I use ReadSideTestDriver to test, throws an exception: " java.util.concurrent.CompletionException: java.util.concurrent.ExecutionException: com.datastax.driver.core.exceptions.InvalidQueryException: Keyspace "xxx" does not exist" . do I need create keyspace explicit or lagom will use cassandra-journal.keyspace as default when test readSide processor?
Hi, I have a design question to model my persistence events so that the read-side can have the data to be updated.
Here's my example: suppose we have a State containing a "StockQuantity" field and we need on the read-side a table containing this "StockQuantity" field.
In this example, we have "QuantityIncreased" and "QuantityDecreased" events which only contains the variations are stored.
However, with the Cassandra support on the reading side, we need the exact value of the field to update (it is not possible to do: StockQuantity = StockQuantity + variation).
Generally speaking, what is the good practice in this case (putting in the events the information of the state (does it make sense to add them in the commandHandler?) or adding to the events the previous value of the state)?
3 replies
Hello all. When running a new ReadSideProcessor with a Entity (Akka persistence entity) that has already many events, will this processor start consumming from the the first event or will it start reading from the next event ?
2 replies
Let me put my question in this way: i have an existing akka-persistence-entity entity that already persited 5000 events.
When deploying a new ReadSideProcessor attached to that entity, will the Processor start consuming event index 0? Or will it start processing event index 5001 ?
1 reply
Hi, I have been trying to build docker images for the lagom services and run in kubernetes. No complete guide is available till now. Anyone can help me on this ? I am getting "Could not find or load main class play.core.server.ProdServerStart" error when i follow shopping cart example
2 replies
Hey team,
I am looking to understand the pattern of migrating events in a journal from one type to another. Anyone had this experience before. Wary of messing with the old existing data. What are some things you had to look out for to ensure a smooth migration?
4 replies
I downloaded the scala template project from the site, but when running ./sbt I'm getting this error
error: error while loading String, class file '/modules/java.base/java/lang/String.class' is broken
I get the same error on two different W10 machines
Nikhil Arora
Greeting everyone, Need some suggestions, I need to run a scheduler which would fetch the values from PostgreSql DB in a reactive way and send commands to entities. Is there a long term scheduler in Lagom/Akka and reactive driver to fetch values from PostgreSQL? Has anyone used R2DBC drivers ?
I want to create a kafka consumer only application with lagom ... I am not able to find any good documentation for that .. what to write in loader how to describe lagom service and how implement kafka consumer with lagom .. Please help .. provide any demo project link in scala or any support ..
5 replies

I want to create a kafka consumer only application with lagom ... I am not able to find any good documentation for that .. what to write in loader how to describe lagom service and how implement kafka consumer with lagom .. Please help .. provide any demo project link in scala or any support ..

Please help me with this

All all I have a question, Please help.
i am trying to create an akka actor scheduler for polling. And i am using akka scheduler. And i design to create new actor for each request.
Here the code to create Actor :->
public ActorManager(
ActorSystem actorSystem,
CTLienOrderInterfaceRequest ctLienOrderInterfaceRequest,
String orderRefId,
String accessToken,
CTLienClient ctLienClien,
PersistentEntityRegistry persistentEntityRegistry) {
this.actorSystem = actorSystem;
this.ctLienOrderInterfaceRequest = ctLienOrderInterfaceRequest;
this.orderRefId = orderRefId;
this.accessToken = accessToken;
this.ctLienClient = ctLienClien;
this.persistentEntityRegistry = persistentEntityRegistry;
ActorRef actorRef =
SchedulerActor.props(actorSystem, new LienServiceConsumer()), orderRefId);
System.out.println(“actorRef ::” + actorRef.path());
new SchedulerActor.Message(
i am executing this class constructor for each request. But its only working for single request. If i send 2 different request together its takes latest data only and process.
Hi, we implemented CQRS using lagom, and eventProcessor for readside using cassandraReadsideHandler, projecting events to write DB. We have a use case where we will replay events from start , so we are deleting eventProcessor rows in Offset store table and eventProcessor will replay all events (but in order for this to happen it needs an event to be triggered ), how can we replay events in readSideProcessor without triggering any event
David Leonhart
Is there a way to turn on debug logging for the events handled in a ReadSide. I looked a bit into the lagom code but I didnt found anything. Do I need to do that myself in the event handler?


We are using lagom in scala and have stuck with such an issue:

We have a small service that gets, validates, and acknowledges some kind of reports in a synchronous way. Part of its responsibilities is sending received and refined reports to kafka for those to be asynchronously processed by a consumer.
We want to acknowledge report received only when message is sent to kafka successfully.
Environment where service is deployed and the deploying pipeline itself, they have some limitations that do not allow us to persist data to cassandra so events eventually passing to kafka. We write to kafka using ActorSource.

Could somebody give me a hand here? Is there a way for us to be notified when message is successfully written to Message Broker?

Thank you&

1 reply
Srepfler Srdan
any brave souls that would like to implement a tapir to lagom api binding? a number of frameworks have already contributed https://github.com/softwaremill/tapir/tree/master/server
Renato Cavalcanti
@schrepfler, I think the easiest for Lagom is to use the Play impl and add it as an AdditionalRouter
much like we do for gRPC
the more we re-use Play / Akka Http stuff, the better
Srepfler Srdan
Thanks Renato
What are we missing if we use native Play/Akka Http?
Hello everybody, we are trying to configure our Lagom persistence readside cassandra connection to a Cassandra cluster (hosted by Aiven) providing only a CA Certificate (ca.pem). Could someone please tell us how to proceed? Thank you.
2 replies
Vincent canuel
I would like to set the level to debug in maven plugin. Why ? I have issued with kafka and I would like to see where it stored its logs. Unfortunately /target/lagom-dynamic-projects/lagom-internal-meta-project-kafka/target/log4j_output does not exist :(
Sergey Morgunov
This message was deleted

@/all Dear Lagom Community,

We are pleased to announce the release of Lagom Framework 1.6.3. This is the latest stable release of Lagom 1.6.x series.

More details about this release here: https://github.com/lagom/lagom/releases/tag/1.6.3

Enjoy :tada:

1 reply
Srepfler Srdan
Hello everybody @Lagom Community, i need to ask one stupid question , i am coming from spring framework background , can we use JPA/Hibernate with Lagom in write side ? , i have found example of read side but not on write side with java , and please also advise is it good approach to use jpa , because we would have many tables for storage , so its not easy to manage with native queries .(insert into ...etc). thanks for suggestion
2 replies