Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Saif Addin

    Hi, I am nesting two futures to execute sequentially but looks like the second db insertion never occurs.

      def addEmployee(entryEmployee: EntryEmployee) = {
        val now = LocalDateTime.now()
        val newEmployee = Employee(0, now, now, entryEmployee.name, entryEmployee.teamId, entryEmployee.notes)
        val linkTeam = tables.db.run(tables.teams.filter(_.id === entryEmployee.teamId).map(_.name).result)
        val newEmployeeId = tables.db.run(
          (tables.employees returning tables.employees.map(_.id) into ((employee, newId) => employee.copy(id=newId)))
            += newEmployee)
        for {
          teamName <- linkTeam
          nei <- newEmployeeId
        } yield {
          addEvents(Seq(EntryEvent(Some(nei.id), Some(nei.teamId), None, None, EntryEvent.employeeToTeam(nei.name, teamName.head))))

    with addEvents as

      def addEvents(entryEvents: Seq[EntryEvent]) = {
        val now = LocalDateTime.now()
        val newEvents = entryEvents.map(e => Event(0, now, e.employeeId, e.teamId, e.projectId, e.companyId, e.notes))
        println(s"ADDING NEW EVENT ${newEvents.head}")
        Await.result(tables.db.run(tables.events ++= newEvents), Duration.Inf)

    AddEvents gets called and the print is being printed. But when requesting later the events table, it returns empty

    Antoine Doeraene
    @saif-ellafi This is the trick with future: as soon as you define them, they get started. In this case, you run them concurrently, and not sequentially. What you can do to fix that is
    • define them as lazy val, or
    • write their implementation in the for comprehension itself (not ideal)
    • use flatMap insight (which is the unsugared version of the previous suggestion)
    Saif Addin
    Thank you, I am trying to think. Here I am creating a new event made of an ID that was returned from an autoInc value newEmplyeeId and also from a query that brings information about the team in linkTeam
    By the time the print ADDING NEW EVENT is called, it shows: ADDING NEW EVENT Event(0,2020-08-10T15:22:33.854,Some(1),Some(1),None,None,Employee: Robert Langdon joined team: Team Unassigned)
    that means the new Event entity has all it needs to be persisted
    yet Await.result(tables.db.run(tables.events ++= newEvents), Duration.Inf) even if blocked (I Blocked it purposedly to make sure it is doing something, but I'll remove that later) - Doesn't seem to be persisting the new event
    GETTING EVENTS found z: Vector()
    Antoine Doeraene

    Ah yes sorry I didn't realize the problem was in addEvents. The problem, here, though, is that you do nothing with the whole for comprehension, which means that you will not know whether it succeeds or failed. My guess is that your call in the await fails. In order to be sure of it, do

    for [...] yield [...]

    if this prints an error, you'll know

    Saif Addin
    Thank you! I was on my track there... I think there is something wrong specifically with the Event table. It has optional foreign keys
    will see if that yields something
    Thank you, that was helpful: Failure(org.h2.jdbc.JdbcSQLSyntaxErrorException: Duplicate column name "TEAM_ID"; SQL statement: insert into "EVENTS" ("CREATED","EMPLOYEE_ID","TEAM_ID","PROJECT_ID","TEAM_ID","NOTES") values (?,?,?,?,?,?) [42121-200])
    I guess as this app grows I need to make sure all of the Futures are handled
    now to find that duplicate column...
    Antoine Doeraene

    Ah nice. Glad it helped! Indeed, as a (very strong) rule of thumb, you should never leave a future unattended. As a matter of fact, if I can suggest something to you, I would make addEmployee return a Future[Employee], and do

    for {
          teamName <- linkTeam
          nei <- newEmployeeId
         _ <- addEvents(...) // remove the await in the function and leave it return the future inside
    } yield newEmployee

    and leave to the callee the responsibility to look inside the future

    also note that you might be able to rather run the for comprehension at the level of Slick, so that you can call .transactionnaly and be sure to keep the database in a clean state. Don't take my word for it though, I didn't look your code closely enough
    Saif Addin
    Thanks, I was returning local instances so they could be submitted back to the REST endpoint as a "temporary response"
    so I was never returning Futures anywhere actually, and they were being just side-effects of the functions..
    but totally your proposal is much cleaner, I'll figure a way of returning both the temporary result and the Future, perhaps a Tuple
    something like
      def addEvents(entryEvents: Seq[EntryEvent]) = {
        val now = LocalDateTime.now()
        val newEvents = entryEvents.map(e => Event(0, now, e.employeeId, e.teamId, e.projectId, e.companyId, e.notes))
        val task = tables.db.run(tables.events ++= newEvents)
        (newEvents, task)
    That was really useful, appreciated

    Insert into a Table from Raw SQL Select

    val rawSql: DBIO[Vector[(String, String)]] = sql"SELECT id, name FROM SomeTable".as[(String, String)]
    val myTable :TableQuery[MyClass] // with columns id, name and some other columns

    Is there a way to use forceInsert functions to insert data from select into the tables?
    If not, Is there a way to generate a sql string by using forceInsertStatements?
    Something like:

    db.run {
              myTable.map{ t => (t.id, t.name)}.forceInsert????(rawSql)

    Thanks for the help.

    Philipp Dörfler
    Hi! In SQL I can do UPDATE … SET something = (SELECT … FROM …). I don't see how I can do the same in Slick. Is that even possible and if so, how so? Thanks!
    Philipp Dörfler
    I should add that unfortunately I'm on Slick 2.1.0
    I understand that in Slick 3 there is forceInsertQuery which seems to this but for inserts, and there are DBIO Actions which apparently let me update values based on a computation. But is there a forceUpdateQuery or do I indeed use DBIO Actions for this purpose?
    James Phillips
    Hi. I've a question about generating Shapes
    I have a very ordinary set up, with a case class User and a Users table
    However, I need to generate a Shape[FlatShapeLevel, Users, User, Users] outside of class Users
    I cannot summon it implicitly from the top - as the topmost level which accepts implicit parameters does not know about the existence of Users
    (probably not a "very ordinary" setup after all)

    So, how do I write this line:

      implicit def userShape: Shape[FlatShapeLevel, Users, User, Users] = ...


    Any help would be greatly appreciated!
    Note again at this point, we are outside of the Users table, and so cannot access the columns within it
    Alexandre Roba
    Hi all, I having some difficulties using groupBy with Slick.
    This is my slick code:
      override def getPaymentErrorsByBookingIds(bookingIds: Seq[UUID]): DBIO[Seq[(UUID, Seq[PaymentError])]] =
          .filter(b => b.bookingId inSet bookingIds)
          .map {
            case (bookingId, errors) => bookingId -> errors
    The Payment Error table looks like:
    trait PaymentErrorTable {
      class PaymentErrors(tag: Tag) extends Table[PaymentError](tag, "payment_error") {
        def newIdWhenInserting: UUID = UUID.randomUUID()
        def currentWhenInserting = new Timestamp((new Date).getTime)
        def id              = column[UUID]("id", O.Default(newIdWhenInserting), O.PrimaryKey)
        def bookingId       = column[UUID]("booking_id")
        def paymentMethodId = column[String]("payment_method_id")
        def error           = column[String]("error")
        def amount          = column[BigDecimal]("amount")
        def currency        = column[Currency]("currency")
        def createdAt       = column[Timestamp]("created_at", O.Default(currentWhenInserting))
        def * =
          ) <> ((PaymentError.apply _).tupled, PaymentError.unapply)
      protected val paymentErrors = TableQuery[PaymentErrors]
    Unfortunattely when compiling I'm getting the following error:
    No matching Shape found.
    Slick does not know how to map the given types.
    Possible causes: T in Table[T] does not match your * projection,
     you use an unsupported type in a Query (e.g. scala List),
     or you forgot to import a driver api into scope.
      Required level: slick.lifted.FlatShapeLevel
         Source type: (slick.lifted.Rep[java.util.UUID], slick.lifted.Query[JdbcPaymentErrorStorage.this.PaymentErrors,booking.PaymentError,[+A]Seq[A]])
       Unpacked type: T
         Packed type: G
          .map {
    Alexandre Roba
    I've got a feeling that groupbycan only be used with aggregate functions...
    Richard Dallaway
    Yeah, I'd do the aggregation into (what is effective, kind of) a map client-side. You could totally count the errors but I don't think you can aggregate like that. (I'm not sure what the SQL would be to return a row containing an id and a list of errors... although there's probably a way to do it in Postgres: it seems to have all the toys 😍 )
    Hi, In slick we generally mention the database credentials in application.conf. But is there any way we can configure these credentials through code? Basically I wanted to retrieve the DB credentials from aws secret manager and then configure them using slick. How should I proceed?
    @vyshnavi-adusumelli have you tried creating a scala object to configure hilaki data source, it basically takes all the params and you could import it from anywhere you want? Then use JdbcBackend.Database.forDataSource to create a db object.
    1 reply
    Konstantin Simeonov

    Hi everyone, I'm using plainsql slick queries and I'm running into the following issue:

    Caused by: org.postgresql.util.PSQLException: ERROR: column "expiry_date" is of type timestamp without time zone but expression is of type character varying

    The column in question is of type timestamp with timezone.

    Here's how I build my query:

        val query = sqlu"""INSERT INTO "offers"
        VALUES (
                  ${new Timestamp(Instant.now.toEpochMilli)},
                  ${new Timestamp(c.expiryDate)},
                  ${new Timestamp(c.startAt)},
                  ${new Timestamp(c.endAt)},
    2 replies
    Renato Cavalcanti
    Hi @/all
    Slick v3.3.3 is out: https://github.com/slick/slick/releases/tag/v3.3.3
    1 reply
    Georgios Larkou
    Case.If(aquery.length>0) Then bquery else cquery
    It seems that the following does not work, what is the suggested way to perform different queries based on the length of the previous one?
    2 replies
    Antoine Doeraene
    @renatocaval thanks! Are there any open points where it could be a good point to jump in and provide some help?
    18 replies
    Hello can someone show me the full slick reference.conf for postgres db with connection pool. It seems the documentation is a bit all over the place for me.
    1 reply
    Blaž Marinović

    hi all! :) I'm having a problem with values interpolation in raw SQL. This works on PostgreSQL CLI:

    select file_name from (values ('file1'), ('file2'), ('file3')) AS t (file_name) except select file_name from processed_s3_files;

    I was expecting this to work:

    val fileNamesAsValues = fileNames.map(fn => s"('$fn')").mkString(", ")
    sql"""select "file_name" from (values $fileNamesAsValues) AS t ("file_name") except select "file_name" from "processed_s3_files"""".as[String]

    but I get org.postgresql.util.PSQLException: ERROR: syntax error at or near "$1"

    Antoine Doeraene
    You need # in front of the $. It's documented here http://scala-slick.org/doc/3.3.2/sql.html