Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    @sherpal noted, thanks!

    I think I am still not grasping how to use groupBy

    val balanceCreditDebitWithPersonQuery =
            .groupBy(r => (r._1, r._2, r._3, r._4))
              case (key, groupValue) => key -> groupValue.map(_._5).sum //sum returns 0 if all values of database are null
                .groupBy(r => (r._1, r._2, r._3, r._4))
                  case (key, groupValue) => key -> groupValue.map(_._5).sum
            ).on(_._1._1 === _._1._1)
            .map(r => (r._1._1._2, r._1._1._3, r._1._1._4, r._1._2, r._2._2, (r._1._2 - r._2._2))).result

    I want to group by the first four columns of each table: .groupBy(r => (r._1, r._2, r._3, r._4))
    and use the last column to sum over it: groupValue.map(_._5).sum

    Antoine Doeraene

    groupBy works roughly the same as for usual standard collection. The difference is that you

    1. can't groupBy on absolutely anything, you have to group by on a "primitive" value, or something with a primary key
    2. you have to go back to a "flat" structure by calling map.

    maybe 1. is not fullfilled in your case. What does it say that is not working out for you? (also you perhaps have to destructure de "key", but I'm not sure right now)

    rimeh bennjima

    Hello @here, I have a question regarding the migration of slick from 2.0.2 to 3.0.0.
    in slick 2.0.0 I have:

        def forInsert = (name, fullName.orEmpty, email.orEmpty, password, active)
        private lazy val addUserC = users.map(_.forInsert).insertInvoker

    How can I migrate insertInvoker to 3.0.0 ?

    @sherpal Thanks for the reply :). It does compile, but the output is just nothing. I double checked against my pure SQL query and I am certain that it is not the SQL query I am trying to abstract. What do you mean by "key" destructure?
    I changed the .join(...).on(...) to a .zip(...) and it is working now aaaand I have no clue why :D
    Hi, I got a question about shipping sqlite db file in a package. I am developing a testkit for my team, where I want to include a binary sqlite db file within the package to provide sample test data. In the package, I put the db file in the resources fold and referred to it as var url = getClass.getResource("/database/data.db").getFile, however, client projects referencing this library are not able to load the file correctly. Do you know how I can package binary db file with my library?
    What do you mean not able to load the file correctly? What exactly is the behavior?
    Austin Steady

    hi all, I recently hit a performance issue with slick, but I think it's due to how we structure our queries

    to support soft-deletion, we have an active column in many tables. since we typically only want active records, we added a shorthand for that:

    lazy val baseQuery = TableQuery[MyTable]
    lazy val query = baseQuery.filter(_.active)

    The problem is that when we join a table against this like otherTable.join(myTable.query).on(_.foreignKeyId === _.id) it generates sql that "eagerly" applies the where active in a subquery inside the join, which results in more rows scanned

    Does anyone recommend a pattern where we can continue to use query like this so we don't have to remember to filter for active, but also don't have to worry about joins being worse?

    KiranKumar BS
    Is there any way to set application_name?
    Antoine Doeraene
    note sure what you mean by "application_name" ?
    Boris Lopukhov
    Hi all!
    I have a question. DBIOAction has map/flatMap methods with implicit ExecutionContext. What execution context should be used here, database or app?
    Rohan Sircar
    @brs-lphv app/async EC
    Boris Lopukhov

    Then i have problem with my code:

    val appEC = scala.concurrent.ExecutionContext.Implicits.global
    val databaseEC = database.executor.executionContext
    val requestCount = 30
    def update(id: Long)(ex: ExecutionContext) = {
      val action = selectByIdAction(id).flatMap(x => updateAction(x))(ex)
    Future.sequence((1 to requestCount).map(_ => update(someId)(databaseEC))) //works fine
    Future.sequence((1 to requestCount).map(_ => update(someId)(appEC))) //doesn't work, deadlock

    it happens when requestCount > numThreads of AsyncExecutor

    Rohan Sircar
    Strange. What's your queueSize in slick? Does it make a difference if you change it's value?
    Boris Lopukhov
    queueSize is 1000
    Oliver Schrenk

    I use https://github.com/tototoshi/sbt-slick-codegen to generate my Tables.scala which I then intend to commit to git.

    I noticed that the database name is also part of the generated source code Some("acme").

    class SomeTable(_tableTag: Tag) extends profile.api.Table[FlywaySchemaHistoryRow](_tableTag, Some("acme"), "some_table")

    Since production code might run against a different database name, is that an issue? Is there some way to not generate the database name? If so, are there any downsides?

    Clint Combs
    Hi all - I'm wondering about the status of Slick development and migration to Scala 3. It appears Lightbend has ended their support for the project(?) and it's being moved forward by "the community": https://scala-slick.org/news/2021/04/23/slick-maintenance.html but I'm not seeing any updates here: slick/slick#2198 I'm a long-time Slick user (mostly happy with it), but find myself considering other options. Anyone have insight into where Slick is headed? Is it time to move to https://tpolecat.github.io/doobie/ ... https://getquill.io/ ... or some other option?
    Antoine Doeraene
    slick support for 3.0 will arrive eventually. But there is no official roadmap, as you can imagine :)
    Julian Sarrelli
    Hi guys, I'm trying to have a collection of Long's as a pre-compiled query parameter so I can use it within an IN clause. Any way I can do something like this?
     protected val propertiesForProduct = for {
        (storeId, productIds) <- Parameters[(Long, Set[Int])]
        relation              <- productCategories.filter(_.storeId === storeId && product.id in productIds)
      } yield (relation.productId, relation.categoryId)
    Is for SQL database
    inSet is the thing you're looking for @jsarrelli
    you can shove anything along lines of Iterable[Long] into it
    Krish Narukulla
    we are running into build perf issues with slick framework when table columns grow more than 22. is there any workaround to the problem?
    Matias Partanen


    Wondering about this issue as I've encountered it, is there any way to go around this? Only way I figured out is to change Postgres time to varchar and then have a custom mapper, which of course isn't optimal if I would have to use some internal time operations.

      import slick.jdbc.PostgresProfile.api._
      val formatter: DateTimeFormatter = DateTimeFormatter.ISO_LOCAL_TIME
      implicit val localTimeColumnType: JdbcType[LocalTime] with BaseTypedType[LocalTime] = MappedColumnType.base[LocalTime, String](
        { b => formatter.format(b) }, // map LocalTime to String
        { i => LocalTime.parse(i) } // map String to LocalTime
    def time      = column[LocalTime]("time")(localTimeColumnType)

    With custom mapper it doesn't work if I have used type time in schema, because then I'm getting following error from Postgres:

    ERROR: column "time" is of type time without time zone but expression is of type character varying Hint: You will need to rewrite or cast the expression.

    Julian Sarrelli

    Hi guys! I'm trying to connect to a MySQL database following Slick Documentation
    But I keep running into the same problem

    com.zaxxer.hikari.pool.HikariPool HikariPool-1 - Exception during pool initialization. com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
    Does it have something to do with mysql-connector version? Currently using 6.0.6 (according to documentation)

    @jsarrelli isn't that old? Or am I mixed up?
    Also the obvious question, are you sure the server is running and slick is pointed to the right place? That is, can you confirm slick has the connection settings you think it has, and that those connection settings work with say the mysql CLI from the same computer (or VM/container) as slick is running in?
    @ClintCombs I'm trying to move it forward but my time is pretty limited. For a while I was spending some time on it every sunday but lately I have not been available on sundays. I plan to continue when I can, hopefully soon
    Right now the goal is to release master as 3.4.0, and then to focus on scala 3 support. There is a PR but I'm not sure exactly what state it's in.
    I have other improvements I would love to find time to make, too
    @ClintCombs also, I would have more time if someone wants to pair program with me (or just act as a sounding board)
    Alex Leeds

    Hi, I have a question around compiled queries, in regards to inserts / upserts.
    Is there a difference between these two statements?

    Assuming val ts = TableQuery[T]:

    • Compiled(ts) += row
    • ts += row
    Priyanka Gugale

    I am getting error Type CustomerRow does not conform to upper bound Product with Serializable of type parameter R
    This is with generated classes using slick-codegen utility. I have more than 22 fields in my table.

    Scala version: 2.13
    Slick version: 3.3.3

    @nafg: There are several issues that have stopped progressing, which ones do we need to resolve for the next release?
    j.kugiya: none are blocking a release. I just have to have time to switch over to sbt-ci-release IIRC (or something like that), as Seth requested. I haven't had a really free Sunday in a while so I haven't gotten to it yet. But if someone wants to pair program with me, or just be a sounding board while I work on it, I will find time a lot more easily. (When there are too many other things in front of me I can't concentrate otherwise.)
    Sunil Kumar Yadav
    Hi All I am facing one issue while doing group by on some id and trying to get value of Cost case class which have filed amount to sum it => case class Booking(id,Cost) case class Cost(amount:Double) getting issue can’t covert Rep[Cost] to Cost ..any idea how to solve it ..groupBy with aggregation method on case class field
    1 reply
    val action = bookings.groupBy { booking =>
    }.map {
    case (id, group) =>
    (id, group.size, group.map { obj =>
    Gajendra Naidu Thalapaneni
    Need help, I have a list of database operations as Futures, when I run them sequentially, the DB update is not updated in the same sequence. Using AsyncExecutor, with Slick "3.3.3" and Postgresql
    Antoine Doeraene
    Could you post the code?
    Gajendra Naidu Thalapaneni
    @sherpal Below code doing sequentialize futures.
    def sequentializeFutures[A, B](l: Iterable[A])(fn: A => Future[B])(implicit ec: ExecutionContext): Future[List[B]] =
        l.foldLeft(Future(List.empty[B])) { (previousFuture, next) =>
          for {
            previousResults <- previousFuture
            next <- fn(next)
          } yield previousResults :+ next
    And those futures includes database operations, like create row and delete row.
    Antoine Doeraene
    That is indeed weird. the only thing I see is if you where to give as l an Iterable[Future[A]] and for fn the identity function.
    Need help, is TableQeury thread safe in play framework?
    can I just use one TableQuery object all through my play application?
    Antoine Doeraene
    Yes, it's an immutable object. Actually, everything in Slick is immutable, until the point where you execute db.run.

    In particular, you can do things like this

    val myTable = TableQuery[Stuff]
    val onlyActiveStuff = myTable.filter(_.active)

    and use onlyActiveStuff throughout your application freely.

    Got it, thanks