Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Just use table.filter(_.name === “baz”). In fact, both are ok.
    Jens Grassel

    Hi, I'm running into a missing implicit value for parameter tt: slick.ast.TypedType[...] issue. The basic code is as follows:

    trait CustomPostgresDriver extends ExPostgresDriver with PgDate2Support with PgPostGISSupport {
      def pgjson = "jsonb" // For postgresql 9.4 and higher use `jsonb`. Version below 9.4 must use `json`.
      protected override def computeCapabilities: Set[Capability] = super.computeCapabilities + JdbcProfile.capabilities.insertOrUpdate
      override val api: API = new API with DateTimeImplicits with PostGISImplicits with PostGISAssistants {}
    object CustomPostgresDriver extends CustomPostgresDriver
    object Tables {
      import CustomPostgresDriver.api._
      class MyTable(tag: Tag) extends Table[TestCase](tag, "cases") {
        def z = column[ZonedDateTime]("z")
        def t = column[String]("t")
        def c = column[Int]("c")
        def p = column[Point]("p")
        override def * : ProvenShape[TestCase] =
          (z, t, c, p).shaped <> ((TestCase.apply _).tupled, TestCase.unapply)
      val tq = TableQuery[MyTable]

    The compiler complains about the ZonedDateTime and the Point column types. :-(

    Any ideas what I am doing wrong? I've raised an issue regarding that which links to a small repo the reproduces the case: tminglei/slick-pg#303

    @jan0sch resolved, pls check the issue comments and try it.
    Dave Nicponski

    Hi tminglei!
    I've been bashing my head in slick without much luck form that gitter room. I'm trying to Compiled() methods that take case classes which have CaseClassShape, but of course that doesn't work due to the ShapeLevel of CaseClassShape (specifically, it's not a ColumnsShapeLevel).
    I noticed that using slick-pg (we use postgres) that i typically can compile queries that use Rep[List[T]] for primitive types T. Then, i stumbled upon something that is making me think i could use slick-pg's support for Composite (or at least, Composite Arrays) and might accomplish the same thing. Basically it would "fake" an encoding as string to become a primitive column type, however this would never actually be used since these types would only be deconstructed inside the query and never actually stored.

    Does this sound reasonable? Or am i way off base here? Do you have any advice for me on how to go about doing this? Slick codebase is unbelievably hard to understand for me, and online resources appear to be nonexistent to help understand it.

    Also (and unrelated), i've managed to use slick-pg's array support to get around the issue of precompiling queries that would naturally take a Set[T] for foo IN ( ... ) type queries, by instead doing something like Query(vals.lift.unnest) to produce a query from a list that i can then join against and precompile.
    However, this doesn't seem to work if T is of the form Option[S]. Basically, i'd like my array to potentially have NULLs in some entries such that i'd end up with NULLs in the unnested query. Is this meant to work? Do i need to do something magical to make it work, like adding additional implicit SimpleArrayJdbcType beyond the base vals in ArrayImplicits ?
    Any help you can provide is majorly appreciated - you seem by far the most responsive (and knowledgeable) person i've observed in the slick world.

    Hi @virusdave I don't exactly understand what you said. To help me understand better, maybe you can:

    • tell me what you really want to do, then I can check whether you chose a suitable way;
    • provide a sample project, then I can check the details and know what happend to you

    Anyway, I'd like to help if I can :-)

    Dave Nicponski

    thanks for responding! It's a little late at night now (4am for me) but i'll try to get a sample for you tomorrow. Meanwhile i can try to explain better what i want.

    I need all of my slick queries to be precompiled - the compilation overhead is basically always too large to be allowed in the serving path.

    My two difficult cases basically are:
    1) I want my query to take "a set of values" as a parameter. Effectively i want a query that does SELECT * WHERE val IN ( x, y, z, ...)
    Slick doesn't want to precompile this, so i'm "faking" it with the array type in slick-pg, which does allow me to precompile a query with List[T] as parameter type. However i can't seem to get this to work with List[Option[T]] types.
    2) Precompiled queries only are allowed to take primitive/scalar lifted types. This means that rather than pass a well-shaped case class, you instead have to pass each of its component data members individually. This is awful. I'm looking for a way to do this better.

    i'll try to get a minimal repro example to demonstrate both of these tomorrow.
    Yeah, example repo is a better work base, I'll wait for it.
    :+1: wonderful slick-pg
    I have a colum[Option[JsValue]] field, it looks like to be able to query on it, I have to: (table.jsonColumnOpt+>>'value' === "the value").getOrElse(false:Rep[Boolean])
    but this is bad as the resulting query create an un-needed coalesce(..., false) and so does not use the index on jsonColum+>>'value'
    is there a way to re-write the query in scala such that it is not using coalesce ?
    it really just seems like table.jsonColumnOpt+>>'value' === "the value" should return a Rep[Boolean] as it does for every other Option type
    Juha Paananen
    Hi! Is there any chance to get the ilike operator to slick-pg?
    Yes, we can. Pls file an issue on github.
    Juha Paananen
    Joe Arasin
    Which version of this library should I use -- the 0.15 milestone or 0.14?
    Juha Paananen
    Is there a way to update a column to its default value as in UPDATE table SET update_timestamp=DEFAULT WHERE id=1? The case is that I'd like to re-calculate the value of an "update timestamp" field that has current_timestamp as its default value.
    What I'm looking for is the equivalend of the DEFAULT keyword in the Slick DSL
    Juha Paananen
    While trying to update to the newest version, I run into an issue with bind.any: previously I was able to do val things: List[JValue] = ... ; things.bind.any and use this in a query. Now I get "value bind is not a member of List[org.json4s.JValue]". Any idea what could have happened?
    Gotta admit I haven't figured out where the implicit conversions that add bind to things are in Slick or Slick-pg codebase...
    Juha Paananen
    Thanks to @tminglei my problem was solved. With Scala 2.12 you have to explicitly declare an API trait in your custom PostgresProfile, instead of just declaring an anonymous class by extending API in a field. This tminglei/slick-pg@56ad9e0 summarizes the changes very well.
    Brian Topping
    wow, sweet project, thanks!
    I have a table column declared as val stoppedAt = column[List[Instant]]("stopped_at ») and I get the following exception when compiling : could not find implicit value for parameter tt: slick.ast.TypedType[List[java.time.Instant]] [error] val stoppedAt = column[List[Instant]]("stopped_at »)
    I’m using slick-pg 0.15.0-M3 with scala 2.12.1 and my driver is declared as follow :
    import java.time.Instant
    import com.github.tminglei.slickpg._
    import slick.basic.Capability
    import slick.jdbc.JdbcCapabilities
    private[taskmanager] trait PgDriver
        extends ExPostgresProfile
        with PgArraySupport
        with PgDateSupport
        with PgDate2Support
        with PgJsonSupport
        with PgRangeSupport
        with PgHStoreSupport
        with PgCirceJsonSupport {
      def pgjson = "jsonb"
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
      override val api: API = new API {}
      trait API
          extends super.API
          with ArrayImplicits
          with SimpleDateTimeImplicits
          with DateTimeImplicits
          with SimpleJsonImplicits
          with RangeImplicits
          with HStoreImplicits
          with CirceImplicits
          with CirceJsonPlainImplicits
          with SimpleArrayPlainImplicits
          with Date2DateTimeImplicitsDuration
          with SimpleJsonPlainImplicits
          with SimpleRangePlainImplicits
          with SimpleHStorePlainImplicits
    //    implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
    //    implicit val simpleIntstantListTypeMapper =
    //      new SimpleArrayJdbcType[Instant]("instant").to(_.toList)
    //  }
    private[taskmanager] object PgDriver extends PgDriver
    I guess my driver misses some implicit. Can you help me ?
    Mario Pastorelli
    hey people, PgDate2Support is available for slick 3.1.1?
    I can see that it's available in master but that works with slick 3.2.0
    in the branch slick 3 is not available
    so slick-pg 0.14.5 doesn't have the extension
    (the README for the branch slick 3 seems broken)
    doesn't have PgDate2Support
    I'm currently using
    "com.typesafe.slick" %% "slick" % "3.1.1",
      "com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
      "org.slf4j" % "slf4j-nop" % "1.6.4", // TODO change to appropriate logging library
      "com.github.tminglei" %% "slick-pg" % "0.14.5"
    Is there a way to do this: select ARRAY(select stringfield from foos);
    Pyry Kovanen

    Hi! I'm trying to use type Point in my model, but I'm not able to make it working

    package models
    import java.sql.Timestamp
    import util.MyPostgresDriver.api._
    import play.api.libs.json._
    import play.api.libs.functional.syntax._
    object Company {
      implicit val messageReads: Reads[Company] = (
            (JsPath \ "id").readNullable[Int] and
            (JsPath \ "companyId").read[String] and
            (JsPath \ "name").read[String] and
              (JsPath \ "location").read[Point] and
            (JsPath \ "createdAt").read[Long].map{ long => new Timestamp(long) } and
            (JsPath \ "updatedAt").read[Long].map{ long => new Timestamp(long) }
        )(Company.apply _)
      implicit val messageWrites: Writes[Company] = (
        (JsPath \ "id").write[Int] and
          (JsPath \ "companyId").write[String] and
          (JsPath \ "name").write[String] and
          (JsPath \ "location").write[Point] and
          (JsPath \ "createdAt").write[Long].contramap{ (a: Timestamp) => a.getTime } and
          (JsPath \ "updatedAt").write[Long].contramap{ (a: Timestamp) => a.getTime }
        )(unlift(Company.unapply _))
    case class Company(
      id: Option[Int],
      companyId: String,
      name: String,
      location: Point,
      createdAt: Timestamp,
      updatedAt: Timestamp

    "Cannot resolve symbol Point"

    package util
    import com.github.tminglei.slickpg._
    trait MyPostgresDriver extends ExPostgresDriver
      with PgArraySupport
      with PgDate2Support
      with PgPlayJsonSupport
      with PgNetSupport
      with PgLTreeSupport
      with PgRangeSupport
      with PgHStoreSupport
      with PgPostGISSupport
      with PgSearchSupport {
      override val pgjson = "jsonb"
      override val api = new API with ArrayImplicits
        with DateTimeImplicits
        with PostGISImplicits
        with PlayJsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with HStoreImplicits
        with SearchImplicits
        with SearchAssistants {}
    object MyPostgresDriver extends MyPostgresDriver

    Anything I'm doing wrong here?

    Thanks for your help!

    Pyry Kovanen
    ah, nevermind import com.vividsolutions.jts.geom._was missing
    Kentaro Kuwata
    I would like to create a PostGIS table including Polygon geometry.
    Is it possible to define class of table like the following codes?
    My IDE (IntelliJ) have given error at <> and Geodummy.unapply
    import MyPostgresDriver.api._
    case class Geodummy(dummy_id: Int, geom: Polygon)
    class Geodummys(tag: Tag) extends Table[Geodummy](tag, "geodummy"){
      def dummy_id = column[Int]("dummy_id")
      def geom = column[Polygon]("geom")
      def * = (dummy_id, geom) <> (Geodummy.tupled, Geodummy.unapply)
    hey everyone, just a noob question, I need the jsonb features of postgres
    i'm confused has how to use the pgjson type
    i have a case class defining the table entity
    but what does the type of the json value need to be?
    case class MetadataEntity(id: Int, metadata: String, asset_id: String)
    metadata is supposed to be jsonb
    Idrees Khan
    @graffam I'm not sure I understand your question correctly, but I am using JsValue from play-json