Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Dave Nicponski
    Any help you can provide is majorly appreciated - you seem by far the most responsive (and knowledgeable) person i've observed in the slick world.

    Hi @virusdave I don't exactly understand what you said. To help me understand better, maybe you can:

    • tell me what you really want to do, then I can check whether you chose a suitable way;
    • provide a sample project, then I can check the details and know what happend to you

    Anyway, I'd like to help if I can :-)

    Dave Nicponski

    thanks for responding! It's a little late at night now (4am for me) but i'll try to get a sample for you tomorrow. Meanwhile i can try to explain better what i want.

    I need all of my slick queries to be precompiled - the compilation overhead is basically always too large to be allowed in the serving path.

    My two difficult cases basically are:
    1) I want my query to take "a set of values" as a parameter. Effectively i want a query that does SELECT * WHERE val IN ( x, y, z, ...)
    Slick doesn't want to precompile this, so i'm "faking" it with the array type in slick-pg, which does allow me to precompile a query with List[T] as parameter type. However i can't seem to get this to work with List[Option[T]] types.
    2) Precompiled queries only are allowed to take primitive/scalar lifted types. This means that rather than pass a well-shaped case class, you instead have to pass each of its component data members individually. This is awful. I'm looking for a way to do this better.

    i'll try to get a minimal repro example to demonstrate both of these tomorrow.
    Yeah, example repo is a better work base, I'll wait for it.
    :+1: wonderful slick-pg
    I have a colum[Option[JsValue]] field, it looks like to be able to query on it, I have to: (table.jsonColumnOpt+>>'value' === "the value").getOrElse(false:Rep[Boolean])
    but this is bad as the resulting query create an un-needed coalesce(..., false) and so does not use the index on jsonColum+>>'value'
    is there a way to re-write the query in scala such that it is not using coalesce ?
    it really just seems like table.jsonColumnOpt+>>'value' === "the value" should return a Rep[Boolean] as it does for every other Option type
    Juha Paananen
    Hi! Is there any chance to get the ilike operator to slick-pg?
    Yes, we can. Pls file an issue on github.
    Juha Paananen
    Joe Arasin
    Which version of this library should I use -- the 0.15 milestone or 0.14?
    Juha Paananen
    Is there a way to update a column to its default value as in UPDATE table SET update_timestamp=DEFAULT WHERE id=1? The case is that I'd like to re-calculate the value of an "update timestamp" field that has current_timestamp as its default value.
    What I'm looking for is the equivalend of the DEFAULT keyword in the Slick DSL
    Juha Paananen
    While trying to update to the newest version, I run into an issue with bind.any: previously I was able to do val things: List[JValue] = ... ; things.bind.any and use this in a query. Now I get "value bind is not a member of List[org.json4s.JValue]". Any idea what could have happened?
    Gotta admit I haven't figured out where the implicit conversions that add bind to things are in Slick or Slick-pg codebase...
    Juha Paananen
    Thanks to @tminglei my problem was solved. With Scala 2.12 you have to explicitly declare an API trait in your custom PostgresProfile, instead of just declaring an anonymous class by extending API in a field. This tminglei/slick-pg@56ad9e0 summarizes the changes very well.
    Brian Topping
    wow, sweet project, thanks!
    I have a table column declared as val stoppedAt = column[List[Instant]]("stopped_at ») and I get the following exception when compiling : could not find implicit value for parameter tt: slick.ast.TypedType[List[java.time.Instant]] [error] val stoppedAt = column[List[Instant]]("stopped_at »)
    I’m using slick-pg 0.15.0-M3 with scala 2.12.1 and my driver is declared as follow :
    import java.time.Instant
    import com.github.tminglei.slickpg._
    import slick.basic.Capability
    import slick.jdbc.JdbcCapabilities
    private[taskmanager] trait PgDriver
        extends ExPostgresProfile
        with PgArraySupport
        with PgDateSupport
        with PgDate2Support
        with PgJsonSupport
        with PgRangeSupport
        with PgHStoreSupport
        with PgCirceJsonSupport {
      def pgjson = "jsonb"
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
      override val api: API = new API {}
      trait API
          extends super.API
          with ArrayImplicits
          with SimpleDateTimeImplicits
          with DateTimeImplicits
          with SimpleJsonImplicits
          with RangeImplicits
          with HStoreImplicits
          with CirceImplicits
          with CirceJsonPlainImplicits
          with SimpleArrayPlainImplicits
          with Date2DateTimeImplicitsDuration
          with SimpleJsonPlainImplicits
          with SimpleRangePlainImplicits
          with SimpleHStorePlainImplicits
    //    implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
    //    implicit val simpleIntstantListTypeMapper =
    //      new SimpleArrayJdbcType[Instant]("instant").to(_.toList)
    //  }
    private[taskmanager] object PgDriver extends PgDriver
    I guess my driver misses some implicit. Can you help me ?
    Mario Pastorelli
    hey people, PgDate2Support is available for slick 3.1.1?
    I can see that it's available in master but that works with slick 3.2.0
    in the branch slick 3 is not available
    so slick-pg 0.14.5 doesn't have the extension
    (the README for the branch slick 3 seems broken)
    doesn't have PgDate2Support
    I'm currently using
    "com.typesafe.slick" %% "slick" % "3.1.1",
      "com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
      "org.slf4j" % "slf4j-nop" % "1.6.4", // TODO change to appropriate logging library
      "com.github.tminglei" %% "slick-pg" % "0.14.5"
    Is there a way to do this: select ARRAY(select stringfield from foos);
    Pyry Kovanen

    Hi! I'm trying to use type Point in my model, but I'm not able to make it working

    package models
    import java.sql.Timestamp
    import util.MyPostgresDriver.api._
    import play.api.libs.json._
    import play.api.libs.functional.syntax._
    object Company {
      implicit val messageReads: Reads[Company] = (
            (JsPath \ "id").readNullable[Int] and
            (JsPath \ "companyId").read[String] and
            (JsPath \ "name").read[String] and
              (JsPath \ "location").read[Point] and
            (JsPath \ "createdAt").read[Long].map{ long => new Timestamp(long) } and
            (JsPath \ "updatedAt").read[Long].map{ long => new Timestamp(long) }
        )(Company.apply _)
      implicit val messageWrites: Writes[Company] = (
        (JsPath \ "id").write[Int] and
          (JsPath \ "companyId").write[String] and
          (JsPath \ "name").write[String] and
          (JsPath \ "location").write[Point] and
          (JsPath \ "createdAt").write[Long].contramap{ (a: Timestamp) => a.getTime } and
          (JsPath \ "updatedAt").write[Long].contramap{ (a: Timestamp) => a.getTime }
        )(unlift(Company.unapply _))
    case class Company(
      id: Option[Int],
      companyId: String,
      name: String,
      location: Point,
      createdAt: Timestamp,
      updatedAt: Timestamp

    "Cannot resolve symbol Point"

    package util
    import com.github.tminglei.slickpg._
    trait MyPostgresDriver extends ExPostgresDriver
      with PgArraySupport
      with PgDate2Support
      with PgPlayJsonSupport
      with PgNetSupport
      with PgLTreeSupport
      with PgRangeSupport
      with PgHStoreSupport
      with PgPostGISSupport
      with PgSearchSupport {
      override val pgjson = "jsonb"
      override val api = new API with ArrayImplicits
        with DateTimeImplicits
        with PostGISImplicits
        with PlayJsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with HStoreImplicits
        with SearchImplicits
        with SearchAssistants {}
    object MyPostgresDriver extends MyPostgresDriver

    Anything I'm doing wrong here?

    Thanks for your help!

    Pyry Kovanen
    ah, nevermind import com.vividsolutions.jts.geom._was missing
    Kentaro Kuwata
    I would like to create a PostGIS table including Polygon geometry.
    Is it possible to define class of table like the following codes?
    My IDE (IntelliJ) have given error at <> and Geodummy.unapply
    import MyPostgresDriver.api._
    case class Geodummy(dummy_id: Int, geom: Polygon)
    class Geodummys(tag: Tag) extends Table[Geodummy](tag, "geodummy"){
      def dummy_id = column[Int]("dummy_id")
      def geom = column[Polygon]("geom")
      def * = (dummy_id, geom) <> (Geodummy.tupled, Geodummy.unapply)
    hey everyone, just a noob question, I need the jsonb features of postgres
    i'm confused has how to use the pgjson type
    i have a case class defining the table entity
    but what does the type of the json value need to be?
    case class MetadataEntity(id: Int, metadata: String, asset_id: String)
    metadata is supposed to be jsonb
    Idrees Khan
    @graffam I'm not sure I understand your question correctly, but I am using JsValue from play-json
    for jsonb. There are other possibilities listed here https://github.com/tminglei/slick-pg#configurable-typemappers
    I guess i'm having a hard time just using pure json strings
    trait CustomPostgresDriver extends PostgresProfile {
      def pgjson = "jsonb"
      override val api = CustomAPI
      object CustomAPI extends API 
    object CustomPostgresDriver extends CustomPostgresDriver
    I don't want to use a heavy json library since i'm not sure the remote client will have it, so i just want a string
    Dmitry Smirnov
    Hi everybody, I have a question about java.time.LocalDateTime support, when I compose my postgres profile with PgDate2Support I can't work with the LocalDateTime, even if I write a custom mapper, but if I wrap it into my custom class it works.