Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    I’m using slick-pg 0.15.0-M3 with scala 2.12.1 and my driver is declared as follow :
    import java.time.Instant
    import com.github.tminglei.slickpg._
    import slick.basic.Capability
    import slick.jdbc.JdbcCapabilities
    private[taskmanager] trait PgDriver
        extends ExPostgresProfile
        with PgArraySupport
        with PgDateSupport
        with PgDate2Support
        with PgJsonSupport
        with PgRangeSupport
        with PgHStoreSupport
        with PgCirceJsonSupport {
      def pgjson = "jsonb"
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
      override val api: API = new API {}
      trait API
          extends super.API
          with ArrayImplicits
          with SimpleDateTimeImplicits
          with DateTimeImplicits
          with SimpleJsonImplicits
          with RangeImplicits
          with HStoreImplicits
          with CirceImplicits
          with CirceJsonPlainImplicits
          with SimpleArrayPlainImplicits
          with Date2DateTimeImplicitsDuration
          with SimpleJsonPlainImplicits
          with SimpleRangePlainImplicits
          with SimpleHStorePlainImplicits
    //    implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
    //    implicit val simpleIntstantListTypeMapper =
    //      new SimpleArrayJdbcType[Instant]("instant").to(_.toList)
    //  }
    private[taskmanager] object PgDriver extends PgDriver
    I guess my driver misses some implicit. Can you help me ?
    Mario Pastorelli
    hey people, PgDate2Support is available for slick 3.1.1?
    I can see that it's available in master but that works with slick 3.2.0
    in the branch slick 3 is not available
    so slick-pg 0.14.5 doesn't have the extension
    (the README for the branch slick 3 seems broken)
    doesn't have PgDate2Support
    I'm currently using
    "com.typesafe.slick" %% "slick" % "3.1.1",
      "com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
      "org.slf4j" % "slf4j-nop" % "1.6.4", // TODO change to appropriate logging library
      "com.github.tminglei" %% "slick-pg" % "0.14.5"
    Is there a way to do this: select ARRAY(select stringfield from foos);
    Pyry Kovanen

    Hi! I'm trying to use type Point in my model, but I'm not able to make it working

    package models
    import java.sql.Timestamp
    import util.MyPostgresDriver.api._
    import play.api.libs.json._
    import play.api.libs.functional.syntax._
    object Company {
      implicit val messageReads: Reads[Company] = (
            (JsPath \ "id").readNullable[Int] and
            (JsPath \ "companyId").read[String] and
            (JsPath \ "name").read[String] and
              (JsPath \ "location").read[Point] and
            (JsPath \ "createdAt").read[Long].map{ long => new Timestamp(long) } and
            (JsPath \ "updatedAt").read[Long].map{ long => new Timestamp(long) }
        )(Company.apply _)
      implicit val messageWrites: Writes[Company] = (
        (JsPath \ "id").write[Int] and
          (JsPath \ "companyId").write[String] and
          (JsPath \ "name").write[String] and
          (JsPath \ "location").write[Point] and
          (JsPath \ "createdAt").write[Long].contramap{ (a: Timestamp) => a.getTime } and
          (JsPath \ "updatedAt").write[Long].contramap{ (a: Timestamp) => a.getTime }
        )(unlift(Company.unapply _))
    case class Company(
      id: Option[Int],
      companyId: String,
      name: String,
      location: Point,
      createdAt: Timestamp,
      updatedAt: Timestamp

    "Cannot resolve symbol Point"

    package util
    import com.github.tminglei.slickpg._
    trait MyPostgresDriver extends ExPostgresDriver
      with PgArraySupport
      with PgDate2Support
      with PgPlayJsonSupport
      with PgNetSupport
      with PgLTreeSupport
      with PgRangeSupport
      with PgHStoreSupport
      with PgPostGISSupport
      with PgSearchSupport {
      override val pgjson = "jsonb"
      override val api = new API with ArrayImplicits
        with DateTimeImplicits
        with PostGISImplicits
        with PlayJsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with HStoreImplicits
        with SearchImplicits
        with SearchAssistants {}
    object MyPostgresDriver extends MyPostgresDriver

    Anything I'm doing wrong here?

    Thanks for your help!

    Pyry Kovanen
    ah, nevermind import com.vividsolutions.jts.geom._was missing
    Kentaro Kuwata
    I would like to create a PostGIS table including Polygon geometry.
    Is it possible to define class of table like the following codes?
    My IDE (IntelliJ) have given error at <> and Geodummy.unapply
    import MyPostgresDriver.api._
    case class Geodummy(dummy_id: Int, geom: Polygon)
    class Geodummys(tag: Tag) extends Table[Geodummy](tag, "geodummy"){
      def dummy_id = column[Int]("dummy_id")
      def geom = column[Polygon]("geom")
      def * = (dummy_id, geom) <> (Geodummy.tupled, Geodummy.unapply)
    hey everyone, just a noob question, I need the jsonb features of postgres
    i'm confused has how to use the pgjson type
    i have a case class defining the table entity
    but what does the type of the json value need to be?
    case class MetadataEntity(id: Int, metadata: String, asset_id: String)
    metadata is supposed to be jsonb
    Idrees Khan
    @graffam I'm not sure I understand your question correctly, but I am using JsValue from play-json
    for jsonb. There are other possibilities listed here https://github.com/tminglei/slick-pg#configurable-typemappers
    I guess i'm having a hard time just using pure json strings
    trait CustomPostgresDriver extends PostgresProfile {
      def pgjson = "jsonb"
      override val api = CustomAPI
      object CustomAPI extends API 
    object CustomPostgresDriver extends CustomPostgresDriver
    I don't want to use a heavy json library since i'm not sure the remote client will have it, so i just want a string
    Dmitry Smirnov
    Hi everybody, I have a question about java.time.LocalDateTime support, when I compose my postgres profile with PgDate2Support I can't work with the LocalDateTime, even if I write a custom mapper, but if I wrap it into my custom class it works.
      case class MyType(value: LocalDateTime)
      class TestTableWithOptMyType(tag: Tag)
        extends Table[Option[MyType]](tag, "TestTableOptMyType") {
        def optMyTypeColumn = column[Option[MyType]]("opt_my_type")
        def * = optMyTypeColumn
    It works
      class TestTableWithOptDt(tag: Tag)
        extends Table[Option[LocalDateTime]](tag, "TestTableOptDt") {
        def optDtTypeColumn = column[Option[LocalDateTime]]("opt_dt_type")
        def * = optDtTypeColumn
    it doesn't
    Here is my mappers:
        implicit def LocalDateTimeMapper: BaseColumnType[LocalDateTime] =
          MappedColumnType.base[LocalDateTime, Timestamp](
            x => Timestamp.valueOf(x),
            x => x.toLocalDateTime
        implicit def MyTypeMapper: BaseColumnType[MyType] =
          MappedColumnType.base[MyType, Timestamp](
            x => Timestamp.valueOf(x.value),
            x => MyType(x.toLocalDateTime)
    When I used not Option[LocalDateTime] i had the same issue
    Dmitry Smirnov
    Oh, i'm sorry, my bad
    I solved the problem, there was a conflict between my implicits and DateTimeImplicits
    Dave Nicponski

    Hi all! Multi part question for y'all.

    1) In a function where I have a scalar array parameter p of type Rep[List[X]], what's the "preferred" way of turning this into a Query[X, Rep[X], Seq? It isn't just p.unnest, apparently, which has type Rep[X].
    Looking at examples, it looks like Query(true).map(_ => p.unnest), which does have the right signature (albeit is a little weird).

    2) Let's say i have 2 array parameters p1 and p2 of the same cardinality. I want "a zip join of the unnested arrays". Pairs (a, b) where a from p1, and b from p2, where the indices for a and b match. How should i do this? Hint - It's not:

    Query(true).map(_ => p1.unnest).zip(Query(true).map(_ => p2.unnest))

    since this leads to a rather surprising sql behavior (at least on postgres) related to the ROW_NUMBER function being used for the zip join.

    Dave Nicponski
    ^^ anyone?
    Dave Nicponski
    @tminglei ^
    Dave Nicponski
    Actually, what i'm really wanting to get at is WITH ORDINALITY https://www.postgresql.org/docs/current/static/functions-srf.html
    because then i can do a join on the ordinality (index) columns, to achieve the zip-like behavior i want
    Dave Nicponski
    I take it that this room is dead?
    Naftoli Gugenheim
    Hi @tminglei you around?
    @melrief PgDate2Support is availabe for slick 3.1.1 as an addon, you can use libraryDependencies += "com.github.tminglei" %% "slick-pg_date2" % "0.14.6" to refer it. But in master for slick 3.2, I merged it into the main jar.
    @njouanin you need define TypeMapper for Instant List with AdvancedArrayJdbcType. And, you should use timestamp instead of instant as its pg type.
    @arussel slick/slick-pg can't generate select ARRAY(select stringfield from foos) yet.
    @Leonhalt3141 yes, you can give it a try.
    @graffam use def pgjson = "json" instead of def pgjson = "jsonb" in the CustomPostgresDriver, to tell slick-pg that you want use json string instead of json binary.
    @virusdave replied to you in the issue.
    Hi All, sorry for the late. I can't receive the notifications for a long time because of the GFW blocking.