Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Juha Paananen
    Hi! Is there any chance to get the ilike operator to slick-pg?
    Yes, we can. Pls file an issue on github.
    Juha Paananen
    Joe Arasin
    Which version of this library should I use -- the 0.15 milestone or 0.14?
    Juha Paananen
    Is there a way to update a column to its default value as in UPDATE table SET update_timestamp=DEFAULT WHERE id=1? The case is that I'd like to re-calculate the value of an "update timestamp" field that has current_timestamp as its default value.
    What I'm looking for is the equivalend of the DEFAULT keyword in the Slick DSL
    Juha Paananen
    While trying to update to the newest version, I run into an issue with bind.any: previously I was able to do val things: List[JValue] = ... ; things.bind.any and use this in a query. Now I get "value bind is not a member of List[org.json4s.JValue]". Any idea what could have happened?
    Gotta admit I haven't figured out where the implicit conversions that add bind to things are in Slick or Slick-pg codebase...
    Juha Paananen
    Thanks to @tminglei my problem was solved. With Scala 2.12 you have to explicitly declare an API trait in your custom PostgresProfile, instead of just declaring an anonymous class by extending API in a field. This tminglei/slick-pg@56ad9e0 summarizes the changes very well.
    Brian Topping
    wow, sweet project, thanks!
    I have a table column declared as val stoppedAt = column[List[Instant]]("stopped_at ») and I get the following exception when compiling : could not find implicit value for parameter tt: slick.ast.TypedType[List[java.time.Instant]] [error] val stoppedAt = column[List[Instant]]("stopped_at »)
    I’m using slick-pg 0.15.0-M3 with scala 2.12.1 and my driver is declared as follow :
    import java.time.Instant
    import com.github.tminglei.slickpg._
    import slick.basic.Capability
    import slick.jdbc.JdbcCapabilities
    private[taskmanager] trait PgDriver
        extends ExPostgresProfile
        with PgArraySupport
        with PgDateSupport
        with PgDate2Support
        with PgJsonSupport
        with PgRangeSupport
        with PgHStoreSupport
        with PgCirceJsonSupport {
      def pgjson = "jsonb"
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
      override val api: API = new API {}
      trait API
          extends super.API
          with ArrayImplicits
          with SimpleDateTimeImplicits
          with DateTimeImplicits
          with SimpleJsonImplicits
          with RangeImplicits
          with HStoreImplicits
          with CirceImplicits
          with CirceJsonPlainImplicits
          with SimpleArrayPlainImplicits
          with Date2DateTimeImplicitsDuration
          with SimpleJsonPlainImplicits
          with SimpleRangePlainImplicits
          with SimpleHStorePlainImplicits
    //    implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
    //    implicit val simpleIntstantListTypeMapper =
    //      new SimpleArrayJdbcType[Instant]("instant").to(_.toList)
    //  }
    private[taskmanager] object PgDriver extends PgDriver
    I guess my driver misses some implicit. Can you help me ?
    Mario Pastorelli
    hey people, PgDate2Support is available for slick 3.1.1?
    I can see that it's available in master but that works with slick 3.2.0
    in the branch slick 3 is not available
    so slick-pg 0.14.5 doesn't have the extension
    (the README for the branch slick 3 seems broken)
    doesn't have PgDate2Support
    I'm currently using
    "com.typesafe.slick" %% "slick" % "3.1.1",
      "com.typesafe.slick" %% "slick-hikaricp" % "3.1.1",
      "org.slf4j" % "slf4j-nop" % "1.6.4", // TODO change to appropriate logging library
      "com.github.tminglei" %% "slick-pg" % "0.14.5"
    Is there a way to do this: select ARRAY(select stringfield from foos);
    Pyry Kovanen

    Hi! I'm trying to use type Point in my model, but I'm not able to make it working

    package models
    import java.sql.Timestamp
    import util.MyPostgresDriver.api._
    import play.api.libs.json._
    import play.api.libs.functional.syntax._
    object Company {
      implicit val messageReads: Reads[Company] = (
            (JsPath \ "id").readNullable[Int] and
            (JsPath \ "companyId").read[String] and
            (JsPath \ "name").read[String] and
              (JsPath \ "location").read[Point] and
            (JsPath \ "createdAt").read[Long].map{ long => new Timestamp(long) } and
            (JsPath \ "updatedAt").read[Long].map{ long => new Timestamp(long) }
        )(Company.apply _)
      implicit val messageWrites: Writes[Company] = (
        (JsPath \ "id").write[Int] and
          (JsPath \ "companyId").write[String] and
          (JsPath \ "name").write[String] and
          (JsPath \ "location").write[Point] and
          (JsPath \ "createdAt").write[Long].contramap{ (a: Timestamp) => a.getTime } and
          (JsPath \ "updatedAt").write[Long].contramap{ (a: Timestamp) => a.getTime }
        )(unlift(Company.unapply _))
    case class Company(
      id: Option[Int],
      companyId: String,
      name: String,
      location: Point,
      createdAt: Timestamp,
      updatedAt: Timestamp

    "Cannot resolve symbol Point"

    package util
    import com.github.tminglei.slickpg._
    trait MyPostgresDriver extends ExPostgresDriver
      with PgArraySupport
      with PgDate2Support
      with PgPlayJsonSupport
      with PgNetSupport
      with PgLTreeSupport
      with PgRangeSupport
      with PgHStoreSupport
      with PgPostGISSupport
      with PgSearchSupport {
      override val pgjson = "jsonb"
      override val api = new API with ArrayImplicits
        with DateTimeImplicits
        with PostGISImplicits
        with PlayJsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with HStoreImplicits
        with SearchImplicits
        with SearchAssistants {}
    object MyPostgresDriver extends MyPostgresDriver

    Anything I'm doing wrong here?

    Thanks for your help!

    Pyry Kovanen
    ah, nevermind import com.vividsolutions.jts.geom._was missing
    Kentaro Kuwata
    I would like to create a PostGIS table including Polygon geometry.
    Is it possible to define class of table like the following codes?
    My IDE (IntelliJ) have given error at <> and Geodummy.unapply
    import MyPostgresDriver.api._
    case class Geodummy(dummy_id: Int, geom: Polygon)
    class Geodummys(tag: Tag) extends Table[Geodummy](tag, "geodummy"){
      def dummy_id = column[Int]("dummy_id")
      def geom = column[Polygon]("geom")
      def * = (dummy_id, geom) <> (Geodummy.tupled, Geodummy.unapply)
    hey everyone, just a noob question, I need the jsonb features of postgres
    i'm confused has how to use the pgjson type
    i have a case class defining the table entity
    but what does the type of the json value need to be?
    case class MetadataEntity(id: Int, metadata: String, asset_id: String)
    metadata is supposed to be jsonb
    Idrees Khan
    @graffam I'm not sure I understand your question correctly, but I am using JsValue from play-json
    for jsonb. There are other possibilities listed here https://github.com/tminglei/slick-pg#configurable-typemappers
    I guess i'm having a hard time just using pure json strings
    trait CustomPostgresDriver extends PostgresProfile {
      def pgjson = "jsonb"
      override val api = CustomAPI
      object CustomAPI extends API 
    object CustomPostgresDriver extends CustomPostgresDriver
    I don't want to use a heavy json library since i'm not sure the remote client will have it, so i just want a string
    Dmitry Smirnov
    Hi everybody, I have a question about java.time.LocalDateTime support, when I compose my postgres profile with PgDate2Support I can't work with the LocalDateTime, even if I write a custom mapper, but if I wrap it into my custom class it works.
      case class MyType(value: LocalDateTime)
      class TestTableWithOptMyType(tag: Tag)
        extends Table[Option[MyType]](tag, "TestTableOptMyType") {
        def optMyTypeColumn = column[Option[MyType]]("opt_my_type")
        def * = optMyTypeColumn
    It works
      class TestTableWithOptDt(tag: Tag)
        extends Table[Option[LocalDateTime]](tag, "TestTableOptDt") {
        def optDtTypeColumn = column[Option[LocalDateTime]]("opt_dt_type")
        def * = optDtTypeColumn
    it doesn't
    Here is my mappers:
        implicit def LocalDateTimeMapper: BaseColumnType[LocalDateTime] =
          MappedColumnType.base[LocalDateTime, Timestamp](
            x => Timestamp.valueOf(x),
            x => x.toLocalDateTime
        implicit def MyTypeMapper: BaseColumnType[MyType] =
          MappedColumnType.base[MyType, Timestamp](
            x => Timestamp.valueOf(x.value),
            x => MyType(x.toLocalDateTime)
    When I used not Option[LocalDateTime] i had the same issue
    Dmitry Smirnov
    Oh, i'm sorry, my bad
    I solved the problem, there was a conflict between my implicits and DateTimeImplicits
    Dave Nicponski

    Hi all! Multi part question for y'all.

    1) In a function where I have a scalar array parameter p of type Rep[List[X]], what's the "preferred" way of turning this into a Query[X, Rep[X], Seq? It isn't just p.unnest, apparently, which has type Rep[X].
    Looking at examples, it looks like Query(true).map(_ => p.unnest), which does have the right signature (albeit is a little weird).

    2) Let's say i have 2 array parameters p1 and p2 of the same cardinality. I want "a zip join of the unnested arrays". Pairs (a, b) where a from p1, and b from p2, where the indices for a and b match. How should i do this? Hint - It's not:

    Query(true).map(_ => p1.unnest).zip(Query(true).map(_ => p2.unnest))

    since this leads to a rather surprising sql behavior (at least on postgres) related to the ROW_NUMBER function being used for the zip join.

    Dave Nicponski
    ^^ anyone?