Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
    Jason Cohen
    does the makePoint() function use lat,lon or lon,lat ordering of the coordinates?
    hello, is there a way I use postgis ST_Extent?
    hi can I do with slick such query
    drop table if exists orders cascade;
    CREATE TABLE orders (
      info json NOT NULL
    INSERT INTO orders (info)
    ('{"interestedIn":[11],"countries":["IT", "UK"]}'),
    ('{"interestedIn":[12],"countries":["US", "UK"]}'),
    ('{"interestedIn":[14,15,16],"countries":["DE", "UK"]}');
    select *
    from orders
    where info::jsonb -> 'countries' ? 'US'
    Jethro Gillgren

    Hi All, we're trying to get upserts working with slick. Trying to use slick-pg to enable this, but we can't find how to access the new methods. We created our own profile that extends ExPostgresProfile, and it looks like it's configured OK in application.conf. We replaced import profile.api._ with import MySlickPostgresProfile.api._

    However, our methods that use testtable.insertOrUpdate still seems to use the old version, and .insertOrUpdateAll isn't found. Is there some weird implicit scala thing we need to do?

    Ivano Pagano
    Hello to everyone here, I'm trying to check if anyone can confirm that generating the slick's database model with the codegen tool might fail to compile when using custom postgres [slick-pg-supported] types
    I's thinking of defining a custom sbt task using a modified version of the generator that would be based on the pg-driver class... but I'm not yet convinced that this might solve the issue, or if the issue is even related to that...
    To give some more context, trying to convert a char varying column to an array of char varying ensued in a compilation loop when including the generated slick Tables.scala file
    Any help or suggestion is highly appreciated, thank you folks
    Ivano Pagano
    I'm trying to understand how the custom postgres driver generates array columns in the schema generation. If I declare a schema file and use the driver as is with the standard generator, I end up having a - nullable - column type of Option[scala.collection.Seq] which is a type-constructor and not a type, hence the compilation fails
    if I override the generator to custom define any array of text/varchar/... as a Option[List[String]] - which seems consistent with array columns declaration for slick-pg, as of https://github.com/tminglei/slick-pg#configurable-typemappers - then the generated code comes as expected, but the compiler hangs in some non-terminating loop when trying to compile the project
    Did anyone experience a similar issue?
    EatPray Bong

    I am new to slick and i am trying to convert a sql query to slick and need help for the same.

    My table is:

        final case class TableExp(
                                    id: String,
                                    key: String,
                                    value: String
        final class TableExp(tag: Tag)(implicit val schema: String)
          extends Table[TableExp](tag, Some(schema), "table") {
          def id = column[String]("id")
          def key = column[String]("key")
          def value = column[String]("value")
          def * =
              ).shaped <> (TableExp.tupled, TableExp.unapply)
          def getData(id: String) =
            TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").map(_.key).distinct

    And the sql query is:

    select distinct key, first_value(value) over (partition by key order by case when t.id=$id then 0 else 1 end) value from table t where t.id in ($id, 'id1')) t

    So far i have tried this:

         def getData(id: String) =
                TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").groupBy(_.key).map
                case (a,b) => Over.partitionBy(b.map(_.key))
    Altern Egro
    @tminglei have you tested slick-pg with Play 2.8.2? Play is unable to load the custom Postgres Profile configured in application.conf. tminglei/slick-pg#484
    Has anyone here been successful with 0.19 on Play 2.8.2?
    Douglas Bett

    hi, has anyone ev3r managed to create BTree Index like below:

    def nameIndex = index("name_idx", name, unique=true)
    def compoundIndex = index("c_idx", (name, age), unique=true)

    but using BTree index

    Michael Ahlers
    Is it possible to make a Slick expression using arrayElements (representing the Postgres function jsonb_array_elements) that creates a JOIN?
    Crudely illustrated:
    CREATE TABLE mytable(myelements JSONB DEFAULT '[]'::JSONB);
    SELECT myelement
    FROM mytable,
         jsonb_array_elements(mytable.myelements) myelement;
    Marek Kidon
    Hello, recently I stumbled on what I think is a bug in escaping quotes in plain array serialization. The quotes are escaped twice. @tminglei PTAL tminglei/slick-pg#492
    Henri Cook
    Hi everyone, I was really happy to find slick-pg today to give me JsValue support with postgres on my Scala project. Thanks for such a great library.
    Yingpeng Xu
    have anyone used the set function from PgJsonExtensions.scala under circe? I'm trying to use inside compiled query, but having cannot be compiled errors.

    Hi @all I am getting this very wired exception when added the

    val slickPgCirciJson = "com.github.tminglei" %% "slick-pg_circe-json" % "0.19.3"
    [error] java.lang.NoSuchMethodError: com.github.tminglei.slickpg.array.PgArrayJdbcTypes$SimpleArrayJdbcType.to(Lscala/Function1;Lscala/reflect/ClassTag;)Lslick/jdbc/JdbcTypesComponent$DriverJdbcType;

    and this is how my database class looks like -

    trait DatabaseApi extends ExPostgresProfile
      with PgArraySupport
      with PgDate2Support
      with PgRangeSupport
      with PgHStoreSupport
      with PgCirceJsonSupport
      with PgSearchSupport
      with PgPostGISExtensions
      with PgNetSupport
      with PgLTreeSupport {
      override val api = PostgresAPI
      object PostgresAPI
        extends API
          with ArrayImplicits
          with DateTimeImplicits
          with JsonImplicits
          with NetImplicits
          with LTreeImplicits
          with RangeImplicits
          with HStoreImplicits
          with SearchImplicits
          with SearchAssistants
          with CirceImplicits {
        val coalesce: (Rep[Option[Long]], Rep[Option[Long]]) => Rep[Option[Long]] =
          SimpleFunction.binary[Option[Long], Option[Long], Option[Long]](
        implicit val strListTypeMapper: DriverJdbcType[List[String]] =
          new SimpleArrayJdbcType[String]("text")
        def pgIntervalStr2Interval(intervalStr: String): Interval =
          Interval.fromPgInterval(new PGInterval(intervalStr))
    object DatabaseApi extends DatabaseApi {
      def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
      // Add back capabilities.insertOrUpdate to enable native upsert support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
    Zacharie Silverstein

    :wave: - I ran into an issue with \u0000 characters not being stripped out when using the default PgJsonSupport.

    I noticed that the serde-specific artifacts (play-json, circe, etc.) already strip this character out via JsonUtils.clean, would it make sense to add that same stripping logic to PgJsonSupport as well? I'd be happy to open a PR to make that change!

    Hi everybody, I am struggling a lot to create a SetParameter of type SetParameter[Seq[(String, String)]]. Any body has done that before.
    I think I need my "MyApi" object to extend the trait PgCompositeSupport and do something with createCompositeSetParameter but:
    • I am not sure that the correct path
      -PgCompositeSupport need my MyApi object to implement PostgreProfile but this one already implement ExPostgresProfile.API which conflict with PostgresProfile:
    import com.github.tminglei.slickpg._
    import com.github.tminglei.slickpg.window.PgWindowFuncSupport
    import slick.basic.Capability
    import slick.jdbc.JdbcType
    import slick.lifted.ExtensionMethodConversions
    import spray.json.{DefaultJsonProtocol, JsValue, _}
    import java.sql.Timestamp
    import java.time.ZonedDateTime
    trait PlusSlick extends ExPostgresProfile
      with DefaultJsonProtocol
      with PgArraySupport
      with PgDate2Support
      with PgRangeSupport
      with PgHStoreSupport
      with PgSprayJsonSupport
      with PgSearchSupport
      with PgWindowFuncSupport
      with PgNetSupport
      with PgPosixSupport
      with PgLTreeSupport {
      def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + slick.jdbc.JdbcCapabilities.insertOrUpdate
      override val api = MyAPI
      object MyAPI extends API
        with DateTimeImplicits
        with ArrayImplicits
        with JsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with SprayJsonPlainImplicits
        with HStoreImplicits
        with SearchImplicits
        with ExtensionMethodConversions
        with WindowFunctions
        with Date2DateTimePlainImplicits
        with PgPosixImplicits
        with SimpleArrayPlainImplicits
        with SearchAssistants {
        //weirdly that work fine
        implicit val simpleZonedDateTimeListTypeMapper: JdbcType[List[ZonedDateTime]] =
          new SimpleArrayJdbcType[Timestamp]("timestamp with time zone").to(_.toList).asInstanceOf[JdbcType[List[ZonedDateTime]]]
        implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
        implicit val playJsonArrayTypeMapper =
          new AdvancedArrayJdbcType[JsValue](pgjson,
            (s) => utils.SimpleArrayUtils.fromString[JsValue](_.parseJson)(s).orNull,
            (v) => utils.SimpleArrayUtils.mkString[JsValue](_.toString())(v)
    object PlusSlick extends PlusSlick
    Andy Czerwonka
    if anyone is around, having trouble with could not find implicit value for parameter tt: slick.ast.TypedType[io.circe.Json]
    I think I'm pulling in the right deps, but apparently not
    Artie Pesh-Imam
    @andyczerwonka I ran into the same issue and found a test case that seemed relevant. I havent actually tested this but the implicit resolution works
    trait GleanPostgresProfile extends PostgresProfile
      with PgCirceJsonSupport
      with array.PgArrayJdbcTypes {
      override val pgjson = "jsonb"
      override val api: API = new API {}
      trait API extends super.API with JsonImplicits {
        implicit val strListTypeMapper: DriverJdbcType[List[String]] = new SimpleArrayJdbcType[String]("text").to(_.toList)
    object GleanPostgresProfile extends GleanPostgresProfile