Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Gavin Bisesi
    @Daenyth
    you could always use sql if you don't need to compose Query objects
    Pyry-Samuli Lahti
    @Pyppe
    Yeah, that would be my fallback. But in my use-case I’m joining many tables. Would like to use type-safe approach, if at all possible.
    Francesco Levorato
    @flevour
    Hi everyone!
    I'm trying to upgrade my project to slick 3.3.0, so I set slick-pg to 0.17.2 but unfortunately in slick-pg 0.17.1 a few deps including json4s were upgraded to latest version. Alas json4s 3.6.x versions have a longstanding bug for case classes with type constructors (json4s/json4s#507). My project also dependes on json4s and thus breaks when upgrading. Is there anyway this dependency hell can be circumvented?
    Gavin Bisesi
    @Daenyth
    @flevour if slick-pg is using only things that are binary compatible with the old one, it should work to add an exclude to your build.sbt
    but also you might not need to bump slick-pg
    Francesco Levorato
    @flevour
    @Daenyth even if a bit late, I want to send you a wholehearted thanks for pointing towards SBT's exclude directive! That's what I needed to get this sorted out. Thanks a lot!
    Gavin Bisesi
    @Daenyth
    np
    ochmist
    @ochmist
    hey guys, is there any good documentation on how to get started with slick-pg?
    If i use the sample on the readme i get a lot of not defined types
    ochmist
    @ochmist

    `import com.github.tminglei.slickpg._

    trait MyPostgresDriver
    extends ExPostgresProfile
    with PgArraySupport
    with PgDateSupportJoda
    with PgEnumSupport
    with PgRangeSupport
    with PgHStoreSupport
    with PgSearchSupport
    with PgPostGISSupport {

    override val api = new MyAPI {}

    trait MyAPI
    extends API
    with ArrayImplicits
    with DateTimeImplicits
    with RangeImplicits
    with HStoreImplicits
    with SearchImplicits
    with PostGISImplicits
    with SearchAssistants
    }

    object MyPostgresDriver extends MyPostgresDriver`

    Bascially none of the PG types here are found.

    Gavin Bisesi
    @Daenyth
    Did you import MyPostgresDriver.api._ ?
    ochmist
    @ochmist
    No i did not ... but the api is defined in this file itself no?
    Gavin Bisesi
    @Daenyth
    sure, but your db tables etc won't be
    queries, etc
    ochmist
    @ochmist
    I see, I will import them in the tables.
    ochmist
    @ochmist
    hey guys, so i used the examples form github and kinda got things to compile
    However, when actually querying, i run into a problem
    org.postgresql.util.PSQLException: ERROR: function st_dwithin(point, bytea, double precision) does not exist [info] Hint: No function matches the given name and argument types. You might need to add explicit type casts. [info] Position: 140
    Here is my call into slick-pg:
    def byDistance(point: Point, distance: Double): Future[Seq[House]] = db run { houses .filter(r => r.location.dWithin(point.bind, distance.bind)) .result .map(t => t) }
    Can anyone please help me figure out what is going on?
    ochmist
    @ochmist
    Another error I am getting when writing to the db is shown below. Can anyone please point out if i am doing anything wrong:
    due to: org.postgresql.util.PSQLException: ERROR: column "location" is of type point but expression is of type bytea [info] Hint: You will need to rewrite or cast the expression. [info] Position: 157
    Is my column type in the database not supposed to be of type point?
    Nader Ghanbari
    @naderghanbari
    Hi, thanks for creating this awesome project! I'm using it and everything works fine just a minor question. Is there a way to get ST_Distance(geom1, geom2, spheroid=true) with slick-pg? The distance function does not support the boolean flag.
    def distance[P2, R](geom: Rep[P2])(implicit om: o#to[Float, R]) = {
      om.column(GeomLibrary.Distance, n, geom.toNode)
    }
    I coudln't find any GH issue related to this. If this is the case, is a PR welcome?
    Rutvik Patel
    @heyrutvik
    Hey folks, need your help with tsquery/tsvector. My question is related to postgresql, than slick-pg. But I assume, you folks can help me with that as well. :)
    Rather than repeating myself here, I'll post my https://twitter.com/heyrutvik/status/1182521309991002112 which describes the problem of using <-> with hyphened string in to_tsquery function.
    Please take a look, thanks!
    Jason Cohen
    @jdcohen220
    does the makePoint() function use lat,lon or lon,lat ordering of the coordinates?
    Yakov
    @yabushraber
    hello, is there a way I use postgis ST_Extent?
    slava
    @slmzig_gitlab
    hi can I do with slick such query
    drop table if exists orders cascade;
    CREATE TABLE orders (
      ID serial NOT NULL PRIMARY KEY,
      info json NOT NULL
    );
    
    INSERT INTO orders (info)
    VALUES
    ('{"interestedIn":[11],"countries":["IT", "UK"]}'),
    ('{"interestedIn":[12],"countries":["US", "UK"]}'),
    ('{"interestedIn":[1,2,3],"countries":["UK"]}'),
    ('{"interestedIn":[14,15,16],"countries":["DE", "UK"]}');
    
    
    select *
    from orders
    where info::jsonb -> 'countries' ? 'US'
    ;
    Jethro Gillgren
    @jethrogillgren

    Hi All, we're trying to get upserts working with slick. Trying to use slick-pg to enable this, but we can't find how to access the new methods. We created our own profile that extends ExPostgresProfile, and it looks like it's configured OK in application.conf. We replaced import profile.api._ with import MySlickPostgresProfile.api._

    However, our methods that use testtable.insertOrUpdate still seems to use the old version, and .insertOrUpdateAll isn't found. Is there some weird implicit scala thing we need to do?

    Ivano Pagano
    @ivanopagano
    Hello to everyone here, I'm trying to check if anyone can confirm that generating the slick's database model with the codegen tool might fail to compile when using custom postgres [slick-pg-supported] types
    I's thinking of defining a custom sbt task using a modified version of the generator that would be based on the pg-driver class... but I'm not yet convinced that this might solve the issue, or if the issue is even related to that...
    To give some more context, trying to convert a char varying column to an array of char varying ensued in a compilation loop when including the generated slick Tables.scala file
    Any help or suggestion is highly appreciated, thank you folks
    Ivano Pagano
    @ivanopagano
    I'm trying to understand how the custom postgres driver generates array columns in the schema generation. If I declare a schema file and use the driver as is with the standard generator, I end up having a - nullable - column type of Option[scala.collection.Seq] which is a type-constructor and not a type, hence the compilation fails
    if I override the generator to custom define any array of text/varchar/... as a Option[List[String]] - which seems consistent with array columns declaration for slick-pg, as of https://github.com/tminglei/slick-pg#configurable-typemappers - then the generated code comes as expected, but the compiler hangs in some non-terminating loop when trying to compile the project
    Did anyone experience a similar issue?
    EatPray Bong
    @EatPrayBong_gitlab

    I am new to slick and i am trying to convert a sql query to slick and need help for the same.

    My table is:

    
        final case class TableExp(
                                    id: String,
                                    key: String,
                                    value: String
                                  )
        final class TableExp(tag: Tag)(implicit val schema: String)
          extends Table[TableExp](tag, Some(schema), "table") {
          def id = column[String]("id")
    
          def key = column[String]("key")
    
          def value = column[String]("value")
    
          def * =
            (id,
              key,
              value
              ).shaped <> (TableExp.tupled, TableExp.unapply)
    
          def getData(id: String) =
          {
    
            TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").map(_.key).distinct
          }
        }

    And the sql query is:

    select distinct key, first_value(value) over (partition by key order by case when t.id=$id then 0 else 1 end) value from table t where t.id in ($id, 'id1')) t

    So far i have tried this:

    
         def getData(id: String) =
          {
    
                TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").groupBy(_.key).map
               {
                case (a,b) => Over.partitionBy(b.map(_.key))
                 }
          }
    Altern Egro
    @alternegro
    @tminglei have you tested slick-pg with Play 2.8.2? Play is unable to load the custom Postgres Profile configured in application.conf. tminglei/slick-pg#484
    Has anyone here been successful with 0.19 on Play 2.8.2?
    Douglas Bett
    @bettdouglas

    hi, has anyone ev3r managed to create BTree Index like below:

    def nameIndex = index("name_idx", name, unique=true)
    def compoundIndex = index("c_idx", (name, age), unique=true)

    but using BTree index

    Michael Ahlers
    @michaelahlers
    Is it possible to make a Slick expression using arrayElements (representing the Postgres function jsonb_array_elements) that creates a JOIN?
    Crudely illustrated:
    CREATE TABLE mytable(myelements JSONB DEFAULT '[]'::JSONB);
    SELECT myelement
    FROM mytable,
         jsonb_array_elements(mytable.myelements) myelement;
    Marek Kidon
    @kidonm
    Hello, recently I stumbled on what I think is a bug in escaping quotes in plain array serialization. The quotes are escaped twice. @tminglei PTAL tminglei/slick-pg#492
    Henri Cook
    @henricook
    Hi everyone, I was really happy to find slick-pg today to give me JsValue support with postgres on my Scala project. Thanks for such a great library.
    Yingpeng Xu
    @ypxu
    have anyone used the set function from PgJsonExtensions.scala under circe? I'm trying to use inside compiled query, but having cannot be compiled errors.
    tgampiyush
    @tgampiyush

    Hi @all I am getting this very wired exception when added the

    val slickPgCirciJson = "com.github.tminglei" %% "slick-pg_circe-json" % "0.19.3"
    [error] java.lang.NoSuchMethodError: com.github.tminglei.slickpg.array.PgArrayJdbcTypes$SimpleArrayJdbcType.to(Lscala/Function1;Lscala/reflect/ClassTag;)Lslick/jdbc/JdbcTypesComponent$DriverJdbcType;

    and this is how my database class looks like -

    trait DatabaseApi extends ExPostgresProfile
      with PgArraySupport
      with PgDate2Support
      with PgRangeSupport
      with PgHStoreSupport
      with PgCirceJsonSupport
      with PgSearchSupport
      with PgPostGISExtensions
      with PgNetSupport
      with PgLTreeSupport {
    
      override val api = PostgresAPI
    
      object PostgresAPI
        extends API
          with ArrayImplicits
          with DateTimeImplicits
          with JsonImplicits
          with NetImplicits
          with LTreeImplicits
          with RangeImplicits
          with HStoreImplicits
          with SearchImplicits
          with SearchAssistants
          with CirceImplicits {
    
        val coalesce: (Rep[Option[Long]], Rep[Option[Long]]) => Rep[Option[Long]] =
          SimpleFunction.binary[Option[Long], Option[Long], Option[Long]](
            "coalesce")
    
        implicit val strListTypeMapper: DriverJdbcType[List[String]] =
          new SimpleArrayJdbcType[String]("text")
            .to(_.toList)
    
        def pgIntervalStr2Interval(intervalStr: String): Interval =
          Interval.fromPgInterval(new PGInterval(intervalStr))
    
      }
    }
    object DatabaseApi extends DatabaseApi {
      def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
    
      // Add back capabilities.insertOrUpdate to enable native upsert support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + JdbcCapabilities.insertOrUpdate
    
    }
    Zacharie Silverstein
    @ihasfrozen

    :wave: - I ran into an issue with \u0000 characters not being stripped out when using the default PgJsonSupport.

    I noticed that the serde-specific artifacts (play-json, circe, etc.) already strip this character out via JsonUtils.clean, would it make sense to add that same stripping logic to PgJsonSupport as well? I'd be happy to open a PR to make that change!

    LUC DUZAN
    @strokyl
    Hi everybody, I am struggling a lot to create a SetParameter of type SetParameter[Seq[(String, String)]]. Any body has done that before.
    I think I need my "MyApi" object to extend the trait PgCompositeSupport and do something with createCompositeSetParameter but:
    • I am not sure that the correct path
      -PgCompositeSupport need my MyApi object to implement PostgreProfile but this one already implement ExPostgresProfile.API which conflict with PostgresProfile:
    import com.github.tminglei.slickpg._
    import com.github.tminglei.slickpg.window.PgWindowFuncSupport
    import slick.basic.Capability
    import slick.jdbc.JdbcType
    import slick.lifted.ExtensionMethodConversions
    import spray.json.{DefaultJsonProtocol, JsValue, _}
    
    import java.sql.Timestamp
    import java.time.ZonedDateTime
    
    trait PlusSlick extends ExPostgresProfile
      with DefaultJsonProtocol
      with PgArraySupport
      with PgDate2Support
      with PgRangeSupport
      with PgHStoreSupport
      with PgSprayJsonSupport
      with PgSearchSupport
      with PgWindowFuncSupport
      with PgNetSupport
      with PgPosixSupport
      with PgLTreeSupport {
      def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
    
      // Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
      override protected def computeCapabilities: Set[Capability] =
        super.computeCapabilities + slick.jdbc.JdbcCapabilities.insertOrUpdate
    
      override val api = MyAPI
    
      object MyAPI extends API
        with DateTimeImplicits
        with ArrayImplicits
        with JsonImplicits
        with NetImplicits
        with LTreeImplicits
        with RangeImplicits
        with SprayJsonPlainImplicits
        with HStoreImplicits
        with SearchImplicits
        with ExtensionMethodConversions
        with WindowFunctions
        with Date2DateTimePlainImplicits
        with PgPosixImplicits
        with SimpleArrayPlainImplicits
        with SearchAssistants {
    
        //weirdly that work fine
        implicit val simpleZonedDateTimeListTypeMapper: JdbcType[List[ZonedDateTime]] =
          new SimpleArrayJdbcType[Timestamp]("timestamp with time zone").to(_.toList).asInstanceOf[JdbcType[List[ZonedDateTime]]]
    
        implicit val strListTypeMapper = new SimpleArrayJdbcType[String]("text").to(_.toList)
        implicit val playJsonArrayTypeMapper =
          new AdvancedArrayJdbcType[JsValue](pgjson,
            (s) => utils.SimpleArrayUtils.fromString[JsValue](_.parseJson)(s).orNull,
            (v) => utils.SimpleArrayUtils.mkString[JsValue](_.toString())(v)
          ).to(_.toList)
      }
    
    }
    
    object PlusSlick extends PlusSlick