Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    ochmist
    @ochmist
    hey guys, so i used the examples form github and kinda got things to compile
    However, when actually querying, i run into a problem
    org.postgresql.util.PSQLException: ERROR: function st_dwithin(point, bytea, double precision) does not exist [info] Hint: No function matches the given name and argument types. You might need to add explicit type casts. [info] Position: 140
    Here is my call into slick-pg:
    def byDistance(point: Point, distance: Double): Future[Seq[House]] = db run { houses .filter(r => r.location.dWithin(point.bind, distance.bind)) .result .map(t => t) }
    Can anyone please help me figure out what is going on?
    ochmist
    @ochmist
    Another error I am getting when writing to the db is shown below. Can anyone please point out if i am doing anything wrong:
    due to: org.postgresql.util.PSQLException: ERROR: column "location" is of type point but expression is of type bytea [info] Hint: You will need to rewrite or cast the expression. [info] Position: 157
    Is my column type in the database not supposed to be of type point?
    Nader Ghanbari
    @naderghanbari
    Hi, thanks for creating this awesome project! I'm using it and everything works fine just a minor question. Is there a way to get ST_Distance(geom1, geom2, spheroid=true) with slick-pg? The distance function does not support the boolean flag.
    def distance[P2, R](geom: Rep[P2])(implicit om: o#to[Float, R]) = {
      om.column(GeomLibrary.Distance, n, geom.toNode)
    }
    I coudln't find any GH issue related to this. If this is the case, is a PR welcome?
    Rutvik Patel
    @heyrutvik
    Hey folks, need your help with tsquery/tsvector. My question is related to postgresql, than slick-pg. But I assume, you folks can help me with that as well. :)
    Rather than repeating myself here, I'll post my https://twitter.com/heyrutvik/status/1182521309991002112 which describes the problem of using <-> with hyphened string in to_tsquery function.
    Please take a look, thanks!
    jdcohen220
    @jdcohen220
    does the makePoint() function use lat,lon or lon,lat ordering of the coordinates?
    Yakov
    @yabushraber
    hello, is there a way I use postgis ST_Extent?
    slava
    @slmzig_gitlab
    hi can I do with slick such query
    drop table if exists orders cascade;
    CREATE TABLE orders (
      ID serial NOT NULL PRIMARY KEY,
      info json NOT NULL
    );
    
    INSERT INTO orders (info)
    VALUES
    ('{"interestedIn":[11],"countries":["IT", "UK"]}'),
    ('{"interestedIn":[12],"countries":["US", "UK"]}'),
    ('{"interestedIn":[1,2,3],"countries":["UK"]}'),
    ('{"interestedIn":[14,15,16],"countries":["DE", "UK"]}');
    
    
    select *
    from orders
    where info::jsonb -> 'countries' ? 'US'
    ;
    Jethro Gillgren
    @jethrogillgren

    Hi All, we're trying to get upserts working with slick. Trying to use slick-pg to enable this, but we can't find how to access the new methods. We created our own profile that extends ExPostgresProfile, and it looks like it's configured OK in application.conf. We replaced import profile.api._ with import MySlickPostgresProfile.api._

    However, our methods that use testtable.insertOrUpdate still seems to use the old version, and .insertOrUpdateAll isn't found. Is there some weird implicit scala thing we need to do?

    Ivano Pagano
    @ivanopagano
    Hello to everyone here, I'm trying to check if anyone can confirm that generating the slick's database model with the codegen tool might fail to compile when using custom postgres [slick-pg-supported] types
    I's thinking of defining a custom sbt task using a modified version of the generator that would be based on the pg-driver class... but I'm not yet convinced that this might solve the issue, or if the issue is even related to that...
    To give some more context, trying to convert a char varying column to an array of char varying ensued in a compilation loop when including the generated slick Tables.scala file
    Any help or suggestion is highly appreciated, thank you folks
    Ivano Pagano
    @ivanopagano
    I'm trying to understand how the custom postgres driver generates array columns in the schema generation. If I declare a schema file and use the driver as is with the standard generator, I end up having a - nullable - column type of Option[scala.collection.Seq] which is a type-constructor and not a type, hence the compilation fails
    if I override the generator to custom define any array of text/varchar/... as a Option[List[String]] - which seems consistent with array columns declaration for slick-pg, as of https://github.com/tminglei/slick-pg#configurable-typemappers - then the generated code comes as expected, but the compiler hangs in some non-terminating loop when trying to compile the project
    Did anyone experience a similar issue?
    EatPray Bong
    @EatPrayBong_gitlab

    I am new to slick and i am trying to convert a sql query to slick and need help for the same.

    My table is:

    
        final case class TableExp(
                                    id: String,
                                    key: String,
                                    value: String
                                  )
        final class TableExp(tag: Tag)(implicit val schema: String)
          extends Table[TableExp](tag, Some(schema), "table") {
          def id = column[String]("id")
    
          def key = column[String]("key")
    
          def value = column[String]("value")
    
          def * =
            (id,
              key,
              value
              ).shaped <> (TableExp.tupled, TableExp.unapply)
    
          def getData(id: String) =
          {
    
            TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").map(_.key).distinct
          }
        }

    And the sql query is:

    select distinct key, first_value(value) over (partition by key order by case when t.id=$id then 0 else 1 end) value from table t where t.id in ($id, 'id1')) t

    So far i have tried this:

    
         def getData(id: String) =
          {
    
                TableQuery[TableExp].filter(res => res.id === id || res.id === "id1").groupBy(_.key).map
               {
                case (a,b) => Over.partitionBy(b.map(_.key))
                 }
          }
    Altern Egro
    @alternegro
    @tminglei have you tested slick-pg with Play 2.8.2? Play is unable to load the custom Postgres Profile configured in application.conf. tminglei/slick-pg#484
    Has anyone here been successful with 0.19 on Play 2.8.2?
    Douglas Bett
    @bettdouglas

    hi, has anyone ev3r managed to create BTree Index like below:

    def nameIndex = index("name_idx", name, unique=true)
    def compoundIndex = index("c_idx", (name, age), unique=true)

    but using BTree index

    Michael Ahlers
    @michaelahlers
    Is it possible to make a Slick expression using arrayElements (representing the Postgres function jsonb_array_elements) that creates a JOIN?
    Crudely illustrated:
    CREATE TABLE mytable(myelements JSONB DEFAULT '[]'::JSONB);
    SELECT myelement
    FROM mytable,
         jsonb_array_elements(mytable.myelements) myelement;
    Marek Kidon
    @kidonm
    Hello, recently I stumbled on what I think is a bug in escaping quotes in plain array serialization. The quotes are escaped twice. @tminglei PTAL tminglei/slick-pg#492
    Igor
    @ishubelko

    Hi all,
    have a JSONB column mapped to Vector[Address]:

    case class Address(city: String, country: String)
    val addresses = column[Vector[Address]]("addresses", O.SqlType("JSONB"))

    added mappers :

    implicit val addressMapper: JdbcType[Address] with BaseTypedType[Address] = {
      val emptyAddress = Address("", "")
      MappedColumnType.base[Address, Json](
        addr => addr.asJson,
        json => json.as[Address].fold(_ => emptyAddress, identity)
      )
    }
    
    implicit val seqOfAddressesMapper: JdbcType[Vector[Address]] with BaseTypedType[Vector[Address]] =
      MappedColumnType.base[Vector[Address], Json](
        addresses => if (addresses.isEmpty) Json.arr() else addresses.asJson,
        json => json.as[Vector[Address]].fold(_ => Vector.empty, identity)
      )

    Looking for filtering by city and country - something like filter(_.addresses +> "city" === "Zion")
    As tminglei suggested here: tminglei/slick-pg#486
    added extensions:

    implicit def AddressJsonColumnExtensionMethods(c: Rep[Address]): JsonColumnExtensionMethods[Address, Address] = {
      new JsonColumnExtensionMethods[Address, Address](c)
    }
    implicit def AddressJsonOptionColumnExtensionMethods(c: Rep[Vector[Address]]): JsonColumnExtensionMethods[Address, Vector[Address]] = {
      new JsonColumnExtensionMethods[Address, Vector[Address]](c)
    }

    but it complains: No implicit found for parameter om: OptionMapper2[Address, String, Address, Vector[Address], String, R_]
    Anyone here tired something like this?

    Igor
    @ishubelko
    is it even possible? @tminglei can you pls help?