Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Feb 03 08:40
    fancellu commented #2672
  • Jan 31 16:54
    scala-steward closed #2696
  • Jan 31 16:54
    scala-steward commented #2696
  • Jan 31 16:54
    scala-steward opened #2698
  • Jan 29 17:41
    scala-steward closed #2695
  • Jan 29 17:41
    scala-steward commented #2695
  • Jan 29 17:41
    scala-steward opened #2697
  • Jan 29 17:41
    scala-steward closed #2693
  • Jan 29 17:41
    scala-steward commented #2693
  • Jan 29 17:41
    scala-steward opened #2696
  • Jan 27 18:21
    scala-steward closed #2663
  • Jan 27 18:21
    scala-steward commented #2663
  • Jan 27 18:21
    scala-steward opened #2695
  • Jan 27 18:21
    scala-steward opened #2694
  • Jan 27 18:21
    scala-steward closed #2691
  • Jan 27 18:21
    scala-steward commented #2691
  • Jan 27 18:21
    scala-steward opened #2693
  • Jan 25 16:59
    scala-steward opened #2692
  • Jan 25 16:59
    scala-steward closed #2690
  • Jan 25 16:59
    scala-steward commented #2690
yanly007
@yanly007
So far I tried to use pattern matching and dynamicQuery but get quite a difficult to decipher error output
    val q = quote {
      for {
        user <- dynamicQuerySchema[User]("user")

        _ <- dynamicQuerySchema[UserToContract]("userToContrat").join { utc =>
          {
            contractIdOpt match {
              case Some(id) => quote(utc.contractId == lift(id))
              case None     => quote(true)
            }
          } && utc.userId == user.id
        }

      } yield user
    }
rom1dep
@rom1dep:kde.org
[m]
@yanly007: would something like that work?
quote {
      query[User]
        .join(query[UserToContract])
        .on((u: User, utc: UserToContract) => utc.contractId == lift(id) && utc.userId == u.id)
        //.map((u: User, _) => u)
    }
anjalibelani
@anjalibelani

Hi, I am using the quill-cassandra-zio library.
I have a simple query like this:

private def itemItemsetQuery(itemId: Long, catalog: String) = quote(
querySchemaItemItemsets.filter(c => c.itemId == lift(itemId)).filter(.catalog == lift(catalog)).map(.itemsets)
)

run(itemItemsetQuery(itemId, catalog))

This compiles fine and works fine too but it gives me the below warning every time it is run:
Re-preparing already prepared query is generally an anti-pattern and will likely affect performance. Consider preparing the statement only once. Query='SELECT itemsets FROM item_itemsets WHERE item_id = ? AND catalog = ?

Does someone know what is wrong with my query? this is how the documentation suggests it should be done too so I have not been able to fix this yet

ElectricWound
@ElectricWound
What is the currently recommended way to integrate Postgres with Akka Streams? Since there are several ways to connect Quill to Postgres I am a bit lost on which to choose and which solution is going to be the most future proof, especially with a Scala3 migration ahead. Monix, ZIO, JAsync, JDBC even, ...?
Mehmet
@mehmetcc
Is quill compatible with zio 2? I tried to do some experiments with both and it seems like quill uses now deprecated ZManaged heavily
anjalibelani
@anjalibelani
Hi,
I have a set of intergers that I need to append to using quill. This is the query(without quill):
""" update targeting_history set audiences=audiences+123 where userId='abc' """
Hi,
I have a set of integers that I need to append to using quill. This is the query(without quill):
update targeting_history set audiences=audiences+123 where userId='abc'
The quill query that I am using looks like this:
    querySchema[PcaidTargetingHistory](
      "pcaid_targeting_history",
      _.userId -> "pcaid",
      _.audiences -> "audiences"
    ).filter(c => c.userId == lift(userId)).update(d => d.audiences -> (d.audiences + 1))
  )
However this fails with the error Tree 'd.audiences.+(1)' can't be parsed to 'Ast'
Can someone please help with the best way to update a collection(Set, List etc) using Quill?
Luis Furnas
@luisfurnas-pma
Is there any way of removing the aliasing part in delete actions? It's not working well with MariaDB
Luis Furnas
@luisfurnas-pma

This here:

def delete(slug: String) = quote { query[Tenant].filter(t => t.slug contains lift(slug)).delete }

generates this query:

DELETE FROM tenant AS t WHERE t.slug = ?

when I really want is:

DELETE FROM tenant WHERE slug = ?

Need some help here

ldeck
@ldeck
Hi all, I'm new to quill — I'm wondering for mysql for onConflictUpdate is it possible to replace all attributes except the pk without specifying every attribute in its own lambda?
Or is there a facility in quill to use REPLACE INTO? Thanks
yanly007
@yanly007

Hello
I use a tagged type for my uuid's with quill and the MappedEncoding's I defined work well untill the id field in question is optional.
These are the mappings I use:

  implicit def idEncoding[T]: MappedEncoding[Id[T], UUID] = MappedEncoding[Id[T], UUID](identity)

  implicit def idDecoding[T]: MappedEncoding[UUID, Id[T]] = MappedEncoding[UUID, Id[T]](Id[T])

When The id field in question is optional I get an error:

exception during macro expansion: 
scala.reflect.macros.TypecheckException: Can't find implicit `Decoder ...

It says it cannot find implicit decoder for whatever Id tag im working with

Alexander Ioffe
@deusaquilus
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Hey Guys, again. I am not supporting this channel anymore. Please go to the zio-quill channel in https://discord.gg/MYErPQgkHH
Elmira Khodaie
@Elmiira_gitlab

Hi,
I have a simple DAO Class as follows:

class AnomalyDao(ctx: PostgresContext) {

  import ctx._

  def deleteTo( temp: LocalDateTime): Unit = run {
    query[Anomaly].filter(anomaly => anomaly.time_inserted.compareTo(lift(temp)) > 0).delete
  }
}

case class Anomaly(
    offset: Long,
    data: Json,
    time_inserted: LocalDateTime
)

When I want to insert a LocalDateTime value or select a SQL timestamps value with Quill, I receive this error:

Tree 'anomaly.time_inserted' can't be parsed to 'Ast'

First, I assumed, that maybe LocalDateTime custom/raw Decoder/Encoder have not been added, So I tried to explicitly define them:

  implicit val localDateTimeDecoder: Decoder[LocalDateTime] =
    decoder((index, row) => row.getTimestamp(index, Calendar.getInstance(dateTimeZone)).toLocalDateTime)

  implicit val localDateTimeEncoder: Encoder[LocalDateTime] =
    encoder(
      Types.TIMESTAMP,
      (index, value, row) => row.setTimestamp(index, Timestamp.valueOf(value), Calendar.getInstance(dateTimeZone))
    )

But, then I got conflicting members error:

class PostgresContext inherits conflicting members:
[error] value localDateTimeDecoder in trait Decoders of type PostgresContext.this.Decoder[java.time.LocalDateTime]  and
[error]   value localDateTimeDecoder in trait PostgresDateSupport of type PostgresContext.this.Decoder[java.time.LocalDateTime]

So, I realized that I have the appropriate decoder and encoder at compile time; But non of the following solutions could solve the Tree can't be parsed to 'Ast' problem:

I changed the DAO class to this version:

class AnomalyDao(ctx: PostgresContext) {

  import ctx._

  def deleteTo(to: Long): Unit = run {
    query[Anomaly].filter(anomaly => anomaly.time_inserted.toEpochSecond(ZoneOffset.UTC) >(lift(to))).delete
  }
}

case class Anomaly(
    offset: Long,
    data: Json,
    time_inserted: LocalDateTime
)

But still parsing error is there, any Idea?

qiudaozhang
@qiudaozhang
How to deal with the database field named Scala keyword?
greatmengqi
@greatmengqi
why the 'returning' and 'returningMany' example in quill docs are the same
image.png
and i can not found returningMany method
is there anyone know the reason?
psoloveichik
@psoloveichik

hi, I am using quill with sql server, and I am trying to query a table function and ordering the results.
my code is:

      val rawQuery = quote {
        ( id: Int ) => infix"""select * from dbo.MyFunction($id)""".as[ Query[ MyCaseClass] ]
      }
      ctx.run( rawQuery( lift( id ) ).sortBy(_.dateTime))

I see in the console that the sql that gets generated for the sortby is:
ORDER BY x6."dateTime" ASC NULLS FIRST
the syntax nulls first isn't supported in sql server and it's throwing an error... anyway to get around it?

davidvirgilnaranjo
@davidvirgilnaranjo
Hello, we are considering using quill, but we need to have async results to include in our for comprehension. Is there a way to do this without using zio?
Juliano Alves
@juliano
hey @Elmiira_gitlab @qiudaozhang @greatmengqi @psoloveichik @davidvirgilnaranjo and everyone else, as mentioned by Alex before, this channel is not being supported anymore. Please bring your questions to the zio-quill channel in https://discord.gg/MYErPQgkHH
Chuong Ngo
@cngo-github:matrix.org
[m]
Hi guys. I am using Quill Dynamic queries with SQLLite. I need access to the SQLLite "LIMIT" and "OFFSET". How?
Naftoli Gugenheim
@nafg
@cngo-github:matrix.org have you searched the docs?
Maciej Majka
@mmajka_gitlab
Hi, is it possible to define a Query[MyModel] as a return value from method def myQuery: Query[MyModel] and then to insert it inside a quote{} ?
as in val query = quote {myQuery()}
MachineElf
@zacwolfe
Hey all, what is the cleanest way to inject FORCE INDEX(my_index) into my EntityQuery? Is doing a raw query my only option?
MachineElf
@zacwolfe
is this thing on?
Obafemi Teminife
@AndySakov

Hello everyone, how do I do this with quill

case class Todo(id: Long, info: String, list_id: Long)
case class TodoList(id: Long, name: String)

I'd like to get all todo lists with their corresponding todos

Naftoli Gugenheim
@nafg
@zacwolfe @AndySakov try Discord
Maciej Majka
@mmajka_gitlab

Hi, I am getting an error when trying to call query:

No Decoder found for LocalDateTime and it is not a class representing a group of columns

anyone?
Mathieu Prevel
@mprevel
@mmajka_gitlab I don't know if it is expected to be natively supported. If not you have to create a codec for this field. Check "custom encoding" in the docs.
You have to map to a known type. Date is supported, maybe some other date-like types.
Maciej Majka
@mmajka_gitlab
@mprevel - I have tried implementing custom coders following the "custom encoding" topic from the docs but with no luck, could You give some hints how to perform it properly?
Maciej Majka
@mmajka_gitlab

Where do I put those custom encoding? let's say I have query:

inline given MyTable.schema
def myQuery = for myTable <- query[MyTable] yield MyCustomModel(created = myTable.created)

where MytAble looks like:

case class MyTable(id: Int, created: LocalDateTime)
object MyTable{
val schema = schemaMeta[MyTable](
"my_custom_name",
_.id -> "my_id",
_.created -> "creation_date"
)
}

and MyCustomModel looks like:
case class MyCustomModel(created: LocalDateTime)

Where to define the custom encoder and how to use/import it to proper place?

Mathieu Prevel
@mprevel

@mmajka_gitlab Writing a schema is not needed if the name of the fields matches between the case class and the db.
You can put the encoding in the same class or in another and import them.
I have a QuillEncoders object where I put the mappings then, I import the members QuillEncoders.*; QuillEncoders.given
Then I write the encoders for my custom types, for example :

import wvlet.airframe.ulid.ULID
import io.lemonlabs.uri.Url

  implicit val encodeULID: MappedEncoding[ULID, String]               = MappedEncoding[ULID, String](_.toString)
  implicit val decodeULID: MappedEncoding[String, ULID]               = MappedEncoding[String, ULID](ULID.fromString)
  implicit val encodeUrl: MappedEncoding[Url, String]                 = MappedEncoding[Url, String](_.toString)
  implicit val decodeUrl: MappedEncoding[String, Url]                 = MappedEncoding[String, Url](Url.parse(_))

For dates I use java.util.Date in the case class binding since it is natively supported.

k0ala
@k0ala

@mprevel is String the standard type to map from/to? For instance, I want to map a Postgresql 'geometry' column to a org.locationtech.jts.geo.Geometry field. The following does not work:

  given MappedEncoding[String, Geometry] = MappedEncoding[String, Geometry](s => WKTReader().read(s))
  given MappedEncoding[Geometry, String] = MappedEncoding[Geometry, String](g => g.toString)

I get "No Decoder found for Geometry and it is not a class representing a group of columns".
However, the following does work, but I need the Quill.Postgres to be available:

private given Decoder[Geometry] =
    decoder((row) => (index) => { // very inefficient? object -> String -> Array[Byte] -> WKT
      WKBReader().read(WKBReader.hexToBytes(row.getObject(index).toString))
    })
Mathieu Prevel
@mprevel

@k0ala I don't think it has to be String, but it has to be a type that is known by quill, so it can read/ write to the db.
If you provide a decoder (and it is in the scope) you don't have to provide a mapping. Just use the type for which you have a decoder in the case class binding.
So it probably depends on the selected module and db you have.

I also can say that depending on the type of the column a string will be rejected from the database (e.g jsonb).

For example, I've written a small codec for circe Json and jdbc a few days ago (this is currently just an experiment and is not battle tested).
I can, now, use circe Json as a field of a case class, and probably I'll be able to use a case class in place of this json field provided I build an implicit MappedEncoding from the case class to Json (but not tried yet).

  implicit val jsonEncoder: JdbcEncoder[Json] =
    encoder[Json](
      Types.OTHER,
      (idx: Int, js: Json, ps: PreparedStatement) => {
        val jsonObject: PGobject = new PGobject()
        jsonObject.setType("json")
        jsonObject.setValue(js.toString)
        ps.setObject(idx, jsonObject)
      }
    )

  val decodeFn: (Int, ResultRow, Session) => Json =
    (idx: Int, row: ResultRow, session: Session) => {
      val jsonObject: PGobject = row.getObject(idx, classOf[PGobject])
      io.circe.parser.parse(jsonObject.getValue).getOrElse(throw new RuntimeException("db value is not a valid json"))
    }

  implicit val jsonDecoder: JdbcDecoder[Json] =
    decoder[Json](decodeFn)
akash1987
@akash1987

Hello, I have a quill query which would almost work if it omitted inserting single quotes around passed-in 'idsConcat'.

If anyone had experience with it, please help. Much appreciated. Using quill-jdbc-zio 4.6.0.

    val idsConcat = someIntIdsList.mkString(",")

    val query = quote { (idsConcat: String) => 
      sql"""
           select name from users
           where 
               id in ($idsConcat)
           """.as[Query[(String)]]
    }

    run(query(lift(idsConcat)))

Here is a quill-printed example of the query with single quotes around the comma separated ids.

select name from users
where
id in ('1,2,3,4')

This is a made-up example to simplify.


I have also tried doing it like below...

 val query = quote { (rawQuery: String) =>
      sql"#$rawQuery".as[Query[Int]]
    }

  run(query(rawQuery))

.. but get an error there also..
scalac: Error while emitting Users.scala
value rawQuery

akash1987
@akash1987

Have found a workaround in PostgresSql.

val idsConcat = "{" + someIntIdsList.mkString(",") + "}"  // {1, 2, 3, 4}

 val query = quote { (idsConcat: String) => 
      sql"""
           select name from users
           where 
               id in (select unnest($idsConcat::int[]))
           """.as[Query[(String)]]
    }

.. which becomes..

select name from users
where
id in (select unnest('{1,2,3,4}'::int[]))

oleg chernykh
@nighttrain18_gitlab
hi, i am trying to get prepared statement in runtime, but interpretation of TranslateResult[_] returns SQL query with actual substituted parameters.
Is there a suitable way to achieve the desired behavior?
Aleksey Anosov
@Grednoud
Hello. I have a dynamic query. I need to add the number of lines in this query to it. What is the best way to do this?
I think it should be something like this but I still can't.
val q = dynamicQuery[AccountEntry]

run(q.map(a => (a, q.size)))
discobaba
@uncleweirdo_twitter

I'm getting a compile-time stack trace telling me that quill isn't getting along with the java module system in java 17:

java.lang.RuntimeException: java.lang.reflect.InaccessibleObjectException: Unable to make fieldprotected transient int java.util.AbstractList.modCount accessible: module java.base does not "opens java.util" to unnamed module @62585f3f

google told me I might need an option like "--add-opens=java.base/java.util=ALL-UNNAMED" or --illegal-access=permit" although They didn't work for me. Maybe I was an incompetent mill user. Anybody know the cure?

1 reply
Saumya Jain
@saumyaj92
Hi, I am getting Can't find an implicit `SchemaMeta` for type
Does anyone know why I would need to add explicit schemaMeta?
Saumya Jain
@saumyaj92
Is JSON not supported for quill-jdbc module?
Mathieu Prevel
@mprevel
@saumyaj92 not yet for what I know. Just check a little bit upper in the chat, my message from Dec 06.
Saumya Jain
@saumyaj92
@mprevel I was able to get the JSON parsing done using Encoders and Decoders. Thanks!
Saumya Jain
@saumyaj92
Can I use Quill to run CTEs? This is a sample CTE query that I want to use -
WITH cte (id, parent_id, name)
as (select id, parent_id, name from my_object where parent_id IS NULL AND type = 'Database'
UNION ALL
SELECT o.id, o.parent_id, o.name from my_object o JOIN cte ON o.parent_id = cte.id)
Select * from my_object where id NOT IN (select id from cte)