Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Oct 20 19:34
    markuskobler starred getquill/quill
  • Oct 20 17:10

    juliano on master

    Update sqlite-jdbc to … (compare)

  • Oct 20 17:10
    juliano closed #2273
  • Oct 20 13:20
    juliano synchronize #2273
  • Oct 20 13:20

    juliano on master

    Update mysql-connector-java to … (compare)

  • Oct 20 13:20
    juliano closed #2278
  • Oct 20 04:47
    ibreaz starred getquill/quill
  • Oct 19 22:18
    scala-steward synchronize #2273
  • Oct 19 22:09
    juliano synchronize #2278
  • Oct 19 22:09

    juliano on master

    Update postgresql to 42.3.0 (#2… (compare)

  • Oct 19 22:09
    juliano closed #2276
  • Oct 19 21:47
  • Oct 19 19:40
    cilicio starred getquill/quill
  • Oct 19 17:06
    fabianhjr closed #2227
  • Oct 19 15:57
    thinkiny starred getquill/quill
  • Oct 19 00:28
    scala-steward opened #2278
  • Oct 18 17:40
    Jordanwbutterwick opened #2277
  • Oct 18 16:59
    scala-steward opened #2276
  • Oct 18 14:50
    pshemass edited #2267
  • Oct 15 18:58
    reidrac starred getquill/quill
Adriani Furtado
Hey all, I have come across the Codegen and has a few questions on it.
  1. I am using new SimpleJdbcCodegen(datasource, "my.package"), my datasource is pointed to a specific schema. However when the code gets generated it pulls every schema in my server, I'm I doing something wrong here?
  2. My next question is, assuming I have solved issue 1, is it possible for me to filter which tables in a schema I would like to generate? We have a schema of about 20 table and I would like to generate the case classes for only 4 of those
Mathieu Prevel

Hi everyone,

I try to use quill with monix to get a stream of db objects where rows are filtered on an enum column.

When not using a stream I am able to get the expected results from the database.

.filter(_.myField == lift(value))

Stream 1:
When filtering the query before using it as a stream it compiles but crashes at runtime.

.filter(_.myField == lift(value))
'ERROR: operator does not exist: t_my_enum = character varying
Indice : No operator matches the given name and argument types. You might need to add explicit type casts.
Position : 198'. Aborting connection.
Caused by: [CIRCULAR REFERENCE: org.postgresql.util.PSQLException: ERROR: operator does not exist: t_my_enum = character varying

Stream 2:
When filtering the stream (not at the query level) it works as expected.

.filter(_.myField == value)

Is it expected for the stream 1 to fail ?
Should I use the stream 2 version ? Does it mean that the whole table will be streamed to the application and filtered by the application ?

how can I use usingTtl for my cassandra inserts when I am using dynamicQuery
Akhil Kodali

Hi, I am using MysqlMonixJdbcContext and trying to do a batch insert with

      liftQuery(images).map(i => query[Image].insert(i).onConflictIgnore)

But I get the following error

exception during macro expansion: 
scala.reflect.macros.TypecheckException: Case class type io.getquill.Insert[Image] has no values
Artūras Šlajus
Is it possible to write extension methods for Quill contexts?
implicit class MyDBContextExts(val ctx: MyDBContext) extends AnyVal {
   def doSomething2(): ctx.IO[Unit, ctx.Effect.Read] = ctx.doSomething()
if I try to use doSomething2() from somewhere where i have the ctx, the types don't match, I can't pass the returned IO to ctx.performIO
[error]  found   : qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           )(
[error]            ^
[info] qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   qual$1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
what if you try an empty refinement, i.e. val ctx: MyDbContext { }?
or even implicit class MyDBContextExts[T <: MyDbContext](t: T { }) extends AnyVal?
Artūras Šlajus
let me try
What do empty braces here do? t: T { }
Oleg Pyzhcov
it's a refinement type, like T { type Res = Int ; val foo: Long }, except it's empty
Artūras Šlajus
  implicit class MyDBContextExts[T <: MyDBContext](val ctx: T { }) extends AnyVal {
    import ctx._

    def createNewUserLoginResponseIO(args: CreateNewUserArgs)(implicit runtimeCfg: RuntimeAppConfig): IO[
      Either[String, CreateNewUserIOResult[Login.Response.Success]],
      Effect.Read with Effect.Write with Effect.Transaction
    ] = {
          val io_ = dbCtx.createNewUserLoginResponseIO(args)
          val result = performIO(io_).getOrThrowLeft
[error] W:\work\unity\sob-review\server\app\src\main\scala\app\commands\DevGenerateUsers.scala:33:34: type mismatch;
[error]  found   : _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] where val _1: app.db.MyDBContextExts.MyDBContextExts[app.db.MyDBContext]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           val result = performIO(io_).getOrThrowLeft
[error]                                  ^
[info] _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   _1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
well, I'm out of dark magics that could help :c
Artūras Šlajus
Heh, thanks for trying though :)
I can always design it other way around but I'm just very confused why the type system thinks the type suddenly changes. I thought returning a dependent type should make it the same type.
Any examples on how to integrate quill-ndbc and cats effect?
Alexis Durieux
Hello ! I have an List[(Float, Float)] representing an array of numrange in postgres. I am having an issue with the decoder. The insertion works fine but the returning is not working. Any examples I could get inspired with ? Thank you and happy easter !
implicit def arrayNumrangeEncoder[Col <: Seq[(Float, Float)]]
        : Encoder[Col] = arrayRawEncoder[(Float, Float), Col]
    implicit def arrayNumrangeDecoder[Col <: Seq[(Float, Float)]](
        implicit bf: CBF[(Float, Float), Col]
    ): Decoder[Col] =
      arrayRawEncoder[(Float, Float), Col]
2 replies
Just setting started here; where do I put the sortBy(d.processDt), filter, etc?
val q = quote { for { z <- query[ZipIndex] f <- query[GeneFile] if (f.geneFileId == z.geneFileId) d <- query[GeneDir] if (d.geneDirId == f.geneDirId) } yield (z) }
Oleg Pyzhcov
you can't sort in for comprehension; if describes filter
so what's the correct approach with a bunch of joins?
Oleg Pyzhcov
you can still do e.g. q.sortBy(_.something) on quoted bit, you can use explicit .join(...).on(...) syntax instead of for-comprehension, or you can wrap all in giant parentheses.
How do you do table aliases where that is required, e.g. sortBy(t.something)?
Oleg Pyzhcov
quill automatically uses aliases
you refer to stuff via variables in for comprehensions/lambdas
I'm not getting how I would distinguish between fields with the same name in ZipIndex and GeneFile in the 3rd line, for instance, or how I would tell it which table processDt is coming from in the sortBy.
    join(query[GeneFile]).on((z, f) => z.geneFileId == f.geneFileId).
    join(query[GeneDir]).on((f, d) => f.geneDirId == d.geneDirId).
    filter(uid == lift(uid)).
Oleg Pyzhcov
that won't compile because join gives you a tuple2. Join twice gives you a tuple inside a tuple. You would be writing something like e.g. .filter { case ((z, f), d) => d.uid == lift(uid) }.sortBy { case ((z, f), d) => d.processDt }(Ord.desc)
ah, that gives me a little traction. Thanks
N.S. Cutler
Been quite awhile, but I believe you can just write:
val q = quote {
  (for {
    (zi, gf) <- ZipIndex join GeneFile on(_.geneFileId == _.geneFileId)
         gd  <- GeneDir join(_.geneDirId == gf.geneDirId) if (uid == lift(uid))
  } yield (zi, gf, gd)).sortBy({ case (_,_,d) => d.processDt })(Ord.desc)
Is there a plan to implement Common table expressions ? getquill/quill#580
Does Quill support filtering on parameters of type Option[A] such that if the type is Some[A] a filter condition is added, but condition is omitted when it's None? Similar to e.g. filterOpt in Slick.
5 replies
I would like to refactor an application towards processing the results of a selection from a large Postgres view by streaming to reduce the memory footprint. My application is built on quill-async-postgres and Akka Streams. Is switching to quill-jdbc-monix and Monix Connect the most concise way to provide the table rows as an Akka streams source? Can anybody share his/her experience on additional refactoring I will have to apply to other existing DAOs that use the quill-async-postgres context, but that require no streaming?

I know quill analyze deeply what's used in map but is there a way to turn it off? E.g. in the following case it should just recognize result must contain colum id,colA,colB

case class A(id: Int, colA: String)

case class B(colB: Int, as: Seq[A])

case class Table(id: Int, colB: Int, colA: String)

ctx.run( query[Table].map(r => B(r.colB, Seq(A(r.id, r.colA)))))

I get instead:
exception during macro expansion: scala.reflect.macros.TypecheckException: package scala.collection is not a value

Remi Guittaut
Hi guys. I was wondering about the interest of using zio-quill, it just removes the need of using zio interop cats isn't it?
case class Person(lastVisit: String)
java.lang.UnsupportedOperationException: Can't read LocalDateValue [value=2000-07-23] as String
but in database this column is Date type, what type i must use or how convert it to String?
Simon Parten

Hi, I'm trying to learn quill (3.7.0) out of a postgres database.

val ctx = new PostgresJdbcContext(LowerCase, new HikariDataSource(config))
import ctx._ 
val check = ctx.quote(query[City]


val mexCity = 
    .filter(c => c.name like "%mex%" )


The first of these queries works well. The second fails at runtime with the following message.

Found the following free variables: IdentName(mexCity).
Quotations can't reference values outside their scope directly.
In order to bind runtime values to a quotation, please use the method `lift`.
Example: `def byName(n: String) = quote(query[Person].filter(_.name == lift(n)))`

As far as I tell,it is consistent with the docs...
Anyone have ideas on what I might be doing wrong here?

Urgh. Foolishness...

val mexCity = 
    .filter(c => c.name like "%mex%" ))

The error message was in fact a good one, telling me I had the brackets in the wrong place. Sorry for spam.

Janghwan Lee

hi I would like to chain filter clauses like

query[Person].filter(p => 
    case ("name", names)  => liftQuery(names).contains(p.name)
    case ("city", cities) => liftQuery(cities).contains(p.city)
  }).foldLeft(lift(false))(_ || _)

but this causes Tree 'map.collect ... ' can't be parsed to 'Ast'
is there any other way to combine filter clauses dynimically?

Luis Gustavo

Hi all!
I need to implement a sortBy with dynamic runtime parameters. Currently Im using:

val query = dSchema.sortBy(item => {
      orderBy.flatMap(_.headOption.map(_.columnName)) match {
        case Some("createdAt") => item.audit.createdAt
        case Some("updatedAt") => item.audit.modifiedAt
        case _ => item.audit.modifiedAt
    }) {
      orderBy.flatMap(_.headOption.map(_.direction)) match {
        case Some(OrderDirection.Asc) => Ord.asc
        case Some(OrderDirection.Desc) => Ord.desc
        case _ => Ord.asc

But we will need to have multiple columns at the same time to the sortBy with match case seems unviable.
I've been experimenting with infix. Is there a way to use infix with sortBy? Can anyone provide an example?

Yisrael Union
im having some trouble with some sql that is generated when using an implicit class to compare date
I ended up with a sql that looks like this
  1 = event_datetime >= ?
I'm using the comparison operators suggested on the Quill docs page. The error I receive states this is invalid sql
Simon Parten
When using quill in metals / vscode, metals - quill query compilation to SQL shows up as a "problem" in the compiler window
is this intentional?

Hello, im using application joins (ver 2.5.4).
Somehow outer select does not contain all columns of table but inner select (in join statements) does and as a result i get "column ... does no exist".

Is there a way to force select all columns from big applicative join with filters?

or joins have columns that are not in a scope of particular join (...) statement
Alexander Ioffe
@Zulek There were various bugs with older versions of Quill with columns being excluded. Are these columns used in Group By (or other aggregations) but any chance?
5 replies
Alexander Ioffe
@all Is anyone using AnyVal encoders? Are your AnyVal instances also case classes or not? In Dotty-Quill they might need to be.
1 reply