Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • Oct 15 18:58
    reidrac starred getquill/quill
  • Oct 15 14:35
    halega starred getquill/quill
  • Oct 15 05:52
    2chilled edited #2275
  • Oct 15 05:51
    2chilled opened #2275
  • Oct 14 13:42
    Pucilowski starred getquill/quill
  • Oct 14 11:18
    sturmin starred getquill/quill
  • Oct 14 11:02
    juliano synchronize #2205
  • Oct 14 10:44
    juliano commented #2274
  • Oct 14 10:43
    juliano synchronize #2274
  • Oct 14 07:25
    i10416 starred getquill/quill
  • Oct 14 03:25
    vivshaw starred getquill/quill
  • Oct 14 02:53
    jilen synchronize #2088
  • Oct 13 19:42
    Kazy starred getquill/quill
  • Oct 13 19:29
    devon-whil starred getquill/quill
  • Oct 13 18:20
    juliano labeled #2274
  • Oct 13 14:14
    deusaquilus commented #2274
  • Oct 13 13:47
    juliano edited #2274
  • Oct 13 13:46
    juliano edited #2274
  • Oct 13 13:46
    juliano opened #2274
  • Oct 13 12:38
    scala-steward opened #2273
David Bouyssié
I'm wondering why it is expressed as a List instead of Iterable.
It lets me think that all the records are gathered after ctx.run, while I would prefer a lazy loading/iteration.
Philippe Derome
Can I use dynamic queries over JDBC (in stream mode preferably, but that's orthogonal concern) where the number of columns and their types are well defined at runtime for each row but not at compile time (I'd know which binding to do for each column on a Sql.ResultSet) from which I would get for each row a List[Any] to represent the row. Is there support for that?
Philippe Derome
Maybe the solution would entail doing casting to ::text within the SQL and now we have a uniform list of strings to return (as I am not interested in the actual type safe values, just the string representations). So something like .as[Query[List[String]]]
Hi everybody

I am testing the new implementation with ZIO, but I have the problem that the implicit val organizationSchemaMeta with which I have mapped the database columns does not work.

Unlike the other implementations I am using context.stream:

context.stream ( query [Organization] .filter (_. organizationId == lift (organizationId)) )
Any idea how to solve, here is my code:

implicit val organizationSchemaMeta: context.SchemaMeta[Organization] = schemaMeta[Organization]( "AD_Org", _.tenantId -> "AD_Client_ID", _.organizationId -> "AD_Org_ID", _.isActive -> "IsActive", _.created -> "Created", _.createdBy -> "CreatedBy", _.updated -> "Updated", _.updatedBy -> "UpdatedBy", _.value -> "Value", _.name -> "Name", _.description -> "Description", _.replicationStrategyId -> "AD_ReplicationStrategy_ID", _.parentOrganizationId -> "Parent_Org_ID", _.uuid -> "UUID" )
def getById( organizationId: Id ): ZStream[ OrganizationZioRepository with QConnection, Throwable, Organization ] = context.stream( query[Organization].filter(_.organizationId == lift(organizationId)) )
the error is :
╠─A checked error was not handled. ║ org.postgresql.util.PSQLException: ERROR: column x14.clientid does not exist ║ Position: 8
so it is not mapping the database columns
Oleg Pyzhcov
could happen if e.g. "clientid" is defined as a field of Organization case class but you don't have it remapped in this schema meta
ok let me check
@oleg-py thank alot , my mapping had an error , I solve now
Adriani Furtado
Hey all, I have come across the Codegen and has a few questions on it.
  1. I am using new SimpleJdbcCodegen(datasource, "my.package"), my datasource is pointed to a specific schema. However when the code gets generated it pulls every schema in my server, I'm I doing something wrong here?
  2. My next question is, assuming I have solved issue 1, is it possible for me to filter which tables in a schema I would like to generate? We have a schema of about 20 table and I would like to generate the case classes for only 4 of those
Mathieu Prevel

Hi everyone,

I try to use quill with monix to get a stream of db objects where rows are filtered on an enum column.

When not using a stream I am able to get the expected results from the database.

.filter(_.myField == lift(value))

Stream 1:
When filtering the query before using it as a stream it compiles but crashes at runtime.

.filter(_.myField == lift(value))
'ERROR: operator does not exist: t_my_enum = character varying
Indice : No operator matches the given name and argument types. You might need to add explicit type casts.
Position : 198'. Aborting connection.
Caused by: [CIRCULAR REFERENCE: org.postgresql.util.PSQLException: ERROR: operator does not exist: t_my_enum = character varying

Stream 2:
When filtering the stream (not at the query level) it works as expected.

.filter(_.myField == value)

Is it expected for the stream 1 to fail ?
Should I use the stream 2 version ? Does it mean that the whole table will be streamed to the application and filtered by the application ?

how can I use usingTtl for my cassandra inserts when I am using dynamicQuery
Akhil Kodali

Hi, I am using MysqlMonixJdbcContext and trying to do a batch insert with

      liftQuery(images).map(i => query[Image].insert(i).onConflictIgnore)

But I get the following error

exception during macro expansion: 
scala.reflect.macros.TypecheckException: Case class type io.getquill.Insert[Image] has no values
Artūras Šlajus
Is it possible to write extension methods for Quill contexts?
implicit class MyDBContextExts(val ctx: MyDBContext) extends AnyVal {
   def doSomething2(): ctx.IO[Unit, ctx.Effect.Read] = ctx.doSomething()
if I try to use doSomething2() from somewhere where i have the ctx, the types don't match, I can't pass the returned IO to ctx.performIO
[error]  found   : qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           )(
[error]            ^
[info] qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   qual$1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
what if you try an empty refinement, i.e. val ctx: MyDbContext { }?
or even implicit class MyDBContextExts[T <: MyDbContext](t: T { }) extends AnyVal?
Artūras Šlajus
let me try
What do empty braces here do? t: T { }
Oleg Pyzhcov
it's a refinement type, like T { type Res = Int ; val foo: Long }, except it's empty
Artūras Šlajus
  implicit class MyDBContextExts[T <: MyDBContext](val ctx: T { }) extends AnyVal {
    import ctx._

    def createNewUserLoginResponseIO(args: CreateNewUserArgs)(implicit runtimeCfg: RuntimeAppConfig): IO[
      Either[String, CreateNewUserIOResult[Login.Response.Success]],
      Effect.Read with Effect.Write with Effect.Transaction
    ] = {
          val io_ = dbCtx.createNewUserLoginResponseIO(args)
          val result = performIO(io_).getOrThrowLeft
[error] W:\work\unity\sob-review\server\app\src\main\scala\app\commands\DevGenerateUsers.scala:33:34: type mismatch;
[error]  found   : _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] where val _1: app.db.MyDBContextExts.MyDBContextExts[app.db.MyDBContext]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           val result = performIO(io_).getOrThrowLeft
[error]                                  ^
[info] _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   _1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
well, I'm out of dark magics that could help :c
Artūras Šlajus
Heh, thanks for trying though :)
I can always design it other way around but I'm just very confused why the type system thinks the type suddenly changes. I thought returning a dependent type should make it the same type.
Any examples on how to integrate quill-ndbc and cats effect?
Alexis Durieux
Hello ! I have an List[(Float, Float)] representing an array of numrange in postgres. I am having an issue with the decoder. The insertion works fine but the returning is not working. Any examples I could get inspired with ? Thank you and happy easter !
implicit def arrayNumrangeEncoder[Col <: Seq[(Float, Float)]]
        : Encoder[Col] = arrayRawEncoder[(Float, Float), Col]
    implicit def arrayNumrangeDecoder[Col <: Seq[(Float, Float)]](
        implicit bf: CBF[(Float, Float), Col]
    ): Decoder[Col] =
      arrayRawEncoder[(Float, Float), Col]
2 replies
Just setting started here; where do I put the sortBy(d.processDt), filter, etc?
val q = quote { for { z <- query[ZipIndex] f <- query[GeneFile] if (f.geneFileId == z.geneFileId) d <- query[GeneDir] if (d.geneDirId == f.geneDirId) } yield (z) }
Oleg Pyzhcov
you can't sort in for comprehension; if describes filter
so what's the correct approach with a bunch of joins?
Oleg Pyzhcov
you can still do e.g. q.sortBy(_.something) on quoted bit, you can use explicit .join(...).on(...) syntax instead of for-comprehension, or you can wrap all in giant parentheses.
How do you do table aliases where that is required, e.g. sortBy(t.something)?
Oleg Pyzhcov
quill automatically uses aliases
you refer to stuff via variables in for comprehensions/lambdas
I'm not getting how I would distinguish between fields with the same name in ZipIndex and GeneFile in the 3rd line, for instance, or how I would tell it which table processDt is coming from in the sortBy.
    join(query[GeneFile]).on((z, f) => z.geneFileId == f.geneFileId).
    join(query[GeneDir]).on((f, d) => f.geneDirId == d.geneDirId).
    filter(uid == lift(uid)).
Oleg Pyzhcov
that won't compile because join gives you a tuple2. Join twice gives you a tuple inside a tuple. You would be writing something like e.g. .filter { case ((z, f), d) => d.uid == lift(uid) }.sortBy { case ((z, f), d) => d.processDt }(Ord.desc)
ah, that gives me a little traction. Thanks
N.S. Cutler
Been quite awhile, but I believe you can just write:
val q = quote {
  (for {
    (zi, gf) <- ZipIndex join GeneFile on(_.geneFileId == _.geneFileId)
         gd  <- GeneDir join(_.geneDirId == gf.geneDirId) if (uid == lift(uid))
  } yield (zi, gf, gd)).sortBy({ case (_,_,d) => d.processDt })(Ord.desc)
Is there a plan to implement Common table expressions ? getquill/quill#580
Does Quill support filtering on parameters of type Option[A] such that if the type is Some[A] a filter condition is added, but condition is omitted when it's None? Similar to e.g. filterOpt in Slick.
5 replies
I would like to refactor an application towards processing the results of a selection from a large Postgres view by streaming to reduce the memory footprint. My application is built on quill-async-postgres and Akka Streams. Is switching to quill-jdbc-monix and Monix Connect the most concise way to provide the table rows as an Akka streams source? Can anybody share his/her experience on additional refactoring I will have to apply to other existing DAOs that use the quill-async-postgres context, but that require no streaming?

I know quill analyze deeply what's used in map but is there a way to turn it off? E.g. in the following case it should just recognize result must contain colum id,colA,colB

case class A(id: Int, colA: String)

case class B(colB: Int, as: Seq[A])

case class Table(id: Int, colB: Int, colA: String)

ctx.run( query[Table].map(r => B(r.colB, Seq(A(r.id, r.colA)))))

I get instead:
exception during macro expansion: scala.reflect.macros.TypecheckException: package scala.collection is not a value