Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Oct 15 18:58
    reidrac starred getquill/quill
  • Oct 15 14:35
    halega starred getquill/quill
  • Oct 15 05:52
    2chilled edited #2275
  • Oct 15 05:51
    2chilled opened #2275
  • Oct 14 13:42
    Pucilowski starred getquill/quill
  • Oct 14 11:18
    sturmin starred getquill/quill
  • Oct 14 11:02
    juliano synchronize #2205
  • Oct 14 10:44
    juliano commented #2274
  • Oct 14 10:43
    juliano synchronize #2274
  • Oct 14 07:25
    i10416 starred getquill/quill
  • Oct 14 03:25
    vivshaw starred getquill/quill
  • Oct 14 02:53
    jilen synchronize #2088
  • Oct 13 19:42
    Kazy starred getquill/quill
  • Oct 13 19:29
    devon-whil starred getquill/quill
  • Oct 13 18:20
    juliano labeled #2274
  • Oct 13 14:14
    deusaquilus commented #2274
  • Oct 13 13:47
    juliano edited #2274
  • Oct 13 13:46
    juliano edited #2274
  • Oct 13 13:46
    juliano opened #2274
  • Oct 13 12:38
    scala-steward opened #2273
e-Evolution
@e-Evolution
Hi everybody

I am testing the new implementation with ZIO, but I have the problem that the implicit val organizationSchemaMeta with which I have mapped the database columns does not work.

Unlike the other implementations I am using context.stream:

context.stream ( query [Organization] .filter (_. organizationId == lift (organizationId)) )
Any idea how to solve, here is my code:

implicit val organizationSchemaMeta: context.SchemaMeta[Organization] = schemaMeta[Organization]( "AD_Org", _.tenantId -> "AD_Client_ID", _.organizationId -> "AD_Org_ID", _.isActive -> "IsActive", _.created -> "Created", _.createdBy -> "CreatedBy", _.updated -> "Updated", _.updatedBy -> "UpdatedBy", _.value -> "Value", _.name -> "Name", _.description -> "Description", _.replicationStrategyId -> "AD_ReplicationStrategy_ID", _.parentOrganizationId -> "Parent_Org_ID", _.uuid -> "UUID" )
def getById( organizationId: Id ): ZStream[ OrganizationZioRepository with QConnection, Throwable, Organization ] = context.stream( query[Organization].filter(_.organizationId == lift(organizationId)) )
the error is :
╠─A checked error was not handled. ║ org.postgresql.util.PSQLException: ERROR: column x14.clientid does not exist ║ Position: 8
so it is not mapping the database columns
Oleg Pyzhcov
@oleg-py
could happen if e.g. "clientid" is defined as a field of Organization case class but you don't have it remapped in this schema meta
e-Evolution
@e-Evolution
ok let me check
e-Evolution
@e-Evolution
@oleg-py thank alot , my mapping had an error , I solve now
Adriani Furtado
@Adriani277
Hey all, I have come across the Codegen and has a few questions on it.
  1. I am using new SimpleJdbcCodegen(datasource, "my.package"), my datasource is pointed to a specific schema. However when the code gets generated it pulls every schema in my server, I'm I doing something wrong here?
  2. My next question is, assuming I have solved issue 1, is it possible for me to filter which tables in a schema I would like to generate? We have a schema of about 20 table and I would like to generate the case classes for only 4 of those
Mathieu Prevel
@mprevel

Hi everyone,

I try to use quill with monix to get a stream of db objects where rows are filtered on an enum column.

When not using a stream I am able to get the expected results from the database.

query[MyTable]
.filter(_.myField == lift(value))

Stream 1:
When filtering the query before using it as a stream it compiles but crashes at runtime.

stream(query[MyTable]
.filter(_.myField == lift(value))
)
'ERROR: operator does not exist: t_my_enum = character varying
Indice : No operator matches the given name and argument types. You might need to add explicit type casts.
Position : 198'. Aborting connection.
...
Caused by: [CIRCULAR REFERENCE: org.postgresql.util.PSQLException: ERROR: operator does not exist: t_my_enum = character varying
...

Stream 2:
When filtering the stream (not at the query level) it works as expected.

stream(query[MyTable])
.filter(_.myField == value)

Is it expected for the stream 1 to fail ?
Should I use the stream 2 version ? Does it mean that the whole table will be streamed to the application and filtered by the application ?

dassum
@dassum
how can I use usingTtl for my cassandra inserts when I am using dynamicQuery
Akhil Kodali
@akhil

Hi, I am using MysqlMonixJdbcContext and trying to do a batch insert with

run(
      liftQuery(images).map(i => query[Image].insert(i).onConflictIgnore)
  )

But I get the following error

exception during macro expansion: 
scala.reflect.macros.TypecheckException: Case class type io.getquill.Insert[Image] has no values
Artūras Šlajus
@arturaz
Is it possible to write extension methods for Quill contexts?
implicit class MyDBContextExts(val ctx: MyDBContext) extends AnyVal {
   def doSomething2(): ctx.IO[Unit, ctx.Effect.Read] = ctx.doSomething()
}
if I try to use doSomething2() from somewhere where i have the ctx, the types don't match, I can't pass the returned IO to ctx.performIO
[error]  found   : qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           )(
[error]            ^
[info] qual$1.ctx.IO[Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],qual$1.ctx.Effect.Read with qual$1.ctx.Effect.Write with qual$1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   qual$1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
@oleg-py
what if you try an empty refinement, i.e. val ctx: MyDbContext { }?
or even implicit class MyDBContextExts[T <: MyDbContext](t: T { }) extends AnyVal?
Artūras Šlajus
@arturaz
let me try
What do empty braces here do? t: T { }
Oleg Pyzhcov
@oleg-py
it's a refinement type, like T { type Res = Int ; val foo: Long }, except it's empty
Artūras Šlajus
@arturaz
  implicit class MyDBContextExts[T <: MyDBContext](val ctx: T { }) extends AnyVal {
    import ctx._

    def createNewUserLoginResponseIO(args: CreateNewUserArgs)(implicit runtimeCfg: RuntimeAppConfig): IO[
      Either[String, CreateNewUserIOResult[Login.Response.Success]],
      Effect.Read with Effect.Write with Effect.Transaction
    ] = {
          val io_ = dbCtx.createNewUserLoginResponseIO(args)
          val result = performIO(io_).getOrThrowLeft
[error] W:\work\unity\sob-review\server\app\src\main\scala\app\commands\DevGenerateUsers.scala:33:34: type mismatch;
[error]  found   : _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] where val _1: app.db.MyDBContextExts.MyDBContextExts[app.db.MyDBContext]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           val result = performIO(io_).getOrThrowLeft
[error]                                  ^
[info] _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   _1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
@oleg-py
well, I'm out of dark magics that could help :c
Artūras Šlajus
@arturaz
Heh, thanks for trying though :)
I can always design it other way around but I'm just very confused why the type system thinks the type suddenly changes. I thought returning a dependent type should make it the same type.
Zett98
@Zett98
Any examples on how to integrate quill-ndbc and cats effect?
Alexis Durieux
@alexisdurieux
Hello ! I have an List[(Float, Float)] representing an array of numrange in postgres. I am having an issue with the decoder. The insertion works fine but the returning is not working. Any examples I could get inspired with ? Thank you and happy easter !
implicit def arrayNumrangeEncoder[Col <: Seq[(Float, Float)]]
        : Encoder[Col] = arrayRawEncoder[(Float, Float), Col]
    implicit def arrayNumrangeDecoder[Col <: Seq[(Float, Float)]](
        implicit bf: CBF[(Float, Float), Col]
    ): Decoder[Col] =
      arrayRawEncoder[(Float, Float), Col]
2 replies
discobaba
@uncleweirdo_twitter
Just setting started here; where do I put the sortBy(d.processDt), filter, etc?
val q = quote { for { z <- query[ZipIndex] f <- query[GeneFile] if (f.geneFileId == z.geneFileId) d <- query[GeneDir] if (d.geneDirId == f.geneDirId) } yield (z) }
Oleg Pyzhcov
@oleg-py
you can't sort in for comprehension; if describes filter
discobaba
@uncleweirdo_twitter
so what's the correct approach with a bunch of joins?
Oleg Pyzhcov
@oleg-py
you can still do e.g. q.sortBy(_.something) on quoted bit, you can use explicit .join(...).on(...) syntax instead of for-comprehension, or you can wrap all in giant parentheses.
discobaba
@uncleweirdo_twitter
How do you do table aliases where that is required, e.g. sortBy(t.something)?
Oleg Pyzhcov
@oleg-py
quill automatically uses aliases
you refer to stuff via variables in for comprehensions/lambdas
discobaba
@uncleweirdo_twitter
I'm not getting how I would distinguish between fields with the same name in ZipIndex and GeneFile in the 3rd line, for instance, or how I would tell it which table processDt is coming from in the sortBy.
run(query[ZipIndex].
    join(query[GeneFile]).on((z, f) => z.geneFileId == f.geneFileId).
    join(query[GeneDir]).on((f, d) => f.geneDirId == d.geneDirId).
    filter(uid == lift(uid)).
    sortBy(_.processDt)(Ord.desc).take(1))
Oleg Pyzhcov
@oleg-py
that won't compile because join gives you a tuple2. Join twice gives you a tuple inside a tuple. You would be writing something like e.g. .filter { case ((z, f), d) => d.uid == lift(uid) }.sortBy { case ((z, f), d) => d.processDt }(Ord.desc)
discobaba
@uncleweirdo_twitter
ah, that gives me a little traction. Thanks
N.S. Cutler
@godenji
Been quite awhile, but I believe you can just write:
val q = quote {
  (for {
    (zi, gf) <- ZipIndex join GeneFile on(_.geneFileId == _.geneFileId)
         gd  <- GeneDir join(_.geneDirId == gf.geneDirId) if (uid == lift(uid))
  } yield (zi, gf, gd)).sortBy({ case (_,_,d) => d.processDt })(Ord.desc)
}
run(q)
b-gyula
@b-gyula
Is there a plan to implement Common table expressions ? getquill/quill#580
corentin
@corenti13711539_twitter
Does Quill support filtering on parameters of type Option[A] such that if the type is Some[A] a filter condition is added, but condition is omitted when it's None? Similar to e.g. filterOpt in Slick.
5 replies
ElectricWound
@ElectricWound
I would like to refactor an application towards processing the results of a selection from a large Postgres view by streaming to reduce the memory footprint. My application is built on quill-async-postgres and Akka Streams. Is switching to quill-jdbc-monix and Monix Connect the most concise way to provide the table rows as an Akka streams source? Can anybody share his/her experience on additional refactoring I will have to apply to other existing DAOs that use the quill-async-postgres context, but that require no streaming?
b-gyula
@b-gyula

I know quill analyze deeply what's used in map but is there a way to turn it off? E.g. in the following case it should just recognize result must contain colum id,colA,colB

case class A(id: Int, colA: String)

case class B(colB: Int, as: Seq[A])

case class Table(id: Int, colB: Int, colA: String)

ctx.run( query[Table].map(r => B(r.colB, Seq(A(r.id, r.colA)))))

I get instead:
exception during macro expansion: scala.reflect.macros.TypecheckException: package scala.collection is not a value

Remi Guittaut
@remiguittaut
Hi guys. I was wondering about the interest of using zio-quill, it just removes the need of using zio interop cats isn't it?
redk0m
@redk0m_twitter
case class Person(lastVisit: String)
java.lang.UnsupportedOperationException: Can't read LocalDateValue [value=2000-07-23] as String
but in database this column is Date type, what type i must use or how convert it to String?
Simon Parten
@Quafadas

Hi, I'm trying to learn quill (3.7.0) out of a postgres database.


val ctx = new PostgresJdbcContext(LowerCase, new HikariDataSource(config))
import ctx._ 
val check = ctx.quote(query[City]
    .sortBy(_.population)(Ord.descNullsLast).take(5))

ctx.run(check)

val mexCity = 
    ctx.quote(query[City])
    .filter(c => c.name like "%mex%" )

ctx.run(mexCity)

The first of these queries works well. The second fails at runtime with the following message.

Found the following free variables: IdentName(mexCity).
Quotations can't reference values outside their scope directly.
In order to bind runtime values to a quotation, please use the method `lift`.
Example: `def byName(n: String) = quote(query[Person].filter(_.name == lift(n)))`

As far as I tell,it is consistent with the docs...
https://getquill.io/#quotation-sql-specific-operations-like
Anyone have ideas on what I might be doing wrong here?

Urgh. Foolishness...

val mexCity = 
    ctx.quote(query[City]
    .filter(c => c.name like "%mex%" ))

The error message was in fact a good one, telling me I had the brackets in the wrong place. Sorry for spam.