Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 16 20:52
    scala-steward closed #2241
  • Sep 16 20:52
    scala-steward commented #2241
  • Sep 16 20:52
    scala-steward opened #2256
  • Sep 16 11:25
    scala-steward closed #2156
  • Sep 16 11:25
    scala-steward commented #2156
  • Sep 16 11:25
    scala-steward opened #2255
  • Sep 16 09:49
    tpetillot edited #2254
  • Sep 16 09:49
    tpetillot opened #2254
  • Sep 14 20:59
    scala-steward opened #2253
  • Sep 14 15:23
    houcros starred getquill/quill
  • Sep 13 23:49
    virusdave commented #1557
  • Sep 13 22:34
    ithinkicancode synchronize #2167
  • Sep 12 04:49
    liutaon edited #2252
  • Sep 11 13:56
    liutaon edited #2252
  • Sep 11 13:55
    liutaon edited #2252
  • Sep 11 13:41
    liutaon edited #2252
  • Sep 11 13:39
    liutaon edited #2252
  • Sep 11 13:38
    liutaon edited #2252
  • Sep 11 13:35
    liutaon opened #2252
  • Sep 11 05:52
    jordanst3wart starred getquill/quill
Artūras Šlajus
@arturaz
          val io_ = dbCtx.createNewUserLoginResponseIO(args)
          val result = performIO(io_).getOrThrowLeft
[error] W:\work\unity\sob-review\server\app\src\main\scala\app\commands\DevGenerateUsers.scala:33:34: type mismatch;
[error]  found   : _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] where val _1: app.db.MyDBContextExts.MyDBContextExts[app.db.MyDBContext]
[error]  required: dbCtx.IO[?, _]
[error] Error occurred in an application involving default arguments.
[error]           val result = performIO(io_).getOrThrowLeft
[error]                                  ^
[info] _1.ctx.IO[scala.util.Either[String,app.db.CreateNewUserIOResult[app.verticles.http.Routes.Login.Response.Success]],_1.ctx.Effect.Read with _1.ctx.Effect.Write with _1.ctx.Effect.Transaction] <: dbCtx.IO[?, _]?
[info]   _1.ctx.type = dbCtx.type?
[info]   false
[info] false
Oleg Pyzhcov
@oleg-py
well, I'm out of dark magics that could help :c
Artūras Šlajus
@arturaz
Heh, thanks for trying though :)
I can always design it other way around but I'm just very confused why the type system thinks the type suddenly changes. I thought returning a dependent type should make it the same type.
Zett98
@Zett98
Any examples on how to integrate quill-ndbc and cats effect?
Alexis Durieux
@alexisdurieux
Hello ! I have an List[(Float, Float)] representing an array of numrange in postgres. I am having an issue with the decoder. The insertion works fine but the returning is not working. Any examples I could get inspired with ? Thank you and happy easter !
implicit def arrayNumrangeEncoder[Col <: Seq[(Float, Float)]]
        : Encoder[Col] = arrayRawEncoder[(Float, Float), Col]
    implicit def arrayNumrangeDecoder[Col <: Seq[(Float, Float)]](
        implicit bf: CBF[(Float, Float), Col]
    ): Decoder[Col] =
      arrayRawEncoder[(Float, Float), Col]
2 replies
discobaba
@uncleweirdo_twitter
Just setting started here; where do I put the sortBy(d.processDt), filter, etc?
val q = quote { for { z <- query[ZipIndex] f <- query[GeneFile] if (f.geneFileId == z.geneFileId) d <- query[GeneDir] if (d.geneDirId == f.geneDirId) } yield (z) }
Oleg Pyzhcov
@oleg-py
you can't sort in for comprehension; if describes filter
discobaba
@uncleweirdo_twitter
so what's the correct approach with a bunch of joins?
Oleg Pyzhcov
@oleg-py
you can still do e.g. q.sortBy(_.something) on quoted bit, you can use explicit .join(...).on(...) syntax instead of for-comprehension, or you can wrap all in giant parentheses.
discobaba
@uncleweirdo_twitter
How do you do table aliases where that is required, e.g. sortBy(t.something)?
Oleg Pyzhcov
@oleg-py
quill automatically uses aliases
you refer to stuff via variables in for comprehensions/lambdas
discobaba
@uncleweirdo_twitter
I'm not getting how I would distinguish between fields with the same name in ZipIndex and GeneFile in the 3rd line, for instance, or how I would tell it which table processDt is coming from in the sortBy.
run(query[ZipIndex].
    join(query[GeneFile]).on((z, f) => z.geneFileId == f.geneFileId).
    join(query[GeneDir]).on((f, d) => f.geneDirId == d.geneDirId).
    filter(uid == lift(uid)).
    sortBy(_.processDt)(Ord.desc).take(1))
Oleg Pyzhcov
@oleg-py
that won't compile because join gives you a tuple2. Join twice gives you a tuple inside a tuple. You would be writing something like e.g. .filter { case ((z, f), d) => d.uid == lift(uid) }.sortBy { case ((z, f), d) => d.processDt }(Ord.desc)
discobaba
@uncleweirdo_twitter
ah, that gives me a little traction. Thanks
N.S. Cutler
@godenji
Been quite awhile, but I believe you can just write:
val q = quote {
  (for {
    (zi, gf) <- ZipIndex join GeneFile on(_.geneFileId == _.geneFileId)
         gd  <- GeneDir join(_.geneDirId == gf.geneDirId) if (uid == lift(uid))
  } yield (zi, gf, gd)).sortBy({ case (_,_,d) => d.processDt })(Ord.desc)
}
run(q)
b-gyula
@b-gyula
Is there a plan to implement Common table expressions ? getquill/quill#580
corentin
@corenti13711539_twitter
Does Quill support filtering on parameters of type Option[A] such that if the type is Some[A] a filter condition is added, but condition is omitted when it's None? Similar to e.g. filterOpt in Slick.
5 replies
ElectricWound
@ElectricWound
I would like to refactor an application towards processing the results of a selection from a large Postgres view by streaming to reduce the memory footprint. My application is built on quill-async-postgres and Akka Streams. Is switching to quill-jdbc-monix and Monix Connect the most concise way to provide the table rows as an Akka streams source? Can anybody share his/her experience on additional refactoring I will have to apply to other existing DAOs that use the quill-async-postgres context, but that require no streaming?
b-gyula
@b-gyula

I know quill analyze deeply what's used in map but is there a way to turn it off? E.g. in the following case it should just recognize result must contain colum id,colA,colB

case class A(id: Int, colA: String)

case class B(colB: Int, as: Seq[A])

case class Table(id: Int, colB: Int, colA: String)

ctx.run( query[Table].map(r => B(r.colB, Seq(A(r.id, r.colA)))))

I get instead:
exception during macro expansion: scala.reflect.macros.TypecheckException: package scala.collection is not a value

Remi Guittaut
@remiguittaut
Hi guys. I was wondering about the interest of using zio-quill, it just removes the need of using zio interop cats isn't it?
redk0m
@redk0m_twitter
case class Person(lastVisit: String)
java.lang.UnsupportedOperationException: Can't read LocalDateValue [value=2000-07-23] as String
but in database this column is Date type, what type i must use or how convert it to String?
Quafadas
@Quafadas

Hi, I'm trying to learn quill (3.7.0) out of a postgres database.


val ctx = new PostgresJdbcContext(LowerCase, new HikariDataSource(config))
import ctx._ 
val check = ctx.quote(query[City]
    .sortBy(_.population)(Ord.descNullsLast).take(5))

ctx.run(check)

val mexCity = 
    ctx.quote(query[City])
    .filter(c => c.name like "%mex%" )

ctx.run(mexCity)

The first of these queries works well. The second fails at runtime with the following message.

Found the following free variables: IdentName(mexCity).
Quotations can't reference values outside their scope directly.
In order to bind runtime values to a quotation, please use the method `lift`.
Example: `def byName(n: String) = quote(query[Person].filter(_.name == lift(n)))`

As far as I tell,it is consistent with the docs...
https://getquill.io/#quotation-sql-specific-operations-like
Anyone have ideas on what I might be doing wrong here?

Urgh. Foolishness...

val mexCity = 
    ctx.quote(query[City]
    .filter(c => c.name like "%mex%" ))

The error message was in fact a good one, telling me I had the brackets in the wrong place. Sorry for spam.

Janghwan Lee
@janghwan

hi I would like to chain filter clauses like

query[Person].filter(p => 
  map.collect({
    case ("name", names)  => liftQuery(names).contains(p.name)
    case ("city", cities) => liftQuery(cities).contains(p.city)
  }).foldLeft(lift(false))(_ || _)
)

but this causes Tree 'map.collect ... ' can't be parsed to 'Ast'
is there any other way to combine filter clauses dynimically?

Luis Gustavo
@lgos44

Hi all!
I need to implement a sortBy with dynamic runtime parameters. Currently Im using:

val query = dSchema.sortBy(item => {
      orderBy.flatMap(_.headOption.map(_.columnName)) match {
        case Some("createdAt") => item.audit.createdAt
        case Some("updatedAt") => item.audit.modifiedAt
        case _ => item.audit.modifiedAt
      }
    }) {
      orderBy.flatMap(_.headOption.map(_.direction)) match {
        case Some(OrderDirection.Asc) => Ord.asc
        case Some(OrderDirection.Desc) => Ord.desc
        case _ => Ord.asc
      }
    }

But we will need to have multiple columns at the same time to the sortBy with match case seems unviable.
I've been experimenting with infix. Is there a way to use infix with sortBy? Can anyone provide an example?

Yisrael Union
@yisraelU
im having some trouble with some sql that is generated when using an implicit class to compare date
I ended up with a sql that looks like this
  1 = event_datetime >= ?
I'm using the comparison operators suggested on the Quill docs page. The error I receive states this is invalid sql
Quafadas
@Quafadas
When using quill in metals / vscode, metals - quill query compilation to SQL shows up as a "problem" in the compiler window
is this intentional?
Zulek
@Zulek

Hello, im using application joins (ver 2.5.4).
Somehow outer select does not contain all columns of table but inner select (in join statements) does and as a result i get "column ... does no exist".

Is there a way to force select all columns from big applicative join with filters?

Zulek
@Zulek
or joins have columns that are not in a scope of particular join (...) statement
Alexander Ioffe
@deusaquilus
@Zulek There were various bugs with older versions of Quill with columns being excluded. Are these columns used in Group By (or other aggregations) but any chance?
5 replies
Alexander Ioffe
@deusaquilus
@all Is anyone using AnyVal encoders? Are your AnyVal instances also case classes or not? In Dotty-Quill they might need to be.
1 reply
SemanticBeeng
@SemanticBeeng
What it the best way to use quill--spark with spark 3.1.1 ?
(forgive me if already discussed)
Lachezar Yankov
@lachezar

Hi, if I have two value types which have the same underlying type (String), is there any way to perform "column update" (assign the value from one of the columns to the other)?

case class Email(email: String) extends AnyVal
case class EmailVerified(emailVerified: String) extends AnyVal

...
... .update(table => table.emailVerified -> table.email, ...

beside infix

Hee Yeon Cho
@aposto
Hello, how to compile list updating.
dynamicQuery[TestEntity].update(set("i", 1), set("l", 2L)) 
// compile error
dynamicQuery[TestEntity].update(list.map(a => set(a._1, a._2):_*))
Alexander Ioffe
@deusaquilus
@lachezar This is basically an update-with-join use case which unfortunately we don't support yet.
Oleg Pyzhcov
@oleg-py
How to do select by a collection of compound keys? I have something like a runtime Vector[((UUID, Int), (UUID, Int))] and need to match on 4 corresponding columns. Using jdbc-monix and postgres.
attempt to liftQuery that vector would fail with "Can't tokenize a non-scalar lifting"
Dan Ellis
@danellis

I'm trying to use a model that has a java.time.Instant field. I've put this in my context class:

        implicit val encodeInstant = MappedEncoding[Instant, LocalDateTime](i => LocalDateTime.ofInstant(i, ZoneOffset.UTC))
        implicit val decodeInstant = MappedEncoding[LocalDateTime, Instant](_.toInstant(ZoneOffset.UTC))

but I'm still getting the "Can't find implicit Decoder[java.time.Instant]" error

I've also tried putting it after my import ctx._
Remi Guittaut
@remiguittaut
Hi guys, in quill 3.7.0 for whatever reason, the generated sql queries are not displayed at compile time anymore. Do you have the same problem? (I don't have -Dquill.macro.log=false in my build definition)
nafg
@nafg
Are there any sample projects besides quill-example and play-quill-jdbc repos? How do you maintain sample projects? (In Slick there's a lot of code in the SBT build dealing with it, which I'd like to simplify or eliminate, and I'm trying to find out what other people do.) Do you automate keeping them up to date, and if so, how? /cc @deusaquilus
Philippe Derome
@phderome
I am getting a macro TypecheckException for no implicit Decoder when using H2ZioJdbcContext. What is it generally indicative of?
[error] scala.reflect.macros.TypecheckException: Can't find implicit `Decoder[Foo]`. Please, do one of the following things:
[error] 1. ensure that implicit `Decoder[Foo]` is provided and there are no other conflicting implicits;
[error] 2. make `Foo` `Embedded` case class or `AnyVal`.
[error]                    
[error]         at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$3(Typers.scala:44)
[error]         at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$2(Typers.scala:38)
[error]         at scala.reflect.macros.contexts.Typers.doTypecheck$1(Typers.scala:37)
[error]         at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$7(Typers.scala:50)
[error]         at scala.reflect.internal.Trees.wrappingIntoTerm(Trees.scala:1891)
[error]         at scala.reflect.internal.Trees.wrappingIntoTerm$(Trees.scala:1888)
[error]         at scala.reflect.internal.SymbolTable.wrappingIntoTerm(SymbolTable.scala:28)
[error]         at scala.reflect.macros.contexts.Typers.typecheck(Typers.scala:50)
[error]         at scala.reflect.macros.contexts.Typers.typecheck$(Typers.scala:32)
[error]         at scala.reflect.macros.contexts.Context.typecheck(Context.scala:18)
[error]         at scala.reflect.macros.contexts.Context.typecheck(Context.scala:18)
[error]         at io.getquill.context.QueryMacro.expandQueryWithMeta(QueryMacro.scala:125)
[error]         at io.getquill.context.QueryMacro.expandQuery(QueryMacro.scala:51)
[error]         at io.getquill.context.QueryMacro.runQuery(QueryMacro.scala:34)
never mind that, I have some more basic questions/issues to address/understand. I'll ask something simpler if required.
Philippe Derome
@phderome
It seems to be about invalid usage of nesting of case classes, which makes sense that I'd need Decoder or Encoder for nesting case classes.