Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:56
    JulienSt starred getquill/quill
  • Dec 03 08:44
    cordawyn starred getquill/quill
  • Dec 01 17:34
    scala-steward closed #2329
  • Dec 01 17:34
    scala-steward commented #2329
  • Dec 01 17:34
    scala-steward opened #2330
  • Dec 01 12:40
    tmatsuura starred getquill/quill
  • Nov 30 19:37
    scala-steward closed #2326
  • Nov 30 19:37
    scala-steward commented #2326
  • Nov 30 19:37
    scala-steward opened #2329
  • Nov 30 16:07
    deusaquilus commented #2328
  • Nov 30 16:04
    vladimirkl opened #2328
  • Nov 29 21:46
    scala-steward opened #2327
  • Nov 29 18:30
    virusdave commented #1929
  • Nov 29 18:18
    ivanopagano starred getquill/quill
  • Nov 28 16:56
    takahiro-prog starred getquill/quill
  • Nov 28 14:59
    deusaquilus commented #1929
  • Nov 28 11:08
    RAOE starred getquill/quill
  • Nov 27 22:50
    deusaquilus commented #2325
  • Nov 27 18:50
    sebastiansen starred getquill/quill
  • Nov 27 10:45
    RAOE starred getquill/quill
Alexander Ioffe
@deusaquilus
I think a sophisticated system of Query-hinting would solve 95% of problems. I.e something like:
select foo, bar from #cache(select  bar baz from someplace) as s join something sn on #index(sn.foo = bar)
Alexander Ioffe
@deusaquilus
Anyhow, I really don't like liftQuery actually. I think it should be replaced with liftDataset in Spark, and the inLifted operator
(e.g. people.filter(p => p.name inLifted (set)) )
Maybe inLiftedSet
Then there should be liftUnest which does something like we did above
it would be really nice to have just people.filter(p => p.name inSet (lift(set))) though
Maybe for Dotty I could do that
I already have multiple interpretations of lift in Dotty
Li Haoyi
@lihaoyi-databricks
lol i feel like 80% of my postgres performance optimization efforts are just splitting up big queries into smaller ones and suffering the additional/unnecessary round trips just so the query planner doesn't do something stupid again
I just had to do that
again
"filtering a few dozen items from one table and joining into two other tables, time to do three huge table scans"
Alexander Ioffe
@deusaquilus
image.png
Reposting here. Quill ZIO JDBC and Quill ZIO Cassandra are both out. They will become available on repo1.maven.org as soon as their next re-index happens.
(right now you can get them from https://oss.sonatype.org/)
Alexander Ioffe
@deusaquilus

doing something like infix"""SELECT * from employee where name = ppp for update""".as[Query[Employee]] maybe?

@juanux Have a look at infix. There's FOR UPDATE is in our example code.
https://getquill.io/#extending-quill-infix

@TheMover Contains on multiple columns doesn't work. Sorry.
druhinsgoel
@druhinsgoel
@deusaquilus I have the following dynamic query function that worked fine on 3.5.3 but throws a nasty error during runtime in any subsequent version:
val ctx = new SqlMirrorContext(PostgresDialect, SnakeCase)

import ctx._

case class Person(id: Long, name: String, age: Int)

def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(idOpt)((person, id) => quote(person.id == id))
   )
}
scala.ScalaReflectionException: class io.getquill.Udt in JavaMirror with ScalaReflectClassLoader(file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.13.4/scala-reflect-2.13.4.jar parent = WrappedClassLoader(CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
})) of type class sbt.internal.ScalaReflectClassLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.13.4/scala-reflect-2.13.4.jar] and parent being WrappedClassLoader(CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
}) of type class sbt.internal.classpath.WrappedLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar] and parent being CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
} of type class sbt.internal.classpath.ClassLoaderCache$Key$CachedClassLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar] and parent being TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667) of type class xsbt.boot.Launch$TestInterfaceLoader$3 with classpath [file:/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar] and parent being xsbt.boot.BootFilteredLoader@74e28667 of type class xsbt.boot.BootFilteredLoader with classpath [<unknown>] and parent being sun.misc.Launcher$AppClassLoader@70dea4e of type class sun.misc.Launcher$AppClassLoader with classpath [file:/app/sbt/sbt/bin/sbt-launch.jar] and parent being sun.misc.Launcher$ExtClassLoader@3c3d9b6b of type class sun.misc.Launcher$ExtClassLoader with classpath [file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunec.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunpkcs11.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunjce_provider.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/dnsns.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/cldrdata.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/jaccess.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/nashorn.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/zipfs.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/localedata.jar] and parent being primordial classloader with boot classpath [/usr/lib/jvm/java-1.8-openjdk/jre/lib/resources.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/rt.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jsse.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jce.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/charsets.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jfr.jar:/usr/lib/jvm/java-1.8-openjdk/jre/classes] not found.
    at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:145)
    at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:29)
    at io.getquill.quat.QuatMakingBase$DefiniteValue$1$$typecreator1$1.apply(QuatMaking.scala:238)
    at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:237)
    ...
That's the error ^^
Am I doing something wrong or is this a regression?
druhinsgoel
@druhinsgoel
Alexander Ioffe
@deusaquilus
Ouch!
I think that's an issue trying to figure out the Quat
it's probably a dynamicQuery regression
could you file it as an issue?
wait a second
try lifting idOpt
def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(lift(idOpt))((person, id) => quote(person.id == id))
   )
}
druhinsgoel
@druhinsgoel
That doesn't work either
druhinsgoel
@druhinsgoel
I'll file it as an issue
druhinsgoel
@druhinsgoel
@deusaquilus What is the correct way to create an infix for an dynamic query? For example, if I want to do a FOR UPDATE on a dynamic query:
def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(lift(idOpt))((person, id) => quote(person.id == id))
      .forUpdate
   )
}
David Bouyssié
@david-bouyssie
Hi there. I'm searching for documentation about RunQueryResult.
I'm wondering why it is expressed as a List instead of Iterable.
It lets me think that all the records are gathered after ctx.run, while I would prefer a lazy loading/iteration.
Philippe Derome
@phderome
Can I use dynamic queries over JDBC (in stream mode preferably, but that's orthogonal concern) where the number of columns and their types are well defined at runtime for each row but not at compile time (I'd know which binding to do for each column on a Sql.ResultSet) from which I would get for each row a List[Any] to represent the row. Is there support for that?
Philippe Derome
@phderome
Maybe the solution would entail doing casting to ::text within the SQL and now we have a uniform list of strings to return (as I am not interested in the actual type safe values, just the string representations). So something like .as[Query[List[String]]]
e-Evolution
@e-Evolution
Hi everybody

I am testing the new implementation with ZIO, but I have the problem that the implicit val organizationSchemaMeta with which I have mapped the database columns does not work.

Unlike the other implementations I am using context.stream:

context.stream ( query [Organization] .filter (_. organizationId == lift (organizationId)) )
Any idea how to solve, here is my code:

implicit val organizationSchemaMeta: context.SchemaMeta[Organization] = schemaMeta[Organization]( "AD_Org", _.tenantId -> "AD_Client_ID", _.organizationId -> "AD_Org_ID", _.isActive -> "IsActive", _.created -> "Created", _.createdBy -> "CreatedBy", _.updated -> "Updated", _.updatedBy -> "UpdatedBy", _.value -> "Value", _.name -> "Name", _.description -> "Description", _.replicationStrategyId -> "AD_ReplicationStrategy_ID", _.parentOrganizationId -> "Parent_Org_ID", _.uuid -> "UUID" )
def getById( organizationId: Id ): ZStream[ OrganizationZioRepository with QConnection, Throwable, Organization ] = context.stream( query[Organization].filter(_.organizationId == lift(organizationId)) )
the error is :
╠─A checked error was not handled. ║ org.postgresql.util.PSQLException: ERROR: column x14.clientid does not exist ║ Position: 8
so it is not mapping the database columns
Oleg Pyzhcov
@oleg-py
could happen if e.g. "clientid" is defined as a field of Organization case class but you don't have it remapped in this schema meta
e-Evolution
@e-Evolution
ok let me check
e-Evolution
@e-Evolution
@oleg-py thank alot , my mapping had an error , I solve now
Adriani Furtado
@Adriani277
Hey all, I have come across the Codegen and has a few questions on it.
  1. I am using new SimpleJdbcCodegen(datasource, "my.package"), my datasource is pointed to a specific schema. However when the code gets generated it pulls every schema in my server, I'm I doing something wrong here?
  2. My next question is, assuming I have solved issue 1, is it possible for me to filter which tables in a schema I would like to generate? We have a schema of about 20 table and I would like to generate the case classes for only 4 of those
Mathieu Prevel
@mprevel

Hi everyone,

I try to use quill with monix to get a stream of db objects where rows are filtered on an enum column.

When not using a stream I am able to get the expected results from the database.

query[MyTable]
.filter(_.myField == lift(value))

Stream 1:
When filtering the query before using it as a stream it compiles but crashes at runtime.

stream(query[MyTable]
.filter(_.myField == lift(value))
)
'ERROR: operator does not exist: t_my_enum = character varying
Indice : No operator matches the given name and argument types. You might need to add explicit type casts.
Position : 198'. Aborting connection.
...
Caused by: [CIRCULAR REFERENCE: org.postgresql.util.PSQLException: ERROR: operator does not exist: t_my_enum = character varying
...

Stream 2:
When filtering the stream (not at the query level) it works as expected.

stream(query[MyTable])
.filter(_.myField == value)

Is it expected for the stream 1 to fail ?
Should I use the stream 2 version ? Does it mean that the whole table will be streamed to the application and filtered by the application ?

dassum
@dassum
how can I use usingTtl for my cassandra inserts when I am using dynamicQuery
Akhil Kodali
@akhil

Hi, I am using MysqlMonixJdbcContext and trying to do a batch insert with

run(
      liftQuery(images).map(i => query[Image].insert(i).onConflictIgnore)
  )

But I get the following error

exception during macro expansion: 
scala.reflect.macros.TypecheckException: Case class type io.getquill.Insert[Image] has no values
Artūras Šlajus
@arturaz
Is it possible to write extension methods for Quill contexts?