Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 16:05
    JanekKar starred zio/zio-quill
  • 15:27
    Yomanz synchronize #2374
  • 12:39
    justcoon commented #2359
  • 12:22
    CLAassistant commented #2382
  • 12:20
    CLAassistant commented #2382
  • 12:20
    tschuchortdev opened #2382
  • 06:48
    VladKopanievExt starred zio/zio-quill
  • 06:39
    renovate[bot] edited #2351
  • 06:38
    renovate[bot] synchronize #2351
  • 06:38

    renovate[bot] on configure

    Add renovate.json (compare)

  • 04:46

    deusaquilus on expand-infix-subquery

    continue (compare)

  • Jan 16 21:21

    deusaquilus on expand-infix-subquery

    continue (compare)

  • Jan 16 19:56
    Yomanz synchronize #2374
  • Jan 16 19:24
    Yomanz synchronize #2374
  • Jan 16 17:16
    deusaquilus commented #2240
  • Jan 16 17:15
    deusaquilus commented #2240
  • Jan 16 17:10
    deusaquilus commented #1529
  • Jan 16 11:42

    deusaquilus on expand-infix-subquery

    Expand infix subquery option to disable (compare)

  • Jan 16 11:26

    deusaquilus on no-tuple-elaboration-2

    (compare)

  • Jan 16 11:25

    deusaquilus on master

    Remove tuple elaboration and re… (compare)

Li Haoyi
@lihaoyi-databricks
I don't really see how a DBA would deal with this stuff any better than I would though
Alexander Ioffe
@deusaquilus
He wouldn't. That's the problem
Li Haoyi
@lihaoyi-databricks
in the end its still "postgres decided to go haywire in production because N rows became N+1"
I complain a lot about postgres but I don't imagine MySQL is any better, it's not known for having a lack of footguns
Alexander Ioffe
@deusaquilus
I think MySQL is worse
nafg
@nafg
Postgres is way better than MySQL
Also, MySQL is way better than it used to be...
At least that's my understanding
Alexander Ioffe
@deusaquilus
SQL Server and Oracle are a bit more consistent in how the handle workloads but they'll cost you an arm and a leg (and your immortal soul for the latter as well)
I think a sophisticated system of Query-hinting would solve 95% of problems. I.e something like:
select foo, bar from #cache(select  bar baz from someplace) as s join something sn on #index(sn.foo = bar)
Alexander Ioffe
@deusaquilus
Anyhow, I really don't like liftQuery actually. I think it should be replaced with liftDataset in Spark, and the inLifted operator
(e.g. people.filter(p => p.name inLifted (set)) )
Maybe inLiftedSet
Then there should be liftUnest which does something like we did above
it would be really nice to have just people.filter(p => p.name inSet (lift(set))) though
Maybe for Dotty I could do that
I already have multiple interpretations of lift in Dotty
Li Haoyi
@lihaoyi-databricks
lol i feel like 80% of my postgres performance optimization efforts are just splitting up big queries into smaller ones and suffering the additional/unnecessary round trips just so the query planner doesn't do something stupid again
I just had to do that
again
"filtering a few dozen items from one table and joining into two other tables, time to do three huge table scans"
Alexander Ioffe
@deusaquilus
image.png
Reposting here. Quill ZIO JDBC and Quill ZIO Cassandra are both out. They will become available on repo1.maven.org as soon as their next re-index happens.
(right now you can get them from https://oss.sonatype.org/)
Alexander Ioffe
@deusaquilus

doing something like infix"""SELECT * from employee where name = ppp for update""".as[Query[Employee]] maybe?

@juanux Have a look at infix. There's FOR UPDATE is in our example code.
https://getquill.io/#extending-quill-infix

@TheMover Contains on multiple columns doesn't work. Sorry.
druhinsgoel
@druhinsgoel
@deusaquilus I have the following dynamic query function that worked fine on 3.5.3 but throws a nasty error during runtime in any subsequent version:
val ctx = new SqlMirrorContext(PostgresDialect, SnakeCase)

import ctx._

case class Person(id: Long, name: String, age: Int)

def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(idOpt)((person, id) => quote(person.id == id))
   )
}
scala.ScalaReflectionException: class io.getquill.Udt in JavaMirror with ScalaReflectClassLoader(file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.13.4/scala-reflect-2.13.4.jar parent = WrappedClassLoader(CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
})) of type class sbt.internal.ScalaReflectClassLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-reflect/2.13.4/scala-reflect-2.13.4.jar] and parent being WrappedClassLoader(CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
}) of type class sbt.internal.classpath.WrappedLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar] and parent being CachedClassloader {
  parent: TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667)
  urls:
    file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar
} of type class sbt.internal.classpath.ClassLoaderCache$Key$CachedClassLoader with classpath [file:/home/sbtRunnerContainer/.cache/coursier/v1/https/repo1.maven.org/maven2/org/scala-lang/scala-library/2.13.4/scala-library-2.13.4.jar] and parent being TestInterfaceLoader(/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar,xsbt.boot.BootFilteredLoader@74e28667) of type class xsbt.boot.Launch$TestInterfaceLoader$3 with classpath [file:/home/sbtRunnerContainer/.sbt/boot/scala-2.12.10/org.scala-sbt/sbt/1.3.13/test-interface-1.0.jar] and parent being xsbt.boot.BootFilteredLoader@74e28667 of type class xsbt.boot.BootFilteredLoader with classpath [<unknown>] and parent being sun.misc.Launcher$AppClassLoader@70dea4e of type class sun.misc.Launcher$AppClassLoader with classpath [file:/app/sbt/sbt/bin/sbt-launch.jar] and parent being sun.misc.Launcher$ExtClassLoader@3c3d9b6b of type class sun.misc.Launcher$ExtClassLoader with classpath [file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunec.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunpkcs11.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/sunjce_provider.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/dnsns.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/cldrdata.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/jaccess.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/nashorn.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/zipfs.jar,file:/usr/lib/jvm/java-1.8-openjdk/jre/lib/ext/localedata.jar] and parent being primordial classloader with boot classpath [/usr/lib/jvm/java-1.8-openjdk/jre/lib/resources.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/rt.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jsse.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jce.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/charsets.jar:/usr/lib/jvm/java-1.8-openjdk/jre/lib/jfr.jar:/usr/lib/jvm/java-1.8-openjdk/jre/classes] not found.
    at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:145)
    at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:29)
    at io.getquill.quat.QuatMakingBase$DefiniteValue$1$$typecreator1$1.apply(QuatMaking.scala:238)
    at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:237)
    ...
That's the error ^^
Am I doing something wrong or is this a regression?
druhinsgoel
@druhinsgoel
Alexander Ioffe
@deusaquilus
Ouch!
I think that's an issue trying to figure out the Quat
it's probably a dynamicQuery regression
could you file it as an issue?
wait a second
try lifting idOpt
def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(lift(idOpt))((person, id) => quote(person.id == id))
   )
}
druhinsgoel
@druhinsgoel
That doesn't work either
druhinsgoel
@druhinsgoel
I'll file it as an issue
druhinsgoel
@druhinsgoel
@deusaquilus What is the correct way to create an infix for an dynamic query? For example, if I want to do a FOR UPDATE on a dynamic query:
def queryPerson(
    idOpt: Option[Long] = None
) = {
  ctx.run(
    dynamicQuery[Person]
      .filterOpt(lift(idOpt))((person, id) => quote(person.id == id))
      .forUpdate
   )
}
David Bouyssié
@david-bouyssie
Hi there. I'm searching for documentation about RunQueryResult.
I'm wondering why it is expressed as a List instead of Iterable.
It lets me think that all the records are gathered after ctx.run, while I would prefer a lazy loading/iteration.
Philippe Derome
@phderome
Can I use dynamic queries over JDBC (in stream mode preferably, but that's orthogonal concern) where the number of columns and their types are well defined at runtime for each row but not at compile time (I'd know which binding to do for each column on a Sql.ResultSet) from which I would get for each row a List[Any] to represent the row. Is there support for that?
Philippe Derome
@phderome
Maybe the solution would entail doing casting to ::text within the SQL and now we have a uniform list of strings to return (as I am not interested in the actual type safe values, just the string representations). So something like .as[Query[List[String]]]
e-Evolution
@e-Evolution
Hi everybody

I am testing the new implementation with ZIO, but I have the problem that the implicit val organizationSchemaMeta with which I have mapped the database columns does not work.

Unlike the other implementations I am using context.stream:

context.stream ( query [Organization] .filter (_. organizationId == lift (organizationId)) )
Any idea how to solve, here is my code:

implicit val organizationSchemaMeta: context.SchemaMeta[Organization] = schemaMeta[Organization]( "AD_Org", _.tenantId -> "AD_Client_ID", _.organizationId -> "AD_Org_ID", _.isActive -> "IsActive", _.created -> "Created", _.createdBy -> "CreatedBy", _.updated -> "Updated", _.updatedBy -> "UpdatedBy", _.value -> "Value", _.name -> "Name", _.description -> "Description", _.replicationStrategyId -> "AD_ReplicationStrategy_ID", _.parentOrganizationId -> "Parent_Org_ID", _.uuid -> "UUID" )
def getById( organizationId: Id ): ZStream[ OrganizationZioRepository with QConnection, Throwable, Organization ] = context.stream( query[Organization].filter(_.organizationId == lift(organizationId)) )
the error is :
╠─A checked error was not handled. ║ org.postgresql.util.PSQLException: ERROR: column x14.clientid does not exist ║ Position: 8