Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Jul 01 00:37
  • Jun 30 19:43
    deusaquilus edited #2513
  • Jun 30 19:41
    deusaquilus edited #2513
  • Jun 30 19:40
    deusaquilus opened #2513
  • Jun 30 14:16
    esthomw starred zio/zio-quill
  • Jun 30 08:52
    cnmade starred zio/zio-quill
  • Jun 30 08:47
    renovate[bot] edited #2351
  • Jun 29 18:33

    deusaquilus on website

    (compare)

  • Jun 29 18:33

    deusaquilus on v4.0.0

    (compare)

  • Jun 29 18:33

    deusaquilus on master

    Setting version to 4.0.0 Setting version to 4.0.1-SNAPSH… (compare)

  • Jun 29 18:32

    deusaquilus on website

    (compare)

  • Jun 29 17:52

    deusaquilus on master

    Trigger Release 4.0.0 (#2512) (compare)

  • Jun 29 17:52
    deusaquilus closed #2512
  • Jun 29 15:35
    deusaquilus opened #2512
  • Jun 29 15:23
    deusaquilus closed #2511
  • Jun 29 15:23

    deusaquilus on master

    Bump to ZIO 2.0.0 Release Versi… (compare)

  • Jun 29 14:49
    patagona-snayyer commented #1296
  • Jun 29 14:47
    patagona-snayyer commented #1296
  • Jun 29 14:09
    deusaquilus synchronize #2511
  • Jun 29 12:41
    deusaquilus synchronize #2511
Li Haoyi
@lihaoyi-databricks
can we use .as[Query[Unnest]] to do that
Alexander Ioffe
@deusaquilus
Hum...
Li Haoyi
@lihaoyi-databricks
like
val strings = quote(infix"UNNEST(${liftQuery(Seq("...", "...")}").as[Query[String])
Alexander Ioffe
@deusaquilus
Yeah
that produces the same query:
@ case class Unnest(unnest: String)

@ val strings = quote {
    infix"UNNEST(${lift(Seq("foo","bar"))})".as[Query[Unnest]].nested
  }
@ run {
    query[ResultName]
      .rightJoin(strings)
      .on(_.text == _.unnest)
      .map{case (rnOpt, n) => rnOpt.map(_.id)}
  }
cmd67.sc:1: SELECT x1.id FROM ResultName x1 RIGHT JOIN (SELECT x.unnest FROM (UNNEST(?)) AS x) AS x2 ON x1.text = x2.unnest
I over-complicated it
Yup, that's the best variation so far
oh
wait
I'm not sure if that will work
Li Haoyi
@lihaoyi-databricks
this gives as syntax error
Alexander Ioffe
@deusaquilus
Yeah
in that case you need to manually specify the "SELECT" part
This should work:
@ case class Unnest(unnest: String)

@ val strings = quote {
    infix"SELECT UNNEST(${lift(Seq("foo","bar"))})".as[Query[Unnest]].nested
  }

@ run {
  query[ResultName]
    .rightJoin(strings)
    .on(_.text == _.unnest)
    .map{case (rnOpt, n) => rnOpt.map(_.id)}
}
It yields:
 SELECT x1.id FROM ResultName x1 RIGHT JOIN (SELECT x.unnest FROM (SELECT UNNEST(?)) AS x) AS x2 ON x1.text = x2.unnest
Li Haoyi
@lihaoyi-databricks
yeah that works
what's that .nested thing do?
is it to force a subquery
Alexander Ioffe
@deusaquilus
yup
let me double-check if it's needed
Li Haoyi
@lihaoyi-databricks
seems to work without it
Alexander Ioffe
@deusaquilus
Yup
all good
Li Haoyi
@lihaoyi-databricks
so this is the state of the art
@ ctx.run {
      query[db.ResultName]
        .rightJoin(infix"SELECT UNNEST(${lift(Seq("foo","test-shard-local-database"))})".as[io.getquill.Query[Unnest]])
        .on(_.text == _.unnest)
        .map{case (rnOpt, n) => rnOpt.map(_.id)}
    }
cmd12.sc:1: SELECT x1.id FROM result_name x1 RIGHT JOIN (SELECT UNNEST(?)) AS x2 ON x1.text = x2.unnest
val res12 = ctx.run {
                    ^
res12: List[Option[Long]] = List(None, Some(2674443566L))
Alexander Ioffe
@deusaquilus
lol
Li Haoyi
@lihaoyi-databricks
basically the only thing I was missing is the UNNEST thing to turn the array into a table
Alexander Ioffe
@deusaquilus
yeah
I don't think I could adapt liftQuery to this kind of functionality
The problem is that postgres expects a query coming out of unnest, not a scalar
Li Haoyi
@lihaoyi-databricks
a bit of a weird postgres quirk but I guess not the worst one I've hit
Alexander Ioffe
@deusaquilus
yeah, lots of that in postgres
Li Haoyi
@lihaoyi-databricks
doesn't compare to the time where deleting old records made the query planner go haywire and start doing table scans
Alexander Ioffe
@deusaquilus
heh, maybe liftUnnestQuery(list)
SQL query planners are the bane of my existance
half the time if they'd just cache a complex sub-view the entire problem would be solved but there's no directive in SQL to do that. You'd think that what CTFs do but it's not
Li Haoyi
@lihaoyi-databricks
basically SQL is the wrong level of abstraction. It tries to hide the implementation, but whether a query runs in 40milliseconds or 40minutes actually matters for a lot of use cases...
Most of the time I would be happier writing query plans directly
Alexander Ioffe
@deusaquilus
Lol, welcome to my life
Li Haoyi
@lihaoyi-databricks
Like I want to specify what index the query will use, and if I want a table scan I'll ask for it thank you very much
Alexander Ioffe
@deusaquilus
The problem is, if we start doing that we're basically back to writing stored-procs... that's essentially what they do
Li Haoyi
@lihaoyi-databricks
sounds good to me
I think this query plan funkiness is a large reason why "dumber" databases like Mongo took off
Alexander Ioffe
@deusaquilus
Nah, stored-procs are a nightmare to maintain. They're too low level.
Li Haoyi
@lihaoyi-databricks
sure talking to mongo may involve over-fetching tons of data, and lots of round-trips, but at least it's a predictable about of over-fetching and round trips
whereas postgres things hum along nicely until suddenly your query plan crosses some heuristic and all hell breaks loose
and naturally it only happens in production
hooray
Alexander Ioffe
@deusaquilus
... and that's databases in a nutshell!
that's why databases are a sub-speciality
that's why and entire class of pseudo-engineer was created to manage them