Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 08:09

    deusaquilus on sketch-of-batch-insert

    continue (compare)

  • 08:07

    deusaquilus on sketch-of-batch-insert

    continue (compare)

  • 08:03

    deusaquilus on better-batch-log

    (compare)

  • 08:03
    deusaquilus edited #2562
  • 08:02
    deusaquilus edited #2562
  • 08:02

    deusaquilus on master

    Slightly better batch logging (… (compare)

  • 08:02
    deusaquilus closed #2562
  • 07:53

    deusaquilus on sketch-of-batch-insert

    continue (compare)

  • 07:33

    deusaquilus on sketch-of-batch-insert

    sketching out implementaion of … (compare)

  • 02:53
    deusaquilus commented #1269
  • 02:38
    deusaquilus closed #2563
  • 02:37
    deusaquilus opened #2563
  • 02:27
    deusaquilus synchronize #2562
  • 02:27

    deusaquilus on better-batch-log

    Add Trace-Type for Particulariz… (compare)

  • Aug 07 19:49
    tkersey starred zio/zio-quill
  • Aug 07 18:41
    guizmaii closed #2486
  • Aug 07 06:02
    deusaquilus synchronize #2562
  • Aug 07 06:02

    deusaquilus on better-batch-log

    Slightly better batch logging Add Trace-Type for Particulariz… (compare)

  • Aug 07 03:19
    deusaquilus opened #2562
  • Aug 06 22:26
    steshaw starred zio/zio-quill
Li Haoyi
@lihaoyi-databricks
@ val strings = quote {
    querySchema[Now]("now()")
      .map(t => infix"UNNEST(${lift(Seq("foo","bar"))})".pure.as[String])
  }
strings: AnyRef with Quoted[io.getquill.EntityQuery[String]]{def quoted: io.getquill.ast.Map;def ast: io.getquill.ast.Map;def id1347240670(): Unit;val liftings: AnyRef{val scala.collection.immutable.Seq.apply[String]("foo", "bar"): io.getquill.quotation.ScalarValueLifting[Seq[String],Seq[String]]}} = `querySchema`("now()").map(t => infix"UNNEST(${?})")

@ run {
    query[db.ResultName]
      .rightJoin(infix"${strings}".pure.as[io.getquill.Query[String]])
      .on(_.text == _)
      .map{case (rnOpt, n) => rnOpt.map(_.id)}
  }
cmd5.sc:1: SELECT x1.id FROM result_name x1 RIGHT JOIN (SELECT UNNEST(?) FROM now() t) AS t ON x1.text = t
val res5 = run {
               ^
org.postgresql.util.PSQLException: ERROR: operator does not exist: text = record
  Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
  Position: 94
  org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2440)
  org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2183)
  org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:308)
  org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
  org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
  org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:143)
  org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:106)
  com.zaxxer.hikari.pool.ProxyPreparedStatement.executeQuery(ProxyPreparedStatement.java:52)
  com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeQuery(HikariProxyPreparedStatement.java)
  io.getquill.context.jdbc.JdbcContextBase.$anonfun$executeQuery$1(JdbcContextBase.scala:41)
  io.getquill.context.jdbc.JdbcContextBase.$anonfun$withConnectionWrapped$2(JdbcContextBase.scala:28)
  io.getquill.context.jdbc.JdbcContext$$anon$1.wrap(JdbcContext.scala:31)
  io.getquill.context.jdbc.JdbcContextBase.$anonfun$withConnectionWrapped$1(JdbcContextBase.scala:28)
  io.getquill.context.jdbc.JdbcContext.$anonfun$withConnection$1(JdbcContext.scala:61)
  scala.Option.getOrElse(Option.scala:201)
  io.getquill.context.jdbc.JdbcContext.withConnection(JdbcContext.scala:59)
  io.getquill.context.jdbc.JdbcContextBase.withConnectionWrapped(JdbcContextBase.scala:28)
  io.getquill.context.jdbc.JdbcContextBase.withConnectionWrapped$(JdbcContextBase.scala:27)
  io.getquill.context.jdbc.JdbcContext.withConnectionWrapped(JdbcContext.scala:15)
  io.getquill.context.jdbc.JdbcContextBase.executeQuery(JdbcContextBase.scala:38)
  io.getquill.context.jdbc.JdbcContextBase.executeQuery$(JdbcContextBase.scala:37)
  io.getquill.context.jdbc.JdbcContext.executeQuery(JdbcContext.scala:40)
  ammonite.$sess.cmd5$.<clinit>(cmd5.sc:1)
seems like this is a lot harder than I thought haha
don't worry too much if it's not possible, we'll find some other way around the problem
Alexander Ioffe
@deusaquilus
interesting
we're getting closer
Naftoli Gugenheim
@nafg
The problem is that t is a record
Alexander Ioffe
@deusaquilus
yup
Naftoli Gugenheim
@nafg
not a single value
What is a Query[String]
Is there such a thing as a resultset of scalars? Each row has to be a record, no?
Alexander Ioffe
@deusaquilus
there is actually
@lihaoyi-databricks I'm pretty sure I got it this time:
case class Unnest(unnest: String)

val strings = quote { 
  querySchema[Unnest]("now()")
    .map(t => infix"UNNEST(${lift(Seq("foo","bar"))})".pure.as[String]) 
}

run {
  query[ResultName]
    .rightJoin(infix"${strings}".pure.as[Query[Unnest]])
    .on(_.text == _.unnest)
    .map{case (rnOpt, n) => rnOpt.map(_.id)}
}
Alexander Ioffe
@deusaquilus
@lihaoyi-databricks ?
I checked the query that came out of this, it should really work...
This variation should also work and it's a bit less hacky:
case class Unnest(unnest: String)

val strings = quote { 
  querySchema[Unnest]("now()")
    .map(t => Unnest(infix"UNNEST(${lift(Seq("foo","bar"))})".as[String])).nested
}

run {
  query[ResultName]
    .rightJoin(strings)
    .on(_.text == _.unnest)
    .map{case (rnOpt, n) => rnOpt.map(_.id)}
}
Both of these queries produce this:
SELECT x1.id FROM ResultName x1 RIGHT JOIN (SELECT UNNEST(?) AS unnest FROM now() t) AS x2 ON x1.text = x2.unnest
the first variation produces that without the as unnest part
Li Haoyi
@lihaoyi-databricks
lemme take a look
hah seems to work
unfortunately I have no idea how this works :/ not sure if I dare start using it in production lol
Alexander Ioffe
@deusaquilus
This is an interesting oddity of postgres, when you unnest an array it returns a record with the 'unnest' column
Li Haoyi
@lihaoyi-databricks
is the issue that we cannot join on an array of strings, and instead have to convert it to a table before we can join on it?
Alexander Ioffe
@deusaquilus
what I did was to simulate that by creating an 'Unnest' row type
yup
Li Haoyi
@lihaoyi-databricks
what's the querySchema and now() thing all about
Alexander Ioffe
@deusaquilus
That's to force Quill to treat the unnest(array) as a table so it can be right-joined
now() is just a function in postgres that returns the time
Li Haoyi
@lihaoyi-databricks
can we use .as[Query[Unnest]] to do that
Alexander Ioffe
@deusaquilus
Hum...
Li Haoyi
@lihaoyi-databricks
like
val strings = quote(infix"UNNEST(${liftQuery(Seq("...", "...")}").as[Query[String])
Alexander Ioffe
@deusaquilus
Yeah
that produces the same query:
@ case class Unnest(unnest: String)

@ val strings = quote {
    infix"UNNEST(${lift(Seq("foo","bar"))})".as[Query[Unnest]].nested
  }
@ run {
    query[ResultName]
      .rightJoin(strings)
      .on(_.text == _.unnest)
      .map{case (rnOpt, n) => rnOpt.map(_.id)}
  }
cmd67.sc:1: SELECT x1.id FROM ResultName x1 RIGHT JOIN (SELECT x.unnest FROM (UNNEST(?)) AS x) AS x2 ON x1.text = x2.unnest
I over-complicated it
Yup, that's the best variation so far
oh
wait
I'm not sure if that will work
Li Haoyi
@lihaoyi-databricks
this gives as syntax error
Alexander Ioffe
@deusaquilus
Yeah
in that case you need to manually specify the "SELECT" part
This should work:
@ case class Unnest(unnest: String)

@ val strings = quote {
    infix"SELECT UNNEST(${lift(Seq("foo","bar"))})".as[Query[Unnest]].nested
  }

@ run {
  query[ResultName]
    .rightJoin(strings)
    .on(_.text == _.unnest)
    .map{case (rnOpt, n) => rnOpt.map(_.id)}
}
It yields:
 SELECT x1.id FROM ResultName x1 RIGHT JOIN (SELECT x.unnest FROM (SELECT UNNEST(?)) AS x) AS x2 ON x1.text = x2.unnest
Li Haoyi
@lihaoyi-databricks
yeah that works
what's that .nested thing do?
is it to force a subquery
Alexander Ioffe
@deusaquilus
yup
let me double-check if it's needed
Li Haoyi
@lihaoyi-databricks
seems to work without it
Alexander Ioffe
@deusaquilus
Yup
all good