renovate[bot] on configure
Add renovate.json (compare)
deusaquilus on re-release-zio2
Integrating ZIO2 Bump to ZIO2 RC5. Refactor when… initial work (#2477) and 2 more (compare)
deusaquilus on v3.19.0
deusaquilus on website
deusaquilus on master
Setting version to 3.19.0 Setting version to 3.19.1-SNAPS… (compare)
deusaquilus on website
deusaquilus on master
Trigger Release 3.19.0 (#2510) (compare)
deusaquilus on master
More returningMany fixes (compare)
deusaquilus on master
Update Docs about returning/ret… (compare)
deusaquilus on returningMany
@deusaquilus
Also, it should be possible to create something like quill-generic in Scala 3. It's just a matter of time and labor.
Proto Quill basically has a bus factor of 1 (you being the 1 :)), so the nice-to-haves might not come soon enough for those looking to use proto Quill in the near-term. Saying that, I noticed yesterday that there's a code generator for Quill that could be used to generate DAO boilerplate without relying on an external library.
Hopefully Scala 3 macros will be able to replicate full Scala 2 Quill feature set without too many workarounds/concessions required.
Impressive work, btw, you've clearly taken the deep dive into Scala 3 macros (just testing the waters here now)
Any resources the community can access that reveal the inner workings of the Scala 3 compiler vis-a-vis macros would be greatly appreciated. Through much trial and error, tree printing, etc. I managed to convert a complex (for me) Scala 2 macro to the Scala 3 equivalent. Learned a lot in the process but it was more challenging than it needed to be. If we had thorough documentation and real world examples to work from, the Scala 2 to Scala 3 macro migration would go much more smoothly, not to mention faster.
At any rate Quill is the gold standard of Scala 2/3 macros.
Also, in regard to implicit conversion from case class companion object Foo to query[Foo], I suspect that until lampepfl/dotty#7000 is resolved, getting that functionality working in Scala 3 will be quite the challenge. Basically flying blind, all you get with Foo.join(x => ...)
at the macro level is a measely Product
, no companion object Mirror, no case class ClassDef, nada.
extension [T](inline ent: T.type)
inline def insert(inline f: (T => (Any, Any)), inline f2: (T => (Any, Any))*): Insert[T] = query[T].insert(f, f2: _*)
inline def update(inline f: (T => (Any, Any)), inline f2: (T => (Any, Any))*): Update[T] = query[T].update(f, f2: _*)
inline def insert(inline value: T): Insert[T] = query[T].insert(value)
inline def update(inline value: T): Update[T] = query[T].update(value)
inline def map[R](f: T => R): Query[R] = query[T].map(f)
inline def flatMap[R](f: T => Query[R]): Query[R] = query[T].flatMap(f)
inline def concatMap[R, U](f: T => U)(implicit ev: U => Iterable[R]): Query[R] = query[T].concatMap(f)
inline def withFilter(f: T => Boolean): Query[T] = query[T].withFilter(f)
inline def filter(f: T => Boolean): Query[T] = query[T].filter(f)
// etc... look into everything in Model.scala in regular-Quill
@deusaquilus T.type
isn't valid scala syntax, but it would be convenient if it was :)
Examining scala 3 case class compiler generated source code in more detail, it's worse than I thought -- all we have to work with in the companion object is java.lang.Object
, but still that's a start, can at least trigger macro invocation with static Foo.filter(...)
call. From there I think we should be able to add a second type param to represent companion class T <: Product
and check if it's a case class within body of macro; if not, error and abort, or else proceed with construction of (x: U) => query[T]
Conversion[U, T] macro return type.
i noticed a confusing (at least to me) behaviour when trying out quill with spark. i have this small example (scala=2.12.13,spark=3.1.2,quill=3.11.0)
import io.getquill.QuillSparkContext._
import io.getquill.Quoted
import org.apache.spark.sql.{Dataset, SQLContext}
import java.time.Instant
object TestQuill {
implicit val instantEncoder: MappedEncoding[Long, Instant] = MappedEncoding[Long, Instant](Instant.ofEpochMilli)
implicit val instantDecoder: MappedEncoding[Instant, Long] = MappedEncoding[Instant, Long](_.toEpochMilli)
implicit class InstantQuotes(left: Instant) {
def <(right: Instant): Quoted[Boolean] = quote(infix"$left < $right".as[Boolean])
}
implicit val sqlCtx: SQLContext = ???
import sqlCtx.implicits._
val ds: Dataset[Instant] = ???
val ts: Instant = ???
run {
liftQuery(ds).filter(_ < lift(ts))
}
}
if i specify the output type for the extention method explicitly def <(right: Instant): Quoted[Boolean] = ...
it doesn't compile:
The query definition must happen within a `quote` block.
[error] liftQuery(ds).filter(_ < lift(ts))
[error] ^
if i leave out the type def <(right: Instant) = ...
it compiles fine and generates the expected query.
i'm just curious if someone could explain this difference.
I need to accept a dynmica list of sort fields and ordering as input and translate that into a database query. I’ve seen a number of questions around how to accomplish dynamic sortBy
blocks but still haven’t seen a working example that doesn’t use infix
to dynamically generate the ORDER BY
section of a query. Has anyone managed to accoplish this or have an example?
I’ve managed something like this but it generates a nested SQL query for each element in my sortFields
list rather than a single query with a single ORDER BY
block:
val sortFields: Seq[SortField] = ???
val qry = myTable.filter(…)…..
sortFields.foldLeft(qry) { (q, field) =>
field match {
case NameAsc => q.sortBy(_.name)(Ord.Asc)
case NameDesc => q.sortBy(_.name)(Ord.Desc)
case IdAsc => q.sortBy(_.id)(Ord.Asc)
case IdDesc => q.sortBy(_.id)(Ord.Desc)
...
}
}
Here's the route in the webserver, which is based on Akka http
~ path(IntNumber) { id =>
concat(
get(complete(Future(DB.aTodo(id)))),
delete(complete(Future(DB.deleteTodo(id))))
)
And here's the DB "implementation"
val ctx = new PostgresJdbcContext(SnakeCase, new HikariDataSource(config))
import ctx._
def aTodo(id: Int) : Seq[Todo] = ctx.run { query[Todo].filter(_.todoId == lift(id)) }
inline def updateTodo(todo: Todo) = ctx.run {
query[Todo].filter(_.todoId == lift(todo.todoId)).update(lift(todo))
}
Issue 1
Looking at the docs, the "canonical compile time" solution for aTodo
would be inline def
. If I do that, I get a message about "could not summon a parser factory, and cannot find a "scala.Int" Encoder of Id. So figured I leave that without the inline.
Issue 2
"io.getquill" %% "quill-jdbc" % "3.10.0.Beta1.6",
Hi,
I would like to make use of the "insert on conflict update where is distinct from"-clause in Postgres in a generic fashion. Therefore I want to create a macro where I can write something like this:
case class Domains(id:Int, domain:String)
quote{
insert[Domains](Domains(1, "a"), _.id)
}
Where the macro should be something like this:
def insert[A](row: A, idColumns: (A => Any)*): Insert[A] = macro InsertMacro.insertImpl[A]
def insertImpl[A: c.WeakTypeTag](c: Context)(row:c.Expr[A],idColumns:c.Expr[A => Any]*): c.universe.Tree= {
import c.universe._
val t=weakTypeOf[A]
val r=q"""
val q = query[$t].insert($row)
val tableName = naming.table(${t.typeSymbol.name.toString})
infix"$$q ON CONFLICT (id) DO UPDATE SET id = excluded.id WHERE (#$$tableName.*) IS DISTINCT FROM (excluded.*)".as[Insert[${t}]]
"""
r
}
Unfortunately, the compile fails with Tree 'context.naming.table("Domains")' can't be parsed to 'Ast'
.
Is there a way to translate the case class name to the table name and use it in the infix part within the quote{...} block?
@andyfr Is naming
available at compile time? If so create the tableName variable in the macro & try something like this:
def insertImpl[A: c.WeakTypeTag](c: Context)(row:c.Expr[A],idColumns:c.Expr[A => Any]*): c.universe.Tree= {
import c.universe._
val t=weakTypeOf[A]
val tableName = naming.table(t.typeSymbol.name.toString)
val r=q"""
val q = query[$t].insert($row)
infix"$$q ON CONFLICT (id) DO UPDATE SET id = excluded.id WHERE (${io.getquill.ast.EntityQuery($tableName)}.*) IS DISTINCT FROM (excluded.*)".as[Insert[${t}]]
"""
r
}
That way you can splice the tableName into the query as a static string and the whole thing should be able to produce a compile-time query
Hello there!
I'm totally new to quill and playing a bit with proto-quill on top of a toy sqlite database, I can run simple queries fine but struggle with putting together joins,
here's the error I'm getting:
Main.scala:43:17: Exception occurred while executing macro expansion.
scala.MatchError: TypedOrTest(Unapply(TypeApply(Select(Ident("Some"), "unapply"), List(Inferred())), Nil, List(Bind("it", Wildcard()))), Inferred()) (of class java.lang.String)
Here are my dependencies:
ivy"org.xerial:sqlite-jdbc:3.36.0.3",
ivy"io.getquill::quill-jdbc:3.17.0.Beta3.0-RC2"
and here is in substance the code
import io.getquill._
import io.getquill.QueryDsl.like
object Main {
case class Artist(name: String)
case class BuilderID(id: Int)
case class Builder(id: BuilderID, name: String)
case class InstrumentTypeID(id: Int)
case class InstrumentType(id: InstrumentTypeID, description: String)
case class InstrumentID(id: Int)
case class Instrument(
id: InstrumentID,
typeId: InstrumentTypeID,
builder: BuilderID,
name: String
)
val ctx = new SqliteJdbcContext(SnakeCase, "ctx")
import ctx._
def main(args: Array[String]): Unit = {
inline def someInstruments = quote {
query[Instrument]
.leftJoin(query[InstrumentType])
.on((i: Instrument, it: InstrumentType) => i.typeId == it.id)
.filter({ case (i, Some(it)) => it.description.startsWith("Ukulele") })
.map((i: Instrument, it: Option[InstrumentType]) => i)
}
println(run(someInstruments))
interestingly, it's seems to be an idiosyncratic thing
inline def someInstruments = quote {
query[Instrument]
.leftJoin(query[InstrumentType])
.on((i: Instrument, it: InstrumentType) => i.typeId == it.id)
.filter(_._2.map(_.description.startsWith("Ukulele")).isDefined)
.map((i: Instrument, _) => i.name)
}
would work, on the other hand. So I guess one has to find a syntax that pleases the compiler and stick to it.
val q = quote {
for {
user <- dynamicQuerySchema[User]("user")
_ <- dynamicQuerySchema[UserToContract]("userToContrat").join { utc =>
{
contractIdOpt match {
case Some(id) => quote(utc.contractId == lift(id))
case None => quote(true)
}
} && utc.userId == user.id
}
} yield user
}
quote {
query[User]
.join(query[UserToContract])
.on((u: User, utc: UserToContract) => utc.contractId == lift(id) && utc.userId == u.id)
//.map((u: User, _) => u)
}
Hi, I am using the quill-cassandra-zio library.
I have a simple query like this:
private def itemItemsetQuery(itemId: Long, catalog: String) = quote(
querySchemaItemItemsets.filter(c => c.itemId == lift(itemId)).filter(.catalog == lift(catalog)).map(.itemsets)
)
run(itemItemsetQuery(itemId, catalog))
This compiles fine and works fine too but it gives me the below warning every time it is run:
Re-preparing already prepared query is generally an anti-pattern and will likely affect performance. Consider preparing the statement only once. Query='SELECT itemsets FROM item_itemsets WHERE item_id = ? AND catalog = ?
Does someone know what is wrong with my query? this is how the documentation suggests it should be done too so I have not been able to fix this yet