Compile-time Language Integrated Query for Scala - Deprecated Please move to zio-quill in https://discord.gg/MYErPQgkHH
val q = quote {
for {
user <- dynamicQuerySchema[User]("user")
_ <- dynamicQuerySchema[UserToContract]("userToContrat").join { utc =>
{
contractIdOpt match {
case Some(id) => quote(utc.contractId == lift(id))
case None => quote(true)
}
} && utc.userId == user.id
}
} yield user
}
quote {
query[User]
.join(query[UserToContract])
.on((u: User, utc: UserToContract) => utc.contractId == lift(id) && utc.userId == u.id)
//.map((u: User, _) => u)
}
Hi, I am using the quill-cassandra-zio library.
I have a simple query like this:
private def itemItemsetQuery(itemId: Long, catalog: String) = quote(
querySchemaItemItemsets.filter(c => c.itemId == lift(itemId)).filter(.catalog == lift(catalog)).map(.itemsets)
)
run(itemItemsetQuery(itemId, catalog))
This compiles fine and works fine too but it gives me the below warning every time it is run:
Re-preparing already prepared query is generally an anti-pattern and will likely affect performance. Consider preparing the statement only once. Query='SELECT itemsets FROM item_itemsets WHERE item_id = ? AND catalog = ?
Does someone know what is wrong with my query? this is how the documentation suggests it should be done too so I have not been able to fix this yet
update targeting_history set audiences=audiences+123 where userId='abc'
querySchema[PcaidTargetingHistory](
"pcaid_targeting_history",
_.userId -> "pcaid",
_.audiences -> "audiences"
).filter(c => c.userId == lift(userId)).update(d => d.audiences -> (d.audiences + 1))
)
Tree 'd.audiences.+(1)' can't be parsed to 'Ast'
Hello
I use a tagged type for my uuid's with quill and the MappedEncoding's I defined work well untill the id field in question is optional.
These are the mappings I use:
implicit def idEncoding[T]: MappedEncoding[Id[T], UUID] = MappedEncoding[Id[T], UUID](identity)
implicit def idDecoding[T]: MappedEncoding[UUID, Id[T]] = MappedEncoding[UUID, Id[T]](Id[T])
When The id field in question is optional I get an error:
exception during macro expansion:
scala.reflect.macros.TypecheckException: Can't find implicit `Decoder ...
It says it cannot find implicit decoder for whatever Id tag im working with
Hi,
I have a simple DAO Class as follows:
class AnomalyDao(ctx: PostgresContext) {
import ctx._
def deleteTo( temp: LocalDateTime): Unit = run {
query[Anomaly].filter(anomaly => anomaly.time_inserted.compareTo(lift(temp)) > 0).delete
}
}
case class Anomaly(
offset: Long,
data: Json,
time_inserted: LocalDateTime
)
When I want to insert a LocalDateTime
value or select a SQL timestamps value with Quill, I receive this error:
Tree 'anomaly.time_inserted' can't be parsed to 'Ast'
First, I assumed, that maybe LocalDateTime custom/raw Decoder/Encoder
have not been added, So I tried to explicitly define them:
implicit val localDateTimeDecoder: Decoder[LocalDateTime] =
decoder((index, row) => row.getTimestamp(index, Calendar.getInstance(dateTimeZone)).toLocalDateTime)
implicit val localDateTimeEncoder: Encoder[LocalDateTime] =
encoder(
Types.TIMESTAMP,
(index, value, row) => row.setTimestamp(index, Timestamp.valueOf(value), Calendar.getInstance(dateTimeZone))
)
But, then I got conflicting members error:
class PostgresContext inherits conflicting members:
[error] value localDateTimeDecoder in trait Decoders of type PostgresContext.this.Decoder[java.time.LocalDateTime] and
[error] value localDateTimeDecoder in trait PostgresDateSupport of type PostgresContext.this.Decoder[java.time.LocalDateTime]
So, I realized that I have the appropriate decoder and encoder at compile time; But non of the following solutions could solve the Tree can't be parsed to 'Ast'
problem:
I changed the DAO class to this version:
class AnomalyDao(ctx: PostgresContext) {
import ctx._
def deleteTo(to: Long): Unit = run {
query[Anomaly].filter(anomaly => anomaly.time_inserted.toEpochSecond(ZoneOffset.UTC) >(lift(to))).delete
}
}
case class Anomaly(
offset: Long,
data: Json,
time_inserted: LocalDateTime
)
But still parsing error is there, any Idea?
hi, I am using quill with sql server, and I am trying to query a table function and ordering the results.
my code is:
val rawQuery = quote {
( id: Int ) => infix"""select * from dbo.MyFunction($id)""".as[ Query[ MyCaseClass] ]
}
ctx.run( rawQuery( lift( id ) ).sortBy(_.dateTime))
I see in the console that the sql that gets generated for the sortby is:ORDER BY x6."dateTime" ASC NULLS FIRST
the syntax nulls first
isn't supported in sql server and it's throwing an error... anyway to get around it?
Where do I put those custom encoding? let's say I have query:
inline given MyTable.schema
def myQuery = for myTable <- query[MyTable] yield MyCustomModel(created = myTable.created)
where MytAble looks like:
case class MyTable(id: Int, created: LocalDateTime)
object MyTable{
val schema = schemaMeta[MyTable](
"my_custom_name",
_.id -> "my_id",
_.created -> "creation_date"
)
}
and MyCustomModel looks like:case class MyCustomModel(created: LocalDateTime)
Where to define the custom encoder and how to use/import it to proper place?
@mmajka_gitlab Writing a schema is not needed if the name of the fields matches between the case class and the db.
You can put the encoding in the same class or in another and import them.
I have a QuillEncoders object where I put the mappings then, I import the members QuillEncoders.*; QuillEncoders.given
Then I write the encoders for my custom types, for example :
import wvlet.airframe.ulid.ULID
import io.lemonlabs.uri.Url
implicit val encodeULID: MappedEncoding[ULID, String] = MappedEncoding[ULID, String](_.toString)
implicit val decodeULID: MappedEncoding[String, ULID] = MappedEncoding[String, ULID](ULID.fromString)
implicit val encodeUrl: MappedEncoding[Url, String] = MappedEncoding[Url, String](_.toString)
implicit val decodeUrl: MappedEncoding[String, Url] = MappedEncoding[String, Url](Url.parse(_))
For dates I use java.util.Date in the case class binding since it is natively supported.
@mprevel is String
the standard type to map from/to? For instance, I want to map a Postgresql 'geometry' column to a org.locationtech.jts.geo.Geometry
field. The following does not work:
given MappedEncoding[String, Geometry] = MappedEncoding[String, Geometry](s => WKTReader().read(s))
given MappedEncoding[Geometry, String] = MappedEncoding[Geometry, String](g => g.toString)
I get "No Decoder found for Geometry and it is not a class representing a group of columns".
However, the following does work, but I need the Quill.Postgres
to be available:
private given Decoder[Geometry] =
decoder((row) => (index) => { // very inefficient? object -> String -> Array[Byte] -> WKT
WKBReader().read(WKBReader.hexToBytes(row.getObject(index).toString))
})
@k0ala I don't think it has to be String, but it has to be a type that is known by quill, so it can read/ write to the db.
If you provide a decoder (and it is in the scope) you don't have to provide a mapping. Just use the type for which you have a decoder in the case class binding.
So it probably depends on the selected module and db you have.
I also can say that depending on the type of the column a string will be rejected from the database (e.g jsonb).
For example, I've written a small codec for circe Json and jdbc a few days ago (this is currently just an experiment and is not battle tested).
I can, now, use circe Json as a field of a case class, and probably I'll be able to use a case class in place of this json field provided I build an implicit MappedEncoding from the case class to Json (but not tried yet).
implicit val jsonEncoder: JdbcEncoder[Json] =
encoder[Json](
Types.OTHER,
(idx: Int, js: Json, ps: PreparedStatement) => {
val jsonObject: PGobject = new PGobject()
jsonObject.setType("json")
jsonObject.setValue(js.toString)
ps.setObject(idx, jsonObject)
}
)
val decodeFn: (Int, ResultRow, Session) => Json =
(idx: Int, row: ResultRow, session: Session) => {
val jsonObject: PGobject = row.getObject(idx, classOf[PGobject])
io.circe.parser.parse(jsonObject.getValue).getOrElse(throw new RuntimeException("db value is not a valid json"))
}
implicit val jsonDecoder: JdbcDecoder[Json] =
decoder[Json](decodeFn)
Hello, I have a quill query which would almost work if it omitted inserting single quotes around passed-in 'idsConcat'.
If anyone had experience with it, please help. Much appreciated. Using quill-jdbc-zio 4.6.0.
val idsConcat = someIntIdsList.mkString(",")
val query = quote { (idsConcat: String) =>
sql"""
select name from users
where
id in ($idsConcat)
""".as[Query[(String)]]
}
run(query(lift(idsConcat)))
Here is a quill-printed example of the query with single quotes around the comma separated ids.
select name from users
where
id in ('1,2,3,4')
This is a made-up example to simplify.
I have also tried doing it like below...
val query = quote { (rawQuery: String) =>
sql"#$rawQuery".as[Query[Int]]
}
run(query(rawQuery))
.. but get an error there also..
scalac: Error while emitting Users.scala
value rawQuery
Have found a workaround in PostgresSql.
val idsConcat = "{" + someIntIdsList.mkString(",") + "}" // {1, 2, 3, 4}
val query = quote { (idsConcat: String) =>
sql"""
select name from users
where
id in (select unnest($idsConcat::int[]))
""".as[Query[(String)]]
}
.. which becomes..
select name from users
where
id in (select unnest('{1,2,3,4}'::int[]))
I'm getting a compile-time stack trace telling me that quill isn't getting along with the java module system in java 17:
java.lang.RuntimeException: java.lang.reflect.InaccessibleObjectException: Unable to make fieldprotected transient int java.util.AbstractList.modCount accessible: module java.base does not "opens java.util" to unnamed module @62585f3f
google told me I might need an option like "--add-opens=java.base/java.util=ALL-UNNAMED" or --illegal-access=permit" although They didn't work for me. Maybe I was an incompetent mill user. Anybody know the cure?
WITH cte (id, parent_id, name)
as (select id, parent_id, name from my_object where parent_id IS NULL AND type = 'Database'
UNION ALL
SELECT o.id, o.parent_id, o.name from my_object o JOIN cte ON o.parent_id = cte.id)
Select * from my_object where id NOT IN (select id from cte)