Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
Not sure why we don't have high-level api for CallableStatement, this looks compilable at least
write: Write[A] = ???
item: A = ???
FCS.raw(write.unsafeSet(_,  index, item))
Does Read represent scenario of multiple result set returned from multiple queries / stored procedure?
Or it's just a generalization for cases when you return back rows of different types inside (but how is it possible?)
Rob Norris
Read decodes a column vector into a value.
Get decodes a single element of a column vector into a value.
Nothing in doobie's high-level API can deal with multiple resultsets. You have to use the low-level API for that.
The problem is that callable statements have in and out parameters that need to be registered, can return atomic values or cursors or other things that are lifetime-managed in a vendor-specific way and may require cleanup.
It's just so general there's not a lot you can say about the "common" use case.
hm.. interesting!
Rob Norris
You may be able to write something for your use case though. And you do this by building up combinators using the F* API
About Read - most of the times users don't have to worry about contracting it right?
One of use-cases for [Read] customization I can see is when I have case class where I want to ignore some of fields from being selected into / updated from
and how does it work - Write[T] with autoincremented columns (say, non-updatable columns)?
Rob Norris
Normally the user never sees Write. You say sql".. $foo ... $bar" and assuming foo: Int and bar: String doobie will forge a Write[Int :: String :: HNil] under the covers and that's what is used to set values for the statement parameters.
Reason I'm asking - if I'm planning to use Write for calling store procedures with low level api, wonder how is better to expose the parameter types
it could be Write[()], Write[A], Write[(A, B)], .... Write22[(...)]
or just Write[A] (but then it's harder to skip non-updatable columns..)

@tpolecat I've noticed that for

def getWrite[T: Write](item: T)
getWrite((Some(123), None)) // 1 parameter
getWrite[(Option[Int], Option[String])]((Some(123), None)) // 2 parameters

in first case it will create Write without None / null; so when I'm trying to assign value to Prepared statement or stored procedure, it won't assign second parameter
is it a well-known behavior?

Rob Norris
The way to summon an instance is Write[T].
I’m surprised it works at all for None, which is probably turned into HNil which has width zero and is skipped.
Stacy Curl

Helo, I’m thinking of writing a ‘doobie for aws’, unless a miracle occurs it will be a noddy project for a long time.

What would be your reaction if it were suspiciously similar to doobie ? i.e. I copy ’n’ pasted ’n’ modified lots of code from doobie ? I’m happy to use the same license, attribute doobie, etc.

Rob Norris
You mean a free monad wrapper for the AWS API?
Stacy Curl
Rob Norris
If you’re going to generate code I suggest generating tagless style instead. It’s just a lot easier.
I think there’s a branch out there that does it. Will be named tagless something.
Stacy Curl
I’m not fond of tagless
Rob Norris
Stacy Curl
So is it ok if I copy liberally from doobie ? I will attribute doobie, of course.
Would you require anything else ?
IANAL and I don’t know if you can legally require stuff, but I want to be thoughtful.
Rob Norris
Well in any case feel free to copy anything you like. If you want to say that portions are derived with permission from doobie that’s cool but I’m not worried about it.
Stacy Curl
Thanks !
Rob Norris
Sure, good luck!
Because AWS is a much bigger and much deeper API you may find that you’ll need to invent some new machinery.
Stacy Curl
I’m cranking through the S3 api and it is big.
Rob Norris
The basic idea of making the bottom layer map 1:1 with the underlying Java code is sound I think. Then derive safer stuff at a higher level.
It’s kept doobie relatively simple.
Hi, is there any way to achieve this?
  def makePerson: Either[Throwable, Person] = ???
  implicit val read: Read[Person] = Read[(Long, String)].map { // I want something like .temap here
    case (age, name) => makePerson(age, name)  
Swapnil S.

Hi ,

I have written a function to create db transactor code below using doobie hikari. :

def createTransactor(db: JDBC, pool_name: String , pool_size: Int): ZLayer[Blocking, Throwable, TransactorEnv] = ZLayer.fromManaged {
    val config = new HikariConfig()
    for {
      rt <- Task.runtime.toManaged_
      transactor <- HikariTransactor.fromHikariConfig[Task](config, rt.platform.executor.asEC).toManagedZIO
    } yield transactor

This code perfectly works with scala 2.12.14 but failed with scala 3.0.0 with below error :

[error] -- Error: /Users/sonawanes/Desktop/swapnil/etlflow-scala3/src/main/scala/db/Transactor.scala:29:3 
[error] 29 |  }
[error]    |   ^
[error]    |Exception occurred while executing macro expansion.
[error]    |java.lang.RuntimeException: TYPEREPR, UNSUPPORTED: class dotty.tools.dotc.core.Types$CachedRefinedType - RefinedType(AppliedType(TypeRef(ThisType(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class doobie)),object util),module class transactor$)),class Transactor),List(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zio)),object package),type Task))),A,TypeBounds(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource),TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource)))
[error]    |    at izumi.reflect.dottyreflection.Inspector.inspectTypeRepr(Inspector.scala:96)
[error]    |    at izumi.reflect.dottyreflection.Inspector.buildTypeRef(Inspector.scala:22)
[error]    |    at izumi.reflect.dottyreflection.TypeInspections$.apply(TypeInspections.scala:10)
[error]    |    at izumi.reflect.dottyreflection.Inspect$.inspectAny(Inspect.scala:17)
[error]    |
[error]    | This location contains code that was inlined from Transactor.scala:29
[error]    | This location contains code that was inlined from Tags.scala:141

Any idea what i am doing wrong here ?

Rob Norris
Looks like a compiler bug.
Rob Norris
Or a zio macro bug.
Rob Norris
I'm no longer monitoring this channel. Please move to https://sca.la/typeleveldiscord
Rob Norris
I'm no longer monitoring this channel. Please switch to Typelevel Discord at https://sca.la/typeleveldiscord
Lawrence Wagerfield
Given a case class Foo(..) and a Meta[Foo] that maps to a jsonb, how do I get a Meta[Array[Foo]] that maps to a jsonb[]? It doesn't seem to come "for free", leading me to believe there's a utility method somewhere for producing a Meta[Array[A]] from a Meta[A]..

hi i'm trying to upgrade my project to use doobie version 0.13.4. i started getting this error on every sql query that i have defined in the project (via the sql interpolator):

java.lang.ClassCastException: doobie.util.fragment$Fragment cannot be cast to doobie.syntax.SqlInterpolator$SingleFragment

does anyone know what could be causing this exception?

Geovanny Junio
Deprecated. Please move to https://sca.la/typeleveldiscord

Hi all, is there a way to try multiple Get decoders in the form:

Get[XYZ] = Get[Int].temap(....) orElse Get[Long].temap(...) orElse Get[String].temap(...)

We have a scenario where we have very similar queries that differ across tables because there is a boolean field encoded in different ways at the DB level.

1 reply
Circe has something like this:
 or[AA >: A](d: => Decoder[AA]): Decoder[AA]
Unfortunately we can't change the db column types :(
Brandon Brown
Question, with cats effect 2 I have some code like currentTime <- Async[ConnectionIO].liftIO(clock.realTime(MILLISECONDS)) I'm not exactly sure how I would convert that to cats effect 3 my initial naive approach was WeakAsyncConnectionIO.pure(clock.realTime.map(_.toMillis)) but that gives me an IO[ConnectionIO[Long]] and not ConnectionIO[Long] I then thought of clock.realTime.map(_.toMillis).to[ConnectionIO] but that requires an implicit LiftIO[ConnectionIO]
Any pointers would be greatly appreciated

Or if possible try abstracting over the column type in your case class.

case class QueryResult[A](
  id: String,
  value: A

and then you can fr"...".query[QueryResult[Int]] fr"...".query[QueryResult[Long]] etc depending on the table

import doobie.implicits._
import doobie.util.{Put, Read}
import org.scalatest.freespec.AnyFreeSpec

case class TempId2(id: Long, targetType: String)

trait TempId {
  val id: Long
  val targetType: String

object TempId{
  def apply(_id: Long, _targetType: String): TempId = new TempId {
    override val id: Long = _id
    override val targetType: String = _targetType

  implicit val put: Put[TempId] = ???

  implicit val read: Read[TempId] = ???

object TempDao {
  case class Row(
    temp1: TempId,
//    temp1o: Option[TempId],
    temp2: Option[TempId2],
    temp2o: Option[TempId2],


  def read[R: Read](): Unit = ???

Uncommenting the temp1o line results in the following:

Cannot find or construct a Read instance for type:


Along with the standard error about Refer to Chapter 12 of the book of doobie for more information.

It appears the issue is related to having an Option of a trait ... there is a read instance for the trait, and the trait is fine if it's not inside an option.