Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Rob Norris
@tpolecat
The problem is that callable statements have in and out parameters that need to be registered, can return atomic values or cursors or other things that are lifetime-managed in a vendor-specific way and may require cleanup.
It's just so general there's not a lot you can say about the "common" use case.
eugeniyk
@eugeniyk
hm.. interesting!
Rob Norris
@tpolecat
You may be able to write something for your use case though. And you do this by building up combinators using the F* API
eugeniyk
@eugeniyk
About Read - most of the times users don't have to worry about contracting it right?
One of use-cases for [Read] customization I can see is when I have case class where I want to ignore some of fields from being selected into / updated from
and how does it work - Write[T] with autoincremented columns (say, non-updatable columns)?
Rob Norris
@tpolecat
Normally the user never sees Write. You say sql".. $foo ... $bar" and assuming foo: Int and bar: String doobie will forge a Write[Int :: String :: HNil] under the covers and that's what is used to set values for the statement parameters.
eugeniyk
@eugeniyk
Reason I'm asking - if I'm planning to use Write for calling store procedures with low level api, wonder how is better to expose the parameter types
it could be Write[()], Write[A], Write[(A, B)], .... Write22[(...)]
or just Write[A] (but then it's harder to skip non-updatable columns..)
eugeniyk
@eugeniyk

@tpolecat I've noticed that for

def getWrite[T: Write](item: T)
...
getWrite((Some(123), None)) // 1 parameter
getWrite[(Option[Int], Option[String])]((Some(123), None)) // 2 parameters

in first case it will create Write without None / null; so when I'm trying to assign value to Prepared statement or stored procedure, it won't assign second parameter
is it a well-known behavior?

Rob Norris
@tpolecat
The way to summon an instance is Write[T].
I’m surprised it works at all for None, which is probably turned into HNil which has width zero and is skipped.
Stacy Curl
@stacycurl

Helo, I’m thinking of writing a ‘doobie for aws’, unless a miracle occurs it will be a noddy project for a long time.

What would be your reaction if it were suspiciously similar to doobie ? i.e. I copy ’n’ pasted ’n’ modified lots of code from doobie ? I’m happy to use the same license, attribute doobie, etc.

Rob Norris
@tpolecat
You mean a free monad wrapper for the AWS API?
Stacy Curl
@stacycurl
Yes
Rob Norris
@tpolecat
If you’re going to generate code I suggest generating tagless style instead. It’s just a lot easier.
I think there’s a branch out there that does it. Will be named tagless something.
Stacy Curl
@stacycurl
I’m not fond of tagless
Rob Norris
@tpolecat
Ok.
Stacy Curl
@stacycurl
So is it ok if I copy liberally from doobie ? I will attribute doobie, of course.
Would you require anything else ?
IANAL and I don’t know if you can legally require stuff, but I want to be thoughtful.
Rob Norris
@tpolecat
Well in any case feel free to copy anything you like. If you want to say that portions are derived with permission from doobie that’s cool but I’m not worried about it.
Stacy Curl
@stacycurl
Thanks !
Rob Norris
@tpolecat
Sure, good luck!
Because AWS is a much bigger and much deeper API you may find that you’ll need to invent some new machinery.
Stacy Curl
@stacycurl
I’m cranking through the S3 api and it is big.
Rob Norris
@tpolecat
The basic idea of making the bottom layer map 1:1 with the underlying Java code is sound I think. Then derive safer stuff at a higher level.
It’s kept doobie relatively simple.
truongio
@truongio
Hi, is there any way to achieve this?
  def makePerson: Either[Throwable, Person] = ???
  implicit val read: Read[Person] = Read[(Long, String)].map { // I want something like .temap here
    case (age, name) => makePerson(age, name)  
  }
Swapnil S.
@Iamswapnil619

Hi ,

I have written a function to create db transactor code below using doobie hikari. :

def createTransactor(db: JDBC, pool_name: String , pool_size: Int): ZLayer[Blocking, Throwable, TransactorEnv] = ZLayer.fromManaged {
    val config = new HikariConfig()
    config.setDriverClassName(db.driver)
    config.setJdbcUrl(db.url)
    config.setUsername(db.user)
    config.setPassword(db.password)
    config.setMaximumPoolSize(pool_size)
    config.setPoolName(pool_name)
    for {
      rt <- Task.runtime.toManaged_
      transactor <- HikariTransactor.fromHikariConfig[Task](config, rt.platform.executor.asEC).toManagedZIO
    } yield transactor
  }

This code perfectly works with scala 2.12.14 but failed with scala 3.0.0 with below error :

[error] -- Error: /Users/sonawanes/Desktop/swapnil/etlflow-scala3/src/main/scala/db/Transactor.scala:29:3 
[error] 29 |  }
[error]    |   ^
[error]    |Exception occurred while executing macro expansion.
[error]    |java.lang.RuntimeException: TYPEREPR, UNSUPPORTED: class dotty.tools.dotc.core.Types$CachedRefinedType - RefinedType(AppliedType(TypeRef(ThisType(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class doobie)),object util),module class transactor$)),class Transactor),List(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zio)),object package),type Task))),A,TypeBounds(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource),TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource)))
[error]    |    at izumi.reflect.dottyreflection.Inspector.inspectTypeRepr(Inspector.scala:96)
[error]    |    at izumi.reflect.dottyreflection.Inspector.buildTypeRef(Inspector.scala:22)
[error]    |    at izumi.reflect.dottyreflection.TypeInspections$.apply(TypeInspections.scala:10)
[error]    |    at izumi.reflect.dottyreflection.Inspect$.inspectAny(Inspect.scala:17)
[error]    |
[error]    | This location contains code that was inlined from Transactor.scala:29
[error]    | This location contains code that was inlined from Tags.scala:141

Any idea what i am doing wrong here ?

Rob Norris
@tpolecat
Looks like a compiler bug.
Rob Norris
@tpolecat
Or a zio macro bug.
Rob Norris
@tpolecat
I'm no longer monitoring this channel. Please move to https://sca.la/typeleveldiscord
discobaba
@uncleweirdo_twitter
Rob Norris
@tpolecat
I'm no longer monitoring this channel. Please switch to Typelevel Discord at https://sca.la/typeleveldiscord
Lawrence Wagerfield
@ljwagerfield
Given a case class Foo(..) and a Meta[Foo] that maps to a jsonb, how do I get a Meta[Array[Foo]] that maps to a jsonb[]? It doesn't seem to come "for free", leading me to believe there's a utility method somewhere for producing a Meta[Array[A]] from a Meta[A]..
dan-ilin
@dan-ilin

hi i'm trying to upgrade my project to use doobie version 0.13.4. i started getting this error on every sql query that i have defined in the project (via the sql interpolator):

java.lang.ClassCastException: doobie.util.fragment$Fragment cannot be cast to doobie.syntax.SqlInterpolator$SingleFragment

does anyone know what could be causing this exception?

Geovanny Junio
@geovannyjs
Deprecated. Please move to https://sca.la/typeleveldiscord
sanjivsahayamrea
@sanjivsahayamrea

Hi all, is there a way to try multiple Get decoders in the form:

Get[XYZ] = Get[Int].temap(....) orElse Get[Long].temap(...) orElse Get[String].temap(...)

We have a scenario where we have very similar queries that differ across tables because there is a boolean field encoded in different ways at the DB level.

1 reply
Circe has something like this:
 or[AA >: A](d: => Decoder[AA]): Decoder[AA]
sanjivsahayamrea
@sanjivsahayamrea
Unfortunately we can't change the db column types :(
Brandon Brown
@brbrown25
Question, with cats effect 2 I have some code like currentTime <- Async[ConnectionIO].liftIO(clock.realTime(MILLISECONDS)) I'm not exactly sure how I would convert that to cats effect 3 my initial naive approach was WeakAsyncConnectionIO.pure(clock.realTime.map(_.toMillis)) but that gives me an IO[ConnectionIO[Long]] and not ConnectionIO[Long] I then thought of clock.realTime.map(_.toMillis).to[ConnectionIO] but that requires an implicit LiftIO[ConnectionIO]
Any pointers would be greatly appreciated
jatcwang
@jatcwang:matrix.org
[m]

Or if possible try abstracting over the column type in your case class.

case class QueryResult[A](
  id: String,
  value: A
)

and then you can fr"...".query[QueryResult[Int]] fr"...".query[QueryResult[Long]] etc depending on the table

PsyfireX
@PsyfireX
import doobie.implicits._
import doobie.util.{Put, Read}
import org.scalatest.freespec.AnyFreeSpec

case class TempId2(id: Long, targetType: String)

trait TempId {
  val id: Long
  val targetType: String
}

object TempId{
  def apply(_id: Long, _targetType: String): TempId = new TempId {
    override val id: Long = _id
    override val targetType: String = _targetType
  }

  implicit val put: Put[TempId] = ???

  implicit val read: Read[TempId] = ???
}

object TempDao {
  case class Row(
    temp1: TempId,
//    temp1o: Option[TempId],
    temp2: Option[TempId2],
    temp2o: Option[TempId2],
  )

  read[Row]()

  def read[R: Read](): Unit = ???
}

Uncommenting the temp1o line results in the following:

Cannot find or construct a Read instance for type:

  ai.tagr.tagrdaos.daos.content.TempDao.Row

Along with the standard error about Refer to Chapter 12 of the book of doobie for more information.

It appears the issue is related to having an Option of a trait ... there is a read instance for the trait, and the trait is fine if it's not inside an option.

Any ideas?
PsyfireX
@PsyfireX
Oh, I think it might have something to do with Read vs Get
Maxim Ivanov
@redbaron
Hi All. I am trying to retry transaction on certain SQL codes. For queries returning ConnectionIO I managed to do that with exceptSomeSqlState, but it doesn't compile for queries returning Stream. What would you recommend to get it working?
import doobie._
import doobie.implicits._
import cats.implicits._
import doobie.postgres.sqlstate.class40


// this is how I retry queries returning ConnectionIO
def insertManyWithRetry[P: Write](
    q: doobie.Update[P],
    elems: List[P]
): ConnectionIO[Int] = {
  q.updateMany(elems)
    .exceptSomeSqlState({
      case class40.TRANSACTION_ROLLBACK |
          class40.TRANSACTION_INTEGRITY_CONSTRAINT_VIOLATION |
          class40.SERIALIZATION_FAILURE |
          class40.STATEMENT_COMPLETION_UNKNOWN | class40.DEADLOCK_DETECTED => {
        HC.rollback *> insertManyWithRetry(q, elems)
      }
    })
}

// trying to do the same for Stream[ConnectionIO, _]
def insertManyReturningWithRetry[K: Read, P: Write](q: Update[P], elems: List[P], columns: Seq[String]): fs2.Stream[ConnectionIO, K] = {
   q.updateManyWithGeneratedKeys(columns: _*)(elems).exceptSomeSqlState({
     case class40.TRANSACTION_ROLLBACK |
          class40.TRANSACTION_INTEGRITY_CONSTRAINT_VIOLATION |
          class40.SERIALIZATION_FAILURE |
          class40.STATEMENT_COMPLETION_UNKNOWN |
          class40.DEADLOCK_DETECTED => {
       // Doesn't work, because ConnectionIO[Unit] doesn't compose with Stream[ConnectionIO, K]
       HC.rollback *> insertManyReturningWithRetry(q, elems, columns)
     }
   })
 }
1 reply
jatcwang
@jatcwang:matrix.org
[m]
I'm unsure what the JDBC behaviour is around streams though if an error happened, so you'll need to test that it actually works too
Meriam Lachkar
@mlachkar

Hello, I have a question about Read Instance asnd how they compose.

  final case class Full(dependency: NewDependency, release: Option[NewRelease])
  //  I have a custom reader for NewRelease 
  // But I notice that Read[Full] uses a generic/derived reader for NewRelease`

So the code compiles fine, but reading fails, since it's not using my custom reader .
Right now to avoid this , I have re-written a Reader for Full, which copies the NewRelease reader with some differences to deal with the option.

      implicit val fullReader: Read[Full] = Read[(NewDependency, Option[String], ..........).map...

I am doing something wrong ? How can I fix this please ? thanks