Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
eugeniyk
@eugeniyk
Hello guys
Please suggest how to use Put[A] instances with low-level API (if I understand correctly what is LL api - we are using HC.prepareCall, setObject and other Free monads), specially with CallableStatementIO
Idea is to make method below more type safe, as I understand Put incapsulates ability to unsafely set object within Prepared statement; currently it's 0.9.2 version
  private def setParams(params: Vector[Any]): CallableStatementIO[Vector[Unit]] = {
    params.zipWithIndex.traverse[CallableStatementIO, Unit] {
      case (p, i) =>
        setObject(i + 1, p.asInstanceOf[AnyRef])
    }
  }
eugeniyk
@eugeniyk
Not sure why we don't have high-level api for CallableStatement, this looks compilable at least
write: Write[A] = ???
item: A = ???
FCS.raw(write.unsafeSet(_,  index, item))
eugeniyk
@eugeniyk
https://tpolecat.github.io/doobie/docs/12-Custom-Mappings.html#column-vector-mappings
Does Read represent scenario of multiple result set returned from multiple queries / stored procedure?
Or it's just a generalization for cases when you return back rows of different types inside (but how is it possible?)
Rob Norris
@tpolecat
Read decodes a column vector into a value.
Get decodes a single element of a column vector into a value.
Nothing in doobie's high-level API can deal with multiple resultsets. You have to use the low-level API for that.
The problem is that callable statements have in and out parameters that need to be registered, can return atomic values or cursors or other things that are lifetime-managed in a vendor-specific way and may require cleanup.
It's just so general there's not a lot you can say about the "common" use case.
eugeniyk
@eugeniyk
hm.. interesting!
Rob Norris
@tpolecat
You may be able to write something for your use case though. And you do this by building up combinators using the F* API
eugeniyk
@eugeniyk
About Read - most of the times users don't have to worry about contracting it right?
One of use-cases for [Read] customization I can see is when I have case class where I want to ignore some of fields from being selected into / updated from
and how does it work - Write[T] with autoincremented columns (say, non-updatable columns)?
Rob Norris
@tpolecat
Normally the user never sees Write. You say sql".. $foo ... $bar" and assuming foo: Int and bar: String doobie will forge a Write[Int :: String :: HNil] under the covers and that's what is used to set values for the statement parameters.
eugeniyk
@eugeniyk
Reason I'm asking - if I'm planning to use Write for calling store procedures with low level api, wonder how is better to expose the parameter types
it could be Write[()], Write[A], Write[(A, B)], .... Write22[(...)]
or just Write[A] (but then it's harder to skip non-updatable columns..)
eugeniyk
@eugeniyk

@tpolecat I've noticed that for

def getWrite[T: Write](item: T)
...
getWrite((Some(123), None)) // 1 parameter
getWrite[(Option[Int], Option[String])]((Some(123), None)) // 2 parameters

in first case it will create Write without None / null; so when I'm trying to assign value to Prepared statement or stored procedure, it won't assign second parameter
is it a well-known behavior?

Rob Norris
@tpolecat
The way to summon an instance is Write[T].
I’m surprised it works at all for None, which is probably turned into HNil which has width zero and is skipped.
Stacy Curl
@stacycurl

Helo, I’m thinking of writing a ‘doobie for aws’, unless a miracle occurs it will be a noddy project for a long time.

What would be your reaction if it were suspiciously similar to doobie ? i.e. I copy ’n’ pasted ’n’ modified lots of code from doobie ? I’m happy to use the same license, attribute doobie, etc.

Rob Norris
@tpolecat
You mean a free monad wrapper for the AWS API?
Stacy Curl
@stacycurl
Yes
Rob Norris
@tpolecat
If you’re going to generate code I suggest generating tagless style instead. It’s just a lot easier.
I think there’s a branch out there that does it. Will be named tagless something.
Stacy Curl
@stacycurl
I’m not fond of tagless
Rob Norris
@tpolecat
Ok.
Stacy Curl
@stacycurl
So is it ok if I copy liberally from doobie ? I will attribute doobie, of course.
Would you require anything else ?
IANAL and I don’t know if you can legally require stuff, but I want to be thoughtful.
Rob Norris
@tpolecat
Well in any case feel free to copy anything you like. If you want to say that portions are derived with permission from doobie that’s cool but I’m not worried about it.
Stacy Curl
@stacycurl
Thanks !
Rob Norris
@tpolecat
Sure, good luck!
Because AWS is a much bigger and much deeper API you may find that you’ll need to invent some new machinery.
Stacy Curl
@stacycurl
I’m cranking through the S3 api and it is big.
Rob Norris
@tpolecat
The basic idea of making the bottom layer map 1:1 with the underlying Java code is sound I think. Then derive safer stuff at a higher level.
It’s kept doobie relatively simple.
truongio
@truongio
Hi, is there any way to achieve this?
  def makePerson: Either[Throwable, Person] = ???
  implicit val read: Read[Person] = Read[(Long, String)].map { // I want something like .temap here
    case (age, name) => makePerson(age, name)  
  }
Swapnil S.
@Iamswapnil619

Hi ,

I have written a function to create db transactor code below using doobie hikari. :

def createTransactor(db: JDBC, pool_name: String , pool_size: Int): ZLayer[Blocking, Throwable, TransactorEnv] = ZLayer.fromManaged {
    val config = new HikariConfig()
    config.setDriverClassName(db.driver)
    config.setJdbcUrl(db.url)
    config.setUsername(db.user)
    config.setPassword(db.password)
    config.setMaximumPoolSize(pool_size)
    config.setPoolName(pool_name)
    for {
      rt <- Task.runtime.toManaged_
      transactor <- HikariTransactor.fromHikariConfig[Task](config, rt.platform.executor.asEC).toManagedZIO
    } yield transactor
  }

This code perfectly works with scala 2.12.14 but failed with scala 3.0.0 with below error :

[error] -- Error: /Users/sonawanes/Desktop/swapnil/etlflow-scala3/src/main/scala/db/Transactor.scala:29:3 
[error] 29 |  }
[error]    |   ^
[error]    |Exception occurred while executing macro expansion.
[error]    |java.lang.RuntimeException: TYPEREPR, UNSUPPORTED: class dotty.tools.dotc.core.Types$CachedRefinedType - RefinedType(AppliedType(TypeRef(ThisType(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class doobie)),object util),module class transactor$)),class Transactor),List(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zio)),object package),type Task))),A,TypeBounds(TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource),TypeRef(TermRef(ThisType(TypeRef(NoPrefix,module class zaxxer)),object hikari),HikariDataSource)))
[error]    |    at izumi.reflect.dottyreflection.Inspector.inspectTypeRepr(Inspector.scala:96)
[error]    |    at izumi.reflect.dottyreflection.Inspector.buildTypeRef(Inspector.scala:22)
[error]    |    at izumi.reflect.dottyreflection.TypeInspections$.apply(TypeInspections.scala:10)
[error]    |    at izumi.reflect.dottyreflection.Inspect$.inspectAny(Inspect.scala:17)
[error]    |
[error]    | This location contains code that was inlined from Transactor.scala:29
[error]    | This location contains code that was inlined from Tags.scala:141

Any idea what i am doing wrong here ?

Rob Norris
@tpolecat
Looks like a compiler bug.
Rob Norris
@tpolecat
Or a zio macro bug.
Rob Norris
@tpolecat
I'm no longer monitoring this channel. Please move to https://sca.la/typeleveldiscord
discobaba
@uncleweirdo_twitter
Rob Norris
@tpolecat
I'm no longer monitoring this channel. Please switch to Typelevel Discord at https://sca.la/typeleveldiscord
Lawrence Wagerfield
@ljwagerfield
Given a case class Foo(..) and a Meta[Foo] that maps to a jsonb, how do I get a Meta[Array[Foo]] that maps to a jsonb[]? It doesn't seem to come "for free", leading me to believe there's a utility method somewhere for producing a Meta[Array[A]] from a Meta[A]..
dan-ilin
@dan-ilin

hi i'm trying to upgrade my project to use doobie version 0.13.4. i started getting this error on every sql query that i have defined in the project (via the sql interpolator):

java.lang.ClassCastException: doobie.util.fragment$Fragment cannot be cast to doobie.syntax.SqlInterpolator$SingleFragment

does anyone know what could be causing this exception?

Geovanny Junio
@geovannyjs
Deprecated. Please move to https://sca.la/typeleveldiscord
sanjivsahayamrea
@sanjivsahayamrea

Hi all, is there a way to try multiple Get decoders in the form:

Get[XYZ] = Get[Int].temap(....) orElse Get[Long].temap(...) orElse Get[String].temap(...)

We have a scenario where we have very similar queries that differ across tables because there is a boolean field encoded in different ways at the DB level.

1 reply
Circe has something like this:
 or[AA >: A](d: => Decoder[AA]): Decoder[AA]
sanjivsahayamrea
@sanjivsahayamrea
Unfortunately we can't change the db column types :(
Brandon Brown
@brbrown25
Question, with cats effect 2 I have some code like currentTime <- Async[ConnectionIO].liftIO(clock.realTime(MILLISECONDS)) I'm not exactly sure how I would convert that to cats effect 3 my initial naive approach was WeakAsyncConnectionIO.pure(clock.realTime.map(_.toMillis)) but that gives me an IO[ConnectionIO[Long]] and not ConnectionIO[Long] I then thought of clock.realTime.map(_.toMillis).to[ConnectionIO] but that requires an implicit LiftIO[ConnectionIO]
Any pointers would be greatly appreciated
jatcwang
@jatcwang:matrix.org
[m]

Or if possible try abstracting over the column type in your case class.

case class QueryResult[A](
  id: String,
  value: A
)

and then you can fr"...".query[QueryResult[Int]] fr"...".query[QueryResult[Long]] etc depending on the table