Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Alfonso Nishikawa
@alfonsonishikawa

Is something like this correct? Would it be memory-safe?

// for simplicity, insertValIntoOtherTable: Int => ConnectionIO[Unit]

def (intStream: fs2.Stream[ConnectionIO,Int]): IO[Unit] = {
    intStream.compile.fold(connection.unit)(
      (accConnectionIO, intVal) => {
        accConnectionIO *> insertValIntoOtherTable(intVal)
      }
    ).flatten.transact(transactor)
}

That last flatten... :/

Ian Agius
@ianagius
Hi,
Im trying to write an INSERT ON DUPLICATE UPDATE query in doobie. Usually when doing an INSERT only I use withUniqueGeneratedKeys[Int]("id") to get the auto-generated ID of the inserted row. I tried doing the same for the INSERT ON DUPLICATE UPDATE query but it was returning the following error when a query updates instead of insertingExpected ResultSet exhaustion, but more rows were available.. I think this occurs because on UPDATE MYSQL returns 2 rows updated to indicate that there was an update instead of an insert. Does this error happen because the withUniqueGeneratedKeys is expecting 1 row only to be updated? I 'fixed' it by using update.run instead. Still I would like to get the original id in case of update. Do you have any recommendations on how to go around it please?
Rob Norris
@tpolecat
MySQL may not support updates returning keys.
I don't know, sorry. If you can figure out how you would do it with JDBC we can figure out how it would look in doobie.
Ian Agius
@ianagius
I will research a bit tomorrow and if I find something I will share. Thanks a lot for your time and prompt response.
Anton Solovyev
@Rosteelton
Hi! Can I get Meta automatically for opaque type in scala 3? (without writing Meta[Int].timap(identity)(identity))
daenyth
@daenyth:matrix.org
[m]
If it doesn't happen out of the box, it seems reasonable that the feature should be added - open a github ticket?
Sebastian Voss
@sebastianvoss

Hi! I'm using the notify/listen functionality of Postgres. The notify is executed in a DB trigger. It seems after some time my program stops receiving notifications but the stream does not stop with an error:

def notificationStream(
      channelName: String,
      pollingInterval: FiniteDuration
  ): Stream[F, PGNotification] = {
    val inner: Pipe[ConnectionIO, FiniteDuration, PGNotification] = ticks =>
      for {
        _  <- Stream.resource(channel(channelName))
        _  <- ticks
        ns <- Stream.eval(PHC.pgGetNotifications <* HC.commit)
        _  <- Stream.eval(logger.info(s"Received ${ns.size} notifications"))
        n  <- Stream.emits(ns)
      } yield n
    awakeEvery[F](pollingInterval).through(inner.transact(xa))
  }

Does somebody have a hint what could be wrong?

Rob Norris
@tpolecat
I have wondered about the reliability of this mechanism for a while. I don’t have an answer but would be interested if you find anything out elsewhere. JDBC users may have run into this.
Sebastian Voss
@sebastianvoss
thx @tpolecat, I will try to reproduce also with psql. Will report back.
Sebastian Voss
@sebastianvoss
@tpolecat , I found impossibl/pgjdbc-ng#517 which sounds similar. They are blaming it on GC. Could this be the right direction?
Rob Norris
@tpolecat
On my phone, I’ll have a look later. Thanks for digging!
Sebastian Voss
@sebastianvoss
A few more observations: In psql I can see a similar situation. After some time I get The connection to the server was lost. Attempting reset: Succeeded. and need to execute listen <channel> again to continue to receive notifications. As a brute force attempt I added Stream.eval(PHC.pgListen(channelName)) to the inner pipe. This seems to work. But I will continue to look for a smarter way.
Walter Chang
@weihsiu
trying to use scala 3 opaque type with Put and Get. the code compiles but querying Country is non-terminating.
opaque type Code = String
object Code:
  def apply(code: String): Code = code
extension (code: Code) def value: String = code
given Get[Code] = Get[String].tmap[Code](x => Code(x))
given Put[Code] = Put[String].tcontramap[Code](x => x.value)

case class Country(code: Code, name: String, pop: Int, gnp: Option[Double])
Rob Norris
@tpolecat
Hm seems like that should work.
bentucker
@bentucker

after updating a project to cats-effect version 3.1.1 & doobie 1.0.0-M5 trying to lift IO to ConnectionIO with this helper

val liftToConnIO: FunctionK[IO, doobie.ConnectionIO] = LiftIO.liftK[doobie.ConnectionIO]

generates an error could not find implicit value for parameter F: cats.effect.LiftIO[doobie.ConnectionIO]

I'm having this same problem too ...

I noticed the signature of LiftIO.liftK changed from liftK[F[_]: LiftIO]: IO ~> F to liftK[F[_]](implicit F: LiftIO[F]): IO ~> F
Davis Zanot
@dzanot
Working on this issue too :point_up: I think the cause is that Async no longer extends LiftIO, thus AsyncConnectionIO no longer satisfies the requirements of LiftIO.liftK... Which leads us to the question, is there a way to get a FunctionK[IO, ConnectionIO] with the new CE3 hierarchy? (ps I think myself and others came to this solution via https://stackoverflow.com/questions/59657203/doobie-lifting-arbitrary-effect-into-connectionio)
The ultimate goal of ours is to use our Logger[F] within the ConnectionIO context
Davis Zanot
@dzanot
Hmm I may have found a solution via WeakAsync (for posterity this was answered in Discourse https://discord.com/channels/632277896739946517/632727524434247691/851530913341898853) )
Though I don't full understand WeakAsync 🤔
danveer4686
@danveer4686
Hi Team, Below code is throwing compile error:
import doobie.hikari.HikariTransactor
import doobie.util.ExecutionContexts
import cats.effect.{ Resource, Blocker}
import zio.interop.catz._
import zio.Task

object SampleDoobie {

  def dbResource: Resource[Task, HikariTransactor[Task]] = {
    for {
      connectEC <- ExecutionContexts.fixedThreadPool[Task](20)
      xa        <- HikariTransactor.newHikariTransactor[Task](
        "org.postgresql.Driver", // driver classname
        "url", // connect URL
        "db_usr", // username
        "db_pass", // password
        connectEC,                              // await connection here
        Blocker.liftExecutionContext(connectEC) // transactEC // execute JDBC operations here
      )
    } yield xa
  }
}
Below is my sbt:
scalaVersion := "2.12.10"

val doobieVersion = "1.0.0-M3"
val calibanVersion= "0.10.1"
val zhttpVersion  = "1.0.0.0-RC17"
val zioInteropVersion = "3.1.1.0"

val libs = List(
  "org.tpolecat" %% "doobie-core"     % doobieVersion ,
  "org.tpolecat" %% "doobie-postgres" % doobieVersion ,
  "org.tpolecat" %% "doobie-h2"       % doobieVersion ,
  "org.tpolecat" %% "doobie-hikari"   % doobieVersion ,

  "com.github.ghostdogpr" %% "caliban" % calibanVersion,
  "com.github.ghostdogpr" %% "caliban-zio-http" %  calibanVersion,
  "io.d11" %% "zhttp" % zhttpVersion,
  "dev.zio" %% "zio-interop-cats" % zioInteropVersion
)

libraryDependencies ++= libs
Rob Norris
@tpolecat
What is the error?
danveer4686
@danveer4686
@tpolecat
error1 in creation of connectEC: No implicits found for parameter sf: Sync[Task]
error2 in import cats.effect.Blocker
Rob Norris
@tpolecat
It sounds like something may be pulling in both CE2 and CE3.
Witold Soczek
@v-tec2706
@danveer4686 did you managed to resolve that?
1 reply
eugeniyk
@eugeniyk
Hello guys
Please suggest how to use Put[A] instances with low-level API (if I understand correctly what is LL api - we are using HC.prepareCall, setObject and other Free monads), specially with CallableStatementIO
Idea is to make method below more type safe, as I understand Put incapsulates ability to unsafely set object within Prepared statement; currently it's 0.9.2 version
  private def setParams(params: Vector[Any]): CallableStatementIO[Vector[Unit]] = {
    params.zipWithIndex.traverse[CallableStatementIO, Unit] {
      case (p, i) =>
        setObject(i + 1, p.asInstanceOf[AnyRef])
    }
  }
eugeniyk
@eugeniyk
Not sure why we don't have high-level api for CallableStatement, this looks compilable at least
write: Write[A] = ???
item: A = ???
FCS.raw(write.unsafeSet(_,  index, item))
eugeniyk
@eugeniyk
https://tpolecat.github.io/doobie/docs/12-Custom-Mappings.html#column-vector-mappings
Does Read represent scenario of multiple result set returned from multiple queries / stored procedure?
Or it's just a generalization for cases when you return back rows of different types inside (but how is it possible?)
Rob Norris
@tpolecat
Read decodes a column vector into a value.
Get decodes a single element of a column vector into a value.
Nothing in doobie's high-level API can deal with multiple resultsets. You have to use the low-level API for that.
The problem is that callable statements have in and out parameters that need to be registered, can return atomic values or cursors or other things that are lifetime-managed in a vendor-specific way and may require cleanup.
It's just so general there's not a lot you can say about the "common" use case.
eugeniyk
@eugeniyk
hm.. interesting!
Rob Norris
@tpolecat
You may be able to write something for your use case though. And you do this by building up combinators using the F* API
eugeniyk
@eugeniyk
About Read - most of the times users don't have to worry about contracting it right?
One of use-cases for [Read] customization I can see is when I have case class where I want to ignore some of fields from being selected into / updated from
and how does it work - Write[T] with autoincremented columns (say, non-updatable columns)?
Rob Norris
@tpolecat
Normally the user never sees Write. You say sql".. $foo ... $bar" and assuming foo: Int and bar: String doobie will forge a Write[Int :: String :: HNil] under the covers and that's what is used to set values for the statement parameters.
eugeniyk
@eugeniyk
Reason I'm asking - if I'm planning to use Write for calling store procedures with low level api, wonder how is better to expose the parameter types
it could be Write[()], Write[A], Write[(A, B)], .... Write22[(...)]
or just Write[A] (but then it's harder to skip non-updatable columns..)
eugeniyk
@eugeniyk

@tpolecat I've noticed that for

def getWrite[T: Write](item: T)
...
getWrite((Some(123), None)) // 1 parameter
getWrite[(Option[Int], Option[String])]((Some(123), None)) // 2 parameters

in first case it will create Write without None / null; so when I'm trying to assign value to Prepared statement or stored procedure, it won't assign second parameter
is it a well-known behavior?

Rob Norris
@tpolecat
The way to summon an instance is Write[T].
I’m surprised it works at all for None, which is probably turned into HNil which has width zero and is skipped.
Stacy Curl
@stacycurl

Helo, I’m thinking of writing a ‘doobie for aws’, unless a miracle occurs it will be a noddy project for a long time.

What would be your reaction if it were suspiciously similar to doobie ? i.e. I copy ’n’ pasted ’n’ modified lots of code from doobie ? I’m happy to use the same license, attribute doobie, etc.

Rob Norris
@tpolecat
You mean a free monad wrapper for the AWS API?
Stacy Curl
@stacycurl
Yes
Rob Norris
@tpolecat
If you’re going to generate code I suggest generating tagless style instead. It’s just a lot easier.
I think there’s a branch out there that does it. Will be named tagless something.