Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    Swoorup Joshi
    it appears to work not sure if this can be simplified
    Rob Norris
    If you want to execute each chunk in its own transaction then you can do s.groupWithin(1000, 5.seconds).evalMap { chunk => ... } where ... is chunk => F[Whatever].
    Since this looks like a long-running thing that's how I would do it.
    Swoorup Joshi
    Swoorup Joshi
    [info] 🔥
    [info] 🔥 Problem: Statement has more than 32767 parameters.
    [info] 🔥 Hint: Postgres can't handle this many parameters. Execute multiple
    [info] 🔥 statements instead.
    [info] 🔥
    why can’t it handle more parameters?
    Swoorup Joshi
    its an array of elements consiting 6 parameters
    about 1000 elements
    Rob Norris
    It expands to a statement with more than 32k parameters. That’s the most PG can handle.
    Swoorup Joshi
    i see
    Swoorup Joshi
    so the library is pretty barebones :smile:
    Swoorup Joshi
    still getting the exception
    val sink: Pipe[F, MarketData, Unit] = s => 
          s.groupWithin(PostgresConstant.MARKET_DATA_INSERT_CHUNK_SIZE, 5.seconds)
            .covary[F].evalMap {chunk => 
              pool.use { 
                case ex =>
                  println(s"Exception: ${ex}")
    Rob Norris
    How big is the chunk?
    And how many fields does each element in the chunk have? This product can’t be more than 32k
    Swoorup Joshi
    max of 256
    yeah it does seem weird
    val marketDataRecord: SkunkCodec[RawMarketDataRecord] =
        *: varchar(32)
        *: timestamptz 
        *: varchar(64)
        *: bool 
        *: bool 
        *: bool.opt 
        *: numeric(30,6)
        *: numeric(18,6))
    256 * 9 = 2304 max
    never mind. uuhhhj
    the groupWithin should be after the flatten thing
    Rob Norris
    Print out the length of the chunk just to be sure.
    Pardon me for being a total beginner, after going through the example docs and I'm able to make my service work with skunk(service based tagless final approach passing sessionpool resource). However I'm not able to get queries to work in seq like getIDQuery then use that to doAnotherQuery in sequence. How do I run queries in seq inside context F. My output type is F[Option[Item]]. I can't get it to compile. I'm using a Query type
    Fabio Epifani

    hello! I'm apparently having an issue decoding a field. in particular I have a query like this:
    SELECT 2::float8 + 3::numeric
    I'd expect the result to be numeric, but I get a

    [info] 🔥  The actual and asserted output columns are
    [info] 🔥  
    [info] 🔥    ?column?  float8  ->  numeric  ── type mismatch

    I take this is how postgres works, and nothing to do with Skunk specifically?

    Fabio Epifani

    hi, me again. I've got this:

    val codec = varchar(16)
    sql"SELECT $codec".query(codec)

    if I run it I get a ?column? varchar -> varchar(16) type mismatch. is this expected?

    Swapnil S.
    Hi , Is there any module present for testing the skunk queries like we do it for doobie using doobie.scalatest.IOChecker ?
    Swapnil S.

    Hi ,

    i have below case class :

    case class StepRun(job_run_id: String, step_name: String, properties: Json )

    where properties is json type . How do i write a codec for this.

    I have written codec as below but it is failing on jsonb type. Error : Not found: jsonb

    def stepRunCodec: Codec[StepRunDB1] = (varchar ~ varchar ~ jsonb).gimap[StepRun]

    Any help ?

    Fabio Epifani
    if you're using circe you can import "org.tpolecat" %% "skunk-circe" % "0.2.0"
    then you have jsonb in skunk.circe.codec.all
    I think you can decode it directly into the final type rather than a generic json
    Swapnil S.
    @epifab Got the issue.. Thanks for the help
    Swapnil S.

    Hi , Is there a way we can process the dynamic queries in skunk. I have seen all the examples where we can define the queries at compile time and there codecs but is there any provision in skunk where we can execute the dynamic queries directly without knowing the codecs.

    Like in doobie we can do this :

     def executeQuery(query: String): IO[DBException, Unit] = {

    Any Help ?

    Rob Norris
    I'm no longer monitoring this channel. Please switch to Typelevel Discord at https://sca.la/typeleveldiscord
    Michal Lacko
    Hi, i dont quite get how to deal with postgres arrays, i see Arr but i dont see how do i get a List out of Arr ??

    Hi everyone, I have a session pool, poolResource : Resource[IO, Resource[IO, Session[IO]]]. I use Session.pooled(...).use to get the actual pool, pool: Resource[IO, Session[IO]] and all my postgres services use this pool. However, I get an error that says the following:

    🔥  EofException
    🔥    Problem: EOF was reached on the network socket.
    🔥     Detail: Attempt to read 5 byte(s) failed after 0 bytes(s) were read, because
    🔥             the connection had closed.
    🔥       Hint: Discard this session and retry with a new one.

    When I use pool.use wouldn't that make a new session for me?

    How can I resolve this issue. Does it involve a certain config?

    Alexandr Oshlakov
    Hi everyone. Who know how work with serial id in skunk, when need insert object without id and return with id?
    Alexandr Oshlakov
    CREATE TABLE users(
      id        SERIAL NOT NULL,
      email     VARCHAR(255) NOT NULL,
      age       SMALLINT DEFAULT NULL,
    case class UserInsert(email: String, age: Option[Int])
    case class UserEntity(id: Long, email: String, age: Option[Int], isBanned: Boolean)
    trait UserRepository[F[_]] {
      def save(data: UserInsert): F[UserEntity]
    3 replies
    Gökmen Nişancı
    Hi Everyone ! :beers:
    When you want to use UPDATE *** RETURNING ***, should you use a command or a query ?
    Felix Bjært Hargreaves
    In doobie, there was a way to typecheck your queries against the database -- does skunk offer something similar?
    Harry Chen
    Hi, there, I try to find an example of writing an upsert command using Postgres On conflict statement.. anyone can help?
    Hi! In doobie I was able to construct a fragment from string (from a file)
    But how can I do it in skunk?
    Fragment is now applying 3 arguments, what should I pass as a third?
    Oliver Winks
    @Truedrog I run sql files to initialise the db before tests
    val sqlStr = Source.fromResource(path).getLines.mkString("\n")
    val sql    = sql"#$sqlStr".command
    I do it like that :point_up_2:
    if there are multiple queries in your sql file you have to wrap them in a DO block https://www.postgresql.org/docs/current/sql-do.html
    Oliver Winks
    I'm a bit confused about tracing/logging in Skunk. I'm not used to tracing. When I execute queries I get super detailed (and very useful) logs in the terminal about each query and result. This is awesome but during tests it gets super noisy, is there a way to silence (or at least be selective of) the output?
    @tpolecat I have tried to refactor and update the commented out Http4sExample in "modules/example/src/main/scala-2/Http4s.scala" and came up with two versions in gist , wherby the second one Http4sExample2.scala seemed to me to be the more "Skunk-idiomatic". Unfortunately only the first version Http4sExample1.scala works, the second throws a skunk.exception.PostgresErrorException ("..prepared statement doesn't exist") . Can you please give me a hint, what's wrong with the second version.