Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 16:29
    scala-steward closed #518
  • 16:29
    scala-steward commented #518
  • 16:29
    scala-steward opened #535
  • 16:28
    scala-steward opened #534
  • 10:15
    scala-steward opened #533
  • May 13 22:33
    scala-steward closed #516
  • May 13 22:33
    scala-steward commented #516
  • May 13 22:33
    scala-steward opened #532
  • May 13 04:06
    scala-steward closed #530
  • May 13 04:06
    scala-steward commented #530
  • May 13 04:06
    scala-steward opened #531
  • May 12 08:13
    scala-steward opened #530
  • May 10 20:09
    scala-steward closed #527
  • May 10 20:09
    scala-steward commented #527
  • May 10 20:09
    scala-steward opened #529
  • May 10 06:38
    scala-steward closed #519
  • May 10 06:38
    scala-steward commented #519
  • May 10 06:38
    scala-steward opened #528
  • May 04 10:13
    scala-steward closed #524
  • May 04 10:13
    scala-steward commented #524
Nick
@gurinderu
@mohanraj-nagasamy don't use exceptions dude) but if you can't you can wrap your code to Try and then transform this to OptionT or EitherT
Dominic Egger
@GrafBlutwurst
Hey I am working on a very simple PR for easily wrapping caches in Resource[F[_], E] from cats
GrafBlutwurst/scalacache@b2f55bd
but I'm struggling with coming up with a clean solution for partial application of types to avoid having to specify the V parameter
does anyone have a good idea?
maybe it'd be better to pass F[Cache[V]] rather than thunk: => Cache[V]. Though ideally I'd like to enable the following syntax:
resourceCache[IO] {
    //cache init
    Cache[V]
}
Anton Kuroedov
@atk91
Hi everyone! Is it possible to get total weight of objects in cache with Caffeine behind?
liadkaradi
@liadkaradi
Hi All,
I'm trying to use a custom cacheConfig object (in order to control cache key creation). I've created an instance of it and have added it to my scope as an implicit value. So far I see that my custom implementation is not being taken into account (my breakpoint is being ignored). Can someone give me a concrete example of such a custom use? Thanks in advance.
Brian P. Holt
@bpholt
Is the scalacache-twitter-util module published anywhere? I don't see it in Maven Central
Paulo "JCranky" Siqueira
@jcranky
Hi. I just tried to update scalacache to 0.28.0 and noticed it is already depending on cats 2 milestones... is that on purpose?
Wojtek Pituła
@Krever
@jcranky cant answer the question but why is that a problem? cats 2 should be bin-compat with cats 1
Paulo "JCranky" Siqueira
@jcranky
should, but why take the risk in a production system, while it is still a Milestone?
Arunav Sanyal
@Khalian
is there documentation on how to create a Caffeine scala cache. I am looking at this https://cb372.github.io/scalacache/docs/index.html and it only says "use caffeine if you want to use a high performance cache"
Arunav Sanyal
@Khalian
nvm, i figured it out : private val accountSPCache = CaffeineCache(Caffeine.newBuilder.build[String, Entry[List[String]]]). Can someone please add this (and every other version) into the documentation so that people do not have to go look at unit tests. Thanks
Arunav Sanyal
@Khalian
in the example | val result = caching("benjamin")(ttl = None) - what does ttl None mean? does it mean the ttl is now the same as the underlying cache init one OR does it imply that there is no ttl, this is meant to live till infinity
Sean Kwak
@cosmir17

Hi :) I used scalacache version, 0.10.0 for scalacache-core & scala-caffeine. I upgraded to "0.28.0"

The following code stopped being compiled..

  implicit private val inMemoryCache: ScalaCache[InMemoryRepr] = ScalaCache(CaffeineCache())
  private val CacheTime = 10.seconds
  def myMethod: Future[Boolean] = memoizeSync(CacheTime) {.....}

Can I ask how I can migrate to 0.28.0?

I have changed the first line to

  implicit private val inMemoryCache: Cache[Future[Boolean]] = CaffeineCache[Future[Boolean]]

It compiles but I don't feel that it is the right approach..

Bijan Chokoufe Nejad
@bijancn
Hey @here . Is there a release planned with cats effect 2.0 ?
Matthew Tovbin
@tovbinm
Howdy, folks! I just started using your library and it's absolutely amazing! Simple and clean API, easy integrations with Redis and others. Thank you!! ;))
Roberto Leibman
@rleibman
Hey... I'm having an issue... I'm trying to use memoizeF and for some reason it's not working the cache keeps on getting "missed" according to the logs. My code looks like this:
    private case object UserCache {

      import scalacache.ZioEffect.modes._

      private implicit val userCache: Cache[Option[User]] = CaffeineCache[Option[User]]

      private[LiveRecipeDAO] def get(userId: Int): Task[Option[User]] = memoizeF[Task, Option[User]](Option(1 hour)) {
        val zio: Task[Option[User]] = fromDBIO(for {
          userOpt <- UserQuery.filter(u => u.id === userId && !u.deleted).result.headOption
          accountOpt <- DBIO
            .sequence(
              userOpt.toSeq.map(
                user =>
                  AccountQuery.filter(account => account.id === user.accountId).result.headOption
              )
            )
            .map(_.flatten.headOption)
        } yield for {
          account <- accountOpt
          user    <- userOpt
        } yield user.toUser(account))

        val runtime = new DefaultRuntime {}
        for {
        _ <- console.putStrLn(s"Retrieving user ${userId}").provide(runtime.environment)
          zio <- zio.provide(self): Task[Option[User]]
        } yield zio
      }
    }
Roberto Leibman
@rleibman
The log does show that the value is inserted into the cache, but the cache consistently misses:
*** (s.caffeine.CaffeineCache) Cache miss for key dao.LiveRecipeDAO.$anon.UserCache.get(1)
*** (s.caffeine.CaffeineCache) Inserted value into cache with key dao.LiveRecipeDAO.$anon.UserCache.get(1) with TTL 3600000 ms
Roberto Leibman
@rleibman
... answering my own self...
Turns out that private inner objects are not "static" so each of my outer objects was getting it's own cache... I moved all the caches outside to a global Object and that worked!
Roberto Leibman
@rleibman
How do I remove from a cache? In the example above, I've tried removing by the user, and by the userId, but neither of those worked. Only removeAll worked
eltherion
@eltherion

Hi, I'm evaluating ScalaCache usage, but I've got one problem. In order to avoid accidental flushing of a Redis prod database we have this configuration enabled (example aliases, ofc):

rename-command FLUSHALL FLUSHALLNEW
rename-command FLUSHDB FLUSHDBNEW

That means standard:

removeAll[T]()

will not work, bc it uses hardcoded FLUSHDB command resulting in that exception:

redis.clients.jedis.exceptions.JedisDataException: ERR unknown command `FLUSHDB`, with args beginning with: 
  redis.clients.jedis.Protocol.processError(Protocol.java:130)
  redis.clients.jedis.Protocol.process(Protocol.java:164)
  redis.clients.jedis.Protocol.read(Protocol.java:218)
  redis.clients.jedis.Connection.readProtocolWithCheckingBroken(Connection.java:341)
  redis.clients.jedis.Connection.getStatusCodeReply(Connection.java:240)
  redis.clients.jedis.BinaryJedis.flushDB(BinaryJedis.java:361)
  scalacache.redis.RedisCache.$anonfun$doRemoveAll$1(RedisCache.scala:20)
  scalacache.AsyncForId$.delay(Async.scala:48)
  scalacache.redis.RedisCache.doRemoveAll(RedisCache.scala:17)
  scalacache.AbstractCache.removeAll(AbstractCache.scala:76)
  scalacache.AbstractCache.removeAll$(AbstractCache.scala:75)
  scalacache.redis.RedisCache.removeAll(RedisCache.scala:12)
  scalacache.package$RemoveAll.apply(package.scala:63)

Is there any workaround for that? I don't want to mix libraries and I would like to avoid low level calls, but I might accept it if inevitable.

Roberto Leibman
@rleibman
I don't know if this thing is on ;) I'm working with the zio branch from @dieproht , I'm trying to memoizeF a ZIO[SomeResource, Throwable, Something], I do include the zio modes, but I still get this error:
Error:(123, 62) Could not find a Mode for type dao.RepositoryIO.
If you want synchronous execution, try importing the sync mode:
import scalacache.modes.sync._
If you are working with Scala Futures, import the scalaFuture mode
and don't forget you will also need an ExecutionContext:
import scalacache.modes.scalaFuture._
import scala.concurrent.ExecutionContext.Implicits.global

        memoizeF[RepositoryIO, Option[User]](Option(1.hour)) {
I have no problem if it's a Task[Something], only if it's a more complicated ZIO that requires resources
Roberto Leibman
@rleibman
I guess the mode only supports Task... ok... so I put my cache closer to the edge of the app and got it to work.
Roberto Leibman
@rleibman
@dieproht?
@dieproht did you zio interop ever make it into the main repo?
Roberto Leibman
@rleibman
I hope you don't mind... I'm actually using it, so I put in a PR for it.
Roberto Leibman
@rleibman
@cb372 Can you look at my PR please?
Roberto Leibman
@rleibman
bump
bengraygh
@bengraygh

How do I remove from a cache? In the example above, I've tried removing by the user, and by the userId, but neither of those worked. Only removeAll worked

I think I have figured out why this is happening to you, but am not sure what is the best way to fix it. Did you find a solution?

I think the problem is that the method call is part of the key stored by scalacache, and calling it from a different method causes a cache miss.

Miguel Vilá
@miguel-vila
hello :wave: , we have observed some strange behavior when using scalacache along with ZIO. We are using a Task mode and we noticed that the parameters were not being included as part of the key. We factored out the code by extracting the underlying operation and it worked as expected, will link some gist
you can see the differences between the working version and the non working version there
so I'm wondering if this is something that's expected? were we missusing scalacache?
Roberto Leibman
@rleibman
@miguel-vila I didn't know there was even an official zio mode. I put a PR back in April but haven't heard anything since.
sc6l6d3v
@sc6l6d3v
Is there any possibility of caching an fs2 stream?
Simon Redfern
@simonredfern_gitlab
Folks, Nubeee here. How do I make the following work with maven?: import scalacache.serialization.binary._
PawelJ-PL
@PawelJ-PL
Hi, I'm wonder why traits Cache and CacheAlg doesn't include effect type parameter (it was moved to methods declarations). I've just created an issue for this (cb372/scalacache#417), where I've explained my doubts more precisely.
SWAPNIL SONAWANE
@iamswapnil44_twitter
Folks, Is there any way to retrive all the keys from Caffeine cache ?
Brendan Maguire
@brendanmaguire

Hi. Hoping someone can help with a question I have regarding using scalacache with cats.effect.IO. If I run the following code:

import cats.effect.{ExitCode, IO, IOApp}
import cats.implicits._
import scalacache.guava.GuavaCache
import scalacache.{cachingF, CacheConfig, Flags}

import scala.concurrent.duration.DurationLong

object CacheTest extends IOApp {

  override def run(args: List[String]): IO[ExitCode] =
    (
      cached("abc"),
      cached("abc"),
      cached("def")
    ).parMapN(_ + _ + _) *> IO.pure(ExitCode.Success)

  private val cache = GuavaCache[String](CacheConfig.defaultCacheConfig)

  private def cached(key: String) =
    cachingF(key)(ttl = None)(loadValue(key))(cache, scalacache.CatsEffect.modes.async, Flags.defaultFlags)

  private def loadValue(key: String) =
    IO(println(s"Loading $key")) *>
      IO.sleep(1.second) *>
      IO {
        println(s"Loaded $key")
        key
      }
}

I get the output:

Loading abc
Loading abc
Loading def
Loaded abc
Loaded abc
Loaded def

cachingF invokes the loadValue function twice with "abc". Instead I would like it to only invoke it once and use the resulting IO for both calls to cached. Is this possible using scalacache or do I need to implement this manually by storing an actual IO in the cache and removing the key if the IO fails at a later stage?

SWAPNIL SONAWANE
@iamswapnil44_twitter

Hello All,

Is there any way to get the redis cache stats like hit rate , hit count as we get it for caffaine by using caffainecache.stats.hitCount()

SWAPNIL SONAWANE
@iamswapnil44_twitter

Hello All ,

Is there any way to provide custom codecs for redis cache.

here is my code :

def creatCache[T](implicit  codec: Codec[T]) = {
      val jedisPool = new JedisPool(new JedisPoolConfig(), "localhost", 6379, 20000)
      val customisedRedisCache: Cache[T] = RedisCache[T](jedisPool)
      customisedRedisCache
}

and calling this function as

 implicit val jobPropsCache = creatCache[Map[String,String]]

code is failing at calling thr function. error is

 No implicits provides for codec: Codec[Map[String,String]]
Yoann Guyot
@ygu

Hi there, I'm migrating from scala 2.12 to 2.13, including code that was written using scalacache and guava. But now the compiler refuses that code. These are the issues:

import scalacache._
import scalacache.serialization.InMemoryRepr
import guava._

class MyClass(cache: ScalaCache[InMemoryRepr]) {...}

gives me : object InMemoryRepr is not a member of package scalacache.serialization and: not found: type ScalaCache

Also:
sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }

gives me: value cachingWithTTL is not a member of object scalacache.sync

Finally:
ScalaCache(GuavaCache())

gives me:

overloaded method apply with alternatives:
[error]   [V](underlying: com.google.common.cache.Cache[String,scalacache.Entry[V]])(implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V] <and>
[error]   [V](implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V]
[error]  cannot be applied to ()

Anyone to help migrating to last version of scalacache (I can't find any scaladoc or changelog to help replacing the old code) ?
So, what should I use instead of scalacache.serialization.InMemoryRepr which seems to have been removed from the library.
Why is ScalaCache not found ?
What should I use instead of sync.cachingWithTTL ?
How to use the new constructors of GuavaCache to get the same as old GuavaCache() ? Isn't there a default one ?

@cosmir17 it seems you've experienced the same kind of issue, have you found out how to fix it ?
Yoann Guyot
@ygu

I may have found answers:

  • I use GuavaCache[MyClass] instead of ScalaCache[InMemoryRepr]
  • I use sync.caching(key)(Some(Duration.apply(1, DAYS))) { ... }(cache = myImplicitCache, mode = mode, flags = new Flags()) instead of sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }
  • I use GuavaCache(config) with implicit val config = CacheConfig() instead of ScalaCache(GuavaCache())

I hope this is correct.

Roberto Leibman
@rleibman
Are there any facilities for multi-level layering of cache systems in either scalacache or in any of the cache systems that scalacache uses? Basically I'd like a short TTL for in memory cache and a a longer TTL for a cloud level cache before I hit the actual process that gets the data.