mergify[bot] on master
Update cats-effect, cats-effect… Merge pull request #688 from sc… (compare)
Hi :) I used scalacache version, 0.10.0 for scalacache-core & scala-caffeine. I upgraded to "0.28.0"
The following code stopped being compiled..
implicit private val inMemoryCache: ScalaCache[InMemoryRepr] = ScalaCache(CaffeineCache())
private val CacheTime = 10.seconds
def myMethod: Future[Boolean] = memoizeSync(CacheTime) {.....}
Can I ask how I can migrate to 0.28.0?
I have changed the first line to
implicit private val inMemoryCache: Cache[Future[Boolean]] = CaffeineCache[Future[Boolean]]
It compiles but I don't feel that it is the right approach..
private case object UserCache {
import scalacache.ZioEffect.modes._
private implicit val userCache: Cache[Option[User]] = CaffeineCache[Option[User]]
private[LiveRecipeDAO] def get(userId: Int): Task[Option[User]] = memoizeF[Task, Option[User]](Option(1 hour)) {
val zio: Task[Option[User]] = fromDBIO(for {
userOpt <- UserQuery.filter(u => u.id === userId && !u.deleted).result.headOption
accountOpt <- DBIO
.sequence(
userOpt.toSeq.map(
user =>
AccountQuery.filter(account => account.id === user.accountId).result.headOption
)
)
.map(_.flatten.headOption)
} yield for {
account <- accountOpt
user <- userOpt
} yield user.toUser(account))
val runtime = new DefaultRuntime {}
for {
_ <- console.putStrLn(s"Retrieving user ${userId}").provide(runtime.environment)
zio <- zio.provide(self): Task[Option[User]]
} yield zio
}
}
*** (s.caffeine.CaffeineCache) Cache miss for key dao.LiveRecipeDAO.$anon.UserCache.get(1)
*** (s.caffeine.CaffeineCache) Inserted value into cache with key dao.LiveRecipeDAO.$anon.UserCache.get(1) with TTL 3600000 ms
Hi, I'm evaluating ScalaCache usage, but I've got one problem. In order to avoid accidental flushing of a Redis prod database we have this configuration enabled (example aliases, ofc):
rename-command FLUSHALL FLUSHALLNEW
rename-command FLUSHDB FLUSHDBNEW
That means standard:
removeAll[T]()
will not work, bc it uses hardcoded FLUSHDB
command resulting in that exception:
redis.clients.jedis.exceptions.JedisDataException: ERR unknown command `FLUSHDB`, with args beginning with:
redis.clients.jedis.Protocol.processError(Protocol.java:130)
redis.clients.jedis.Protocol.process(Protocol.java:164)
redis.clients.jedis.Protocol.read(Protocol.java:218)
redis.clients.jedis.Connection.readProtocolWithCheckingBroken(Connection.java:341)
redis.clients.jedis.Connection.getStatusCodeReply(Connection.java:240)
redis.clients.jedis.BinaryJedis.flushDB(BinaryJedis.java:361)
scalacache.redis.RedisCache.$anonfun$doRemoveAll$1(RedisCache.scala:20)
scalacache.AsyncForId$.delay(Async.scala:48)
scalacache.redis.RedisCache.doRemoveAll(RedisCache.scala:17)
scalacache.AbstractCache.removeAll(AbstractCache.scala:76)
scalacache.AbstractCache.removeAll$(AbstractCache.scala:75)
scalacache.redis.RedisCache.removeAll(RedisCache.scala:12)
scalacache.package$RemoveAll.apply(package.scala:63)
Is there any workaround for that? I don't want to mix libraries and I would like to avoid low level calls, but I might accept it if inevitable.
Error:(123, 62) Could not find a Mode for type dao.RepositoryIO.
If you want synchronous execution, try importing the sync mode:
import scalacache.modes.sync._
If you are working with Scala Futures, import the scalaFuture mode
and don't forget you will also need an ExecutionContext:
import scalacache.modes.scalaFuture._
import scala.concurrent.ExecutionContext.Implicits.global
memoizeF[RepositoryIO, Option[User]](Option(1.hour)) {
How do I remove from a cache? In the example above, I've tried removing by the user, and by the userId, but neither of those worked. Only removeAll worked
I think I have figured out why this is happening to you, but am not sure what is the best way to fix it. Did you find a solution?
I think the problem is that the method call is part of the key stored by scalacache, and calling it from a different method causes a cache miss.
Cache
and CacheAlg
doesn't include effect type parameter (it was moved to methods declarations). I've just created an issue for this (cb372/scalacache#417), where I've explained my doubts more precisely.
Hi. Hoping someone can help with a question I have regarding using scalacache with cats.effect.IO
. If I run the following code:
import cats.effect.{ExitCode, IO, IOApp}
import cats.implicits._
import scalacache.guava.GuavaCache
import scalacache.{cachingF, CacheConfig, Flags}
import scala.concurrent.duration.DurationLong
object CacheTest extends IOApp {
override def run(args: List[String]): IO[ExitCode] =
(
cached("abc"),
cached("abc"),
cached("def")
).parMapN(_ + _ + _) *> IO.pure(ExitCode.Success)
private val cache = GuavaCache[String](CacheConfig.defaultCacheConfig)
private def cached(key: String) =
cachingF(key)(ttl = None)(loadValue(key))(cache, scalacache.CatsEffect.modes.async, Flags.defaultFlags)
private def loadValue(key: String) =
IO(println(s"Loading $key")) *>
IO.sleep(1.second) *>
IO {
println(s"Loaded $key")
key
}
}
I get the output:
Loading abc
Loading abc
Loading def
Loaded abc
Loaded abc
Loaded def
cachingF
invokes the loadValue
function twice with "abc"
. Instead I would like it to only invoke it once and use the resulting IO
for both calls to cached
. Is this possible using scalacache or do I need to implement this manually by storing an actual IO
in the cache and removing the key if the IO
fails at a later stage?
Hello All ,
Is there any way to provide custom codecs for redis cache.
here is my code :
def creatCache[T](implicit codec: Codec[T]) = {
val jedisPool = new JedisPool(new JedisPoolConfig(), "localhost", 6379, 20000)
val customisedRedisCache: Cache[T] = RedisCache[T](jedisPool)
customisedRedisCache
}
and calling this function as
implicit val jobPropsCache = creatCache[Map[String,String]]
code is failing at calling thr function. error is
No implicits provides for codec: Codec[Map[String,String]]
Hi there, I'm migrating from scala 2.12 to 2.13, including code that was written using scalacache and guava. But now the compiler refuses that code. These are the issues:
import scalacache._
import scalacache.serialization.InMemoryRepr
import guava._
class MyClass(cache: ScalaCache[InMemoryRepr]) {...}
gives me : object InMemoryRepr is not a member of package scalacache.serialization and: not found: type ScalaCache
Also:sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }
gives me: value cachingWithTTL is not a member of object scalacache.sync
Finally:ScalaCache(GuavaCache())
gives me:
overloaded method apply with alternatives:
[error] [V](underlying: com.google.common.cache.Cache[String,scalacache.Entry[V]])(implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V] <and>
[error] [V](implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V]
[error] cannot be applied to ()
Anyone to help migrating to last version of scalacache (I can't find any scaladoc or changelog to help replacing the old code) ?
So, what should I use instead of scalacache.serialization.InMemoryRepr
which seems to have been removed from the library.
Why is ScalaCache
not found ?
What should I use instead of sync.cachingWithTTL
?
How to use the new constructors of GuavaCache
to get the same as old GuavaCache()
? Isn't there a default one ?
I may have found answers:
GuavaCache[MyClass]
instead of ScalaCache[InMemoryRepr]
sync.caching(key)(Some(Duration.apply(1, DAYS))) { ... }(cache = myImplicitCache, mode = mode, flags = new Flags())
instead of sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }
GuavaCache(config)
with implicit val config = CacheConfig()
instead of ScalaCache(GuavaCache())
I hope this is correct.