mergify[bot] on master
Update cats-effect, cats-effect… Merge pull request #680 from sc… (compare)
mergify[bot] on master
Update scalatest to 3.2.12 Merge pull request #674 from sc… (compare)
Hi, I'm evaluating ScalaCache usage, but I've got one problem. In order to avoid accidental flushing of a Redis prod database we have this configuration enabled (example aliases, ofc):
rename-command FLUSHALL FLUSHALLNEW
rename-command FLUSHDB FLUSHDBNEW
That means standard:
removeAll[T]()
will not work, bc it uses hardcoded FLUSHDB
command resulting in that exception:
redis.clients.jedis.exceptions.JedisDataException: ERR unknown command `FLUSHDB`, with args beginning with:
redis.clients.jedis.Protocol.processError(Protocol.java:130)
redis.clients.jedis.Protocol.process(Protocol.java:164)
redis.clients.jedis.Protocol.read(Protocol.java:218)
redis.clients.jedis.Connection.readProtocolWithCheckingBroken(Connection.java:341)
redis.clients.jedis.Connection.getStatusCodeReply(Connection.java:240)
redis.clients.jedis.BinaryJedis.flushDB(BinaryJedis.java:361)
scalacache.redis.RedisCache.$anonfun$doRemoveAll$1(RedisCache.scala:20)
scalacache.AsyncForId$.delay(Async.scala:48)
scalacache.redis.RedisCache.doRemoveAll(RedisCache.scala:17)
scalacache.AbstractCache.removeAll(AbstractCache.scala:76)
scalacache.AbstractCache.removeAll$(AbstractCache.scala:75)
scalacache.redis.RedisCache.removeAll(RedisCache.scala:12)
scalacache.package$RemoveAll.apply(package.scala:63)
Is there any workaround for that? I don't want to mix libraries and I would like to avoid low level calls, but I might accept it if inevitable.
Error:(123, 62) Could not find a Mode for type dao.RepositoryIO.
If you want synchronous execution, try importing the sync mode:
import scalacache.modes.sync._
If you are working with Scala Futures, import the scalaFuture mode
and don't forget you will also need an ExecutionContext:
import scalacache.modes.scalaFuture._
import scala.concurrent.ExecutionContext.Implicits.global
memoizeF[RepositoryIO, Option[User]](Option(1.hour)) {
How do I remove from a cache? In the example above, I've tried removing by the user, and by the userId, but neither of those worked. Only removeAll worked
I think I have figured out why this is happening to you, but am not sure what is the best way to fix it. Did you find a solution?
I think the problem is that the method call is part of the key stored by scalacache, and calling it from a different method causes a cache miss.
Cache
and CacheAlg
doesn't include effect type parameter (it was moved to methods declarations). I've just created an issue for this (cb372/scalacache#417), where I've explained my doubts more precisely.
Hi. Hoping someone can help with a question I have regarding using scalacache with cats.effect.IO
. If I run the following code:
import cats.effect.{ExitCode, IO, IOApp}
import cats.implicits._
import scalacache.guava.GuavaCache
import scalacache.{cachingF, CacheConfig, Flags}
import scala.concurrent.duration.DurationLong
object CacheTest extends IOApp {
override def run(args: List[String]): IO[ExitCode] =
(
cached("abc"),
cached("abc"),
cached("def")
).parMapN(_ + _ + _) *> IO.pure(ExitCode.Success)
private val cache = GuavaCache[String](CacheConfig.defaultCacheConfig)
private def cached(key: String) =
cachingF(key)(ttl = None)(loadValue(key))(cache, scalacache.CatsEffect.modes.async, Flags.defaultFlags)
private def loadValue(key: String) =
IO(println(s"Loading $key")) *>
IO.sleep(1.second) *>
IO {
println(s"Loaded $key")
key
}
}
I get the output:
Loading abc
Loading abc
Loading def
Loaded abc
Loaded abc
Loaded def
cachingF
invokes the loadValue
function twice with "abc"
. Instead I would like it to only invoke it once and use the resulting IO
for both calls to cached
. Is this possible using scalacache or do I need to implement this manually by storing an actual IO
in the cache and removing the key if the IO
fails at a later stage?
Hello All ,
Is there any way to provide custom codecs for redis cache.
here is my code :
def creatCache[T](implicit codec: Codec[T]) = {
val jedisPool = new JedisPool(new JedisPoolConfig(), "localhost", 6379, 20000)
val customisedRedisCache: Cache[T] = RedisCache[T](jedisPool)
customisedRedisCache
}
and calling this function as
implicit val jobPropsCache = creatCache[Map[String,String]]
code is failing at calling thr function. error is
No implicits provides for codec: Codec[Map[String,String]]
Hi there, I'm migrating from scala 2.12 to 2.13, including code that was written using scalacache and guava. But now the compiler refuses that code. These are the issues:
import scalacache._
import scalacache.serialization.InMemoryRepr
import guava._
class MyClass(cache: ScalaCache[InMemoryRepr]) {...}
gives me : object InMemoryRepr is not a member of package scalacache.serialization and: not found: type ScalaCache
Also:sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }
gives me: value cachingWithTTL is not a member of object scalacache.sync
Finally:ScalaCache(GuavaCache())
gives me:
overloaded method apply with alternatives:
[error] [V](underlying: com.google.common.cache.Cache[String,scalacache.Entry[V]])(implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V] <and>
[error] [V](implicit config: scalacache.CacheConfig): scalacache.guava.GuavaCache[V]
[error] cannot be applied to ()
Anyone to help migrating to last version of scalacache (I can't find any scaladoc or changelog to help replacing the old code) ?
So, what should I use instead of scalacache.serialization.InMemoryRepr
which seems to have been removed from the library.
Why is ScalaCache
not found ?
What should I use instead of sync.cachingWithTTL
?
How to use the new constructors of GuavaCache
to get the same as old GuavaCache()
? Isn't there a default one ?
I may have found answers:
GuavaCache[MyClass]
instead of ScalaCache[InMemoryRepr]
sync.caching(key)(Some(Duration.apply(1, DAYS))) { ... }(cache = myImplicitCache, mode = mode, flags = new Flags())
instead of sync.cachingWithTTL(key)(Duration.apply(1, DAYS)) { ... }
GuavaCache(config)
with implicit val config = CacheConfig()
instead of ScalaCache(GuavaCache())
I hope this is correct.
When using ScalaCache with cats-effect and Caffeine, is it a good idea to set the Caffeine executor
field to the cats-effect Blocker
being used?
e.g.
Caffeine.newBuilder().maximumSize(10000L).executor(blocker.blockingContext.execute(_)).build[String, Entry[String]]
Mode
for ZIO effects ?
Hi! If I use caffeine cache with expireAfterWrite set, not sure how the ttl param would behave in cachingF?
private val underlyingCaffeineCache = Caffeine.newBuilder()
.initialCapacity(1000)
.maximumSize(10000L)
.expireAfterWrite(60, TimeUnit.SECONDS)
.buildString, Entry[T]
implicit val caffeineCache: Cache[T] = CaffeineCache(underlyingCaffeineCache)
cachingF(key)(ttl.some)
Hi Team,
We are using below library dependencies in our scala App:
"com.github.cb372" %% "scalacache-redis" % "0.28.0",
"com.github.cb372" %% "scalacache-cats-effect" % "0.28.0"
We are planning to upgrade our scala app from 2.12 to 3.0.0. Are there any plans from scalacache's end to release next version compatible with scala 3 ?
import scalacache.memoization._
import scala.concurrent.duration._
import scalacache.caffeine._
import scala.concurrent.Future
implicit val cache: CaffeineCache[String] = CaffeineCache[String]
import scalacache.modes.scalaFuture._
import scala.concurrent.ExecutionContext.Implicits.global
def memFut(s: String): Future[String] = memoize(Option(30.seconds)) {
Future(s) // I want to be able to cache an async call here
}
Future[A]
?
CaffeineCache[Future[String]]