Cache
Last updated
Last updated
A good overview can be found here.
Used in managing state of stateless application, reduce load off of databases for read intensive workloads etc.
Some common strategy based on operation done on the system can be found here.
Most popular strategies are,
Lazy Loading/Cache Aside/Lazy Population
Write Through
Caches are in-memory databases with high performance and low latency.
Being a managed service following tasks are managed by AWS,
OS maintenance/patching
Optimizations
Setup
Configuration
Monitoring
Failure recovery and backups
Offers managed Redis
or Memcached
variants.
For redis in cluster mode, there can be upto 5 read replicas.
Cache eviction can occur in 3 ways
Delete the item explicitly in the cache
Item is evicted because the memory is full and is not recently used.
You set an item's time-to-live, where in TTL can vary from a few second to hours or days.
If too many evictions happen due to memory, you should scale up or out.
Encryption at rest and transit is available to make sure data is secure.