In-memory caching in minimal APIs
ASP.NET Core gives help for 2 abstractions for working with caching, IMemoryCache
and IDistributedCache
. Whereas the previous is used to implement in-memory caching, the latter is used to implement distributed caching.
The next use of IMemoryCache
exhibits how one can retrieve knowledge from the cache if the requested knowledge is on the market. If the information requested shouldn’t be current within the in-memory cache, the applying will retrieve the information from the information retailer (utilizing a repository), retailer the information within the in-memory cache, and return it.
app.MapGet("authors/getall", (IMemoryCache cache,
IAuthorRepository authorRepository) =>
{
if (!cache.TryGetValue("get-authors",
out Listing authors))
{
authors = authorRepository.GetAll();
var cacheEntryOptions = new MemoryCacheEntryOptions()
.SetAbsoluteExpiration(TimeSpan.FromMinutes(5))
.SetSlidingExpiration(TimeSpan.FromMinutes(1));
cache.Set("get-authors", authors, cacheEntryOptions);
}
return Outcomes.Okay(authors);
});
As you’ll be able to see within the previous code snippet, the cached content material will reside within the reminiscence for a most of 30 seconds.
Distributed caching in minimal APIs
Distributed caching enhances the efficiency and scalability of functions by distributing the load throughout a number of nodes or servers. The servers may be situated both in the identical community or in numerous networks which might be unfold throughout geographical distances.
The next code demonstrates tips on how to implement distributed caching in a minimal API endpoint in ASP.NET Core. On this instance, the endpoint returns all creator information from the distributed cache if the information is on the market within the cache. If the requested knowledge shouldn’t be obtainable within the distributed cache, the endpoint provides the information to the cache after which returns the information.