0

I've trying to build a web API throttling mechanism in ASP.NET Core that "penalizes" excess usage in an exponentially-growing denial window e.g. first 3 tries are allowed, 4th is blocked if within 60 seconds, 5th will blocked if within 120s, 6th blocked for 240s, etc.

The latest .Net 7's API rate-limiting is great, but rate-limiting isn't throttling and I couldn't find a way to augment it enough, either by partitioning or otherwise, as it would lack that sliding window feature.

I tried different options using LazyCache, directly with IMemoryCache, and my favourite, IDistributedCache - and looked at what others did, such as suggested by @JsAndDotNet here. However, all these derivatives suffer from the same security flaw when the cache is full (flaw for throttling purposes, that is): Adding a cache entry when the cache is full doesn't return any error (as discussed here). In fact, it continues as if that entry was added successfully to the cache. You can catch that asynchronically through a callback - but at that point, access has been already granted.

An attacker could simply flood the cache memory until its full, virtually bypassing the throttling mechanism on all subsequent requests. When the cache memory is full, compaction is triggered - which is another bad thing for throttling security, because it get rids of potentially relevant entries in the cache.

My question is how do I avoid that "cache full" vulnerability and catch it synchronically so I can block all transactions until memory is available again?

10
  • If the attacker can spoof the metadata that the cache key is made of, doesn't this mean that they already hacked the system? Commented Dec 5, 2022 at 7:37
  • No spoofing is needed in the attack scenario I mentioned. I meant that an attacker that wants to circumvent the throttling can simply make enough API calls to fill the cache (if the sizelimit is set on 1024, for example, it will only takes 1024 calls) and once that happens, every subsequent call won't enter the cache, therefore bypass the throttling. To make it clearer: assume the system is open source - so the attacker knows to make 1024 calls. Commented Dec 5, 2022 at 7:58
  • Is each request stored in the cache?! Commented Dec 5, 2022 at 10:49
  • No. Cache key can be session (IP:port), user ID, API key, etc. So assume a new record entry for every IP/port attempted, for example. Commented Dec 5, 2022 at 15:25
  • For these types of keys how would the attacker make 1024 different entries? Commented Dec 5, 2022 at 21:25

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.