jeudi 17 novembre 2016

CDN benchmarking

Today, when you want to compare the performance of different CDN providers in a specific region, your first reflex is to check public Real User Monitoring (RUM) data, with Cedexis being one of the most known RUM provider. This data is very useful, and some CDN providers buy it in order to benchmark with other competitors and work on improving performance when there are gaps.

I will highlight in the following what exactly RUM measures, so you do not jump quickly to some unprecise conclusions. Let's focus on the latency performance KPI and list the different components that contribute to it:
  • Last mile network latency to CDN Edge on the ISP network, which reflects how near is it to the user.
  • Caching latency, which is mainly due to CDN Edge not having the content and must go back to the origin to fill it.
  • Connectivity latency from CDN edge to Origin.

In general, RUM measurements are based on calculating the time (RTD) it takes to serve the user a predefined object from CDN Edge. Since the object is the same and doesn't change, it's always cached on edges, thus measurements reflect solely the first mile network latency. But that's not all of the picture, because in real life CDN edges needs to fill content from the origin:
  • According to the cache purge policy and the disk space available, the request will be a cache miss. The less TB capacity is present on the Edge, the more will be the caching latency of the CDN.
  • According to CDN backbone, the more hops you need to cross to reach the origin, the more will be the connectivity latency. On this aspect for example, tier 1 IP networks who provide CDN services are very optimized.
For highly cachable content, the comparaison based only on the first mile latency makes sense, but it has limits when it's not the case, such as for long tail video streaming or dynamic content.

Aucun commentaire :

Enregistrer un commentaire