Affichage des articles dont le libellé est CDN. Afficher tous les articles
Affichage des articles dont le libellé est CDN. Afficher tous les articles

jeudi 2 mars 2017

Video streaming security

How can I protect my OTT platform from attacks? how can I stop bad people from stealing my content and monetizing it instead of me? Any OTT actor should be having these questions on video streaming security. In this post, I will share some answers from delivery (CDN) perspective, by building on my customers experiences. Video streaming security is tackled with respect to OTT platform & content components.


Typically, the OTT platform (origin) sits behind a CDN. As a result, bad actors will either attack the origin through the CDN or completely bypass it and attack the origin directly. I suggest the following solutions and best practices to enhance your platform security:
  • Isolate the origin on a separate infrastructure from other services, like email server for example.
  • Avoid using easy-to-discover FQDN of the origin (such as my-origin.ottdomain.tv) and do not expose it publicly (better use a completely different domain name).
  • Whitelist on your firewall (a cloud-based firewall is even better) solely CDN IP ranges. Most CDNs can ensure that origin fill is done through a predefined IP range.
  • Use CDN parent cache in order to reduce traffic back to the origin. In certain cases, when an asset has a very distributed audience, all CDN edge servers will go back to the origin to fill, which might bring it down like a DDoS attack!
  • Use CDN based geoblocking in order to block traffic from countries where you do not have audience. For exemple, if not operating in Latin America, you would better block this region because it has a considerable DDoS botnet concentration.
  • Understand how your CDN is capable of mitigating DDoS layer 3/4/7 attacks.
  • Implement restrictions on CDN: block HTTP POST requests if not used, ignore query strings that are not part of your normal usage.

I've got a lot of cases where customers have suffered from having their content appearing on third party websites and thus loosing potential revenues. The following solutions can help protecting against content theft:
  • Apply simple HTTP best practices on CDN like enforcing cross origin policy, and blocking requests based on Referrer header.
  • Authenticate streams on CDN by using cookie based (pay attention to cookies device compatibility and acceptable legal framework) or path based token authentication.
  • Add DRM protection to video workflow. This is the ultimate solution but comes with cost and complexity drawbacks since the DRM industry is still very fragmented and not standardized. Make sure that your CDN is compatible with the chosen DRM technology (for exemple Widevine’s WVM format requires that the CDN supports byte range requests).
  • Use TLS for video delivery in order to reduce the risk of a third party sniffing  your content on clear unencrypted channels. Make sure that your CDN is up to date in regards to the latest TLS security and best practices (secure cipher suits, OSCP-stapling, keep alive, false start...).
  • Contact CDN & IP providers of  websites which are stealing your content in order to block illegal content and dissuade them from continuing theft practices.

One last advice, common to platform and content security, is monitoring CDN logs and building some relevant security oriented analytics around it to have better insights on your streaming and take actions quickly to mitigate any abnormal behavior.

jeudi 17 novembre 2016

CDN benchmarking

Today, when we want to compare the performance of different CDN providers in a specific region, the first reflex is to check public Real User Monitoring (RUM) data, with Cedexis being one of the most known RUM provider. RUM data is very useful, and many CDN providers buy it in order to benchmark with other competitors and continuously work on improving performance.



I will highlight in the following what exactly Community RUMs measure, so you do not jump quickly to some wrong conclusions. Let's focus on the latency performance KPI and list the different components that contribute to it:
  • Last mile latency to CDN Edge, which reflects how near is it to the user from network perspective.
  • Cache width latency, which is mainly due to CDN Edge not having the content locally and must go get it from somewhere (Peer Edge, Parent Caching or simply from the origin)
  • Connectivity latency from CDN to Origin when there is a cache fill needed.


In general, Community RUM measurements are based on calculating the time (RTD) it takes to serve users a predefined object from CDN Edges. Since the object is the same and doesn't change, it's always cached on edges. In consequence, Community RUM solely measure first mile network latency, which reflects sufficiently the latency performance of very high popular objects in cache.

Nevertheless that's only a part of the picture. In real life, CDNs have different capabilities and strategies for storing content beyond Edges and filling it from origin:
  • According to content popularity, CDN cache purge policy, disk space available (Cache Width) on the Edge and Parent Caching architecture, the request will be a cache miss or hit with impact on performance. VoD provider with large video library know very well this topic. 
  • According to CDN upstream connectivity, the number of hops needed to fill from origin impacts connectivity latency. CDNs who built their own backbone benefit from a good upstream connectivity. Dynamic content is very sensitive to this aspect.
As a final word, we also need to be aware that CDNs tend to optimize their configuration used by RUM measurement for this specific use case.

lundi 21 décembre 2015

Increase QoE of video streaming by optimizing the encoding of the video library

A week ago, Netflix engineers have revealed a project on which they were working for some time now to optimize the QoE of videos they are streaming to their subscribers. Their approach is based on video encoding settings in the back-end rather than focusing only on the front-end (peering, codec, streaming format, CDN..). The expected gain is 20% on overall traffic which is huge for a player who is accounting for more than third of the global internet traffic. Additionally, Netflix will be delivering better quality at the same connection bandwidth, which is critical for addressing emerging markets.

Content providers encode several representations of the same video asset, where each representation is formed by a bitrate and a resolution. below is an example of representations of a 4:3 main profile video according to Apple recommendations:


The goal of having several representations is serving the best one to the user according to his screen (TV  vs  mobile) and connection (4G, ADSL, FTTH..). With adaptive streaming, the same user can change the representation during video play in order to adapt to the connection bandwidth variation. 

So far content providers encoded all their video assets with the same set of representations. Netflix noticed that it doesn't make sense because for the same quality/resolution a cartoon movie requires less bitrate that an action film. Each video asset has its own "entropy" that should be taken into consideration when generating the representation set. This what Netflix is doing with their per-title encoding approach. 

In order to know what is the best representation set for a title, Netflix will encode at different resolutions (480p, 720p, 1080p...), then for each resolution they will draw the "exponential" curve of quality (PSNR) vs encoding bitrate (black, green and blue curves). Notice that a 720p @ 400Kbps representation will have a worse quality than a 480p representation encoded at the same bitrate and upscaled to 720p. The optimal representation set will be the set of dots close to the red curve.


Of course this approach will cost more computing resources in the video preparation workflow, but the gains are worth it.

On the same subject, I read this interesting scientific article, where they propose not only looking to the type of the title (cartoon vs action film...) but also to its popularity, to the limits of contracted CDN capacity, of users screen resolution, of users connection, video storage... They make interesting findings on the optimal representation set:
  • Titles with high "entropy" like action film requires more representations than low "entropy" titles like cartoons.
  • The number of representations per resolution is dependent on the distribution of devices: HDTV vs mobile phones.
  • For a given resolution, lower bitrates are closer one to the other than higher bitrates.
Of course these are the conclusions of the test conditions. It can be different for exemple if we consider a content provider targeting only mobile devices in emerging markets.

As the environment of the content provider is smoothly changing (proliferation of mobile devices, conquer strategies for emerging markets with low bandwidth connections, title popularity changing, versatile peering agreements...) I guess it would be interesting to dynamically re-encode in a continuous way the representations of the video library in order to guarantee the best global user QoE with respect to constraints imposed by this moving environment.


lundi 15 juillet 2013

Content Delivery Networks

As promised previously, I'll dedicate this post to Content Delivery Networks. The goal of CDN is to improve the performance of networked applications, whether on a private network or over the Internet.
CDN is a distributed set of interconnected surrogate servers offloading traffic from the origin server. When a user makes a request, the intelligent CDN platform redirects the request to the best server capable of answering, in order to deliver the best performance and user experience.


CDN's performance is measured by the hit ratio, the ratio of requests served by surrogates instead of the origin server. Hit ratio depends on the nature of data: It is more suited to static and basic dynamic data and it's not suited to security sensitive data and data modified concurrently by different users.

CDN is a great solution for businesses that depend on networked applications, typically online e-commerce websites. First, the CDN solution will improve and harmonize the customer's experience, leading to a more loyal customer and a better conversion rate thus generating more revenues. Second, a CDN System will help simplifying and improving the efficiency of the webserver infrastructure:
  • Reduce infrastructure (Internet access, servers..) by offloading traffic to CDN provider's surrogate servers.
  • Avoid sizing the infrastructure to the peak traffic making it more effcient.
  • Avoid the complexity of evolving the infrastructure in order to absorb more traffic (load balancing systems, heavy investments ..).
  • Increase the availability of infrastructure by leveraging the availability of a redundant and distributed CDN platform.
In the following Prezi slides, I try to explain the different components building a CDN system and the different possible architectures and their stakes. This presentation is inspired from the great book by Dinesh Verma on CDNs.




Personnaly, my first experience on CDN was with Orange Labs, where we tried to leverage a CDN for a Video on Demand service. And now, I am excited about designing and proposing CDN solutions to my customers, since Orange Business Services and Akamai (leader global CDN provider) have signed a strategic partnership in November 2012: Orange will commercialize Akamai's solutions and will extend Akamai's CDN on their networks.

Achraf