Published on

Multi-Tier CDN Caching

Authors

Caching on the Web

Caching is a vital part of web and mobile services. Without it, traffic would pile up and bottlenecks would be created. Caching allows for traffic to remain local - with only the part of the data that must go to the origin server actually going there.

Breaking it down

A business hosts an application on an origin server. The business customers navigate to the business website and are routed through caching servers that are local to them - often in the same city. The business origin server, however, is not local. These caching servers are alone capable of responding to most customer requests with cached data, and only forward requests whose responses cannot be cached (due to them being dynamic or secure information.)

Awesome and efficient, but there's one issue: Each caching server has to fulfill customer requests at least once in order to store the responses in the cache. This is fine if there are only a few caching servers, but what if there are hundreds, or thousands?

Multi-tier Caching

Multi-tier caching is where we add one more hop between the business origin host server and the caching servers. This additional hop acts as the "on-ramp" for the business origin host and is physically located near it, often in the same city, and also caches data. In this scenario, the "on-ramp" is able to cache all the requests for the business origin server and respond to the caching servers directly - further reducing requests to the business origin server.

What We Do

Except in the case of serverless origins, Skip2 uses multi-tier caching exactly as described. Every server on our network is capable of acting both as an "on-ramp" for our customers origins and as a caching server to provide data to their users. This keeps the cache hit ratio high and requests to the origins low, reducing server load and increasing application capacity.

Learn more

Sign up for our newsletter

Get Started