Run DCR in Fastly Edge Compute #1233
Replies: 4 comments 9 replies
-
we'd need a careful look at what frontend does – iirc it does a lot more that routing e.g. header management etc. at this point moving to something like next.js maybe makes more sense than effectively writing the guardian's own version of next. i'm personally tempted by that proposition, but i wonder if we can get edge to intervene without actually needing to render every page in full? |
Beta Was this translation helpful? Give feedback.
-
also, did you mean to raise this in source repo? |
Beta Was this translation helpful? Give feedback.
-
Interesting idea! aiui if DCR renders at-edge this would happen post cache and hence the time taken to render would be added to all requests? Some points/questions:
|
Beta Was this translation helpful? Give feedback.
-
Why cache at all?Creating a new thread for this idea from @arelra
|
Beta Was this translation helpful? Give feedback.
-
DCR in Fastly Edge Compute
What?
Run DCR - a node server - as a function in Fastly Edge Compute.
https://docs.fastly.com/products/compute-at-edge
Instead of having a farm of EC2 servers that provide a rendering service for requests that get routed to origin, we move that logic closer to the browser.
To achieve this the request would still need to get CAPI data, this would mean that instead of making the request for the article CAPI data inside Frontend, it would be Fastly making the request and it is this request that Fastly would cache.
Why?
Performance
The request for CAPI data is much more cachable than the one for the article html. We vary the html response based on a number of things right now (AB tests, region, etc) but the CAPI data does not vary at all (is this true?). This means far fewer origin requests so faster response times.
The flip side to this is that every request will be 'rendered' by DCR so the time taken to render must be fast. The sell on Fastly's edge cache and the reports from other people are that this is the case but this would need careful testing and monitoring.
Flexibility
Right now we carefully, reluctantly split the cache so provide a varied response to different types of request. We want to do this more, we want a more customised response to readers based on their individual state, but providing this is expensive, complex, and does not scale.
But if we put DCR at the edge then we can provide a much higher level of customisation with zero added cost. In addition the code required to do this would be a lot simpler and easier to reason about and, crucially, much easier to test. Instead of complex conditional logic living in different platforms, the conditionals needed to decide what a user is served will all live in one place, DCR.
But DCR isn't a web server
Right now DCR only accepts a POST request from Frontend. It lives behind Frontend as a rendering service.
To make this idea work you would need to make DCR a web server that accepts a get request. This is simple to do on the DCR side of things but would need some routing work on Frontend
Beta Was this translation helpful? Give feedback.
All reactions