Tutorial: How to Set Up a API Load Balancer by Using CloudFlare
- Time:2020-09-07 12:03:44
- Class:Weblog
- Read:29
The CloudFlare Worker is a Serverless Service (similar to Amazon Lambda, Google Function) that we can use to deploy functions in the edge network. The serverless is the future – with the biggest advantages: no need to maintain the servers (no devop costs) – and highly scalable, as the serverless functions can be automatically scaled horizontally to the nodes. And also, the latency is small because the node will be close to the visitors geographically.
A Load Balancer is a server that can be used to distribute the load across the servers. Load Balancing enables the horizontal scaling. However, a load balancer can still suffer from Single-Point-Failure if there is only 1 physical load balancing server. With CloudFlare Worker, the worker node is distributed in the edge networks. And thus, we can deploy a load balancing script on the CloudFlare networks – which has a high durability and availability.
Deploying the Load Balancer using a serverless techonology on the distribute network nodes thus can be quite advantageous.
Distributed Load Balancer by CloudFlare Worker
The cost of seting up a distribute load balancer is affordable. The CloudFlare worker has a free plan – which gives you 100K API calls per day and 10ms maximum CPU time for each API request. For paid plan – the monthly quote is 10 Million requests and maximum 50ms CPU time.
For example, let’s first define a list of servers (origins) behind the distributed load balancer that actually do the work.
1 2 3 4 | let nodes = [ "https://api.justyy.com", "https://api.steemyy.com" ]; |
let nodes = [ "https://api.justyy.com", "https://api.steemyy.com" ];
We also need to implement the Promise.any as it is not supported on the worker nodes in the Cloudflare Worker. The Promise.any will return the first promise that is fulfilled.
1 2 3 4 5 6 7 | function reverse(promise) { return new Promise((resolve, reject) => Promise.resolve(promise).then(reject, resolve)); } function promiseAny(iterable) { return reverse(Promise.all([...iterable].map(reverse))); }; |
function reverse(promise) { return new Promise((resolve, reject) => Promise.resolve(promise).then(reject, resolve)); } function promiseAny(iterable) { return reverse(Promise.all([...iterable].map(reverse))); };
On the other hand, the Promise.race will return the first promise that is either fulfilled or rejected. Here, we need Promise.any as we need to get the first (fastest) server. Following is a function to send an API to the server and return the server name:
1 2 3 4 5 6 7 8 9 10 11 12 13 | async function contactServer(server) { return new Promise((resolve, reject) => { fetch(server, { method: "GET" }).then(response => { resolve({ "server": server, }); }).catch(function(error) { reject(error); }); }); } |
async function contactServer(server) { return new Promise((resolve, reject) => { fetch(server, { method: "GET" }).then(response => { resolve({ "server": server, }); }).catch(function(error) { reject(error); }); }); }
We can improve the serverless load balancer by contacting server (a load average API required at the origins) and getting more information such as the load average of the API server – and then choose the least load one.
Handling the CORS and Headers
The entry of a Cloudflare Worker should be handling the CORS and Options.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 | function handleOptions(request) { // Make sure the necesssary headers are present // for this to be a valid pre-flight request if ( request.headers.get('Origin') !== null && request.headers.get('Access-Control-Request-Method') !== null && request.headers.get('Access-Control-Request-Headers') !== null ) { // Handle CORS pre-flight request. // If you want to check the requested method + headers // you can do that here. return new Response(null, { headers: corsHeaders, }) } else { // Handle standard OPTIONS request. // If you want to allow other HTTP Methods, you can do that here. return new Response(null, { headers: { Allow: 'GET, HEAD, POST, OPTIONS', }, }) } } addEventListener('fetch', event => { const request = event.request; const method = request.method.toUpperCase(); if (method === 'OPTIONS') { // Handle CORS preflight requests event.respondWith(handleOptions(request)) } else if ( method === 'GET' || method === 'HEAD' || method === 'POST' ) { // Handle requests to the API server event.respondWith(handleRequest(request)) } else { event.respondWith( new Response(null, { status: 405, statusText: 'Method Not Allowed', }), ) } }); |
function handleOptions(request) { // Make sure the necesssary headers are present // for this to be a valid pre-flight request if ( request.headers.get('Origin') !== null && request.headers.get('Access-Control-Request-Method') !== null && request.headers.get('Access-Control-Request-Headers') !== null ) { // Handle CORS pre-flight request. // If you want to check the requested method + headers // you can do that here. return new Response(null, { headers: corsHeaders, }) } else { // Handle standard OPTIONS request. // If you want to allow other HTTP Methods, you can do that here. return new Response(null, { headers: { Allow: 'GET, HEAD, POST, OPTIONS', }, }) } } addEventListener('fetch', event => { const request = event.request; const method = request.method.toUpperCase(); if (method === 'OPTIONS') { // Handle CORS preflight requests event.respondWith(handleOptions(request)) } else if ( method === 'GET' || method === 'HEAD' || method === 'POST' ) { // Handle requests to the API server event.respondWith(handleRequest(request)) } else { event.respondWith( new Response(null, { status: 405, statusText: 'Method Not Allowed', }), ) } });
Forwarding Requests
Once we know which (fastest) server should serve the current request. Then we need to forward the request to the orign server and once we have the result – forward it back to the users. The following are two functions to forward the GET and POST requests respectively – you might want to add other requests such as PUT, PATCH,DELETE etc.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | async function forwardRequestGET(apiURL) { return new Promise((resolve, reject) => { fetch(apiURL, { method: "GET", headers: { 'Content-Type': 'application/json' }, redirect: "follow" }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); } async function forwardRequestPOST(apiURL, body) { return new Promise((resolve, reject) => { fetch(apiURL, { method: "POST", redirect: "follow", headers: { 'Content-Type': 'application/json' }, body: body }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); } |
async function forwardRequestGET(apiURL) { return new Promise((resolve, reject) => { fetch(apiURL, { method: "GET", headers: { 'Content-Type': 'application/json' }, redirect: "follow" }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); } async function forwardRequestPOST(apiURL, body) { return new Promise((resolve, reject) => { fetch(apiURL, { method: "POST", redirect: "follow", headers: { 'Content-Type': 'application/json' }, body: body }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); }
Load Balancer Implementation using CloudFlare Worker
Finally, below is the main implementation of the load balancer by CloudFlare worker script – which is to run on the distributed edge networks.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 | /** * Respond to the request * @param {Request} request */ async function handleRequest(request) { const country = request.headers.get('cf-ipcountry'); const servers = []; for (const server of nodes) { servers.push(contactServer(server)); } const load = await promiseAny(servers); const forwardedURL = load['server']; const method = request.method.toUpperCase(); let result; let res; let version = ""; try { version = await getVersion(load['server']); } catch (e) { version = JSON.stringify(e); } try { if (method === "POST") { const body = await request.text(); result = await forwardRequestPOST(forwardedURL, body); } else if (method === "GET") { result = await forwardRequestGET(forwardedURL); } else { res = new Response(null, { status: 405, statusText: 'Method Not Allowed', }); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3600'); res.headers.set("Origin", load['server']); res.headers.set("Country", country); return res; } res = new Response(result, {status: 200}); res.headers.set('Content-Type', 'application/json'); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3'); res.headers.set("Origin", load['server']); res.headers.set("Version", version); res.headers.set("Country", country); } catch (e) { res = new Response(JSON.stringify(result), {status: 500}); res.headers.set('Content-Type', 'application/json'); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3'); res.headers.set("Origin", load['server']); res.headers.set("Version", version); res.headers.set("Country", country); res.headers.set("Error", JSON.stringify(e)); } return res; } |
/** * Respond to the request * @param {Request} request */ async function handleRequest(request) { const country = request.headers.get('cf-ipcountry'); const servers = []; for (const server of nodes) { servers.push(contactServer(server)); } const load = await promiseAny(servers); const forwardedURL = load['server']; const method = request.method.toUpperCase(); let result; let res; let version = ""; try { version = await getVersion(load['server']); } catch (e) { version = JSON.stringify(e); } try { if (method === "POST") { const body = await request.text(); result = await forwardRequestPOST(forwardedURL, body); } else if (method === "GET") { result = await forwardRequestGET(forwardedURL); } else { res = new Response(null, { status: 405, statusText: 'Method Not Allowed', }); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3600'); res.headers.set("Origin", load['server']); res.headers.set("Country", country); return res; } res = new Response(result, {status: 200}); res.headers.set('Content-Type', 'application/json'); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3'); res.headers.set("Origin", load['server']); res.headers.set("Version", version); res.headers.set("Country", country); } catch (e) { res = new Response(JSON.stringify(result), {status: 500}); res.headers.set('Content-Type', 'application/json'); res.headers.set('Access-Control-Allow-Origin', '*'); res.headers.set('Cache-Control', 'max-age=3'); res.headers.set("Origin", load['server']); res.headers.set("Version", version); res.headers.set("Country", country); res.headers.set("Error", JSON.stringify(e)); } return res; }
Please note that the load balancer can add custom headers before returning the requests – and we here add the version header that is obtained via a separate API call to the origin server:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 | async function getVersion(server) { return new Promise((resolve, reject) => { fetch(server, { method: "POST", headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({"id":0,"jsonrpc":"2.0","method":"call","params":["login_api","get_version",[]]}) }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); } |
async function getVersion(server) { return new Promise((resolve, reject) => { fetch(server, { method: "POST", headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({"id":0,"jsonrpc":"2.0","method":"call","params":["login_api","get_version",[]]}) }).then(response => { resolve(response.text()); }).catch(function(error) { reject(error); }); }); }
By implementing such distributed Load Balancer node, we improve the availability and durability – at a low cost – as we do not actually need to maintain (monitor, upgrade and security patches) a server because of the serverless technology. At the meantime, we are forwarding the requests to the ‘fastest’ origin servers via the CloudFlare worker – which is close to the users (low latency) georgraphically.
–EOF (The Ultimate Computing & Technology Blog) —
Recommend:Interview: China-Africa cooperation is friendly cooperation betw
China's Qin takes bronze, 3 world records shattered at FINA shor
Chinese FM calls for deepening China-CEEC cooperation
China-Laos Railway embraces prosperous development
Chinese tourism bucks trend in 2022
China to provide more market-oriented, law-based, internationali
Chinese-invested largest power generation project in Cambodia la
Iranian president says "taking people's lives" sole re
Palace Museum's Spring Festival collections of Qing Dynasty exhi
Feature: Gwadar's university students see bright future through
- Comment list
-
- Comment add