Going Serverless
Being on the internet, I’ve started to hear this word “serverless”, a lot. And like most people I pointed out the weird obvious issue with it having servers. But more the point is that it’s not a server that you don’t have to manage. Instead of a server that just runs your app continuously with a defined start and stop, it’s a single lambda.
Lambda wasn’t really a term I had heard or understood. A lambda is a lot like an anonymous function. Your server becomes a single function that responds to an event with a context, like a request. That function executes only when it’s called and doesn’t need to be run continuously waiting to be pinged.
So, We Might Have a Problem Here
Zeit’s project “Now”, which I’ve used almost since the beginning, moved to this serverless concept for deployments, offering cheaper, faster deployments all around with their Now v2. And I use this service heavily for Downwrite, (which is forever in alpha / beta).
Side note: One of their other big projects, Next.js also migrated to a severless pattern for server-side rendering your React application.
This presented a challenge because my API server, uses Hapi.js which doesn’t use a single function by default to run. You instantiate a class and call start()
:
const server = new Hapi.Server();
const init = async () => {
await server.start();
console.log(`Server running at: ${server.info.uri}`);
};
init();
You attach routes and controllers to that server and when you ping one of those routes, Hapi responds. Hapi geniunely makes working on a REST API server very organized and very easy to grow, while being an absolute joy to use. Routes are easy to define in an object, controllers are usually very simple, it builds authentication models to each route, and you just pass an array of those route objects to your server:
server.routes([
{
method: "POST",
path: "/settings",
handler: async (
request
h
) => {
const id = request.params.id;
const { user } = request.auth.credentials;
return await db.find(id)
},
config: {
cors: {
origin: ["*"],
credentials: true
},
auth: {
strategy: "jwt"
}
}
}
]);
And all of this works great if your server starts and stops. But serverless functions are reactive, they’re called and return a value. Serverless functions, especially on Now, work more like this:
import { IncomingMessage, ServerResponse } from "http";
export default async (request: IncomingMessage, response: ServerResponse) => {
response.send();
};
They respond to an event versus running and waiting to be called.
With the way things stood before, that code running on these deployments would cause every request would time out and run continuously, driving up my bill and not serving my users.
Oh Wait, It Doesn’t Need to Be
Zeit also have a framework for Node servers called micro, which for all intents and purposes, follows this serverless model and gives helper functions for processing the body of an incoming request.
Hapi has an idea of an inject()
method, where you pass it a payload to your server class and it calls the specified route and that route’s controller and returns the expected result.
Injects a request into the server simulating an incoming HTTP request without making an actual socket connection. Injection is useful for testing purposes as well as for invoking routing logic internally without the overhead and limitations of the network stack. — Hapi API documentation
Even though it’s “useful for testing purposes” it really fits the needs I have for working with my serverless deployment target.
import { send } from "micro";
import * as Hapi from "hapi";
import { IncomingMessage, ServerResponse } from "http";
import routes from "./routes";
async function createServer(port?: number): Promise<Hapi.Server> {
const server = new Hapi.Server({
port,
routes: { cors: true },
});
server.route(routes);
return server;
}
export default async (req: IncomingMessage, res: ServerResponse) => {
const server: Hapi.Server = await createServer();
const injection: Hapi.ServerInjectOptions = {
method: req.method,
url: req.url,
headers: req.headers,
};
const response = await server.inject(injection);
send(res, response.statusCode, response.result);
};
Now, adding micro allowed me to make LITERALLY NO CHANGES to my routes or controllers and use all my original server code while allowing me to respond more nimbly to each request.
My tests were easier to write now that my server was an instance returned by a function:
describe("Endpoints", () => {
beforeAll(async () => {
db = await prepareDB();
server = await createServer(9999);
server.start();
});
it("can create a user", async () => {
const R: Axios.AxiosResponse<ICreateResponse> = await Axios.default.post(
"http://localhost:9999/api/users",
{
...createdUser,
}
);
token = (R.data as ICreateResponse).id_token;
user = (R.data as ICreateResponse).userID;
expect(R.status).toBeLessThanOrEqual(300);
});
afterAll(async () => {
await server.stop();
db.disconnect();
});
});
My only real gotcha after this was starting Mongoose (a Node adapter for MongoDB), it needs to be started and stopped after each request.
Conclusion
Serverless was a new concept for me but it wasn’t as hard as I was making it out to be. Finding the right tools enabled to me to leverage all the code I had and ultimately a better experience to work with. You can see the source code for this and the rest of Downwrite on GitHub.