For a while, when updating ArkScript website, I have needed to remember to update the copy on my VPS so that the changes would be reflected to the world. Since I’m quite lazy, I’d like to automate this!
# The solutions
The first solution would be to use crontabs
. Easy to setup, find a schedule, write a bash script that cd
and git pull
and you’re down. However, it has some caveats:
- the changes are not reflected immediately, as you need to wait for your schedule to hit
- we could use a schedule that hit every 5 minutes, but then we would be many useless
git pull
s (which could be bad for our account/server, as GitHub could see this as a bad automated behavior and ban us) - permissions? I tried to set this up on my server and probably messed something up somewhere, because my script started (shows up in syslog) but did nothing (project was lagging behind and I had to
git pull
myself)
The second funnier solution is to use webhooks. Basically, every event (a push, star, fork, action running…) can be filtered and sent as json or xml to an URL of your choice. You might have already used them to set up channels on Slack or Discord with push/merge/star/whatever events, but did it occur to you that we could use them to detect changes to a project and run automated actions on a deployed version of said project?
# Our needs
What we want:
- upon pushing to the
master
(ormain
) branch, GitHub will send an event to our webhook - check that the event is really coming from GitHub (and not from a script kiddy abusing our webhook)
- update our project (that can come in the form of a
git pull
or something more elaborate like pulling and building our webapp)
# Registering our webhook on GitHub
The first step is trivial, by going to https://github.com/USERNAME/REPOSITORY/settings/hooks, we can create a new webhook. We will need:
- a payload url:
https://server.com/PROJECT
- a content type:
application/json
(so that if we need to read values from it, it will be easier than parsing and readingx-www-form-urlencoded
) - a secret: that’s like a password GitHub will be using to hash the payload (HMAC SHA256), which can be used later to ensure that the payload is coming from GitHub and not someone else
- the type of event(s) we want:
push event
is more than enough to me so that’s what I’ll focus on
Our 3rd point here solved our 2nd problem, neat!
For the payload url, I put something like webhooks.myserver.com/project-name
. Since I have a domain I can create subdomains for free (and also request SSL certificates with LetsEncrypt), and adding the project name in the URL will let me have multiple webhooks on the same server, to update different projects without spinning up a new server.
# Checking the event origin
As said before, we need to check that the event is coming from GitHub and not from someone trying to make us pull indefinitely our project(s) to use precious bandwidth.
That’s where the secret comes in! GitHub uses and HMAC SHA256 digest to compute the hash of the payload it sends us. Said hash is computed from our secret and the payload, which means we can compute it on our side and compare it with the hash GitHub sends us, since our secret is… well, secret, only GitHub and us will know its value.
I’ll then be using NodeJS and express as it feels to be the easiest way to spin a server:
|
|
Given a secret, a payload and a computed hash, we can check for it in JavaScript:
|
|
To decompose the second algorithm:
- we are stringifying the request body (json) to be able to compute its hash
- getting the signature from the headers
- creating a HMAC SHA256 from our secret
- create a digest from our data, using the HMAC
- comparing the length of the given signature and digest, and if they match, comparing the hash using
crypto.timingSafeEqual
to perform constant time string comparison, to help mitigate timing attacks (and thus potentially leaking our secret)
# Registering our routes
We can then implement this check as an express middleware, and use it in our routes:
|
|
To make registering routes easier, and to avoid having to go back to the code every time I need to add a new webhook, we can make use of JSON files as configuration files:
|
|
|
|
Once we all put that together, we’ll need an error handler to handle the bad signature / non-signed case, to provide an answer to the caller (since our middleware calls next()
with an error message when the provided signature and the computed digest do not match):
|
|
# Conclusion
We’re now ready to go: we can easily create new webhooks and register them on GitHub, we just have to make more projects and deploy them on the same VPS under a docker/
folder for example (so that we can mount ./docker:/docker
in our webhooks server container, to run the git pull
commands in the projects).
This setup is pretty basic though, as it only pulls changes when an event is received it assumes the project is served by something like Apache HTTPD and that there are no additional steps (eg building with npm run build
), hopefully it can get you started!
# Links
- GitHub repository: SuperFola/webhooks
- GitHub — Validation webhook deliveries