How does this website work?
Try Notion
How does this website work?
July 21, 2021
I have a few websites which I've moved around and hosted in different ways over the years. Initially they were hosted on a Linux server with a traditional CMS like Wordpress. After a time, I no longer wanted to pay for the server, so I was looking for an alternative.
I moved one of the sites to GitHub pages, which is nice (and free, just pay for the domain and DNS routing), but requires anyone who wants to edit the site to know how to use git... works for me, but not my wife.
I switched a couple other sites to static-hosting sites in AWS. The architecture was pretty standard, an S3 bucket to host the assets connected to a CloudFront distribution which took care of routing, SSL, and caching, and a Route53 DNS entry to point my domains to the right CloudFront distribution. Simple system diagram below:
For managing content, I would either write my own raw HTML files and upload them to S3, or else use a simple CMS like Publii. The issue with this method is its hard to collaborate... theoretically you can sync the assets for Publii across computers using Dropbox or something, but it's cumbersome.
Enter Notion.
I've been using Notion for about a year now to organize my life. It's fantastic. You should too.
I got the idea that it could be easier to build public, static websites from pages in Notion. Apparently this is not a novel idea. There are several paid services that do this (like super.so and hostnotion.co). There are also a few projects on GitHub that go in this direction; they work in one of three ways:
Download content from Notion using the same (not-publicly-documented) API that Notion uses internally to render their app and render with React (which Notion is built with). An example is react-notion (a Node.js app), which was one of the first attempts at generating Notion-based static sites. The major downside of this approach is that it relies on Notion's internal API which could change (or be locked down) at any moment.
Download content from Notion using their recently-released API beta and render it using React. This Next.js template, notion-blog-nextjs, and react-notion-x are examples of this approach. They have the advantage of using Notion's official (supported) API, but unfortunately this API doesn't yet support all block types — it doesn't support blocks I use frequently, like equation, table, or embed. Some of these systems fall back to approach 1, or else just don't display unsupported block types.
Scrape public Notion pages using a headless browser like Chromium and save the HTML and assets locally. An example is loconotion. The advantage here is that all block types are supported, and if Notion changes their official API, or revokes access from free accounts (it could happen...), this should still work. loconotion has some particularly nice features like recursive crawling of subpages, ability to customize site description and page slugs, and ability to add custom CSS and Javascript.
Approach 3 looked most promising to me, but I wanted to be able to do a quick, one-click site update, and ideally share that with a few trusted, non-technical people. Google Colab is a free service (requires a Google account) where you can run Python code. It's typically used by data scientists for small projects. But notebooks can be shared like Google Docs! So it looked like a promising option.
Here's the system diagram I had in mind:
After some tinkering, I cooked up a Google Colab notebook with a Python script that gets the job done. Below are the steps to replicate my set up:
Create a page in Notion which will become your website root. (In my Notion, I've set up a page called "Websites," under which I have a separate page for each public Notion-driven website I've set up.)
Make the page you created in step 2 is public (Share → Share to web) and note its public URL.
Log into your AWS account.
Request an SSL certificate from AWS Certificate Manager. This can take some time, which is why we do it first.
Create an S3 bucket in AWS with the same name as the domain where you want to host the site (like example.com).
(Optional) Create a favicon.ico file and upload it to your S3 bucket's root.
(Optional) Create a site.css file with any custom CSS you want on your site (I have the following to hide the Notion top-bar) and upload it to your S3 bucket's root:
.notion-topbar { display:none!important; } .notion-frame > .notion-scroller >div:first-child { display:none!important; }
(Optional) Create a site.js file with any custom Javascript you want on your site — if you want Google Analytics on your site, add it like below (customize with your site ID):
window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'UA-XXXXXXXX-1');
Create a site.toml configuration file like the one below (customize it with your own settings, remove scripts and CSS if you didn't create them in steps 6-8 above) and upload it to your S3 bucket:
# See loconotion docs (https://github.com/leoncvlt/loconotion) and # example .toml file (https://github.com/leoncvlt/loconotion/blob/master/example/example_site.toml) # for further options name = "example.com" page = "https://example.notion.site/Welcome-0782ef638dd547bba989fe3a244c9c8c" [site] [[site.meta]] name = "title" content = "Welcome to my Example Site" [[site.meta]] name = "description" content = "An example site built with Notion, Loconotion, Google Colab, and AWS." # (Optional, see step 6) [[site.inject.head.link]] rel="icon" type="image/x-icon" href="/favicon.ico" # (Optional, see step 7) [[site.inject.head.link]] rel="stylesheet" type="text/css" href="/site.css" # (Optional, include if you plan to use Google analytics) [[site.inject.head.script]] type="text/javascript" async="" src="/ga.js" # (Optional, see step 8) [[site.inject.head.script]] type="text/javascript" src="/site.js"
Plain Text
Create a CloudFront Distribution in AWS which is backed by the S3 bucket from step 5. Note the CloudFront distribution's ID and ARN.
Make your CloudFront distribution redirect http traffic to https and configure it to use your Amazon-issued SSL certificate from step 4.
Create an AWS IAM user with programmatic access. Note the Access Key ID and Secret Access Key.
Add a custom access policy to your IAM user as shown below. This allows the user to upload files to your S3 bucket and then invalidate the CloudFront cache. Update example.com with your S3 bucket name and AWS_CLOUDFRONT_DISTRIBUTION_ARN with your CloudFront distribution's ARN from step 10.
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::example.com" ] }, { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject" ], "Resource": [ "arn:aws:s3:::example.com/*" ] }, { "Effect": "Allow", "Action": [ "cloudfront:ListDistributions" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "cloudfront:CreateInvalidation" ], "Resource": [ "AWS_CLOUDFRONT_DISTRIBUTION_ARN" ] } ] }
Point your domain at your CloudFront distribution.
Create a Google Colab notebook with the following code. Replace the settings at the top with your own.
# configuration: local_dir = 'loconotion/dist/example.com/' aws_key = 'AKXXXXXXXXXXXXXXXXXX' aws_secret = 'XXXXXXXX+XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX' aws_bucket = 'example.com' aws_cloudfront_distribution_id = 'XXXXXXXXXXXXX' # download a config file: !rm -f site.toml !wget https://example.com/site.toml # install dependencies: !apt-get update !apt install chromium-chromedriver !cp /usr/lib/chromium-browser/chromedriver /usr/bin !pip install selenium # install loconotion: !rm -rf loconotion/ !git clone https://github.com/leoncvlt/loconotion.git !pip install -r loconotion/requirements.txt # download assets for loconotion to inject: !cd loconotion && wget https://example.com/favicon.ico !cd loconotion && wget https://example.com/site.js !cd loconotion && wget https://example.com/site.css !cd loconotion && wget https://www.googletagmanager.com/gtag/js?id=UA-XXXXXXXX-1 -O ga.js # run loconotion to generate static site copy: !cd loconotion && python loconotion ../site.toml --chromedriver /usr/bin/chromedriver # install AWS Python library: !pip install boto3 # import libraries import os import sys import boto3 import mimetypes from time import time # recursively upload static files to S3 bucket session = boto3.Session( aws_access_key_id=aws_key, aws_secret_access_key=aws_secret, ) s3 = session.client('s3') for root, dirs, files in os.walk(local_dir): for filename in files: local_path = os.path.join(root, filename) relative_path = os.path.relpath(local_path, local_dir) s3_path = os.path.join('', relative_path) # ensure files are uploaded to S3 with correct mimetype mimetype, _ = mimetypes.guess_type(local_path) if mimetype is None: mimetype = 'text/x-python' s3.upload_file( Filename=local_path, Bucket=aws_bucket, Key=s3_path, ExtraArgs={ "ContentType": mimetype } ) # invalidate CloudFront cache to finish updating the site: # (borrowed from https://gist.github.com/jolexa/e58ea2ec19cf3067d0ddfbdc98bbaf6d) cf = session.client('cloudfront') response = cf.create_invalidation( DistributionId=aws_cloudfront_distribution_id, InvalidationBatch={ 'Paths': { 'Quantity': 1, 'Items': [ '/*' ], }, 'CallerReference': str(time()).replace(".", "") } )
Basically this Python script is
Installing all needed software and packages:
Chromium, the headless browser used to download Notion pages
loconotion, the package that turns a Notion page into static assets in a local folder
boto3, the AWS client Python library
mimetypes, the Python library that ensures assets are uploaded to S3 with correct types
Downloading the Notion page(s) and creating the static version
Uploading static assets to the S3 bucket
Invalidating the CloudFront distribution's cache, so the updated site appears immediately
One caution here: every time you run the Google Colab notebook, you're re-downloading the entire website, re-uploading it to S3, and purging the entire CloudFront cache... all of this will take longer, incur more AWS costs, and may run into rate-limits from Notion, as your site gets bigger or if you update it frequently. Running it for a pretty simple one-page site takes just over 1 minute. Running a bigger site with a couple dozen pages takes maybe 5 minutes.
But yes, you can use Notion as a CMS for your website.
Costs involved are
domain name registration, maybe $15/year
$0.50/month for a Hosted Zone in AWS Route53
$0.023/GB for storing assets in AWS S3 — realistically, for small site, you're unlikely to exceed free usage tier
$0.085/GB data transferred out to the Internet — realistically, for a small site, you're unlikely to exceed the free usage tier
$0.005/ CloudFront cache invalidation path (/* operates on all assets) after the first 1000 — again, for a small site, you're unlikely to exceed 1000
Google Colab, Loconotion, and AWS ACM certificates are all free. So for a small site, you can expect to pay around $20-$25 per year all in, which is pretty cheap!