Growing up, I’d stay up until 3 or 4am on IRC, chatting with people from Europe, South America, Australia/NZ. We built websites, setup servers, shared knowledge, hosted radio shows together. No rails with technology. Fun times. That time is entirely why I’m able to support myself now. Sadly lost touch with most of them.  I wonder how they’re all doing now. This post brought to you by some MP3s of “The Streets” that I got from a Swede in those days.

I bought a new (shorter) domain for my new email address. One advantage of migrating away from gmail that I hadn’t anticipated is how much calmer I feel.

You see, Gmail technically supports IMAP, but it’s more of a shim. You’re not really supposed to use IMAP with Gmail. And as such I never felt comfortable using a regular email client, instead opting to check mail via the web-app.

Checking mail via a browser is fine but being in a browser switches your mind to a different context. Browsers are meant for consuming. The entire internet is just a simple cmd-T away. So “checking email” became a mental excuse to open my web browser. And then Twitter. And then Hacker News. And then Reddit. Oh, I wonder if I got any new email? And repeat.

Now with a provider where IMAP is a first class citizen, I can use again. Mail is set to be pulled in once an hour. No more temptation from a web browser. And an unexpected sense of calm.

I’m back in control.

Each time I watch “Rams”, a documentary of Dieter Rams, something different gets stuck in my head.

This time was brief interaction at the very start of the film. A designer asks how he can propel his work beyond mediocracy. Rams’ response is simple: Find people whom you can collaborate, and to use this collaboration to move beyond mediocracy.

The shift in ability required to elevate your craft is something that often can’t happen alone. Collaborating is the most effective method to improve your work. Looking back, I can pinpoint exactly when and where my sense of design as a developer went from typical engineer to closer to a designer.

I was moonlighting and doing some work with a designer, who was also the lead on the project. I’d submit a revision and she she noticed immediately when my implementation wasn’t perfect. A section was a bit too tall, or a line off by a pixel. Through this back and forth, I began to catch things I didn’t before. My eyes began to see what she saw.

Before I was blind and, through collaboration, I could see.

I’ve been using my gmail account since a few months after the beta started. I’ve moved a dozen times since then, but my email stayed the same.

However, over the years Google has lost my confidence that they’ll do the right thing and do no evil. It’s for this reason I don’t use their apps, don’t invest in tweaking gmail, or even (especially) sync my contacts.

As a Mac user for almost 20 years, I’d like to use iCloud for my email, but I can’t use custom domains with Apple. While I don’t foresee Apple losing my trust and confidence, I can’t be sure.

Tying my email to a third party domain will lock me in to their ecosystem, for better or worse. Moreover, I could lose it all in an instant by the whim of an algorithm with little to no recourse.

With Gmail, I’m not the customer, the advertisers are. And because our interests are not aligned, I have no idea how my data will actually be used.

What to do?

The obvious answer is to move my email to a domain I own. Then find a provider that supports open protocols and that I pay at a regular interval.

I’m leaning towards Fastmail. They’ve got a nice detailed migration guide, I’ve been a customer on the business side for a number of years, it’s time to renew, and most importantly their systems behave in ways that I expect.

The main blocker isn’t even money, it’s updating each account that uses my gmail as a login to my new address. Lock-in, albeit defacto and of my own doing, is a bitch.

The mantra in bootstrapping circles for the past while has been “charge more”. And the best way to charge more, over time, is a SaaS. So it’s natural that most bootstrapers default to a SaaS pricing model when starting their new projects and companies.

I’m no different. I build web-apps professionally and have for the past 10 years. Web apps are my bread and butter.

But when I compare my successful SaaS projects to my successful desktop app projects, no matter the metric, I’ve always made more when I charge less and charge it once.

And since I’ve been so focused on SaaS and this charge more mentality, I’ve automatically dismissed ideas that I had that weren’t SaaS.

After attempting to build a number of web apps independently I’ve mostly stopped midway through. The slog of getting the basics perfect, managing servers, dealing with recurring payments, it’s too much like my day-job.

And so I find myself considering going back to my old bread and butter for side-projects: native apps for the Macintosh.

So far I’ve got a few ideas for small utility apps. The ones I’m most interested in are the ones that fit in the open web and apps that can help increase privacy for its users.

It’s been a breath of fresh air and I’m excited to be having fun making things again.

Django has a nice security feature that verifies the request HOST header against the ALLOWED_HOSTS whitelist and will return errors if the requesting host is not in the list. Often you’ll see this when first setting up an app where you only expect requests to but some bot makes a request to <server ip address>.

While it’s not strictly harmful to add your server ip to your ALLOWED_HOSTS, in theory, it does allow bots to easily reach and fire requests to your Django app, which will needlessly consume resources on your app server. It’s better to filter out the requests before they get to your app server.

For HTTP requests, you can block requests by adding default_server that acts as a catchall. Your app server proxy then set its server_name to the a domain in your ALLOWED_HOSTS. This simple configuration will prevent http://<server ip address> requests from ever reaching your app server.

// default.conf server { listen 80 default_server; return 444; } // app.conf upstream app_server { server fail_timeout=0; } server { listen 80; server_name {{ WEB_SERVER_NAME }}; access_log /var/log/nginx/access.log access_json; error_log /var/log/nginx/error.log warn; location /static/ { alias /var/app/static/; } location / { proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Request-Id $request_id; proxy_redirect off; proxy_pass http://app_server; } }

However, once you enable SSL with Let’s Encrypt, despite the fact that they matching by host, as there is only one SSL server configuration by default, it routes all https traffic to the same host. What this means is that while requests made to http://<server ip address> will continue to be blocked, requests to https://<server ip address> will begin to be forwarded to your django app server, resulting in errors. Yikes!

The solution is to add a default SSL enabled server, much like your http configuration. Thee only tricky bit is that all ssl configurations must have a valid ssl certificate configuration as well.  Rather than making a self-signed certificate I reused my let’s encrypt ssl configuration.

// default.conf
server {
  listen 80 default_server; return 444;

server {
  listen 443 ssl default_server;
  ssl_certificate /etc/letsencrypt/live/{{ WEB_SERVER_NAME }}/fullchain.pem;
  ssl_certificate_key /etc/letsencrypt/live/{{ WEB_SERVER_NAME }}/privkey.pem;
  include /etc/letsencrypt/options-ssl-nginx.conf;
  ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem;

  if ($host != {{ WEB_SERVER_NAME }}) {
    return 444;

By adding a default SSL server to your nginx config your server_name settings will be respected and requests that do not match your host name will no longer be forwarded to your app server.

Recently at work I’ve been working quite a bit with Django and GraphQL. There doesn’t seem to be much written about best practices for organizing your Graphene-Django projects, so I’ve decided to document what’s working for me. In this example I have 3 django apps: common, foo, and hoge.

There’s two main goals for this architecture:

  1. Minimize importing from “outside” apps.
  2. Keep testing simple.

Queries and Mutations Package

Anything beyond simple queries (i.e. a query that just returns all records of a given model) are implemented in their own file in the queries or mutations sub-package. Each file is as self-contained as possible and contains any type definitions specific to that query, forms for validation, and an object that can be imported by the app’s

Input Validation

All input validation is performed by a classic Django form instance. For ease of use django form input does not necessarily match the GraphQL input. Consider a mutation that sends a list of dictionaries with an object id.

  "foos": [
        "id": 1,
        "name": "Bumble"
        "id": 2,
        "name": "Bee"

Before processing the request, you want to validate that the ids passed actually exist and or reference-able by the user making the request. Writing a django form field to handle input would be time consuming and potentially error prone. Instead each form has a class method called convert_graphql_input_to_form_input which takes the mutation input object and returns a dictionary that can be passed the form to clean and validate it.

from django import forms
from foo import models

class UpdateFooForm(forms.Form):
    foos = forms.ModelMultipleChoiceField(queryset=models.Foo.objects)

    def convert_graphql_input_to_form_input(cls, graphql_input: UpdateFooInput):
        return { "foos": [foo["id"] for foo in graphql_input.foos]] }

Extra Processing

Extra processing before save is handled by the form in a prepare_data method. The role this method plays is to prepare any data prior to / without saving. Usually I’d prepare model instances, set values on existing instances and so forth. This allows the save() method to use bulk_create() and bulk_update() easily to keeps save doing just that – saving.

Objects/List of objects that are going to be saved / bulk_created / updated in save are stored on the form. The list is defined / set in init with full typehints. Example:

from typing import List, Optional

class UpdateFooForm(forms.Form):
    foos = forms.ModelMultipleChoiceField(queryset=models.Foo.objects)

    def __init__(*args, **kwargs)
        super().__init__(*args, **kwargs)
        self.foo_bars: List[FooBar] = [] Optional[Bar] = None

Type Definition Graduation

Types are defined in each query / mutation where possible. As schema grows and multiple queries/mutations or other app’s queries/mutations reference the same type, the location where the type is defined changes. This is partially for a cleaner architecture, but also to avoid import errors.

└── apps
├── common
│   ├──
│   └──  # global types used by multiple apps are defined here
└── hoge
├── mutations
│   ├──  # types only used by create_hoge are in here
│   └──
├── queries
│   └──
└──  # types used by either create/update_hoge and or complex_query are defined here

Example Mutation

The logic kept inside a query/mutation is as minimal as possible. This is as it’s difficult to test logic inside the mutation without writing a full-blown end-to-end test.

from graphene_django.types import ErrorType

class UpdateHogeReturnType(graphene.Union):
    class Meta:
        types = (HogeType, ErrorType)

class UpdateHogeMutationType(graphene.Mutation):

    class Meta:
        output = graphene.NonNull(UpdateHogeReturnType)

    class Arguments:
        update_hoge_input = UpdateHogeInputType()

    def mutate(root, info, update_hoge_input: UpdateHogeInputType) -> str:
        data = UpdateHogeForm.convert_mutation_input_to_form_input(update_hoge_)
        form = MutationValidationForm(data=data)
        if form.is_valid():
        errors = ErrorType.from_errors(form)
        return ErrorType(errors=errors)

Adding Queries/Mutations to your Schema

This architecture tries to consistently follow the graphene standard for defining schema. i.e. when defining your schema you create a class Query and class Mutation, then pass those to your schema schema = Schema(query=Query, mutation=Mutation)

Each app should build its Query and Mutation objects. These will then be imported in the, combined into a new Query class, and passed to schema.

# hoge/mutations/

class UpdateHogeMutation:

    update_hoge = UpdateHogeMutationType.Field()

# hoge/mutations/

from .mutations import update_hoge, create_hoge

class Mutation(update_hoge.Mutation,

# common/

import graphene

import foo.schema
import hoge.schema

class Query(hoge.schema.Query, foo.schema.Query, graphene.GrapheneObjectType):

class Mutation(hoge.schema.Mutation, foo.schema.Mutation, graphene.GrapheneObjectType):

schema = graphene.Schema(query=Query, mutation=Mutation)

Directory Tree Overview

└── apps
├── common
│   ├──
│   └──
├── foo
│   ├── mutations
│   │   └──
│   ├── queries
│   │   └──
│   └──
└── hoge
├── mutations
│   ├──
│   ├──
│   └──
├── queries
│   └──