If you’re relatively new to the statically-typed functional programming scene, there’s a good chance you’ve encountered these fancy-sounding things called “algebraic data types” (often abbreviated as ADTs), and have been left feeling both a little intimidated by them and confused about what they have to do with algebra. That was my initial experience with them, anyway. I’d like to try offering a beginner-friendly introduction to them in a fashion that I think would have helped me understand them sooner.
Algebraic data types without Maybe
, Either
, These
, or type classes
I’m willing to bet you were first exposed to the concept of ADTs through a data type known as Maybe
(a.k.a. Option
). Then you encountered Either (a.k.a. Result
), and then perhaps you saw These
mentioned somewhere. These are indeed algebraic data types, but it’s not immediately obvious what is algebraic about them, and, to add to the confusion, they are often presented as examples of some other mathematical-sounding concepts that one might mistakenly interpret as the source of their algebraic-ness. I for one thought they were algebraic because of something to do with them having type classes like functor and monad with special “laws” and what-have-you, and I didn’t realize until much later that those concepts were not directly related to algebraic data types.
The reality is that types like Maybe
and Either
aren’t any more algebraic than most of the other types you see every day in TypeScript (or whatever language you’re using). For example, this union of strings is an algebraic data type:
type Size
= "small"
| "medium"
| "large"
And so is this interface for a customer:
interface Customer {
isPremiumMember: boolean
shirtSize: Size
}
Actually, if my understanding is correct, almost every type in TypeScript (or a similar language) technically qualifies as an algebraic data type. If that’s the case, what is this “algebra” we keep referring to? What is algebraic about algebraic data types?
Cardinality: the numerical value of a type
The first thing you need to know is that there is a numerical relationship between types and the runtime values they represent. This numerical relationship is simple: there is a quantifiable number of runtime values that inhabit every type, anywhere from zero to positive infinity.
For example, consider these primitive TypeScript types:
never
has zero possible runtime valuesnull
has one possible runtime value, namely,null
boolean
has two possible runtime values:true
andfalse
number
and string have somewhere around infinite possible runtime values
The case is the same for user-defined types:
type Size = "small" | "medium" | "large"
has three runtime values, namely, the string literals you see in the type declaration.Customer
(defined above) has six possible runtime values, which you can see if you list out every possible combination of values for its two fields.
The number of values that inhabit a type is referred to as the type’s cardinality, which is a term borrowed from set theory. It refers to the number of members in a set. Once we recognize that types have such numerical value by way of cardinality, their algebraic-ness begins to come into focus. Algebraic data types are algebraic because you can reason about their cardinality with basic algebra. That’s all.
Sums and Products
There are two main algebraic data types that you will use day-in and day-out; in fact, you’re already using them: sum types and product types. Their names give you a hint as to how their cardinalities are calculated.
Sum types and product types both combine other types to form a new type, but they do so in very different ways, which significantly affects how we calculate their cardinalities.
Sum Types
Sum types are so-called because they combine other types in such a way that their cardinality is the sum of the cardinalities of the other types (minus any duplicate members that might exist). The TypeScript documentation call this a union type, which is a type that “describes a value that can be one of several types." We call the “several types” that make up a sum type its variants. The canonical sum type is boolean whose members may be either the type of literal true or the type of literal false.
Our Size
type from earlier is also an example of a sum or union type:
type Size
= "small"
| "medium"
| "large"
We might read this type declaration as follows: A value of type Size
is a member of either the type of literal "small"
, the type of the literal "medium"
, or the type of the literal "large"
. The logic of either-or translates to each variant’s members being joined with the others' to form a larger set, whose total number of members is the cardinality of the sum type. Since each of these variants is a string literal type, they each have a cardinality of 1. To find the cardinality of Size we perform simple addition:
type Size = "small" | "medium" | "large"
// 3 = 1 + 1 + 1
This is the algebra of sum types. It holds up no matter what types you throw into a sum type. Here’s a slightly more complex example, and it still just amounts to basic addition:
type Size = RegularSize | TallSize | XtraSize
// 9 = 3 + 3 + 3
type RegularSize = "small" | "medium" | "large"
type TallSize = "small-tall" | "medium-tall" | "large-tall"
type XtraSize = "xsmall" | "xlarge" | "xxlarge"
Now here’s a curveball: What would happen if we threw string into the Size union type? Is it still an algebraic data type? What would be its cardinality?
type Size = RegularSize | TallSize | XtraSize | string
// ? = 3 + 3 + 3 + ???
Well, what is the cardinality of string
? If we use our imaginations a little, we can actually view string as a sum type whose variants are every possible combination of string characters, like this:
type string
= ""
| " "
| // ...spaces to infinity
| "a"
| "aa"
| // ..."a"s to infinity
| // ...and so on
It would seem then that string
is a sum type with an infinite cardinality. So the algebra of sum types still applies to Size, even if it’s a bit odd:
type Size = RegularSize | TallSize | XtraSize | string
// Infinity = 3 + 3 + 3 + Infinity
Of course, this is a pretty bogus type that you’d never write. But it shows that the algebra of sum types holds even if we’re dealing with types of infinite cardinality.
With this understanding of sum types, we can now see what is algebraic about data types like Maybe<T>
and Either<L, R>
. They are just examples of sum types. However, because they have generic type parameters, we don’t have enough information to determine their cardinality right away.
interface Nothing { tag: "Nothing" }
interface Just<T> { tag: "Just", value: T }
type Maybe<T> = Nothing | Just<T>
// ? = 1 + ?
Because we don’t know what T is, we can’t know its cardinality, and therefore we can’t know the cardinality of Maybe
type Maybe<T> = Nothing | Just<T>
// y = 1 + x
Once Maybe
is supplied with a type argument, we can then plug in the numbers to solve for the cardinality. For example, Maybe<Size>
would work out like so:
Maybe<Size> = Nothing | Just<Size>
// 4 = 1 + 3
Either
is similarly straightforward:
interface Left<L> { tag: "Left", left: L }
interface Right<R> { tag: "Right", right: R }
type Either<L, R> = Left<L> | Right<R>
// z = x + y
If we had Either<boolean, Size>
, then finding its cardinality is as simple as 2 + 3 = 5.
That about sums it up for sum types.
Product Types
Product types are so-called because they combine other types in such a way that their cardinality is the result of multiplying the cardinalities of the other types they combine. The way they do this is by combining multiple types into a structure. In TypeScript, the structures we can use for product types are tuples and object types with static keys (which can be expressed as interfaces, classes, and object literal type aliases). Our Customer interface from earlier is an example of a product type:
interface Customer {
isPremiumMember: boolean
shirtSize: Size
}
It combines Size
(cardinality 3) and boolean
(cardinality 2) into a structure by specifying these types on keys of an object. We can count the number of values that inhabit the type of Customer by listing out all the possibilities:
{ isPremiumMember: true, shirtSize: "small" }
{ isPremiumMember: true, shirtSize: "medium" }
{ isPremiumMember: true, shirtSize: "large" }
{ isPremiumMember: false, shirtSize: "small" }
{ isPremiumMember: false, shirtSize: "medium" }
{ isPremiumMember: false, shirtSize: "large" }
However, on a more complicated product type, it would be unwieldy, if not impossible, to try to find the cardinality by listing out of possible values. Instead we can use the algebra of product types to calculate it’s cardinality:
// Size x boolean = Customer
// 3 x 2 = 6
If Customer
had another field, or ten more, it’s still just as easy to calculate its cardinality:
interface Customer {
favoriteFood: Food
isPremiumMember: boolean
shirtSize: Size
}
type Food
= "hotdog"
| "burger"
| "pizza"
| "salad"
| "pretzel"
| "nachos"
Now Customer
combines three types into a structure, and look how much its cardinality has grown:
// Food x Size x boolean = Customer
// 6 x 3 x 2 = 36
The algebra of product types work the same regardless of the structure we choose. If for some reason we decided to represent Customer as a tuple, the math would work out the same:
type Customer = [Food, Size, boolean]
// 36 = 6 x 3 x 2
This makes sense because we’ve simply replaced the keys of an object/record/struct with the indexes of a tuple.
When the product type we’re dealing with is generic (i.e. it has type parameters, like Maybe<T>
and Either<L, R>
), we are left with a simple equation to be solved when the type is fully resolved. As an example, we can look at the canonical product type known as Pair<A, B>
, a simple structure over two types. Again, it could be implemented as either a tuple or object type. For this example we’ll use a tuple:
type Pair<A, B> = [A, B]
Since we do not know what A and B are yet, all we know is the equation to solve for the cardinality of Pair<A, B>
:
type Pair<A, B> = [A, B]
// z = x * y
So if we had a function that returned Pair<boolean, Size>
, we just plug in A = 2 and B = 3 and find that our function can only return 2 * 3 or 6 possible values.
Take a look at these other Pairs and use the algebra of product types to determine their cardinality:
Pair<null, null>
Pair<never, Food>
Pair<boolean, Maybe<boolean>>
Pair<Either<Error, Customer>, Maybe<Size>>
(use the originalCustomer
)
By understanding the algebra of sums and products, one can easily calculate the number of possibilities these types encode. That’s the power of algebraic data types!
Why the Algebra of Types Matters
After getting through all that, it’s reasonable to wonder why any of this matters. So types have cardinality and I can calculate the cardinality of a type with some basic math. So what?
In my own experience, the main advantage I’ve experienced from understanding the algebra of types is that it makes me much more aware of how software complexity is directly tied to the aggregate complexity of the data models we choose to use throughout our programs, and it enables me to design better data models that reduce the cognitive complexity of writing correct, error-free programs.
Think about what a type’s cardinality translates to in a program: cases or branches that need to be handled, and we all know how tough heavily branching code is to deal with. Imagine implementing a function whose argument has a cardinality of 4 versus a function that whose argument has a cardinality of 36 (or Infinity!). That function needs to handle every possible case in order to avoid a runtime error or some other higher level business logic error. That’s much easier to do when your cardinality is low. Often times, a high cardinality is indicative of a poorly designed data model, and understanding the algebra of types can help you design a simpler or more accurate one that makes your program less complicated to reason about.
What We Learned
Algebraic data types sound complicated, but they’re really quite simple and surprisingly ubiquitous. We can view types in terms of the set of values that the compiler says are valid members of that type, which means we can think about types in terms of a set’s cardinality, that is, the number of members in the set. When we see types in terms of cardinality, it’s easier to see what math might have to do with types.
Sum types and product type are the two algebraic data types we commonly make use of in TypeScript (and many other typed languages).
- A sum type is a type whose members may be members of one its variant types, and its cardinality is the sum or the cardinalities of the variants.
boolean
is the canonical example of a sum type. The well-knownMaybe
andEither
data types are examples of sum types as well. - A product type is a type that combines other types into a structure, like a tuple or object/struct. The cardinality of a product type is the product of the cardinalities of the types it combines. The canonical example of a product type is
Pair<A, B>
, which is simple structure that holds two values, one of any typeA
and the other of some other typeB
.
Understanding the algebra of types helps us to avoid accidental complexity in our programs by designing better data types to model the problems we’re solving.