Unfortunately, I don’t have a result to work towards with this post. Instead, I want to give a rough idea of what forcing is, by analogy with other extensions of structures, in particular, field extensions and Cauchy completions.

The usual way to construct an extension over a base field is to construct as a quotient of the polynomial ring by some (irreducible) ideal . Abstracting a bit, the sketch of this construction is:

- Start with a base structure . (In this case, our base field .)
- Find an object which approximates some gap in our structure. (In this case, the ideal represents an approximation of a solution to some equation (or set of equations) in .)
- Use some common operation to generate a new structure which contains the approximated object. (In this case, take the quotient .)

A similar thing happens when we complete a metric space:

- Start with a metric space .
- Take the set of Cauchy sequences in . Here, Cauchy sequences are approximations of points which don’t exist in .
- Quotient by Cauchy equivalence.

Compactification is another example of this type of construction. The basic idea of forcing is similar, but we need to figure out:

- What is our base structure?
- How do we approximate an object which doesn’t exist?
- What operation do we use to finish the approximation?

The base structure is easy: we start with a model of set theory, . *For technical reasons, we need this to be a model of a “sufficiently large fragment of ZF”. What this means isn’t important, because we’ll just tacitly assume we’re always working with models of ZF (even though this isn’t* quite *justified.) We also want this to be a* transitive *model, because certain technical lemmas (that we’ll avoid) require this. Sorry, this is all a little sloppy; we’ll go into more detail next time.*

What are the approximations? Forcing makes use of a poset (called a *forcing notion*) which exists in (That is, .) The elements of are the approximations of an object which doesn’t exist in , and we want to add to the “limit” (which won’t exist) of a set of elements of . *(edit: I said supremum before, but that’s not really what we’re looking for.)*

The subset of we use is what’s called a *generic filter*. A *filter* in is a subset of such that

- is non-empty;
- is upward closed. That is, if and , then ;
- Every two elements of are compatible. That is, for every , there is some such that and .

*Aside: this is nothing more than a generalization of the notion of filter for Boolean algebras (or lattices in general). Since in a Boolean algebra, a filter is dual to an ideal, by taking a filter we are morally doing the same thing as taking an ideal in a ring.*

A subset of is called *dense* if for every there is some with (that is, you can always find an element of below any element of , and is called *open* if it is downward closed.

A filter is called *generic over * if additionally, intersects every dense open subset of which **is an element of **.

Genericity expresses “non-existence of an object” in much the same way that non-convergence of a Cauchy sequence or irreducibility of an ideal expresses non-existence of an object.

It’s a fact (which I won’t prove), that for every poset , there is a generic filter over . However, we want to not exist in , and to ensure this, we require to be *perfect*; that is, for every , there are which are incompatible. If is perfect, it is in fact the case that there are generic filters over that are not elements of .

To complete the picture, we need to know how to generate the new structure. For this, we use relative constructability, which requires a more thorough explanation than I’m prepared to give in this post. We’ll leave a discussion of that, as well as some other important details for the next post. At any rate, when we’re finished, we’ll have a structure denoted , which has as a subset at as an element. Usually, by letting , we have that , but . Then is a new element of our universe, and is the element “approximated” by .

Before I leave off, I want to give a sketch of the first forcing argument: adding a new real number.

As our notion of forcing, we take what’s called the *Cohen forcing* (denoted ), which is the set of all finite sequences of ordered by *reverse inclusion*; that is, is the complete binary tree turned upside down. The idea is that the elements of represent approximations of subsets of , where as we move down the tree (away from the root) we get a better approximation. Observe that is perfect: We can extend any sequence by adding either 1 or 0, and these two extensions will be incompatible.

A filter in is a branch in the tree (being just a filter it need not be a *complete* branch—it may cut off somewhere). A dense subset is one which extends every possibly sequence. That is, given *any* sequence, say , we have some sequence in which extends , maybe something like .

Genericity ensure two things:

- The branch is a “complete” branch–it is a path all the way from the root to “the end”.
- The branch does not exist in .

Once we have a generic filter , when we construct , we get an *infinite* sequence which is the limit of the sequences in . By genericity, we must have that . But this sequence corresponds to a real a number; if the real number existed in , then so would , so we know that is a **new** real number. Such a new real number is called a *Cohen real*.

We can use this idea to construct a model of ZF in which the continuum hypothesis fails: instead of just adding *one* real number, we can add *as many real numbers as we want*.

Next time, after we discuss relative constructibility and some other minutia of forcing, we will see exactly how to “refute” the continuum hypothesis.