# WPGMA

WPGMA (Weighted Pair Group Method with Arithmetic Mean) is a simple agglomerative (bottom-up) hierarchical clustering method, generally attributed to Sokal and Michener.[1]

The WPGMA method is similar to its unweighted variant, the UPGMA method.

## Algorithm

The WPGMA algorithm constructs a rooted tree (dendrogram) that reflects the structure present in a pairwise distance matrix (or a similarity matrix). At each step, the nearest two clusters, say ${\displaystyle i}$ and ${\displaystyle j}$, are combined into a higher-level cluster ${\displaystyle i\cup j}$. Then, its distance to another cluster ${\displaystyle k}$ is simply the arithmetic mean of the distances between ${\displaystyle k}$ and members of ${\displaystyle i\cup j}$ :

${\displaystyle d_{(i\cup j),k}={\frac {d_{i,k}+d_{j,k}}{2}}}$

The WPGMA algorithm produces rooted dendrograms and requires a constant-rate assumption: it produces an ultrametric tree in which the distances from the root to every branch tip are equal. This ultrametricity assumption is called the molecular clock when the tips involve DNA, RNA and protein data.

## Working example

This working example is based on a JC69 genetic distance matrix computed from the 5S ribosomal RNA sequence alignment of five bacteria: Bacillus subtilis (${\displaystyle a}$), Bacillus stearothermophilus (${\displaystyle b}$), Lactobacillus viridescens (${\displaystyle c}$), Acholeplasma modicum (${\displaystyle d}$), and Micrococcus luteus (${\displaystyle e}$).[2][3]

### First step

• First clustering

Let us assume that we have five elements ${\displaystyle (a,b,c,d,e)}$ and the following matrix ${\displaystyle D_{1}}$ of pairwise distances between them :

a b c d e
a 017213123
b 170303421
c 213002839
d 313428043
e 232139430

In this example, ${\displaystyle D_{1}(a,b)=17}$ is the smallest value of ${\displaystyle D_{1}}$, so we join elements ${\displaystyle a}$ and ${\displaystyle b}$.

• First branch length estimation

Let ${\displaystyle u}$ denote the node to which ${\displaystyle a}$ and ${\displaystyle b}$ are now connected. Setting ${\displaystyle \delta (a,u)=\delta (b,u)=D_{1}(a,b)/2}$ ensures that elements ${\displaystyle a}$ and ${\displaystyle b}$ are equidistant from ${\displaystyle u}$. This corresponds to the expectation of the ultrametricity hypothesis. The branches joining ${\displaystyle a}$ and ${\displaystyle b}$ to ${\displaystyle u}$ then have lengths ${\displaystyle \delta (a,u)=\delta (b,u)=17/2=8.5}$ (see the final dendrogram)

• First distance matrix update

We then proceed to update the initial distance matrix ${\displaystyle D_{1}}$ into a new distance matrix ${\displaystyle D_{2}}$ (see below), reduced in size by one row and one column because of the clustering of ${\displaystyle a}$ with ${\displaystyle b}$. Bold values in ${\displaystyle D_{2}}$ correspond to the new distances, calculated by averaging distances between each element of the first cluster ${\displaystyle (a,b)}$ and each of the remaining elements:

${\displaystyle D_{2}((a,b),c)=(D_{1}(a,c)+D_{1}(b,c))/2=(21+30)/2=25.5}$

${\displaystyle D_{2}((a,b),d)=(D_{1}(a,d)+D_{1}(b,d))/2=(31+34)/2=32.5}$

${\displaystyle D_{2}((a,b),e)=(D_{1}(a,e)+D_{1}(b,e))/2=(23+21)/2=22}$

Italicized values in ${\displaystyle D_{2}}$ are not affected by the matrix update as they correspond to distances between elements not involved in the first cluster.

### Second step

• Second clustering

We now reiterate the three previous steps, starting from the new distance matrix ${\displaystyle D_{2}}$ :

(a,b) c d e
(a,b) 025.532.522
c 25.502839
d 32.528043
e 2239430

Here, ${\displaystyle D_{2}((a,b),e)=22}$ is the smallest value of ${\displaystyle D_{2}}$, so we join cluster ${\displaystyle (a,b)}$ and element ${\displaystyle e}$.

• Second branch length estimation

Let ${\displaystyle v}$ denote the node to which ${\displaystyle (a,b)}$ and ${\displaystyle e}$ are now connected. Because of the ultrametricity constraint, the branches joining ${\displaystyle a}$ or ${\displaystyle b}$ to ${\displaystyle v}$, and ${\displaystyle e}$ to ${\displaystyle v}$ are equal and have the following length: ${\displaystyle \delta (a,v)=\delta (b,v)=\delta (e,v)=22/2=11}$

We deduce the missing branch length: ${\displaystyle \delta (u,v)=\delta (e,v)-\delta (a,u)=\delta (e,v)-\delta (b,u)=11-8.5=2.5}$ (see the final dendrogram)

• Second distance matrix update

We then proceed to update the ${\displaystyle D_{2}}$ matrix into a new distance matrix ${\displaystyle D_{3}}$ (see below), reduced in size by one row and one column because of the clustering of ${\displaystyle (a,b)}$ with ${\displaystyle e}$ : ${\displaystyle D_{3}(((a,b),e),c)=(D_{2}((a,b),c)+D_{2}(e,c))/2=(25.5+39)/2=32.25}$

Of note, this average calculation of the new distance does not account for the larger size of the ${\displaystyle (a,b)}$ cluster (two elements) with respect to ${\displaystyle e}$ (one element). Similarly:

${\displaystyle D_{3}(((a,b),e),d)=(D_{2}((a,b),d)+D_{2}(e,d))/2=(32.5+43)/2=37.75}$

The averaging procedure therefore gives differential weight to the initial distances of matrix ${\displaystyle D_{1}}$. This is the reason why the method is weighted, not with respect to the mathematical procedure but with respect to the initial distances.

### Third step

• Third clustering

We again reiterate the three previous steps, starting from the updated distance matrix ${\displaystyle D_{3}}$.

((a,b),e) c d
((a,b),e) 032.2537.75
c 32.25028
d 37.75280

Here, ${\displaystyle D_{3}(c,d)=28}$ is the smallest value of ${\displaystyle D_{3}}$, so we join elements ${\displaystyle c}$ and ${\displaystyle d}$.

• Third branch length estimation

Let ${\displaystyle w}$ denote the node to which ${\displaystyle c}$ and ${\displaystyle d}$ are now connected. The branches joining ${\displaystyle c}$ and ${\displaystyle d}$ to ${\displaystyle w}$ then have lengths ${\displaystyle \delta (c,w)=\delta (d,w)=28/2=14}$ (see the final dendrogram)

• Third distance matrix update

There is a single entry to update: ${\displaystyle D_{4}((c,d),((a,b),e))=(D_{3}(c,((a,b),e))+D_{3}(d,((a,b),e)))/2=(32.25+37.75)/2=35}$

### Final step

The final ${\displaystyle D_{4}}$ matrix is:

((a,b),e) (c,d)
((a,b),e) 035
(c,d) 350

So we join clusters ${\displaystyle ((a,b),e)}$ and ${\displaystyle (c,d)}$.

Let ${\displaystyle r}$ denote the (root) node to which ${\displaystyle ((a,b),e)}$ and ${\displaystyle (c,d)}$ are now connected. The branches joining ${\displaystyle ((a,b),e)}$ and ${\displaystyle (c,d)}$ to ${\displaystyle r}$ then have lengths:

${\displaystyle \delta (((a,b),e),r)=\delta ((c,d),r)=35/2=17.5}$

We deduce the two remaining branch lengths:

${\displaystyle \delta (v,r)=\delta (((a,b),e),r)-\delta (e,v)=17.5-11=6.5}$

${\displaystyle \delta (w,r)=\delta ((c,d),r)-\delta (c,w)=17.5-14=3.5}$

### The WPGMA dendrogram

The dendrogram is now complete. It is ultrametric because all tips (${\displaystyle a}$ to ${\displaystyle e}$) are equidistant from ${\displaystyle r}$ :

${\displaystyle \delta (a,r)=\delta (b,r)=\delta (e,r)=\delta (c,r)=\delta (d,r)=17.5}$

The dendrogram is therefore rooted by ${\displaystyle r}$, its deepest node.

Alternative linkage schemes include single linkage clustering, complete linkage clustering, and UPGMA average linkage clustering. Implementing a different linkage is simply a matter of using a different formula to calculate inter-cluster distances during the distance matrix update steps of the above algorithm. Complete linkage clustering avoids a drawback of the alternative single linkage clustering method - the so-called chaining phenomenon, where clusters formed via single linkage clustering may be forced together due to single elements being close to each other, even though many of the elements in each cluster may be very distant to each other. Complete linkage tends to find compact clusters of approximately equal diameters.[4]