I use the word “orthogonal” a lot — and have been asked before what I meant by it. The best way for me to explain is as follows: *orthogonal is when two concepts are related, and yet independent.*
Orthogonal has its roots in mathematics, so let’s start there.
- In *geometry*, orthogonal means “at right angles” and is thus synonymous with *perpendicularity*. The most obvious illustration is a point on the 2D plane. That point has a value on the $x$ axis, and another on the $y$ axis, which together define its coordinates on the plane. The $x$ value can change (shifting the point left and right) independently of the $y$ value, and vice-versa; they are *degrees of freedom*.
- In *statistics*, orthogonal means that the two variables are independent. Bobby Henderson (of Pastafarian fame) cheekily used the example of [a near-perfect correlation](https://www.spaghettimonster.org/pages/about/open-letter/) between the decline in the number of pirates in activity and the increase in the global mean surface temperature of Earth; obviously, those are independent *and* unrelated variables, and the correlation is purely coincidental.
- In *computer science*, orthogonality is the ability to programmatically change one thing without changing others — again conveying the notion of independence.
But what if two variables are independent but still related in some way?
An example that comes to mind is the position $x$ and momentum $p$ of a particle. Knowing where the particle is located tells us nothing about its momentum, and vice-versa; they are *independent* variables. But, they are also *related* through the uncertainty principle ($\sigma_{x} \cdot \sigma_{p} \geq \frac{\hbar}{2}$, meaning that the product of their standard deviations is greater or equal than some value which is half of the reduced Planck constant). Those variables are not as foreign as, say, the number of active pirates and the mean surface temperature of Earth.