Let's stick with Edwin Hubble for a second. While the 1920s roared past and the Great Depression limped by, Hubble was performing groundbreaking astronomical research. Hubble not only proved that there were other galaxies besides the Milky Way, he also discovered that these galaxies were zipping away from our own, a motion he called recession.
In order to quantify the velocity of this galactic movement, Hubble proposed Hubble's Law of Cosmic Expansion, aka Hubble's law, an equation that states: velocity = H × distance. Velocity represents the galaxy's recessional velocity; H is the Hubble constant, or parameter that indicates the rate at which the universe is expanding; and distance is the galaxy's distance from the one with which it's being compared.
Hubble's constant has been calculated at different values over time, but the current accepted value is 70 kilometers/second per megaparsec, the latter being a unit of distance in intergalactic space [source: White]. For our purposes, that's not so important. What matters most is that Hubble's law provides a concise method for measuring a galaxy's velocity in relation to our own. And perhaps most significantly, the law established that the universe is made up of many galaxies, whose movements trace back to the big bang.