We all make mistakes. Sometimes, if you play your cards right, they can become valuable learning opportunities. It's called "human error" for a reason; even the best of us leave a "t" uncrossed or an "i" undotted now and then. Such is life.
Before trying to correct a blooper, gaffe or snafu (did you know a thesaurus makes a great gift?), it's usually a good idea to find out what went wrong in the first place.
The size of the error is a key detail. How badly did you miss the mark? Was it a close shave or wildly off-base?
Picture a violinist in a philharmonic orchestra. On the night of a big concert, he misses an important cue and plays some notes too late. If he missed the cue by half a second, it might not be a huge deal. But if he missed it by half a minute, that's a different can of worms.
When there's a difference between the value you expected and the value you actually got — and you express that difference as a mathematical percentage — it's called a percent error or percentage error. Calculating percentage error involves comparing an expected value and an actual value to determine how far reality deviated from theoretical expectations.
Today, we're going to take the mystery out of reporting the percent error correctly and show you how to use it in real life.