History
German psychologist William Stern formulated the basic definition of IQ in 1912 when he defined intelligence quotient as a ratio of an estimated "mental age" and "actual chronological age":
For instance, if a ten-year-old boy has the intellectual capabilities of a thirteen year old, his IQ equals 130 (100×13/10).
During testing, tasks, divided according to the age of individuals who are, on average, able to complete them, are used. Mental age is then determined on the basis of the most complicated tasks that a tested individual is able to adequately complete.
An IQ in the range of 90 to 110 is considered average.
Stern's equation, however, only makes sense with children; for adults, a derived quotient, known as a deviance IQ, is used to compare the level of an individual's mental abilities with the mean level of a population. Approximately 50% of the population has an average IQ value (i.e. 90–110). Roughly 13% of the population fit within 110 and 139 and 1.5% of earth's inhabitants score at the genius level. Low intelligence is defined by scores in the 80–89 range. Scores below 70 designate, in descending order, –morons–, idiocy and imbecility. Morons are teachable and trainable; idiots are trainable while imbeciles are neither teachable nor trainable. The fact that you possess a certain level of intelligence, however, does not necessarily play a significant role in your life. Blue-collar workers, for example, can have an IQ over 125.
IQ measurement has developed over the years and one of the most exact IQ tests has just been introduced to you. Because some things have not changed over all those years, however, do not be surprised that during the course of this test you will be asked to enter your age. This is only to ensure that the resulting score is as exact as possible. Good luck!