In the 1980’s James Flynn found that on average, human intelligence quotients (IQ) increase by 3 points every 10 years. Known as the “Flynn effect”, these increases in IQ points equate to 9 points per generation and mean that, someone with an average IQ nowadays would have been considered a genius 100 years ago. But why? What caused these increases in IQ?
The onset of the industrial revolution seems to be the biggest factor- affecting IQs in myriad ways. In particular, changes in formal schooling have likely played a big role- not as more and more children have attended school, but also as learning methods have changed. Whereas they were once more in line with rote-learning, thus promoting a very utilitarian approach to knowledge acquisition, now in the West at least, schooling requires more abstract thinking, and is thus focused less on knowledge acquisition, and more so on processing that knowledge.
Such differences in schooling have also led to more people pursuing more cognitively demanding professions including tech-related professions, teaching, medicine and law- in with higher IQs are beneficial. For example, while in 1950, only 12% of Americans experienced some level of tertiary or post-high school education, today, the same figure stands at 52%.
Evolving concepts of leisure time may also be responsible. Whereas leisure time was once mainly seen as a chance to relax and recover from weekly duties- nowadays, more mentally taxing activities are now considered an important part of leisure- from playing video games to rock climbing and traveling. The increased mental stimulation from these activities may have then passively inflated IQ scores along the way too.
So does all of this mean that we are now smarter than our grandparents? Not necessarily. According to Flynn, rather than becoming “smarter” or “more intelligent”, our brains have simply become more modern. This comes as, throughout different periods of time, different mental functions have gained and lost priority. For example, although map-reading was a highly important skill for Americans driving cars in 1950, nowadays, the onset of Google Maps has rendered this skill practically obsolete, and as such, its corresponding “measure of intellect”.
Another good example of this is found when looking into the subsets of the IQ test, and gains made within each. As modern society generally demands more abstract thinking, in the subset of the IQ test known as “similarities”, in which questions such as “What do cats and dogs have in common?” are asked, average scores increased by 25 points between 1950 and 2000. Meanwhile however, the arithmetic subtest saw minute increases by comparison- likely as, rather than calculating simple math problems in our heads, nowadays, we tend to use calculators.
In fact, such results in IQ subtest scores have led some to suggest that we may even be hitting “peak intelligence”. For example, since the 1990’s IQ scores in Finland, Norway and Denmark have been falling by an average of 0.2 points per year. Whether this is due to less stimulating education practices, or simply IQ tests no longer measuring relevant usages of mental bandwidth, is up for debate.
To conclude, although average IQ scores have increased dramatically over the last century, rather than meaning humans have gotten smarter, it instead suggests that we’ve simply become more modern. This comes as the modern world has different cognitive priorities than earlier times. This has essentially led to different neurological muscles having been stimulated and developed, while the need for others has been replaced by technology and more advanced societies.
Sources: BBC, Smithsonian Mag, European Scientist, Big Think and Merion West