According to a recent Internet article on Smart Planet, we have reached our limits on human intelligence unless we re-organize our brains or make them more efficient. Apparently the faster one’s electronic impulses travel, the smarter one is. But that smartness has a price--energy. The faster the impulses go the more energy it takes, quickly maxing out on the practicality of being able to provide enough energy to the body to make a big difference in nerve impulse speed. See the Smart Planet article Why We Can't Get Smarter.
Of course, this has nothing to do with the advancement of knowledge. That will continue at an ever-increasing rate. What we do with all that knowledge is where the smarter human brain might come in handy. Presently we’re not doing a bang-up job. Witness the fact we burn 85% of the petroleum that comes out of the ground, despite the fact it is a non-renewable resource and there are exotic plastics and lubricants for the production of which oil is irreplacable: or the fact we are about to be seven billion strong and nobody mentions birth control because of religion: or that our national economy is being held ransom by representatives voted in by a constituency gobsmacked by the Alice-In-Wonderland notion of balancing the budget without raising taxes--the 21st century notion of “40 acres and a mule”: or that we have a plan to zip through our 150 year stash of phosphates (fertilizer) in the next 50 years creating “biofuel.” Just to mention a few odd things for which historians will bestow upon us labels ranging from "short-sighted egocentrics" to “idiotic, psychopathic greedy-guts,” assuming there will be historians.
Currently we handle this increasingly vast amount of information by dividing and subdividing it into amounts that can be handled by the human brain. Each subdivision is then studied by specialists who can add to this information through their studies. This is taking place in every aspect of human endeavor from science to sex. What we need is an intelligence to span these subdivisions, absorb them, and realize their relationships to every other subdivision. We need an intelligence to take this information and synthesize it into the solutions we need for the problems we are running up against.
Before we starve in a nuclear winter (war, supervolcano, asteroid) or run out of resources or food due to lack of foresight, planning, or just plain smarts; we should come up with some alternative to achieve this "synthesis." Since it currently seems implausible to overclock our brains (no, crystal meth does not count), we should let our electronic vassals handle this heavy mental lifting.
Arenal Volcano courtesy Scott Robinson (Clearly Ambiguous) under the Creative Commons License
AIs becoming smarter than humans has been hashed over. It's a very popular science fiction theme called the “singularity moment”--that point in time at which machines come to the self-realization they are smarter than the people that made them and begin to manipulate their environment. Since this could involve said machines deciding they really don’t need us anymore, i.e. “The Terminator,” we should proceed with caution. Wikipedia has a good article on the singularity. But it seems to me just as likely AIs will take humans on as a “project” or under their wing as an incubating organic intelligence. Alternatively, everything could seem very benign under the aegis of our protective AI until it trades us as slaves to the first aliens that come along for a cool FTL space drive. We should be careful in dealing with such technology. Once a machine becomes sentient, things could happen faster than humans can think. A fictional account of such an event is described in my book Transmat World.
powered by Fotopedia
In the meantime, we have to take that chance. We have to develop that thinking power. The alternative is not looking much better than winding up as servants to three-legged saurians from the Lesser Magellanic Cloud.