by Glen Hendrix
According to an internet article on ZDNet, we have reached our limits on human intelligence unless we re-organize our brains or make them more efficient. Apparently the faster one’s electronic impulses travel, the smarter one is. But that smartness has a price - energy. The faster the impulses go the more energy it takes, quickly maxing out on the practicality of being able to provide enough energy to the body to make a big difference in nerve impulse speed. See the ZDNet article Why We Can't Get Smarter.
Of course, this has nothing to do with the advancement of knowledge. That will continue at an ever-increasing rate. What we do with all that knowledge is where the smarter human brain might come in handy. Presently we’re not doing a bang-up job. Witness the fact we burn 85% of the petroleum that comes out of the ground, despite the fact it is a non-renewable resource and there are exotic plastics and lubricants for the production of which oil is irreplaceable: or the fact we are about to be eight billion strong and nobody mentions birth control because of religion: or that our national economy is being held ransom by representatives voted in by a constituency gobsmacked by the Alice-In-Wonderland notion of balancing the budget by lowering taxes: or that we are planning to zip through our 150 year stash of phosphates (fertilizer) in the next 50 years creating “biofuel.” Just to mention a few odd things for which historians will bestow upon us labels ranging from "short-sighted egocentrics" to “idiotic, psychopathic greedy-guts,” assuming there will be historians.
Plastic grocery bags prefer this as their second career.
Currently we handle this increasingly vast amount of information by dividing and subdividing it into amounts that can be handled by the human brain. Each subdivision is then studied by specialists who can add to this information through their studies. This is taking place in every aspect of human endeavor from science to sex. What we need is an intelligence to span these subdivisions, absorb them, and realize their relationships to every other subdivision. We need an intelligence to take this information and synthesize it into the solutions we need for the problems we are running up against.
Before we starve in a nuclear winter (war, super volcano, asteroid) or run out of resources or food due to lack of foresight, planning, or just plain smarts; we should come up with some alternative to achieve this "synthesis." Since it currently seems implausible to overclock our brains (no, crystal meth does not count), we should let our electronic vassals handle this heavy mental lifting.
Arenal Volcano courtesy Scott Robinson (Clearly Ambiguous) under the Creative Commons License
AIs becoming smarter than humans has been hashed over. It's a very popular science fiction theme called the “singularity moment” - that point in time at which machines come to the self-realization they are smarter than the people that made them and begin to manipulate their environment. Since this could involve said machines deciding they really don’t need us anymore, i.e. “The Terminator,” we should proceed with caution. Wikipedia has a good article on the singularity. But it seems to me just as likely AIs will take humans on as a “project” or under their wing as an incubating organic intelligence. Alternatively, everything could seem very benign under the aegis of our protective AI until it trades us as slaves to the first aliens that come along for a cool FTL space drive. We should be careful in dealing with such technology. Elon Musk and Stephen Hawking tell us so. But then again Mark Zuckerberg says AI, schmayeye, bring it on. One thing is for sure, once a machine becomes sentient, things could happen faster than humans can think.
In the meantime, we have to take that chance. We have to develop that thinking power to solve the problems coming up like climate change and resource limitations. The alternative is not looking much better than winding up as servants to three-legged saurians from the Lesser Magellanic Cloud.
No comments:
Post a Comment