(This is Part 1 of However-Many-I-Feel-Like-Writing)
I know what you’re thinking…Another frigging age, already!?
Afraid so. It can’t be helped. It’s been foisted upon us like my pink umbrella was foisted upon me by my mom.
“Don’t be a hero,” she said. “It’s pouring out there. And don’t worry about the col0r. I hear pink is now fashionable for men.”
Sigh, okay mom. The worst part is that it’s really a great umbrella. How did I not know how great umbrellas truly are? They keep you, like, dry…
But I digress. Back to cognition.
Yes, it’s the “Age of Cognition” because cognition memes are everywhere these days. Books, articles, TV shows, social media, movies, workplaces, you name it. You can’t escape them. They’re like elephant poo after the circus leaves town, like white cat hair on your black jacket just before the big meeting, like John McCain on Sunday morning shows. They’re just inevitable.
There are two prongs to the Age of Cognition. Prong one concentrates on the pudding in our noggins, that mysterious goo responsible for logic and love, math and poetry, atomic bombs and nerve gas, Laurel and Hardy, pet rocks and the iFart apps.
Prong two delves, like a wet finger in the socket of uncertainty, into computer science. It covers everything from ditsy Siri to dangerous Hal 9000 to jeopardizing Watson and the adorably precocious siblings ConceptNet 4 and ConceptNet 5.
And then there’s that place were the two prongs converge (or diverge? can you picture this weird fork?). It’s where people become machines (or vice versa) and where the wet dreams of Singularity zealots reside, determined to one day be downloaded like the latest version Fantasy Fantasy by the great-great-grand kids. Or maybe it’s where we all have chips in our brains or high-tech hats. Whichever boosts our IQs by, like, a bajillion.
But seriously, it’s hard to overstate the sheer number of threads that make up this Age of Cognition. Here are just a few of the more accessible axons weaving together in bizarre ways to make up the Age:
BRAIN Initiative: Yes, the entity that can’t even balance a spreadsheet, the U.S. government, is going to figure out the brain. It’s laid out a massive research effort to “revolutionize our understanding of the human mind and uncover new ways to treat, prevent, and cure brain disorders like Alzheimer’s, schizophrenia, autism, epilepsy, and traumatic brain injury.” Awesome. Better than a moonshot, right! Except, well, why is DARPA the first government agency mentioned on the website?
Brain Games: A reality show about the brain? Have we really gone that far to the geeky side? Housewives-watchers beware! This one’s a tad more high brow, showing you “what is going on in there with an intricate series of interactive experiments designed to mess with your mind and reveal the inner-workings of your brain.” Has there ever been another age when this stuff wouldn’t have been DOA? I doubt it. It’s a 21st Century thing.
Neuroleadership: Management is the type of field in which consultants and researchers will latch onto fashionable ideas like a cat on a new screen door. A lot of weird hybrid ideas get concocted this way, some of which are useful. I’m not sure whether neuroleadership is among them, but I do know people like David Rock have made a name for themselves in this niche. How else would we know, for example, to beware the dangers of dopamine bursts (no, not a new movie candy). Neuroleadership has infiltrated everywhere. And that’s because, repeat after me, we live in the Age of Cognition (see, if we repeat it enough, it’ll surely become true, another lesson we can learn from the management sciences).
Neurolearning: Here’s a word I thought I’d made up as a joke (after all, can you have learning where there are no neurons? ) but no. It got a whopping 26,100 hits on a Google search, including its very own website. Groovy. Despite the repetitive redundancy behind the name, I do think that knowing about how cognition works can help learning become more efficient. That should hold us until we can upload lessons via fiber optic cables like Neo I-know-Kung Fu Anderson.
Monkeying with Mind Control: Speaking of technologies that make you go, “Whoa!”, we’d be remiss if we didn’t mention monkey mind control. Okay, if you have kids in the room, make sure they aren’t reading over your shoulder, cuz this is creepy stuff: “They used a machine to covert the brain activity of one monkey, called the master, into electrical impulses applied to the spinal cord of a sedated animal, called an avatar, which moved in response to the stimulation.” One thing is clear to me: we’ve got to stop our biomedical researchers from watching so many Gorrilla Grodd cartoons.
The Mr. Spock-Approved Brain: “Spock, I’ve got a hankering to drive our ship into the sun to impress our hot new ensign.” “I’d advise against it, Captain. It is my considered opinion that you should exercise your logically constructed human cerebral cortex rather than rely on thinking with any other part of your anatomy.” Yes, it’s true. Our cerebral cortex, the stuff that gives us language and consciousness, is neatly organized into eight distinct neural networks. It turns out that God really is an Exquisite Carpenter.
The Brainy AI Debate: Think of all the brain folks, from programmers to philosophers to neurologists, as hunks of your brain quarreling with itself: “You should go to the gym; no, wait, you should work on that editing project; no, maybe a nice nap to get back on track; c’mon, get to the gym already, you schmuck!” So, what do the brain folk quarrel about? Well, one big conflict is this: Can humanity build a machine that mimics human thinking (thus passing the Turing Test) by leveraging engineering lessons gained from decades worth of computer design? Or is it better to study and then reverse engineer the crap out of our brains and then build a mechanical replica? Both seems like good ideas for creating intelligent machines (assuming we should build such beasts in the first place). But, up to now, neither one has worked. If they were two stooges, Moe would knock their heads together.
Okay, enough for now. We’ll come back to the Age of Cognition latter to further explore that second prong in greater detail. Should be electrifying.