As I sit here at my desk I am surrounded by five screens. The World Cup match between Argentina and the Netherlands is unfolding silently on my iPad while my laptop powers two screens, one holds Evernote where I am composing this essay, the other a Chrome window set to look for cheap car rental rates for a trip later this month, and to scan for any incoming emails. To my left sits my Wacom Cintiq panel where I am working on some new drawings. My cellphone rests somewhere around here — I'll find it if it rings. Pandora plays piano solos in the background — I'm not sure where it is coming from. This is my environment, one that I have come to accept as normal, but in which I am still not entirely comfortable.
I grew up in an era in which acquiring information and advancing knowledge required significant real-time effort. Whether you labored in a university or a corporate research and development laboratory, you wandered the stacks of the library. You paged through journal articles. You conducted research by interacting with subjects or samples that were often in the same room or lab. You applied critical models or coded data, ran tests and examined results. Then you looked for patterns and insights that allowed you to broach defensible conclusions or generate new hypotheses. Gathering information and advancing knowledge was hard work, and behind it always lurked the often unspoken assumption that the pursuit of knowledge was, in itself, an honorable goal.
Now more information than I could ever hope to acquire on my own shimmers across these screens that surround me. If I pose any hypothesis in the form of a question and enter it into Google Scholar or more specialized search engines available online through the university library, links to hundreds if not hundreds of thousands of studies will flash onto my screen providing — if not answers — then seemingly countless rabbit holes into which I can stumble and while away an afternoon, a week, or a career. In short, the pursuit of knowledge has become largely automated, as much programming as reflection. Where once we could dip a tentative toe in a particular pool, today we stand on the banks of rivers at flood stage, deciding whether or not to leap in. The sheer volume of information seems to demand levels of specialization undreamed of in my undergraduate days. The Mediaeval notion of a Renaissance man who could aim realistically at learning the sum of human knowledge is more fanciful than our wildest science fiction.
It is certainly a "brave new world." But, as always, the dynamic between knowledge and culture can be intense and unexpected. The accumulation of new knowledge, by definition, mandates change. And change is, as often as not, uncomfortable, even frightening. And society — particularly those who benefit most from current norms – reacts poorly to change or information that falls somehow afoul of the current notion of truth. Historical negative responses to new or unexpected information have run the gamut from burning witches to burning books. And it would be nice to think it was all in the past. The streams of information flowing across my screen assert that is not the case.
A variety of news sources reveal that fundamentalists of every faith would have us turn the clock of human knowledge back several thousand years; stuffing, for example, the miraculous reality of evolution into the pleasant, fictional poetry of Genesis, or another of the myriad creation myths from our colorful and blended heritage. As I write this a militaristic fundamentalist Sunni Islamic group known as ISIS — The Islamic State of Iraq and al-Sham is, apparently, attempting to recreate an historically and theologically pure Sunni entity in the Middle East – one that would perpetuate Sharia law, including the subservient place of women, supposedly demanded by centuries-old assumptions drawn from interpretations of the Quo-ran. Here in the rural South some Pentecostal believers still handle poisonous snakes because in one translation of the Bible it says "They shall take up serpents; and if they drink any deadly thing, it shall not hurt them; they shall lay hands on the sick, and they shall recover." I do not mean to single out Islam and Christianity, they come to mind as they get the most attention in that media flow here on my screens. Choose the extreme sliver of any religious or political organization and you are likely to find an affection for the past and an unwillingness to embrace any new knowledge that conflicts with time-honored old myths that have fossilized into truth.
As fundamentalism gains influence in the American political landscape — perhaps most obvious with the Republican Party, the traditional conservative standard bearer in American politics being attacked as "not conservative enough" or "Republican in Name Only" aka "RINOS" — opportunistic politicians seek to curry favor with this increasingly vocal and well-funded, anti-intellectual portion of the electorate. One result, at least publicly and in policy, is a growing distrust of new knowledge or education in general. This creates real problems in America's intellectual community — those who still believe that the pursuit of knowledge is, in itself, an honorable goal.
You see, in America, traditionally financial support for intellectual endeavors springs primarily from three sources; from its public tax supported institutions; from private institutions endowed by a wealthy elite class, or by multinational corporations purchasing research that will result in new products for expanding markets. That is a model that seems to be gaining traction internationally, leading to a series of global conflicts between fundamentalist movements of a variety of stripes who oppose education and new knowledge for religious or political reasons, and multinational corporate entities or state-run economies that seek to direct human intellectual activities down paths that advantage their financial or political objectives. Neither side fights fair — allowing for the moment that the notion of a "fair fight" need not always be an oxymoron.
OK. Now shift gears for a moment. Since the beginning of human society we have stared into the heavens wondering is we are alone. In the last century or so we have peered further and further into the nooks and crannies of the universe, discovering galaxies, stars, and planets seemingly without number — there is even good evidence that our universe may be only one of many. The notion that we are alone becomes absurd. A more interesting question then becomes "Why is no one out there talking to us?"
Think about it for a moment. As a species we are remarkably clever. Our radio-telescopes listen to the murmur of the universe, our probes are poised to pierce the heliosphere. We are unwinding the mysteries of the genome. We talk boldly of coming to understand the brain, to poke at the edges of immortality. So we should ask again "Why is no one out there talking to us?" We can assume the existence of entities far more clever than we. Civilizations that span planets and galaxies. There are just too many galaxies for homo sapiens to define the apex of sentient existence. It is more likely that the smarter kids in the class have been watching us for a long time. But it is equally likely that we will never be allowed to sit at their table in the lunchroom until we advance beyond our current adolescent fixations over power and politics, until we stop our schoolyard brawling and behave like adults.
And there is only one path to that admirable goal. We need to acknowledge the depth of our ignorance. We need to learn more, we need to study more, we need to become smarter. We need to realize that a truly healthy culture is one that seeks knowledge with the same ferocious intensity that, as a species, we currently seem only able to bring to the conduct of war. Maybe then the smart kids will be willing to talk to us.