I’ve just moved from Maryland to Edinburgh. I’m here to do a one-year master’s, as I have ended up explaining to countless customer service people. Naturally they ask, “A master’s in what?” I say “cognitive science” apologetically, knowing it sounds vague and perhaps technical. And given the way people blink at me in response, I think they agree.
So now I streamline the exchange with a joke. “I’m here studying cognitive science – and hopefully by the end of my year here, I’ll know what cognitive science is!” So far, everyone’s eaten up that line. Which is gratifying – apparently I’m funny in Scotland! But it’s also a bit troubling. If I can’t even attempt to define what I’m studying, how can I have good reasons for studying it? What is and why cognitive science?
My path to this field reads like an accident. During my first semester of college, I enrolled in an introductory cognitive science class mostly because it was at a convenient time. I wanted to study literature. But then my first lit class turned out to be a bust – filled with science majors forced into a humanities requirement and silent during discussions on the readings. Though I’m forever grateful to that class for assigning De Profundis by Oscar Wilde, my enthusiasm for studying literature waned.
On the other hand, the cognitive science professors had us learning how our visual system converts the two-dimensional images that go onto our retina into the complex 3D world we actually perceive. I loved that. Not because it was any more beautiful than Oscar Wilde berating an ex-lover, but because the details of the visual system concretely illustrate how point of view is written into our worlds. If I stared at the kid sitting in front of me in class, my eyes were not just taking in something exactly as it was. Translation was required from my eyes to my brain to produce this image: dude in gray sweatshirt.
And that doesn’t even scratch the surface of all the other stuff that would have to go on in my head for me to decide if that dude looked familiar, approachable, attractive, etc. But to me, that seemed exactly the goal of cognitive science: unpacking the stuff going on in my and other people’s heads. As someone always fascinated with human thought, a perpetual over-thinker, how could I not fall for that?
Now admittedly, my unofficial description of cognitive science makes it sound like a leech on other disciplines. Psychology exploded in Western culture via Freud, who obsessed over the inner workings of the mind. The field has since worked hard to denounce Freud’s legacy (and I don’t entirely blame them – no one dare tell me I have penis envy). Still, as the study of human behavior, psychological research inevitably asks questions and makes inferences about thought.
Then there’s philosophy, which has probed the nature of perception, reality and systems of thought for, well, a really long time. And what about neuroscience? If our minds are made possible by our brains, why don’t we just start with the brain? Why should cognitive science be its own thing?
There are many possible responses to that question. A gentle response is that cognitive science brings together those other disciplines toward common goals, making it a sort of hub. A firmer response is that cognitive science has a unique underlying belief: that the mind should be thought of as a computer. The idea is that all our thoughts come from processes acting on existing representations — like an innate concept of language — the same way a computer processes stored information.
That’s the notion that grabs attention for cognitive science, because it leads most directly to all the future age stuff about robots and artificial intelligence and advances in human prosthetics. Those developments are exciting (if sometimes terrifying). They also lend themselves to the most mind-boggling applications, like technology to aid people with severe nerve damage or to communicate quite literally with our thoughts.
Beyond that, though, I think the whole mind-as-computer analogy has broader use and appeal. It might sound dry or depressing, suggesting that thinking beings are just computers that bleed (some people definitely see it that way). But I take it more as a framework encouraging us to break down all the overwhelming questions about our minds – i.e., WHAT IS GOING ON IN THERE?? – into more approachable bits and pieces, such as, what mental processes are involved in learning how to count?
Maybe that’s how I should explain my degree: “It’s like, how kids learn to count.” Once I dig my heels into research, it will indeed get that specific. But the funny thing is, I’m not even sure I will continue on with research after this year. What I am sure about is that it is infinitely useful to me to study how people end up thinking the things they do. And I’ll figure out something useful to do with that.
Meanwhile, hello, Edinburgh! Time to start looking to my right for traffic. And maybe there’s a study to be done on the fact that I think best while nursing a pint.