Last month, I graduated from Skidmore College after four years. I ended my term there by writing a 27,000-word undergraduate thesis (above) as the culmination to the major I designed (Epistemology of the Social Sciences).
I didn’t arrive at Skidmore intending to create my own major, or even intending to study what I ended up majoring in. In fact, I arrived at Skidmore with the full intention of studying Economics. I had only had a brief exposure to Economics during High School, but what I had seen persuaded me that Economics offered the most powerful and elegant set of tools to understand the world. It seemed to provide powerful predictive ability about human and corporate behavior.
After two years, my thought had shifted radically. Here’s why.
During my first couple of semesters, I learned Economics was functioned by analyzing markets; how the agents within market behaved. I also learned that the behavior of agents could be predicted by calculating their monetary incentives. While people may not be rationally individually, in aggregate people will be rational – and thus the average person, “Homo Economicus,” would be rational. Macroeconomics introduced a few wrinkles, sure, but that was to be expected when dealing with a complex system.
This was great. It was perfect, in fact – just what I had been looking for: a rational, extensible, and elegant model for understanding people’s behavior.
It also didn’t last.
The first surprise came during my second semester, when I took an Introduction to Sociology course; I also read Berger and Luckmann’s The Social Construction of Reality.
The entire premise of sociology is that (i) social structures exist, and (ii) social structures alter people’s behavior. While the precise fashion in which social structures are either created or affect people remains hotly debated within sociology, sociologists have amassed a reasonable volume of empirical evidence supporting their existence and impact.
The problem is that Economics doesn’t say anything about social structures. It can deal with a bit of organization between agents, but not much more. Now, this in itself isn’t fatal. It could be that economics limits itself to economic behaviors that occur “in the market,” and that social structure has a marginal effect on market interactions.
I also took an introduction to philosophy course. While I’d previously read most of the materials in the course, it sensitized me to Descartes’ problem of knowledge and Hume’s problem of induction.
In my third semester at Skidmore, four of my classes raised additional concerns. My introduction to psychology course raised additional problems for my understanding of economics; while economics assumes that at least the average of individual behaviors is rational, psychology has a wealth of (experimental) evidence suggesting that people are highly irrational. The TED conference has features some of the applicable research. David Pink and Tom Wujec both mention the most important one: Adding or increasing monetary incentives can decrease performance. Another, even more disturbing example was featured in a recently published blog post that discusses how adding a fine that is less than the implicit social cost of bad behavior can increase that bad behavior! For instance, adding a fine of $3 for late parents in a day care center dramatically increased the number of parents who came late.
Not only does this shake some of the foundations of economics, it provides a whole new level of validation for sociology. The very existence of an implicit social cost suggests the existence of a psychosocial structure!
This (disturbing) information was backed up by another class, Individual in Society. Ostensibly a sociology course, it drew heavily from (social) psychological research to examine how society affected individuals, and vice versa. The result, of course, was yet more information suggesting that economics had it wrong.
Furthermore, another of my classes – Studying Student Worlds – immersed me in qualitative data analysis, specifically participant observation. I ended up writing a paper on student use of a physical space (the campus center). The experience introduced me to the depth possible with qualitative research, further undermining my trust in economics with its hypothetico-deductive and statistical methods.
Additionally, in the one economics course I took – Microeconomic Theory – I chose to write my semester essay on the subprime mortgage market. Fortunately, I had the advantage of researching the market just as the subprime mortgage crisis was beginning to emerge, presaging this most recent financial crisis. The experience did not, shall we say, reaffirm my confidence in the methods employed by economists.
Nor was my experience isolated to economics; in each new discipline, not only did I find contradictory theories and divergent methods, but I found that each discipline contradicted others at multiple levels.
In other words, by the end of my third semester at Skidmore college, I didn’t know where to turn.
When I asked my teachers, they were very understanding and quite helpful. Unfortunately, the answer reduced to “go to graduate school.” Digging a little deeper, that solution would only solve half the problem – and not, to my mind, the important half. A graduate school examination involves, really, intensively examining the literature – which means negotiating with each (contradictory) theory, and forming your own evaluation of it. The difficulty is that such a project does not deal with contradictions across disciplines; only within disciplines.
Of course, I had to do something – that is, I had to declare a major. Economics, I thought, had appealed to me because I tended to think of the world in similar ways before learning Economics: in other words, it was “intuitive” to me. I could major in Economics, but I would do little more than perfect that way of thinking and learn about some of the conclusions which that way of thinking leads to. I wouldn’t challenge my existing ways of thinking, or get “closer to the truth.”
During my fourth semester, I spent a substantial amount of time thinking about the point of education. I was influenced partly by a class called Citizen Studentship, which is a pedagogical experiment. The class is predicated on the idea that having a teacher changes the nature of the educational experience, and that there are some drawbacks from that. In Citizen Studentship, the teacher only showed up if we asked him to – as students, we taught each other. The topic, and the exploration, was in the nature of education, and the idea that one needed to take ownership of one’s education.
The other issue which drove my thinking is what I came to call my “integrity problem.” Let’s say I was employed as a Market Analyst, and my manager asked me to produce a report on a market segment. How could I employ the tools in, say, Economics while believing that there is a distinct possibility that the techniques are misleading, that the results are wrong? It would, quite frankly, feel like lying.
A trivial extension was that giving any recommendation without knowing the limitations of knowledge; without understanding where that conclusion applied, and when it would begin to break down.
Added to the mix was my interest in technology, and how it was changing society. Would the education I received at college even be applicable in 20 years? Given how the world is changing, what skills do I want to develop in college?
These questions and concerns led me down a train of thought prompted by “the knowledge problem.”
The Knowledge Problem
Thirty years ago, the internet didn’t exist. Computers were not in common use – either commonly, or by the common person. The world has changed. Not only are nearly two billion people accessing the internet, but we’re facing a data singularity.
Except we’re not just facing a mass of data: we’re facing an excess of information. Over twenty years ago, the New York Times reported that scientists cannot keep up with new research. It’s only got worse in the intervening years; there’s just too much information, no one person can know all of it. The response has been specialization: a scientist would focus their energy on one very small area, and learn everything they could. It’s a radical change from the days of Francis Bacon and Leonardo De Vinci, when scientists (well, philosophers) seemed to make discoveries in every field of human knowledge. Except these areas of specialization have been diminishing at an ever-increasing rate. This is obvious nowhere more than mathematics, where the American Mathematical Society classifies maths into over six thousand different subdivisions.
So specialization is increasing. Isn’t that a good thing? Specialization, as we all know, makes people more productive. Adam Smith, in the first chapter of the first book of the Wealth of Nations, waxes lyrically about the benefits of division of labor. Doesn’t focus make the world better?
I was concerned with two problems arising from specialization. The first, premature specialization, is the tendency to specialize before understanding the alternatives. The second is a touch more insidious, and relates the expansion of information which has driven specialization. Briefly, what if there’s duplication of work across specialties?
Allow me to take these in order. Certainly, there are returns to specialization. But there are also consequences; a specialist may be much more effective within their domain, but much less effective outside of their domain. Part of the problem is that specialists tend to know how to do one thing very well. But – as the saying goes – if all you have is a hammer, then everything looks like a nail.
It seems obvious, then, that it’s important to determine when a particular specialist is most likely to be effective. Sure, this may not be a problem during day to day work, but whenever you’re looking to do something new, something where the problems themselves are not well-understood, specialization may hamper one’s effectiveness. I hate to point to the late financial crisis again, but – well, look at the consequences of economists applying their highly specialized set of tools to new areas. Not only did they get it wrong, but they misunderstood the nature of the problem – they were incapable of recognizing that certain things (which were not accommodated by their toolbox) were important.
The second issue is the one of duplication of work. This may seem unimportant, but – for example – suppose you’re a PhD student writing their dissertation. You pick a topic, perform the literature review, and write your conclusions – only to find out later that there’s an entire body of research in a separate field that not only came to your conclusions, but also refuted them, 20 years ago.
Time-displaced duplication of work is inefficient. It’s also, quite possible, less effective. If two different specialties are concerned with the same problem, does it not make sense that they should collaborate? Or, at the least, be aware of each other’s research.
Unfortunately, highly specialized people use highly specialized terminology (jargon). But different specialties can use different words for precisely the same problem. Even if one has access to tools such as Google, they are (at least so far) not capable of returning results for things you didn’t search for.
I wanted a different solution – I didn’t want to specialize prematurely, and then find my understanding limited.
Recasting the Problem
The knowledge problem, as defined, is the expansion of information to the extent than no one individual can have a handle on it. But if you can’t know it a priori – that is, before you need it – then can you know it after you need it? Or: can you determine what you need to know, and then go find out, as opposed to learning everything first and then looking for applications?
Undoubtedly, it’s a twofold issue (i.e. the first rule of ignorance is that you’re ignorant of what you’re ignorant of, which means you need some knowledge to even begin to search).
I recast the problem as finding and evaluating information.
It used to be that finding was the biggest problem. Thirty years ago, all you could do was go to the library and look up the index – a time-consuming activity that precluded processing much information. Fortunately, new technology has emerged to take the edge off the knowledge problem – the most obvious, and arguably the most effective, being Google. It’s essentially a type of filtering, where you’re filtering the sum of human knowledge (well, the index) to displaying only those results which you are interested in.
Though the “finding problem” isn’t solved, it’s been ameliorated. But the evaluation problem remains.
How do you evaluate the validity of the information you find; how true, and how applicable, it is? If – as there always are – there are multiple theories, each of which gives different courses of action, how do you choose which one to adhere to?
It’s possible to make an arbitrary choice, such as the theory that supports the actions you would like to take anyway. But that is a highly biased choice, and one which is unlikely to be true. Indeed, as psychologists have demonstrated, humans experience an array of cognitive errors which make that kind of selection highly misleading.
The modern problem, then, is evaluating new information. Whatever education I got out of college, it needed to address that.
Designing a Major
In the end, I came to the conclusion that no single major – or combination of majors – could teach me what I wanted to learn.
Fortunately, Skidmore provided the opportunity to create a self-determined major, a true realization of the liberal arts tradition. It may, in fact, be unique in this regard; I haven’t seen any other colleges providing the opportunity to design your own major.
I set about designing my major. In doing so, I made two assumptions. First, that knowing certain theories was much less important than being able to (i) find, (ii) understand, and (iii) evaluate those theories. And second, that it was possible to do that – evaluate theories – without first becoming an expert in the subject.
I described my goal in my major proposal:
I am motivated by a desire to understand both the sources of knowledge in the social sciences, and also the ‘quality’ – insofar as that can be determined – of that resulting knowledge. I believe that any study of the social sciences must begin with an investigation into the attributes and limitations of knowledge; in a word, its epistemology.
My goal is to be capable of keying into the limitations of knowledge in all aspects of the social sciences, to locate and criticize unjustified conclusions, and to be able to determine what methods are necessary to justify a given conclusion.
I divided my examination of the social sciences into three categories: background, methods, and epistemology. I took courses on the foundations of each discipline, how they had developed over time; learned the methods and techniques they employed; and studied logic, epistemology, and the philosophy of science.
My goal for my self-determined major was learning how to evaluate information; to understand the limitations of knowledge. In short, I set out to learn “how to think” or – to be meta – “how to think about thinking.”
I believe I succeeded in that goal; though not, perhaps, as well as I would have liked.
While I had originally intended to write a thesis applying techniques I learned to each of the three social sciences I examined, I soon learned that simply wasn’t possible.
The notion of “truth” is very slippery indeed, and quite often misleading. The concept of “evaluation” is equally ambiguous, and can mean many different things. And the promise of “objectivity” is one, I believe, that will never be fulfilled.
Thus, my thesis is not a manual for how to think. Nor is it an example of how to evaluate an individual theory. One can become impressively competent at both of those by studying both logic and cognitive biases.
Rather, I wrote a document describing how knowledge develops in the social sciences. I document how social scientists – specifically economists, psychologists, and sociologists – generate and disseminate knowledge in their respective disciplines.
I chose my thesis topic in part as a reiteration of my point about the dangers of specialization. Part of the problem of specialization that, for specialization to work, one needs to work within a very small area. This necessitates taking dependencies on a number of assumptions; assumptions which, for the specialist to be effective, must remain true.
However, as science develops, it frequently throws new light on old topics. New research can contradict old assumptions. One danger of specialization is being unaware not just of what has changed, but what can change.
It is also valuable to understand how science develops as a way to contextualize new advances within science. Journalists routinely publish (misleading) summaries of new research – sometimes before they have been published!
Finally, it is useful to know the “landscape” of science; to have an idea of how the work in each discipline fits together both internally and as part of science.
I encourage you to read my thesis, and would certainly appreciate any feedback upon it.
There are no differences between the December 29, 2015 @ 15:25:21 revision and the current revision. (Maybe only post meta information was changed.)