Jobs of the future!

Computerworld has an article about what (IT) jobs will be like in 2020. Ignoring the parent difficulty of forecasting a decade in the future, allow me to poke a few holes in their account.

There are two broad themes: first, that technology is changing the business landscape, driving more towards cloud-based and mobile solutions.

Second, that people in all categories lack the skills to react appropriately. I’m not going to dwell too long on the stereotyping – it’s pretty basic, such as claiming that there’s “gap between college and real-world experience” (surprise), that Gen-Xers have too much entitlement, that mid-career workers lack experience with technology, etc.

There are two issues with this pretty picture. First, it’s so damn broad that it could apply to anyone at any time; second, it over-emphasizes the impact of technology on business.

I know, I know: it’s a terrible surprise from a rag called “Computerworld.”

Business is about two things: money and relationships. While such a simplification is broadly inaccurate, it’s sufficient when talking about technology. The goals for technology are to lower cost, speed up existing processes, move into new areas, and make it easier to connect.

For all that people claim that Facebook, Twitter, LinkedIn, Salesforce, whateverothercompany is changing the way business works – well, they’re full of shit. Facebook makes maintaining existing social relations easier (no more writing letters/emails to update people), LinkedIn does the same for business networking, Twitter (and its cousin, blogs) make publishing your thoughts easier, and Salesforce (along with other automation technology) makes existing tasks easier.

You could say that the purpose of technology is to eliminate drudgery.

Now, it’s very true that people who make a living off of performing drudgery are challenged by this tendency. And it’s true that people who put up with drudgery to accomplish their real goal (like a salesperson who fills out paperwork so that they can sell) don’t like having processes changed on them.

But it isn’t true that the “skills” are changing so much. People who sell still need to sell; people who manage need to manage, etc. The communication channels change somewhat, but neither the goal nor the method undergo distinct changes.

Tools are just tools: they exist to make you more effective. If new technology makes you less productive – well, it’s not a very good tool then, is it? So why the hell are you adopting it?

Review: Practical Thinking

image

In 1971 Edward de Bono published Practical Thinking, and has revised it multiple times; the last time in 1992.

It’s a charming little book, largely because – despite making some false statements – his advice is excellent, practical, and should improve thinking for almost anyone who reads (and applies) it.

The most interesting parts of the book, to me – given I have just completed a major in epistemology, or the study of knowledge (sort-of “how to think”) – was the advice about certainty. It’s well-known (now) that the feeling of certainty people sometimes have is bonkers. de Bono breaks down why it’s bonkers; but also provides ways of avoiding the issue.

I’m not going to re-hash his book, in part because he provides an excellent summary at the back of the book you can reference (and, really, it’s $4).

But the most important takeaway for managers and other “practical thinkers” is the de Bono’s discussion on the tyranny of the YES/NO system. It’s a simple insight: If you keep saying “No” to new ideas, the idea you end up with will be the first idea whose answer isn’t “No.” That is, it will not be the best idea; it will be the first mediocre idea. Abandoning the “YES/NO” system of brainstorming is really rather important.

If you’ve studied, oh, logic, Quine, cognitive psychology, and the philosophy of science, all this stuff will be old hat (and some of it wrong). If not,  I highly recommend it.

Review: The Science of Fear

image

Today, I read The Science of Fear (2008) by Daniel Gardner. It’s a remarkably well-done book for what it is – namely, a journalist’s (informed) overview of some of the psychological components of fear, and a large number of example as to how people exploit that tendency to fear.

It’s a nice book because he relies on rather well-accepted psychological research, while going into great depth on examples. It helps people to understand that these principles actually mean something.

On the other hand, for those people who were looking for a explanation about how fear works inside the brain – such as myself – it’s a bit lacking. And the reliance on solid psychological principles means I didn’t learn any new psychology. Regardless, I enjoyed the read because it was well-written and interesting.

In lieu of a review, allow me to review some of his points. In the course of the book, Mr. Gardner outlined three ways the brain screws up, leading people to irrational fears.

1. The Availability Heuristic

The availability heuristic is a pretty good rule. It’s a general cognitive bias – pretty robust across all humans – and works out to people predicting the probability of events in proportion to how many instances of it they can recall (are available).

If you do something a lot – work on computers, go hunting, etc – then over time you establish a battery of experiences. If someone asked you how probable something was, you could reach back into your experience and get a feel for how many times you’ve seen it – and give a pretty good example.

The big advantage is that it’s computationally very fast. If you need to make a split-second decision, you want it to be fast.

It’s also extensible: that is, people don’t differentiate between their own experiences (memories) and other people’s (stories). This works out really well if you talk to people who do the same thing as you do – say, a bunch of hunters sitting around a fire swapping stories. That way, you can tap into the knowledge of your entire community (if you haven’t yet experienced it, you don’t know how common it is – hearing stories of experiences can both ameliorate your ignorance and give you ideas of what to do to deal with it).

But therein lies the rub. The media specialize in providing stories – really compelling anecdotes – about things that happen. The brain doesn’t differentiate based on sources, so the availability heuristic can be screwed in the incorrect direction. People vastly overestimate the risk of terrorism, kidnappings, and murder; but vastly underestimate the risk of car accidents, drowning, diabetes, etc.

2. Confirmation Bias

Confirmation bias is an old favorite of psychologists, simply because it explains so much.

It’s pretty simple, actually. Once you believe something, your brain tends to look for other instances of it – confirming instances. It does not, however, look for falsifying instances for your belief. Sometimes, your brain will even change it’s recollection of the facts to conform to your current belief (one example is the “rose-colored glasses” effect; you believe the past was better, so you unconsciously modify your memories of the past to make it match your belief).

But it also means once you believe something about, say, terrorism, you’ll focus on the positive (that is, supporting) instances – and ignore the others. No terrorist attacks does not affect your belief about the danger of terrorism even though a terrorist attack down – which is illogical. It’s a binary outcome, therefore one outcome value  should be just a good a predictor as the other. The brain doesn’t think so.

A rather insidious effect of confirmation bias concerns the use of statistics. If you believe someone, and you come across a statistic (or a story) you disagree with, you’re going to scrutinize it very closely. If, however, you come across a statistic which supports your belief – then, hey, no need to question the source or the methodology, it’s obviously correct. People apply different levels of evaluation to information that conforms with their existing beliefs to information that violates their existing beliefs.

3. The Urge to Conform

Conformity has been studied a great deal, and the results are pretty consistent. When people are in a group and a task is difficult, you see more conformity. That is, lower confidence in the result for any one individual means that people are more willing to accept a group consensus. Funnily enough, though, each individual’s belief in the accuracy and reliability of the group consensus goes way up – even though the confidence of any individual’s conclusion is low.

Mr. Gardner makes the important point that conformity actually serves a good purpose. If you’re on the African plains, and everyone around you begins to get worried about a tiger in the grass – well, even if you can’t see the tiger yourself, there’s a pretty good reason to take precautions. More formally, it allows all members of a group to take advantage of the knowledge from all members of the group, and not rely on their own knowledge all the time.

The problem is that once a belief has taken hold in the general population, it’s bloody hard to get rid of. The combination of conformity – people fall into line – and the confirmation bias means that as a group, people don’t deal with falsifying evidence well at all. Mr. Gardner goes through a hilarious number of examples showing that (i) people say they believe something because of the evidence, (ii) you prove the evidence is wrong, (iii) people still believe it despite accepting that the evidence is wrong.

A Passing Note

In addition to those three psychological features, Mr. Gardner notes a few other issues. Here’s one I found striking.

It has to do with pointing out how badly people deal with numbers. People have no innate ability to deal with numerical data; though they do have a pretty good ability to deal with proportions. Unfortunately, this isn’t a good thing.

Mr. Gardner gives a great example. Take two groups of people: in both, tell them they are reviewing how much money to devote to improving airport safety. Tell the first group that implementing the precautions will save 150 lives; tell the second that it will save 98% of 150 lives. Consistently, people rate saving 98% of 150 lives higher than 150 lives (that is, the second group would devote more money to the project then the first group, even though they were saving objectively fewer people).

And don’t get started on how bad people are with probability – it doesn’t bear thinking about.

A Brief Conclusion

The Science of Fear rests on some good psychology, and goes into a large number of examples as to how human reason fails us when it comes to knowing what to fear.

The real effect of the book is to persuade people to be less afraid; it reduces fear. Mr. Gardner systematically goes through most hot-button political issues, and shows how the data doesn’t back up the fear-mongering. Not only is he persuasive, but he writes in such a fashion that you’ll pick up an innate skepticism of the media (if you didn’t already have it) and a deeper skepticism for anecdotes (if you have no statistical background).

It’s certainly worth the time just for that.