Recently I went to a planetarium show with my kids. The show was an interesting blend of digital animation and puppet theatre, staged by a local puppet theatre company and animators from Morehead Planetarium and the University of North Carolina. There’s a girl who gets lost in the woods, meets a magical old lady, and she returns the sacred fire to a dragon who brings Spring back. Basically it’s the story of the Winter solstice and the changing of the seasons. I and my kids really enjoyed it. The production was high quality and the storytelling was fun and I think the kids learned something from it.
Then after the show and the credits a planetarium staff member addressed the audience and said she was going to show us some basic astronomy, a few constellations and the path of the sun at the solstices and equinoxes. Then she said “Of course, it’s not dragons that change the seasons, it’s the tilt of the earth. Sorry to disappoint you.” What she said was perfectly reasonable, but I would like to question the motives of saying that to a room full of five-year-old kids. A colleague of mine, who is also a father of a five-year-old boy said “Yeah, and there’s no Easter Bunny either!” I don’t think the planetarium staff member was mean-spirited about what she said, but I have some ideas about where she’s coming from making such a remark, and I think it raises some questions about the intellectual climate of science at this time. Incidentally, just guessing by her age, I think this person knew a lot more about astronomy than she knew about kids, so again, this isn’t personally about her, but about the intellectual environment that we create when we insist on militant scientific positivism in all areas of life.
The question I raise is whether everything needs to be science. Are there kinds of knowing, learning and being in the world that are served better by other enterprises? Does science really have to dominate everything we do? Science is great. It’s not just my job; I really love it, and yes I’d probably be a lot worse off without vaccines and blah blah blah. None of that is at issue. The question is whether that disqualifies the rest of human endeavors. Are other kinds of thinking allowed? Without someone sneaking up behind you and saying “Well, actually the temperature differential between points A and B leads to variation in pressure that…”
Here’s another illustration of the problem. I have read, a few times with my kids, the book A Child’s Introduction to the Night Sky by Michael Driscoll. This is an excellent book about astronomy and I recomend it whether you have kids or not. Children’s books are really great to read: they are packed with information, presented in easy-to-remember ways, and they have all the basic background. I always feel like I’m missing something when I read age-appropriate (I won’t say “adult”) books on technical topics. I especially like this book because in addition to telling kids the typical stuff about the solar system, it tells kids what they can see with a telescope or their own eyes. In other words, it teaches kids how to collect their own data. That’s how I aim to teach science, so I really love seeing it in a children’s book. This book might even be where I got the idea.
The topic comes up late in the book about the history of astronomy, the zodiac and astrology. The author makes the claim that priests and fortune tellers were just as interested in the stars as “early astronomers.” He fails to mention that these were the same people. The occupation “scientist” is a fairly recent invention, and so is the distinction between astrologers and astronomers. Even a paragon of empiricism such as Isaac Newton was a far out mystic by today’s standards (and Wikipedia says he’d be considered a heretic by the standards of his day). The author seems to go to a lot of trouble to make sure kids know that there are scientists and non-scientists. In his defense, he tells the Greek myths the constellations are based on, and explains where the zodiac signs come from and how they are associated with astrology without judging astrology harshly. I believe the author’s motives are totally beneficent. But again I ask why do we tell this sort of thing to kids? What goal does it serve? Who does it serve?
Perhaps we can re-examine those motives and see if they really check out. I’ll use myself as an example. What motivated me to tell The Truth to people for a long time, was that I thougth people would be happier if they knew The Truth (i.e. my version of it). I thought that if people could accept science then they would see the wonder of the natural world, have an idea of where they came from, and all the things I was excited about. This sounds weird, but I really wanted to help people. I thought “this is my way of helping people,” this is my role, this is my purpose. Unfortunately the way it came out was cussing out a room full of Christians and telling them that the speaker was lying to the audience (he was, by the way). So, as much as I wanted to help people, it came out simply as rude and inconsiderate. When people wouldn’t listen, I would just shrug and say “Well, if they want to live their lives as morons, I guess I can’t stop them.” Looking back, I see now that this isn’t that different from saying “Well, you’re the one who’s going to Hell.”
So what message do we send when we say things like “Well, actually it’s not dragons.” My concern is that we are telling kids that it’s not okay to have an imagination. Now put yourself in my shoes, trying to teach science to people with no imagination. What I’m thinking is that insistence on science as the One True Way can dull people’s imaginations just as much as a fundamentalist religion. When we fail to see the value of other ways of thinking, we could be tying kids down to only one set of mental habits, limiting their flexibility. I think the scariest thing about hearing people say stuff like this is that reminds me of myself in middle school and high school, when I refused to see the value of anything other than science.
My ninth-grade English teacher is going to love this: myths have their own value. What is the value of the Santa Claus myth? It teaches kids about giving, but not in a didactic “You better do this” kind of way. It also teaches them that it’s nice to receive gifts. There is someone who will just give you something because that’s his job. That’s just what he does. The Easter Bunny? That teaches kids about the changing seasons, about how life comes from somewhere, and that spring and changing seasons are something to celebrate. Telling these stories also teaches kids the value of story telling. As kids get older, big brothers tell these stories to little brothers (that’s how it works in my family, anyway) and the cycle starts all over again. Gee, maybe there’s a story about the world being full of cycles? Kids get the idea that you can learn from these stories, and that playing and pretending that they’re real is a great way to learn about the world. They’re also just fun, and there’s plenty of value in that. Not only does not everything have to be science, but not everything has to be about money, or values, or even learning. As long as you’re not hurting somebody, fun is a perfectly good reason to do something.
The biggest problem I see with thinking everything needs to be science is that we will fail to see the value of other modes of thinking. Jerry Coyne, Sam Harris and others seem to think that religions and mythology are “failed science.” This seems true as long as everything is trying to be science. Maybe not. Perhaps the goal of telling stories is not to get at what an empiricist thinks is The Truth. I ask if it’s at all conceivable to you, as a scientist reading this, that myths about natural phenomena are actually about the course of human lives, about how people change, and about valuable lessons in how to get along with people (like how if you keep transforming yourself into animals and raping virgins your wife might get a little peeved). Perhaps there’s value in learning how to live with people and there’s something called wisdom that it’s hard to get through studying science. Myths could serve this purpose, but not if we tell the story and then dismiss it by saying “Well, I’m glad we know better now, thanks to modern science! What a bunch of baloney!”
One final question (not bloody likely) is what are we left with if we don’t bother to think in terms of anything other than science? What do we have if we dismiss every story and myth as just plain wrong? Seems to me like we’re left with a bunch of seventh-grade boys. All we have left is “Well, technically…” and “I heard there’s this virus that can eat your brain” and “Nuh-uh” and “Yuh-huh” and…
Don’t we all remember how stupid that was?
- If and When to “Spill the Beans” about Santa Claus (psychologytoday.com)
I’ve just finished reading portions of Rupert Sheldrake‘s The Science Delusion. The title is an obvious allusion to Richard Dawkins’ The God Delusion so you can guess that Sheldrake’s thesis is that scientists have great faith in their craft, elevating it to the level of producing what I call Truth. The problem, Sheldrake points out, is that modern science is based on adhering to a dogmatic assumption that the universe is a machine. He points out that this is a fairly new idea, and worst of all for supposedly empirical science, there is absolutely no evidence for it. It’s a belief. It’s a myth. I’d like to leave aside the readability and scholarship for a proper review (perhaps elsewhere), but here I’ll deal with the real philosophical problem this presents.
Sheldrake points out that the mechanistic worldview, that is seeing the universe and everything in it as a machine, was a fairly radical idea in the sixteenth through eighteenth centuries when it was proposed by a minority of scientists and natural philosophers. David Hume dismissed it completely. The universe and its inhabitants were seen as something organic, i.e. something that grows, by most ordinary and learned people. However, the material success of Newton’s Laws and (Sheldrake doesn’t mention!) the Industrial Revolution, and continuing into the computer age, has helped convince most people that they are robots inhabiting a giant clock. This is bad for science, as dogmatism stifles creativity and ideas that could be either helpful for science (like Sheldrake’s own theories of morphic resonance) or helpful to the general population (like “alternative medicine”) are dismissed since they don’t fit in to the mechanistic, materialistic worldview of science.
As an example, many scientists dismiss acupuncture as incapable of anything but a placebo effect, since its “mechanism of action” is not known; therefore it’s a money-making tool for charlatans and shouldn’t be used to try to heal people. Sheldrake points out that’s not a valid criticism since the effect on the health of the patient is the same regardless of the mechanism of action, even if it’s just a placebo. Scientists and materialist physicians, on the other hand, will support many drugs whose mechanisms are poorly understood, simply because they are produced by chemistry. As someone who’s seen the inside of pharmaceutical research, Sheldrake is dead on: we don’t know much more about methylphenidate than we know about acupuncture. The mechanism of action of many psychiatric drugs is completely unknown and that doesn’t stop doctors and scientists from having total faith in them.
Although Sheldrake makes his point somewhat clearly, I’m not sure it’s the biggest problem with the mechanistic worldview and dogmatism in science. The problem I see is not within science, but in how the general public is persuaded to see science as Truth. Just witness how scientific graphics are used in TV commercials to sell running shoes: it’s very convincing even when there’s no actual science behind it. This means that scientists do a very good job of convincing people that science is the only route to Truth, or merely that science is the most pragmatic method of achieving their goals. People either see science as infallible, and they swallow the idea that the current mechanistic worldview of science is It. The big problem, as I see it, is that people are encouraged to deny their own experiences in favor of the findings of science, which are inextricably linked to the dogmatic assumptions of the mechanistic worldview.
I’ll give you an example. Let’s pretend, just for the sake of discussion, that I suffer from terrible migraines up to three times a month that keep me from going to work or enjoying and taking care of my family. Totally hypothetical (not). Let’s also pretend that I’ve been to lots of doctors, been prescribed all kinds of drugs, vitamins, diets and exercise based on “evidence.” I’m still getting headaches. None of this stuff has helped to my satisfaction. I’ve had improvements, and I’m slowly learning to live with it, but the best most doctors have to offer me is “try this, there was a study done…” Science is slow. It’s way too slow to help me with this problem. I’ve been having these headaches for thirteen years and the science has not improved much in that time. The best a headache specialist could offer me was to take large doses of vitamins that were identified to help people with mitochondrial disorders, in a study done over forty years ago. The mechanistic worldview, encouraging me to see my body as a set of pumps and electronic circuits mounted on an armature of primitive calcite crystals tells me to see more doctors until I find the one who’s read the right peer-reviewed study. Why should I deny my own experience in favor of peer-review? No thanks. You bet your ass I’m going to try Chinese medicine before I’m going to wait for science to catch up to what I need in my life. I do science, I know how slow it is, even for the fast people.
My biggest problem with the book is this: scientists play the game of “Who’s right?” I used to believe that being factually correct was the most important thing in life. Most of the scientists I know also believe this and they don’t just apply it to their work. They apply it in all realms of their being, particularly because our language and culture is set up for it. People like to be right. Many see life as a competition. Unfortunately, Sheldrake is also playing this game. He spends most of the book promoting his own scientific theories of morphic resonance and other ideas about psychic phenomena. I see this as more of the problem. We don’t need more science or better science. We need to see science for what it is: a way of learning. When we ask for more science, we are reinforcing the attitudes that lead to the problem in the first place. This is particularly evident in how we teach science.
When we teach science, we play the same game by teaching not methods, but findings. Most often those findings are actually models and metaphors, not experiences. For example, right now I’m helping to teach genetics and molecular biology. Most of the course material is not experimental procedures as it could be, but models of the function of biological molecules. The biggest one is the model of protein synthesis, where DNA is transcribed into RNA, which is translated into polypeptides. This is not anyone’s direct experience. This is a story (you could even call it a myth, due to the dogmatism it attracts) that is supported by clever experiments. Nicholas Maxwell points out that we could come up with a huge number of alternative myths that would also be supported by the same experiments, but that’s not how science works. Science seizes upon the first kinda-plausible idea and runs with it until it runs out of steam. The “findings” or “facts” that are found to support this story are wrapped up in it: we never would have done those experiments and found them to support the story if we didn’t have the story in the first place. When we teach science, we don’t teach method, we teach the mechanistic worldview, which is a myth. I often remind my colleagues that most of science is made up. Surprisingly a lot of them take no issue with that assertion, just as I don’t. The problem comes when we present it as something that’s Right, and don’t present people with the alternative of trusting their own experience. If we were honest about the nature of science, then people would see science as one fun way of learning, rather than The Way of Learning.
Unfortunately we encourage intellectual terrorism (“Who’s right?”) by refusing to be honest with people about the nature of our ideas. Sheldrake points this out, but quickly gets caught up in the same game by proposing alternatives. We don’t need more science, we just need to be honest about what science is. This is Sheldrake’s main point, but he primarily focuses on the danger of it to science, proving that he is, after all, a scientist. I am a lot less skeptical about my overall experience than I used to be. However, I’m still just as skeptical about scientific matters because science is a particular way of doing things and it’s intensely limited. I happen to think the prevailing theories of science are just fine. Swallowing them whole as the key to understanding your own direct experience is not just fine.
My overall point is that I don’t think the abuse of mechanistic metaphors is as big a problem for science as it is for regular people (scientists included). I’m surprised how often I see people who have a problem with science, e.g. adherents to “alternative” medicine, are doggedly scientific. In other words, I often encounter people raising gripes against “science,” and their first response is to propound an alternative scientific theory, i.e. to do more science. I’m also surprised how often I hear people explain their personal experiences (mostly bizarre, inexplicable ones) in terms of science: people usually invoke quantum mechanics because it’s the weirdest scientific thing they’ve heard of. It’s almost like they feel they need to defend their own experiences. That’s sad. Personal experience is not a competition, nor is it subject to peer-review. This just shows how deeply science-as-truth is ingrained in our culture. This probably has to do with the Puritan origins of our country; to understand that I’m reading Paul Feyerabend.
- 3 TED Talks the Establishment Would Prefer You To Miss (talesfromthelou.wordpress.com)
- The Science Delusion and Good News for Lumbering Robots (linguaphileapprentice.wordpress.com)
- The debate about Rupert Sheldrake’s talk (ted.com)
- Try not to be dogmatic about this (lackofenvironment.wordpress.com)
- TED’s Censorship of Rupert Sheldrake and Graham Hancock (rockandrollphilosopher.wordpress.com)
I just finished reading a piece by Jerry Coyne published in this month’s issue of Evolution. Coyne lays out the problem of belief in evolution, belief in God, and questions whether there can be compatibility. He’s basically asking how we, as scientists, can get more people to accept evolution. Relying mostly on poll data and sociological assays of religiosity in the United States and elsewhere, he concludes that the problem is that the United States is a more strongly religious nation than most others. He then argues that science and religion are incompatible unless we redefine religion, and hence cautions that acceptance of evolution will have to wait until widespread social change makes religion less important to Americans.
Coyne’s primary argument that science and religion are incompatible is an argument also used by Richard Dawkins, based on the idea that scientists discover Truth (with a capital “T”) . Coyne distinguishes between “scientific truth” and “religious truth” and then conveniently shows that religious truths are not supported by science. There are some logical problems there, but I would rather ask the question: is that really what scientists do? Do we discover the Truth? What is the Truth? I don’t know any way to communicate Truth to anybody: what I experience as Truth is based on my subjective experience, and is inconvenient to communicate in any reliable way. What I think is going on here is that Coyne, Dawkins and many others take science too seriously: science is a way of communicating. Science is a way of using objective criteria to describe nature so that we can talk about the common aspects of our experience.
However, is that Truth? Or is it just what we can learn using science? Science is very effective in doing what it does, but it is also intentionally very limited. Science cannot do a lot of things that people might find very interesting: certain experiments would not be science because there would be no further experimenting with them. For example, there were widespread experiments with telepathy, prayer and other forms of supernatural communication around 1900, but the experiments were hard to conduct and the results were hard to interpret. So what did the scientists do? They did what scientists always do and they backtracked to something that they could work with. That’s the point: science is about experimenting with things in small steps that are fun to play with. Science is incredibly limited, very slow, and usually very crude in its means of experimentation (“Hmm, this week let’s cut out this part of the brain!”) . Such a method could hardly come close to finding “The Truth.” Nevertheless, it is still fun, enlightening, and people learn a lot doing it. There’s no greater hell for scientists than feeling that they are not learning. Let’s see science for what it is — a good way of learning and communicating — instead of relying on it for The Truth.
My real question is why this is so important to Jerry Coyne, Richard Dawkins and the late Christopher Hitchens. The title of the essay is telling: “Science, religion and society: the problem of evolution in America” (my emphasis). Non-acceptance of evolution is a problem to be solved. Really? What is exactly the problem to solve? What do we accomplish by having more people accept evolution? What does anybody gain, except learning more science? This is kind of like complaining about not getting a third cookie: we scientists do accept and study evolution and get our own benefits from doing that. Do we really need more scientists? I think what is motivating these authors is that they believe that they are reporting The Truth, and it’s always in the best interests of people to know The Truth. Then I ask how evolution is different from Christianity or Islam: how are atheists any different from the religions they oppose in saying that they themselves have the truth and everyone would be better off to agree with them? Has evolution become ideology?
The other possible answer is that scientists believe that they are right in another sense. Not that they are ideologically correct, but that they have the right information, the right data, the right facts. This is a syndrome of people believing that being right is the most important thing. I would venture to ask if compassion is not more important than factual correctness. Have you ever been in a conversation with a person who absolutely didn’t care about your feelings in any way, but just wanted to show you how wrong you were about some arbitrarily tiny little matter of fact? If it was me, then I’m sorry.
Religion, Coyne concludes, is a symptom of a sick society, and America is completely sick. Might that mean that Americans need religion more than they need evolution? Is being factually correct really important when people are just hurting, feeling misunderstood, feeling abandoned by a rigid, competitive society? Again, perhaps compassion is more important — and the means of conveying that compassion is inconsequential. If you don’t think Americans are ill, then why are they killing themselves with terrible food? Why are they watching their neighbors kill themselves on TV? Why are so many Americans addicted to pain medication? I agree with Jerry Coyne here: if we live in a society where people are so bad off that they need religion, is making them accept evolution really important? What bothers me is that his only seeming concern for the problems of his fellow human beings is clearing it all up so that they’ll finally accept his version of Truth.
- Jerry Coyne Explains Why Evolution Is True (patheos.com)
- Correlation and causation, science and religion – ScienceBlogs (blog) (scienceblogs.com)
- Science and Christianity – Different Ways of Finding Truth? (sandwalk.blogspot.com)
- New Atheism: A Secular Religion (choiceindying.com)
If you read the theoretical and mathematical literature of evolutionary biology as much as I do, you’ll quickly notice that three-quarters of a century has been spent trying to verify a cryptic statement made by Ronald Fisher and declared the Fundamental Theorem of Natural Selection. The “theorem” states that when mutation and genetic drift are negligible (e.g. in a large population) the rate of increase of mean fitness is equal to the variance in fitness in the population. Most of the time in that case we can use mean fitness as a Lyapunov function to demonstrate asymptotic stability of equilibria.
Unfortunately, most of the work on the FTNS shows that it doesn’t apply in most interesting cases and Fisher’s original derivation had some serious problems. Many have concluded that Fisher’s motivation and conclusion were unclear. Despite that, I’ve recently read that the FTNS is comparable to Newton’s Second Law of Motion. I disagree. I remember using Newton’s Second Law to solve tons of problems, and I have never used the FTNS to solve a problem in evolutionary theory. Never.
On top of all this, we have a substitute: there happens to be an actual theorem whose proof is rigorous, applies to any set of aggregate quantities and includes all the evolutionary details. I have just rederived yet another set of famous evolutionary equations from The Price Equation in less than five minutes this morning. Try it sometime! You can derive everything, from the most basic equations of single-locus selection, to mutation-selection balance, and everything else, from the Price Equation. I use Price’s Equation the way I used the Fundamental Theorem of Calculus, and the Fundamental Theorem of Linear Programming. As a universal problem solver. I don’t use Fisher’s theorem that way.
Price’s Equation truly is the Fundamental Theorem of Evolution and I, for one, am going to make my efforts to reverse the typical ordering of things. Fisher’s “theorem” (it’s not even a theorem!) is called “Fundamental” and the selection component of Price’s Equation is called the “secondary theorem.” Dude, that’s bogus: you can derive Fisher’s FTNS from the Price Equation!
To see just how fundamental Price’s Equation is, use the full equation, with the transmission bias component to derive equations for
- Single-locus selection dynamics
- Two-locus-two-allele selection dynamics
- Equilibrium between forward and backward mutation
- The Breeder’s Equation
You’ll see what I mean.
Most people think “programming is for programmers,” and by “programmers” they mean people who earn a living writing software, i.e. “end-user” software: software that people will buy, or that will be used in some big company. However, recently I’ve overheard a lot of talk from people in the business world about what those large companies do, and much of it sounds like it could be done by simple computer programs. The problem is that people don’t learn programming, nor do they learn to think of their problems as amenable to programming. I surmise that for most people, a programming solution doesn’t even enter into their thinking.
At a recent breakfast conversation, my brother told me that at his company most of the problems that come up result from people not thinking of something if a notification doesn’t come up on their computer screens and ask them. Even if they know there’s a problem, they won’t do anything about it if they don’t see it right there in front of their faces. They won’t even get up and walk five feet over to the guy in charge of that problem to ask him. These people and their tasks could be replaced with simple programs. He also told me that the corporation he works for uses none of the operations research or systems design theory that he learned in business school. Everything is just left up to guessing at the best solution and sticking with it for years, regardless of how much it costs or the alternatives.
I also sat next to some people in the airport who were using Excel and mentioned SAP (which my brother tells me is basically a money-milking machine; the companies who buy it are the cows). One of these people said her current project was “organizing [inaudible] into categories based on characteristics, including horsepower…they weren’t labeled with horsepower.” She was doing it “by hand.” One of my missions in my last job and in graduate school is to intercede whenever I hear the phrase “by hand.” We have computers. “By hand” should be a thing of the past. This young woman apparently didn’t think of her task algorithmically. Why would she when it’s unlikely any of her education included the word “algorithm?”
These patterns highlight the current failings of commercial computing. Commercial computing has one goal: sell computers. For most of the history of computing, this approach has been focused on hardware, but now people mostly see it as software. Commercial computing’s current goals are to sell people software as if it were hardware and then walk away, wagging your finger when the customer comes back complaining that it doesn’t work. Eric Raymond calls this the “manufacturing delusion.” Programs aren’t truly manufactured because they have zero marginal costs (it costs only as much to make a billion copies of a program as it does to make one copy). Commercial computing focuses on monolithic hardware and software, i.e. programs that try to do everything the user might need, and funneling everyone’s work through that program. That doesn’t work.
Academic computing, on the other hand, has the perspective that if something doesn’t work the way you need it to work, you rewire it, you combine it with something else, or build onto it so that it will accomplish a specific task. People literally rewired computers up until twenty-five years ago, when it became cheaper to buy a new machine (if anyone can correct me on that date, please let me know). Similarly for software, if the software you have doesn’t do the job you need, you write the software to do the job you need. If you have several programs that decompose the problem, you tie them together into a workflow. Suppose you have a specific problem, even one that you will only do once, and might take you one day to program — potentially saving you a week of “by hand” — then you write a program for it. Then if you ever have to do it again, you already have a program. You might also have a new problem that is very similar. So you broaden the scope of the previous program. Recently I wrote a script that inserted copyright notices with the proper licenses into a huge number of files. I had to insert the right notice, either for the GPL or All Rights Reserved based on the content of those files. On the other hand, if you have a program that generally does what you want, e.g. edits text, and you want it to do something specific, you extend that program to do what you need.
Basically I see a dichotomy between the thinking that certain people should make money, and solving problems only to the extent that solving their problems makes those people a lot of money, versus actually solving problems. If you disagree that this dichotomy exists, let me know and I’ll show you academic computing in action.
The solution for all these problems is teaching people to think algorithmically. Algorithmic thinking is inherent in the use of certain software and therefore that software should be used to teach algorithmic thinking. Teaching people algorithmic thinking using Excel is fine, but Excel is not free software, and thus should not be considered “available” to anyone. Teaching these skills in non-computer classes will get the point across to people that they will be able to apply these skills in any job. Teaching this to high school students will give them the skills that they need to streamline their work: they will be able to do more work, do the work of more people, communicate better and think through problems instead of just sitting there. People will also know when someone else is bullshitting them, trying to sell them something that they don’t need. Make no mistake, I’m not saying that teaching programming will get rid of laziness, but it might make it a lot harder to tolerate laziness. If you know that you can replace that lazy person with a very small shell script then where will the lazy people work?
If you teach biology, or any field that is not “computer science,” then I urge you to start teaching your students to handle their problems algorithmically. Teach them programming! I am going to try to create a project to teach this to undergraduate students. I have in mind a Scheme interpreter tied to a graphics engine, or perhaps teaching people using R, since it has graphics included. Arrgh…Scheme is just so much prettier. Teaching them the crucial ideas behind Unix and Emacs will go a long way. Unix thinking is workflow thinking. Unix (which most often these days is actually GNU/Linux) really shines when you take several programs and link them together, each doing its task to accomplish a larger goal. Emacs thinking is extension-oriented thinking. Both are forms of algorithmic thinking.
If you are a scientist, then stop procrastinating and learn a programming language. To be successful you will have to learn how to program a computer for specific tasks at some point in your career. Every scientist I know spends a huge amount of time engaged in programming. Whenever I visit my grad student friends, their shelves and desks are littered with books on Perl, MySQL, Python, R and Ruby. I suggest learning Scheme, but if you have people around you programming in Python, then go for it. I also suggest learning the basics of how to write shell-scripts: a lot of people use Perl when they should use shell-scripts. Learn to use awk, sed and grep and you will be impressed with what you can do. The chapters of Linux in a Nutshell should be enough to get you going. Classic Shell Scripting is an excellent book on the subject. Use Emacs and you’ll get a taste of just how many things “text editing” can be.
Every profession today is highly data-oriented. Anybody hoping to gain employment in any profession will benefit from this sort of learning. Whether people go into farming, business, science or anything else, they will succeed for several reasons. There are the obvious benefits of getting more work done, but there are also social reasons we should teach people algorithmic thinking. The biggest social reason is that teaching algorithmic thinking removes the divide between “customers” and “programmers.” This is why it bothers me to hear “open source” commentators constantly referring to “enterprise” and “customers” and “consumers.” If a farmer graduates from high school knowing that he can program his way to more efficient land use, then he will not be at the mercy of someone who wants to sell him a box containing the supposed secret. Again, you can teach algorithmic thinking using Excel, but then you already have the divide between Microsoft and the user. With free software that divide just doesn’t exist.
We are sick of education. We as a nation, a society, a world, and individually are making ourselves sick over education. I can’t speak for people in other countries, but as an educator in the United States at several levels I have repeatedly seen people make themselves physically and mentally ill over education. We need to do something about it. We don’t need to do away with school, but we need to seriously re-think what we tell people about school, our values as a society and our valuation of human life.
Another semester has come to an end, which means I’ve seen another set of young people come into my office to beg for me to change their grades; I’ve seen more people crying in my office about how they need to pass a class; they’re afraid that if they don’t pass they won’t graduate; they’re afraid that if they don’t pass they are not good people. It’s as if their lack of comprehension of a model of predation or population genetics means they are not good people. Of course I want them to learn, but does that mean I want them to destroy their lives and develop a mood disorder over learning what I’ve chosen to learn? Not only do I see university students — people over the age of twenty, mostly — crying over this stuff, but some have looked on the verge of vomiting over their fear of failing a required class. After one of these meetings the other day I decided that it’s not just the students’ fault, and that’s a silly way to assess the situation. No, it doesn’t matter whose fault it is. Instead this is a sign of everybody doing something seriously wrong.
Education Problems are Our Problems
First let’s establish that if there’s a problem with education there is a problem with our entire society. I don’t just mean that getting an education is a fundamental part of our society, or that in America we really value education. I don’t mean that since education has been made particularly important in capitalist society that a problem with education is a problem with the whole society. What I mean is that if there is a problem anywhere it’s a problem with everything. It’s a problem for everyone and everything we do. Richard Stallman often reminds people who software freedom is important not because computers are an important industry (i.e. make somebody a lot of money), but because using computers is a part of our lives now. If you doubt that’s true, think of how often you make scheduling decisions based on the performance of a piece of software: do you ever schedule a meeting at a particular time because your calendar software makes it easier to do so? Do you ever leave the house later because your web browser wouldn’t load the article you wanted to read over breakfast? Do you ever leave the house or office a little later so you can download something onto a mobile device? Think about it and you’ll see that anything you do with a computer affects your whole life, just as your diet affects your whole life.
Education is the same way. If there’s a problem with our education system, the way we teach, and the reason we learn (i.e. our values) then there’s a problem with our whole lives. We can’t ever say “That’s a problem for the schools,” or “That’s the teacher’s problem.” It’s our problem. If your kid is having a problem learning or understanding why he should be learning something, it’s your problem.
Not only are students crying in my office, but there are larger societal signs of the problems created by our you-must-go-to-college-society. Consider that when I graduated from high school, I had a range of friends with varying interest in college. I wanted to go to college because I wanted to be a professor; I can’t do that without a college degree, plus a Ph.D. and (I thought at the time) a Master’s degree. But I only applied to one university. I didn’t apply to a “safe” school plus Harvard, Dartmouth, Yale, Stanford, Washington University, Scripps, Pomona, Miami University, Beaverton College, Reed College… . I had other friends who weren’t going to college at all: they had taken courses in computer networking, database administration and other topics at vocational school, or were headed to community college to take those courses. Nowadays I know people with BAs in computer science who are doing those jobs. If you can do that job with a high school diploma or a BA, doesn’t that seem a little off to people?
College-entrance-syndrome begins in kindergarten. My son will be going to kindergarten soon and I am shocked at the stuff I’ve been hearing from people. I should have known things would be different now that every toy has pop psychology jargon written all over it. Parents have told me “Oh, that’s a good school, she went to kindergarten there and she learned all her skills.” Learned what? I don’t know about you, but in kindergarten I learned about primary colors, my feelings, don’t talk to strangers, and to be nice to people. An administrator at a charter school recently told me that kindergarten is the new second grade. My brother in Texas just told me that the Dallas public schools are paying teachers to quit, while they’re spending millions of dollars on testing.
Educators automatically tell kids that they ought to go to college. I’m sure even I fell into that habit when I was working in middle and high schools. Think about it and you’ll see how weird it is. It makes sense to have had that attitude with poor kids in the fifties and sixties, and with GIs coming out of the most violent conflict in human history. In the former case you had people who definitely would benefit from the vocational and educations opportunities of college; in the latter you had trained killers already familiar with the newest technology, knowledge that would be wasted if we didn’t interest them in something before they started killing again. My father-in-law’s family is a great example of a family working hard so their children could benefit and go to college.
The question we need to ask today is “do you need to go to college to work at Hooters?” Do you need to go to college to manage a Hooters? According to College Conspiracy most college graduates end up working at jobs that require no higher education, whereas high school graduates that don’t go to college enter the workforce earlier, gain more skills and advance more rapidly than college students. Also according to the film, most college graduates who do become successful don’t believe that their college education has much to do with starting a successful business. I’m not asking about the economic value of these people or their jobs, instead I’d like to ask: are you a bad person if you choose to work as a waiter?
Also consider the futility of telling kids to go to college “just because” it will do them better in some unspecified way. One of my brothers is a musician, meaning he chooses to live in poverty. He recently told me that high school administrators learned very quickly that they couldn’t motivate him by telling him “Well, if you want to get into a good college…” He’d interrupt them and say “I don’t want to go to college, I want to play my drums.” And he does.
We need drummers. And we need farmers, and waiters, and lots of people who are still perfectly good people without going to college. My grandfather was a brilliant builder, designer and artist with a seventh-grade education. Maybe not everybody needs to go to architecture school. Maybe they need to learn trades like my friends who learned networking and database administration without going to college.
That brings up the really sick part of all of this: why we’re telling kids they have to go to college. Most often we tell them because it’s the only way they’ll “make it.” What does that mean? Well, usually when you press people they’ll tell you it means making a lot of money. Oh? Is that what it takes to be “successful?” Is money what it takes to be happy? Is money what people really need? So people are told to go to school in the long-term to make money.
I shouldn’t need to tell you that I think this not only debases education, it debases the people we tell it to. If the only way to be happy is to have a lot of money — or even a lot of prestige, i.e. becoming a doctor — and the only way to earn a lot of money is to go to school then what good is learning? What if you’re not good at school? What if you don’t need to go to school? What if you don’t want to go to school? Do you really need to be Mark Zuckerberg or Bill Gates to be a good person? Do you even need to try?
The problem in all of this is that school has become a means to an end, rather than an end in itself, at least in America. I don’t think that every kid should love going to school — I hated it — but kids should be taught the value of learning. They shouldn’t be taught that learning is only for school; they shouldn’t be taught that learning is just one thing you do sometimes; they shouldn’t be taught that the job of learning is to make you a lot of money. They certainly shouldn’t be told money is the key to happiness.
One of the first times I realized I was different was in Junior High school. One of my neighbors saw me on the bus; I was reading The Extended Phenotype by Richard Dawkins, a book that in retrospect I totally did not understand but I read it anyway because I knew I was challenging myself, and it was cool. She said “Why are you reading? School’s over.” I can’t remember my response, but I can remember thinking the question was coming from someone who had completely wrong ideas about why we learn, and I felt sad for her.
My son recently received a book as a gift and on the cover was a banner that read “Time to learn!” I wouldn’t ordinarily think this was anything weird: it’s a good message to send that learning is exciting, right? However, we don’t reserve learning for when we’re in school or while we have a book to read. Children love to learn; they don’t need to even be in school, really. They certainly don’t need to be told “Okay, now we’re going to learn.” It’s what they do. My son is learning when he’s playing with Legos, when he’s on the playground, or when he’s reading a book. My other son is learning right now watching a Neil Peart drum solo, but I didn’t say “Okay, Khalil, it’s time to learn something; there will be a quiz afterward,” before I showed it to him.
What do schools tell kids instead? A recent lunchtime conversation with an undergrad, a graduate school drop-out, a grad student/TA (me) and a professor revealed the horrible truth. Had I forgotten it? The undergrad reminded me that throughout high school, students are told that they are in school expressly so that they can go to college. My professor colleague then said “So now people are going to college just so they can get into medical school?”
Today I took a look at some more articles and some more videos on the web introducing myself to the concepts of Open Access. I also talked to a professor at my university who pointed me to some interesting links. The impact factor of Open Access journals is rapidly approaching that of non-OA journals. That is interesting: I definitely want to support Open Access, but as I said yesterday, I simply find some of the arguments lacking.
I was browsing through some videos on Youtube and just as I was thinking “Why are these all from people I’ve never heard of?” I found one from someone I have heard of (and, incidentally, whose papers I read quite a lot of): Andrew Pomiankowski.
Pomiankowski points out several advantages of BMC Journals:
- Fast peer review
- Fast turnaround from submission to publication
- Creating competition for journal publishers, who have been “ripping off” scientists for a while now
Okay, now he’s speaking my language. I definitely think the publishers need competition. And if they genuinely have been ripping off scientists (and I have no reason to think he’s lying, having seen the whole process from submission to publication), then that’s something we need to rebel against.
Another critical argument that didn’t come up right away in my investigations was authors retaining copyright. Open Access journals don’t demand copyright assignment from the authors of scientific works. Instead authors retain the rights to distribute their own works in their own ways. The articles are still peer-reviewed and still perfectly reputable, but you can get them from a colleague’s website while reading his CV, instead of following links that might not work. That is compelling.
What’s holding me back? What’s holding me back from saying “I will only publish in open access journals?” The fact that a journal like Evolution, or American Naturalist is not Open Access still doesn’t deter me. Those are good journals. I would be more than proud to publish a paper in any one of many good “closed access” journals. As I said yesterday, I also still can’t discount the value of a good paper because it’s not in an Open Access journal.
However, things are changing. Within ten or twenty years I doubt there will be many “closed access” journals. The journals I just mentioned are already putting open data policies into effect. Open data to open access takes very few steps, although it may involve pissing off, losing, or forcing the hand of the publisher. If Wiley-Blackwell can’t provide the kind of access that scientists want, then they may just go out of business. I think a lot of scientists do want that level of access, so I think Wiley-Blackwell, Elsevier, Springer and company will have to change. Either that or they’ll just keep merging until there will be only one such publisher to put out of business.
I do think the access the web allows people is an unstoppable force. The trick is to take the best part of the freedom of the internet (access) and not let the content fall victim to the worst part of the worst part of the freedom of the internet: barriers to entry for authors. Any jerk can say whatever the hell he wants and you’ll still probably read it. You for example, reading the blog of a graduate student when you should be reading something over at PLoS. We need to make sure that all the good things about journals and the peer-review process stay in place, while making the journals as easily accessible as this blog.
Just for fun, here’s another biologist that I have heard of, Steve Jones: