I have often become confused, angry or cynical over the past few years when seeing self-professed “open source users” with Macs on their desks, or using R under Windows. I once had a discussion with a Linux user group about which laptop to buy: when many had said my laptop was “under-powered” I pressed them and found out that they meant it would have been slow running Windows. Contributors to help forums and on IRC have often assumed that my machines dual-boot Windows and GNU/Linux: “Can you see the partition when you boot into Windows?” I have also seen the insistence, or mere suggestion, of calling the operating system I’m using “GNU/Linux,” instead of Linux, dismissed as “zealotry,” or “mere semantics.” I became angry because I assumed that everyone in these situations had heard of the values of freedom embodied by the GNU project and had rejected them as unimportant. How could freedom possibly be unimportant? What could be more important to Americans, other than money?
There was another possibility that I only considered for a few seconds at a time, but it’s now becoming clear that this possibility is more feasible: these people have never heard of the GNU Project, or the Four Freedoms, or Richard Stallman. They have never heard of the true benefits of software freedom, the dangers of proprietary software, or the full breadth of freedom that is possible. If they have heard of it, perhaps they did dismiss it without thinking it was possible: perhaps software freedom is, to most people, an urban legend. This seems strange, since I came to free software by reading about it on Wikipedia and gnu.org and my interest was primarily motivated by (a) freedom and (b) the possibility of having a Unix-like system to work on. The fact that it was free to download and install merely removed the barriers to enacting those freedoms.
The barrier to my own belief that people have just never heard of freedom is that it seems to me that all systems (in fact all things) are imperfect. We all know how imperfect Windows is, and I got annoyed as hell using a Mac, so as much as its devotees attest to its perfection, it’s not perfect for everybody. However, people complain the most about the imperfections of Linux[sic]. Perhaps this is because they can, as in if they complain, someone will do something about it eventually. With Windows and MacIntyre, you have to get fifty million corporate employees to complain, whereas with free operating systems, you can be just one guy and raise a huge stink about how the buttons on the top of the windows are arranged all wrong (of course, the other advantage is that somebody can explain to you how that’s your fault). Despite the lowered barriers to complaints, I always had the feeling that people were complaining because they feel like GNU/Linux is just not “professional,” or “slick” because it’s not purveyed by a huge corporation. Therefore they complain about all kinds of things that really aren’t important to me.
Nevertheless, you still get people promoting the hell out of Linux[sic]. I could never understand why. Take NixiePixel for example, a YouTube personality who promotes primarily Ubuntu and Linux Mint. I really thank her for doing so, because whether she likes it or not, she’s promoting freedom: better that people have it and not know it than not have it at all. However, she never says why she’s promoting these alternatives. Why is it better to use Ubuntu than Windows, particularly if there aren’t the same games available for it? She even has a new series called OSAlt where she discusses and rates “open source” alternatives to non-free programs. Again the question is why? Is “open source” inherently better for users somehow? I suppose in some ways it is, but how?
This is so puzzling because for me, without freedom, everything comes down to your personal choices. No computer operating system, no anything, is going to work well, or even comfortably for anybody. Life just doesn’t work that way: nothing “just works.” So why promote one alternative over another? Freedom is the only motivator to use GNU/Linux that stands that test. The freedom leads to a lot of nice by-products, but freedom is the prime mover. Some users may not have a choice of what to use; they may have to use a proprietary system at work, and not have time to learn to use something else at home. Additionally, some users like NixiePixel will be unwilling to embrace a campaign for freedom because considerations of freedom are intensely personal at the same time as “political” and the possibility for insulting people is pretty high. There is also a lot of angry, cynical behavior in the open source and free software worlds. That’s bound to happen whenever a community is composed of human beings instead of marketing personnel.
This is why it’s so crucial to let people know about their freedom at every possible opportunity, i.e. every time you mention the system. I know that “GNU/Linux” is a mouthful, but it’s too easy for people to hear about “Linux” and not know there’s anything special about it except that nerds like it. I myself had heard of “Linux” for years before I knew that it was free of charge, much less free-as-in-freedom (FAIF). There’s too much possibility that people will hear of “Linux” and just think it is another operating system. Or, they may get sucked into using non-free software by the “nerd-allure” of it.
Take Android for example: Android is a Linux system, but it only took me a few minutes of using my dad’s Samsung phone to see that Android is not a freedom-respecting system. None of the values of the free software movement were respected in its interface or its operation. There weren’t even subsidiary values (those by-products I mentioned), like organization, clarity and standards. There was an avenue for spam and advertising that was pretty well-lubricated, but the only reason I saw for using the Linux kernel was that it’s adaptable to many devices. After playing Angry Birds for a few minutes, it became clear to me why it’s important to call the system I’m using now GNU/Linux: it’s accurate, and it promotes a mission that is in line with my values. As often as I can inform people of their possibility for freedom in technology, I will do my best.
For more on these issues, you can read The GNU/Linux FAQ
My son has been in public school for about six weeks. Every week we get an invitation to buy something from some company, or support corporate advertising through his school. The first week was NFL week, where the school tried to win a “grant” from the NFL by submitting pictures of students wearing NFL apparel. Next it was the “Fall Fundraiser,” where a corporation came in and showed my son a video of a bunch of kids having a party, then gave him an envelope containing the tools for us to provide data and money to a corporation selling candy and magazines. Next it’s the bookfair, and then picture day. These are all things that we had when I was a kid, except for NFL Week. I don’t suppose there’s anything hugely different, but now I’m seeing it as a parent.
NFL Week was thoroughly transparent: it raised a lot of questions. There’s the typical question of why the NFL doesn’t just give money to a randomly or thoughtfully selected school near each of its teams; there’s the question of why people who wouldn’t otherwise be buying NFL apparel (like my wife and I) should go out and get some in the name of providing money for the school; then there’s the question of why there’s chance involved. We felt uncomfortable turning our son into a billboard (something we avoid in all clothing purchases). Other corporations are doing similar things. We have recently seen high school students at our supermarket telling us to go to the Pepsi-Cola website to vote for their school, so that they can “win a grant” from Pepsico. This raises the same questions, and again it’s thoroughly transparent. Why doesn’t the company just use advertising to tell people to go their website, or a phony contest? If they want to tell people to go to their website, why do they have to fool children into doing it for them?
We hear all the time about how public schools just don’t have enough money, teachers don’t get paid enough, and so on and so on. That may all be true, but has anyone stopped to think about who has an interest in propagating that story? It may not be true after all, since when I was in contact with private schools, even in the richest of them I would hear talk of fund-raising and budget shortfalls all the time. They were just like public schools in that respect, except that it was obvious that they actually had huge piles of money. I just didn’t get why they had such a scarcity mentality, although I should point out there were some schools that didn’t. Those tended to be the ones who actually had fat kids and teenagers with funny haircuts (you know, like a normal school).
Whether public schools have enough money or not is really irrelevant when we see schools turned into avenues for advertising and commerce. Basically every week our son comes home with a piece of paper saying that there’s something we can buy through the school. Doesn’t that seem weird? Doesn’t it seem like these companies would be making less money if they didn’t have this avenue?
Now consider that these companies also have enough money to influence law-making.
This weekend I’ve made trips to two events that really got me thinking about who we should promote free software to. The first stop was the Durham Farmer’s Market, and the second was a benefit concert for a cooperative preschool in Chapel Hill. I have been thinking for a long time about the “organic food crowd,” particularly because I’m a biology graduate student, and most of my fellow graduate students buy organic food or shop at farmer’s markets. They seem to have values in common with me, yet few of them use free operating systems. A lot of my fellow graduate students know about certain free software, like Firefox, R and Python. However, mostly they use Window$ or McIntyre operating systems.
I really think somebody needs to get the idea of free operating systems to people at the Durham Farmer’s Market, Whole Foods and events like the concert I just attended. Obviously that could be me, and I could just go and talk to the vendors at the farmer’s market. That would be easy. There are a few problems, chief among them the assumptions I’m making. I automatically assume that these people who I seem to have a lot in common with are very different from me. I assume that they are making their decisions from a fashion-inspired reflex. I think I feel this way because I have come to my own values my own way, and not because of fashion. However, I know my conclusion is not justified. I don’t actually have good data about the “organic food people,” and probably at least ten percent of them do indeed use free software. Probably more than ninety percent of them at least know about Firefox, even if they don’t know what’s actually good about “open source.” I do know that pretty often I see cars like the one I saw driving back from Chapel Hill: bumper stickers saying “When words fail, music sings,” alongside an Apple sticker.
The other problem is just what I would say to them? Would I recommend a particular distro? Would I recommend that they read GNU Philosophy? Would I recommend that they learn about the issues on Wikipedia? These were all helpful things for me. However, it’s best to get across the ethical essence of the idea by simply giving people a persuasive argument. That almost always gets people’s attention, but you need to give them at least a step to get going. Another good first step is to recommend the film Revolution OS, but that’s starting to seem a bit dated. Perhaps it’s time for another documentary, like Patent Absurdity.
The third challenge is to remember is that promoting freedom is not a race to get the most users. People in the software press seem to always be concerned about numbers, about “desktop share,” and about “killer apps.” That’s really not the point. The point is to demonstrate that ethical motivation is enough to create a working operating system. In other words, whether the GNU/Linux operating system was created for freedom and fun, it was not created for money. Often the first thing people tell me when I give them my persuasive argument is “but programmers have to make money!” as if money were the only reason that anybody ever does anything. The point of free software (and Wikipedia) is to show skeptics that there are people who have different values.
Ultimately, I believe, that ethical motivation will prevail and one way or another, whether they know it or not, people will end up using ethics-promoting software. It doesn’t matter how many Windows users we “convert” or how many Mac users we tell the truth about much of the software they’re using. It doesn’t matter that we “conquer the world” or anything like that. What matters is that those of us who care about our freedom now do what we can to continue to improve our ability to live our lives without using ethics-compromising software. The more we can do that, the better demonstration we make to people who finally decide that they want to make the effort to preserve their freedoms. We will do our best, and others will see it and make their decision.
The University of North Carolina has a long history of supporting software freedom. The University has sponsored ibiblio.org since before I started using the internet, and recently made the very smart move to switch away from the proprietary Blackboard online learning system to Sakai, which is licensed under an Apache-like license. Recently however the university has made an unfortunate choice about its email systems. I wrote in my last post about the dichotomy between academic computing and commercial computing, and unfortunately UNC Chapel Hill has chosen commercial computing over academic computing in handling its email systems. This disappoints me. I contend that their justifications, mostly based on “performance” and “meeting the needs of users” are hollow. Performance is not the only thing that is important in computer systems. As far as I can tell, the only feature that distinguishes the new system from the existing system is the ability to invade user privacy. Worst of all, the university is sacrificing academic computing ideals, including freedom, and “outsourcing” its email to a commercial interest. The fact that a world-class university like UNC Chapel Hill would trust Microsoft instead of using their own talent is really stupid.
Take a look at this list of advantages of the new Microsoft-based email system, offered by the ITS staff at the medical school. Look carefully and notice that the only feature that is really new is the “[a]bility to ‘wipe’ lost/stolen portable devices.” Everything else on that list is available with Cyrus IMAP. In other words, the university prefers a system that allows invasion of privacy. Now, I understand that there is a good security motivation for this feature. However, when considering that this is the only new feature of Exchange over Cyrus IMAP, it seems odd that the university is favoring a new system that does allow invasion of privacy. Why is that so important? Clearly the new system does not “meet the needs of users,” as much as it meets the needs of administrators.
Another feature that doesn’t make sense to me is “Scalable handheld (smart phone) e-mail solution – works with Blackberry, iPhone, Windows Mobile, Android, etc.” This is a little weird because I don’t need to view a webpage to get my email on my desktop, why would I need to view a webpage to get my email on a smartphone? This demonstrates the most annoying aspect of all the announcements I’ve gotten about the new email system: confusion between, or failure to distinguish client and server. Many of the justifications for the new email system are made on the basis of clients, but the change the university is making is a change of server. That’s weird because the whole point of standardized protocols like IMAP is so that clients can be entirely agnostic to the identity of the server. If the server chooses to depend on nonstandard features, that messes things up for clients. Which client I use is my choice, and the server should accommodate. That’s the “needs of users.” However, I know at least one user who’s having trouble even marking her mail read while connecting her chosen client to the new server.
Features and “performance” are a common justification for using proprietary software. There is a common attitude that “open source is best for making the world a better place, but I need to get my work done and I’ll choose the best tool for the job.” We’ve already ruled out any advantages in terms of “features” of Microsoft Exchange over the Cyrus IMAP daemon. There are other things to consider: the quality of service and the message that the choice of proprietary software sends to students of the university. The quality of service with Microsoft Email servers that I’ve experienced is terrible. Again, the biggest problem is the confusion of client and server. I used to work at a large hospital system that used Exchange, and whenever I called the helpdesk, they would refuse to answer any questions about the server until I told them which client I used. In other words, they wouldn’t simply tell me if the server was down because I was using Thunderbird to read my mail. Storing mail and reading mail are two different things. Sending mail and fetching mail are two different things. The only people in charge of an email system should be people who (minimally) understand those facts. Microsoft’s sales tactic, on the other hand, is that their “customers” will save money by hiring less qualified people. In other words, screw service, screw your users, save your own ass some dough. That’s what UNC Chapel Hill is choosing.
The message this sends to UNC students is that the university cares more about money and less about student lives and intellectual freedom. They’re already raising tuition. The university is effectively making itself another corporate entity. They are in the business not of education, but of being in business, just like any other vacuous corporation. That’s insane. Universities should be bastions of intellectual freedom and they should cultivate and harvest the fruits of that intellectual freedom by providing key infrastructure themselves. They should not seek to emulate the corporate world. I understand that they want to save money, but they should do it by hiring fewer, well-qualified people to staff fewer servers running free software.
I often mention that I’ve been using the internet for almost twenty years. I do this for two reasons, neither of which is to brag or apply seniority. One is to emphasize that before most people found out about the world-wide web, there was an established culture on the internet of scientists, engineers and computer personnel. Universities were the backbone of that community. After the concept of the internet was established by the military, universities carried the torch and led the way in technology. When the military needed a new technology to build up their newer communications network, where did they go? They went to Berkeley. A university has the necessary expertise for what they needed.
The other reason I point out how long I’ve used the internet is a sort of nostalgia. The best way to use the internet was always on university machines, running some form of Unix: BSD, System V, SunOS and more recently GNU/Linux. Universities were always the best places to use computers. Why? Because universities were where the talent grew up, developed and was allowed to be creative. Universities existed outside the stultifying, cost-saving world of corporations.
It seems that now universities are done giving the baby a bath, they are throwing the baby, the bathwater, the tub, the sink and the baby’s mom out of the window. They might as well kick dad in the balls by only teaching their computer science students how to work for Microsoft. Using Microsoft servers, software and supporting corporate culture (i.e. the culture of Microsoft) doesn’t serve the interests of students, researchers at the university, or society. Universities serve their students by teaching them how to be flexible, creative and constructive members of society. Universities do not help their students by teaching them to be money-hungry, cog-thinking, competitive corporate flunkies. A university can teach all of the above good values by teaching students with free software based on Unix ideals. It can even do so in an inclusive environment that includes Microsoft software. However, teaching students in a university computing environment mainly based on Microsoft software does not teach them creativity or flexibility: it teaches them “you don’t have a right to learn until you are chosen as one of the elite; then you can subjugate people just like we’ve subjugated you.”
Anybody who says “We have to be realistic and teach students to use software X because those are the jobs that are out there” is a corporate tool. People who learn properly at a university can learn to use anything that someone hands them: that’s the point of a college education, to be able to learn, not to know something.
Furthermore, universities do not serve their researchers by running Microsoft software. Researchers at universities are professionals and they need to be treated that way. Microsoft products are just not professional quality. Even if they were, they limit freedom in such a way that they should not be taken seriously by researchers. This goes for all proprietary software, including Mathematica and Matlab, but researchers have choices on what to use in their own research. Unfortunately they often have to use what a university will provide for them when it comes to basic services like email. Universities should provide the best, and Microsoft Exchange is just not the best. If Cyrus IMAP was not the best, they could have chosen Dovecot, Courier, Zimbra or any of the huge number of free software alternatives. If the Cyrus system “… is old, complex, outdated, and does not fully meet the needs of our users” they can hire dedicated, talented people and make it simple and current so that it meets the needs of users. Instead they choose Microsoft.
Both of these failures to meet the needs of students and researchers mean that the university is failing society as well. People denigrate the “ivory tower” all the time, but there are chunks that fall from that ivory tower that change society and even make people a lot of money. Let me see if I can think of a few examples: the internet, the world-wide web, science, liberalism, Charles Darwin…
What can you do? Complain. If you have a problem with the new email system, let the university know. They do listen. A list of the relevant managers in charge can be found on the ITS web site. Email them directly. Another alternative is to stop using email. I don’t advise this because UNC has made email an official form of communication. You could probably rig something where they have to contact you by campus mail (forcing you to use email is discriminatory). However, another problem is that email is, I believe, with all its problems, the best form of electronic communication. If you want to ‘e’-anything, you should email it. One thing I know I will do is I will seriously consider the IT infrastructure at the next university I go to. I’m a graduate student, so my time at UNC is limited. I will have things I will miss and things I certainly won’t.
One more thing: don’t wait until the forced transition if you plan to continue using email. I’m going to transition tomorrow and I’ll let you know how it goes. Thanks for reading.
Most people think “programming is for programmers,” and by “programmers” they mean people who earn a living writing software, i.e. “end-user” software: software that people will buy, or that will be used in some big company. However, recently I’ve overheard a lot of talk from people in the business world about what those large companies do, and much of it sounds like it could be done by simple computer programs. The problem is that people don’t learn programming, nor do they learn to think of their problems as amenable to programming. I surmise that for most people, a programming solution doesn’t even enter into their thinking.
At a recent breakfast conversation, my brother told me that at his company most of the problems that come up result from people not thinking of something if a notification doesn’t come up on their computer screens and ask them. Even if they know there’s a problem, they won’t do anything about it if they don’t see it right there in front of their faces. They won’t even get up and walk five feet over to the guy in charge of that problem to ask him. These people and their tasks could be replaced with simple programs. He also told me that the corporation he works for uses none of the operations research or systems design theory that he learned in business school. Everything is just left up to guessing at the best solution and sticking with it for years, regardless of how much it costs or the alternatives.
I also sat next to some people in the airport who were using Excel and mentioned SAP (which my brother tells me is basically a money-milking machine; the companies who buy it are the cows). One of these people said her current project was “organizing [inaudible] into categories based on characteristics, including horsepower…they weren’t labeled with horsepower.” She was doing it “by hand.” One of my missions in my last job and in graduate school is to intercede whenever I hear the phrase “by hand.” We have computers. “By hand” should be a thing of the past. This young woman apparently didn’t think of her task algorithmically. Why would she when it’s unlikely any of her education included the word “algorithm?”
These patterns highlight the current failings of commercial computing. Commercial computing has one goal: sell computers. For most of the history of computing, this approach has been focused on hardware, but now people mostly see it as software. Commercial computing’s current goals are to sell people software as if it were hardware and then walk away, wagging your finger when the customer comes back complaining that it doesn’t work. Eric Raymond calls this the “manufacturing delusion.” Programs aren’t truly manufactured because they have zero marginal costs (it costs only as much to make a billion copies of a program as it does to make one copy). Commercial computing focuses on monolithic hardware and software, i.e. programs that try to do everything the user might need, and funneling everyone’s work through that program. That doesn’t work.
Academic computing, on the other hand, has the perspective that if something doesn’t work the way you need it to work, you rewire it, you combine it with something else, or build onto it so that it will accomplish a specific task. People literally rewired computers up until twenty-five years ago, when it became cheaper to buy a new machine (if anyone can correct me on that date, please let me know). Similarly for software, if the software you have doesn’t do the job you need, you write the software to do the job you need. If you have several programs that decompose the problem, you tie them together into a workflow. Suppose you have a specific problem, even one that you will only do once, and might take you one day to program — potentially saving you a week of “by hand” — then you write a program for it. Then if you ever have to do it again, you already have a program. You might also have a new problem that is very similar. So you broaden the scope of the previous program. Recently I wrote a script that inserted copyright notices with the proper licenses into a huge number of files. I had to insert the right notice, either for the GPL or All Rights Reserved based on the content of those files. On the other hand, if you have a program that generally does what you want, e.g. edits text, and you want it to do something specific, you extend that program to do what you need.
Basically I see a dichotomy between the thinking that certain people should make money, and solving problems only to the extent that solving their problems makes those people a lot of money, versus actually solving problems. If you disagree that this dichotomy exists, let me know and I’ll show you academic computing in action.
The solution for all these problems is teaching people to think algorithmically. Algorithmic thinking is inherent in the use of certain software and therefore that software should be used to teach algorithmic thinking. Teaching people algorithmic thinking using Excel is fine, but Excel is not free software, and thus should not be considered “available” to anyone. Teaching these skills in non-computer classes will get the point across to people that they will be able to apply these skills in any job. Teaching this to high school students will give them the skills that they need to streamline their work: they will be able to do more work, do the work of more people, communicate better and think through problems instead of just sitting there. People will also know when someone else is bullshitting them, trying to sell them something that they don’t need. Make no mistake, I’m not saying that teaching programming will get rid of laziness, but it might make it a lot harder to tolerate laziness. If you know that you can replace that lazy person with a very small shell script then where will the lazy people work?
If you teach biology, or any field that is not “computer science,” then I urge you to start teaching your students to handle their problems algorithmically. Teach them programming! I am going to try to create a project to teach this to undergraduate students. I have in mind a Scheme interpreter tied to a graphics engine, or perhaps teaching people using R, since it has graphics included. Arrgh…Scheme is just so much prettier. Teaching them the crucial ideas behind Unix and Emacs will go a long way. Unix thinking is workflow thinking. Unix (which most often these days is actually GNU/Linux) really shines when you take several programs and link them together, each doing its task to accomplish a larger goal. Emacs thinking is extension-oriented thinking. Both are forms of algorithmic thinking.
If you are a scientist, then stop procrastinating and learn a programming language. To be successful you will have to learn how to program a computer for specific tasks at some point in your career. Every scientist I know spends a huge amount of time engaged in programming. Whenever I visit my grad student friends, their shelves and desks are littered with books on Perl, MySQL, Python, R and Ruby. I suggest learning Scheme, but if you have people around you programming in Python, then go for it. I also suggest learning the basics of how to write shell-scripts: a lot of people use Perl when they should use shell-scripts. Learn to use awk, sed and grep and you will be impressed with what you can do. The chapters of Linux in a Nutshell should be enough to get you going. Classic Shell Scripting is an excellent book on the subject. Use Emacs and you’ll get a taste of just how many things “text editing” can be.
Every profession today is highly data-oriented. Anybody hoping to gain employment in any profession will benefit from this sort of learning. Whether people go into farming, business, science or anything else, they will succeed for several reasons. There are the obvious benefits of getting more work done, but there are also social reasons we should teach people algorithmic thinking. The biggest social reason is that teaching algorithmic thinking removes the divide between “customers” and “programmers.” This is why it bothers me to hear “open source” commentators constantly referring to “enterprise” and “customers” and “consumers.” If a farmer graduates from high school knowing that he can program his way to more efficient land use, then he will not be at the mercy of someone who wants to sell him a box containing the supposed secret. Again, you can teach algorithmic thinking using Excel, but then you already have the divide between Microsoft and the user. With free software that divide just doesn’t exist.
We are sick of education. We as a nation, a society, a world, and individually are making ourselves sick over education. I can’t speak for people in other countries, but as an educator in the United States at several levels I have repeatedly seen people make themselves physically and mentally ill over education. We need to do something about it. We don’t need to do away with school, but we need to seriously re-think what we tell people about school, our values as a society and our valuation of human life.
Another semester has come to an end, which means I’ve seen another set of young people come into my office to beg for me to change their grades; I’ve seen more people crying in my office about how they need to pass a class; they’re afraid that if they don’t pass they won’t graduate; they’re afraid that if they don’t pass they are not good people. It’s as if their lack of comprehension of a model of predation or population genetics means they are not good people. Of course I want them to learn, but does that mean I want them to destroy their lives and develop a mood disorder over learning what I’ve chosen to learn? Not only do I see university students — people over the age of twenty, mostly — crying over this stuff, but some have looked on the verge of vomiting over their fear of failing a required class. After one of these meetings the other day I decided that it’s not just the students’ fault, and that’s a silly way to assess the situation. No, it doesn’t matter whose fault it is. Instead this is a sign of everybody doing something seriously wrong.
Education Problems are Our Problems
First let’s establish that if there’s a problem with education there is a problem with our entire society. I don’t just mean that getting an education is a fundamental part of our society, or that in America we really value education. I don’t mean that since education has been made particularly important in capitalist society that a problem with education is a problem with the whole society. What I mean is that if there is a problem anywhere it’s a problem with everything. It’s a problem for everyone and everything we do. Richard Stallman often reminds people who software freedom is important not because computers are an important industry (i.e. make somebody a lot of money), but because using computers is a part of our lives now. If you doubt that’s true, think of how often you make scheduling decisions based on the performance of a piece of software: do you ever schedule a meeting at a particular time because your calendar software makes it easier to do so? Do you ever leave the house later because your web browser wouldn’t load the article you wanted to read over breakfast? Do you ever leave the house or office a little later so you can download something onto a mobile device? Think about it and you’ll see that anything you do with a computer affects your whole life, just as your diet affects your whole life.
Education is the same way. If there’s a problem with our education system, the way we teach, and the reason we learn (i.e. our values) then there’s a problem with our whole lives. We can’t ever say “That’s a problem for the schools,” or “That’s the teacher’s problem.” It’s our problem. If your kid is having a problem learning or understanding why he should be learning something, it’s your problem.
Not only are students crying in my office, but there are larger societal signs of the problems created by our you-must-go-to-college-society. Consider that when I graduated from high school, I had a range of friends with varying interest in college. I wanted to go to college because I wanted to be a professor; I can’t do that without a college degree, plus a Ph.D. and (I thought at the time) a Master’s degree. But I only applied to one university. I didn’t apply to a “safe” school plus Harvard, Dartmouth, Yale, Stanford, Washington University, Scripps, Pomona, Miami University, Beaverton College, Reed College… . I had other friends who weren’t going to college at all: they had taken courses in computer networking, database administration and other topics at vocational school, or were headed to community college to take those courses. Nowadays I know people with BAs in computer science who are doing those jobs. If you can do that job with a high school diploma or a BA, doesn’t that seem a little off to people?
College-entrance-syndrome begins in kindergarten. My son will be going to kindergarten soon and I am shocked at the stuff I’ve been hearing from people. I should have known things would be different now that every toy has pop psychology jargon written all over it. Parents have told me “Oh, that’s a good school, she went to kindergarten there and she learned all her skills.” Learned what? I don’t know about you, but in kindergarten I learned about primary colors, my feelings, don’t talk to strangers, and to be nice to people. An administrator at a charter school recently told me that kindergarten is the new second grade. My brother in Texas just told me that the Dallas public schools are paying teachers to quit, while they’re spending millions of dollars on testing.
Educators automatically tell kids that they ought to go to college. I’m sure even I fell into that habit when I was working in middle and high schools. Think about it and you’ll see how weird it is. It makes sense to have had that attitude with poor kids in the fifties and sixties, and with GIs coming out of the most violent conflict in human history. In the former case you had people who definitely would benefit from the vocational and educations opportunities of college; in the latter you had trained killers already familiar with the newest technology, knowledge that would be wasted if we didn’t interest them in something before they started killing again. My father-in-law’s family is a great example of a family working hard so their children could benefit and go to college.
The question we need to ask today is “do you need to go to college to work at Hooters?” Do you need to go to college to manage a Hooters? According to College Conspiracy most college graduates end up working at jobs that require no higher education, whereas high school graduates that don’t go to college enter the workforce earlier, gain more skills and advance more rapidly than college students. Also according to the film, most college graduates who do become successful don’t believe that their college education has much to do with starting a successful business. I’m not asking about the economic value of these people or their jobs, instead I’d like to ask: are you a bad person if you choose to work as a waiter?
Also consider the futility of telling kids to go to college “just because” it will do them better in some unspecified way. One of my brothers is a musician, meaning he chooses to live in poverty. He recently told me that high school administrators learned very quickly that they couldn’t motivate him by telling him “Well, if you want to get into a good college…” He’d interrupt them and say “I don’t want to go to college, I want to play my drums.” And he does.
We need drummers. And we need farmers, and waiters, and lots of people who are still perfectly good people without going to college. My grandfather was a brilliant builder, designer and artist with a seventh-grade education. Maybe not everybody needs to go to architecture school. Maybe they need to learn trades like my friends who learned networking and database administration without going to college.
That brings up the really sick part of all of this: why we’re telling kids they have to go to college. Most often we tell them because it’s the only way they’ll “make it.” What does that mean? Well, usually when you press people they’ll tell you it means making a lot of money. Oh? Is that what it takes to be “successful?” Is money what it takes to be happy? Is money what people really need? So people are told to go to school in the long-term to make money.
I shouldn’t need to tell you that I think this not only debases education, it debases the people we tell it to. If the only way to be happy is to have a lot of money — or even a lot of prestige, i.e. becoming a doctor — and the only way to earn a lot of money is to go to school then what good is learning? What if you’re not good at school? What if you don’t need to go to school? What if you don’t want to go to school? Do you really need to be Mark Zuckerberg or Bill Gates to be a good person? Do you even need to try?
The problem in all of this is that school has become a means to an end, rather than an end in itself, at least in America. I don’t think that every kid should love going to school — I hated it — but kids should be taught the value of learning. They shouldn’t be taught that learning is only for school; they shouldn’t be taught that learning is just one thing you do sometimes; they shouldn’t be taught that the job of learning is to make you a lot of money. They certainly shouldn’t be told money is the key to happiness.
One of the first times I realized I was different was in Junior High school. One of my neighbors saw me on the bus; I was reading The Extended Phenotype by Richard Dawkins, a book that in retrospect I totally did not understand but I read it anyway because I knew I was challenging myself, and it was cool. She said “Why are you reading? School’s over.” I can’t remember my response, but I can remember thinking the question was coming from someone who had completely wrong ideas about why we learn, and I felt sad for her.
My son recently received a book as a gift and on the cover was a banner that read “Time to learn!” I wouldn’t ordinarily think this was anything weird: it’s a good message to send that learning is exciting, right? However, we don’t reserve learning for when we’re in school or while we have a book to read. Children love to learn; they don’t need to even be in school, really. They certainly don’t need to be told “Okay, now we’re going to learn.” It’s what they do. My son is learning when he’s playing with Legos, when he’s on the playground, or when he’s reading a book. My other son is learning right now watching a Neil Peart drum solo, but I didn’t say “Okay, Khalil, it’s time to learn something; there will be a quiz afterward,” before I showed it to him.
What do schools tell kids instead? A recent lunchtime conversation with an undergrad, a graduate school drop-out, a grad student/TA (me) and a professor revealed the horrible truth. Had I forgotten it? The undergrad reminded me that throughout high school, students are told that they are in school expressly so that they can go to college. My professor colleague then said “So now people are going to college just so they can get into medical school?”