These days one can hardly throw an iPhone 7 (don’t, actually) without hitting a “learn to code” initiative aimed at young people. The Hour of Code, part of Computer Science Education Week, encourages every student in America to explore programming. Code.org teaches basic concepts using Star Wars, Frozen, and Angry Birds. Scratch, a popular site developed by MIT, opts for a dancing cat.
These efforts are impressive, and my own kids have benefited from some of them. Widespread knowledge of programming is necessary in our economy and schools are probably doing too little of it. Yet the swell of enthusiasm threatens to drown out some fundamental questions of ethics and values.
In his blog post “Coding for What?”, English educator Ben Williamson asks how all this technical training relates to a broader sense of digital citizenship.
The reality . . . is that coding in the curriculum, and many other learning to code schemes, have tended to overemphasize either economically valuable skills for the software engineering sector, or high-status academic computer science knowledge and skills. There has been far too little focus on enabling young people to appreciate the social consequences of code and algorithms.
Williamson finds throughout the tech sector a narrow focus on technological sophistication for its own sake. Code, of course, is neither good nor bad; its virtues depend on the ends for which it is used.
Williamson’s post was no doubt sparked by the grim results of the ends that we have chosen, intentionally or unwittingly. After the presidential election, we learned how Facebook and other social media sites used algorithms to entice users into a maze of fake news. The ridesharing pioneer Uber had its engineers create a fake app (code-named “Greyball”) to deceive government regulators. And these specific examples are minor compared to looming concerns about privacy, discrimination, and equal access.
If we want better answers to the “for what?” question, we may have to turn to the humanities. In WIRED, Stanford computer science doctoral student Emma Pierson laments the myopia of her field:
I’ve watched brilliant computer scientists display such woeful ignorance of the populations they were studying that I laughed in their faces. I’ve watched military scientists present their lethal innovations with childlike enthusiasm while making no mention of whom the weapons are being used on. There are few things scarier than a scientist who can give an academic talk on how to shoot a human being but can’t reason about whether you should be shooting them at all.
We have ethics training for aspiring doctors and lawyers, Pierson observes. Why not for young people learning to code?
She knows an important reason why not: “Diverse worldviews can produce argument.” In our age of polarization, the bloodless precision of programming has an undeniable appeal. Code isn’t liberal, conservative, Democratic, Republican, Christian, Muslim, or atheist. The engineers of Facebook’s News Feed, writes Farhad Manjoo, “are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth.” When we ask the “for what?” question, we move to a qualitative realm that quickly becomes ambiguous and contradictory. How much easier to pretend that we can engineer that realm out of existence.
In the end, though, the neglect of values, the pretending, is bringing not techno-utopia but an increasingly impoverished public culture. Fake news and fake apps make a travesty of our democratic system of government. Tech workplace cultures that are toxic for women and people of color (Uber again) belie our commitment to equal opportunity. In a world where technical questions are inevitably political and social questions, we have to start asking not only “how to code?” but “coding for what?”