A couple of weeks ago, I was invited to a small workshop at Queen's University with some other journalists. We met researchers there working on issues involving technology, surveillance, privacy and the interface between human and machine. It was a lot of fun.
Coming away from that seminar, though, I was reminded of a book I'd read a few years ago by Andrew Feenberg called Questioning Technology. Feenberg was then a philosophy professor at San Diego State University but I'm pleased to see that he's been lured up to the Great White North and holds a Canada Research Chair in the Philosophy of Technology in the School of Communication at Simon Fraser University in Vancouver.
At Queen's, we got into a discussion (ok, I kinda dragged everyone there into this topic) that touched on what I'll call the perceived neutrality of technology. When people say technology is neutral they're really making the “guns don't kill people, people kill people” argument. Scientists and techno-enthusiasts will often dress that argument up a bit, saying that tools, methodologies, inventions — technology — are valueless and that they acquire values — good, evil, helpful, hurtful — only from use. A knife is just a tool. But use it to cut your veggies, it becomes a good knife. Use it to kill, it is a bad knife.
I'm not so convinced of that view. I think technology, more often than not, has a value system built it into it. I think Feenberg thinks the same way too:
… the democratic movement [used to] g[i]ve is fullest confidence to the natural processes of technological development, and it was only conservative cultural critics who lamented the price of progress. The Ruskins and the Heideggers deplored the dehumanizing advance of the machine while democrats and socialists cheered on the engineers, heroic conquerors of nature. However, all agreed that technology was an autonomous force separate from society, a kind of second nature impinging on social life from the alien realm of reason in which science too find its source. For good or ill, technology's essence – rational control, efficiency — ruled modern life.
But this conception of technology is incompatible with the extension of democracy to the technical sphere. Technology is the medium of daily life in modern societies. Every major change reverbates at many levels, economic, political, religious, cultural. Insofar as we continue to see the technical and the social as separate domains, important aspects of these dimensions of our existence will remain beyond our reach as a democratic society… [p. viii]… insofar as democracy challenges the autonomy of technology, the “essentiast” philosophy of technology around which there used to be such general consensus, is challenged as well. … [p viii]
…technologies are not merely efficient devices or efficiency oriented practices but include their contexts as these are embodied in design and social insertion. The contexts of technology include such things as its relation to vocations, to responsibility, initiative, and authority, to ethics and aesthetics, in sum, to the realm of meaning. [p xiii]
… the notion of the “neutrality” of technology is a standard defensive reaction on the part of professions and organizations confronted by public protest and attempting to protect their autonomy. But in reality technical professions are never autonomous; in defending their traditions, they actually defend the outcomes of earlier controversies rather than a supposedly pure technical rationality… [p 89]
You can read the preface to the book, from which I've quoted several chunks, here.
If you like that, by the way, I'd also recommend We Have Never Been Modern by French philosopher Bruno Latour. Can't say I'm smart enough to understand everything Feenberg and Latour are talking about it, but their writing seems to start firing some critical thinking neurons in my brain and that, I suppose, is what good philosophy is supposed to do.
Technorati Tags: Andrew Feenberg, Bruno Latour, democracy, ethics, philosophy, philosophy of technology, values, surveillance
I have to agree 100% with this statement. Technologies exist, or are developed to fill a certain need and/or requirement of society; for better or for worse. As such, Technology is not inherently innocent or neutral.
Take a gun for instance (I use this example not only because it's so frequently used, but it's one most people can understand), and the arguement you quoted earlier in your post here. Guns don't kill people, people kill people. Wrong. Guns kill people when in the hands of people who intend to kill other people. The gun becomes the vessel through which the intent of the person is projected.
Why was a gun designed? In the large scope of things, what is it's purpose? To kill, to maim, to injure, to subdue. A more efficient means of defeating your opponent in combat. In general terms, these actions are negative (evil) actions. Therefore, logic follows that to design and create an object through which such actions are made simpler, makes said technology negative (evil). Or, in the context of your article David, value inherent.
I will agree too, however, that there are neutral intended technologies. Items such as an axe which was designed to fell trees, but which could also be taken and used in negative (evil) action. Items such as the clock which performs a very simple and neutral task in and of itself, but if taken and combined with certain psychological factors and intent, can be turned into something nefarious, as difficult as that might be. In the end, Neurtal Intent is not Neutrality of Technology.