A door meant to be pushed will never be pulled. No more than three, indiscriminate buttons or switches will ever be placed in the same space. Internet pathways will always have clear direction that leads to the desired outcome. The number of affordances in a given artifact will be comparable with the number of constraints, so that clear direction will be clearly outlined in a cognitive map. User Manuals will become obsolete because the use of any object or system will be intuitive.
Design does not discriminate against any individual because of issues of gender, race, religion, sexual orientation or disability. This does not mean, however, that all artifacts are created assuming that all demographics of potential users have the same use cases for them. Because there are sociotechnical contexts that surround any object or system, designers think actively about each type of user before finalizing its structure. For example, the classic analysis of gender and cockpit design, reminds us that the anthropometric differences between men and women, but also between users of differentiating age, ability and ethnicity. Having learned this lesson, design enables users to make modifications to meet individual physical and psychological needs (Weber 373).
Semantically too, designers try to address as diverse an audience as possible. If an individual finds their identity is not represented in a menu of selections or if their intended use not advocated by a digital network, they alter the code so that their differences become less marginalized. The next user to encounter the system, would then be included within the confines of its use. The word “other” would cease to exist.
In the Utopian world of Values Embodied in Information Technologies and Digital Media, computer systems are be both moral entities and moral agents. Computers are moral agents because designers and programmers do not create systems with an eye only toward efficiency and usability. Rather than systems that consider only the best way to accomplish a given task, Utopian computer systems objectively weigh the potential outcomes. While this type of control coming from a computer seems problematic, the morality engine would be the result of a collaborative project, allowing experts and lay people from varied fields have a say in the values embedded in a system--values that would directly go toward how the system operates.
Since systems are created with a particular moral bias, users sacrifice some of their agency to assimilate to the system's design. This is not truly a sacrifice, though, as the values of computers are neither totalitarian nor puritanical. Human agency exists across an extremely wide spectrum of socially acceptable behavior. The morality of systems does stuff users’ ambitions into tiny pre-described boxes; it creates a vast world of available actions, all of which can be broadly defined as, if not “good,” at least “not bad.” Instead, moral choices associated with technology should be easily understood and communicated through the technology as to promote iteration, development and change in the technology. As technology is co-constituted with those who use it, an easy understanding of the moral choices in the design of a particular technology are necessary.
Built-in morality designed by a large group of humans also guards against the danger of technological determinism, which “blinds us to the forces that shape the direction of technological development and discourages intervention” (Johnson, 204). The morality exhibited by computers is unmistakeably human-made, thus “the design of computer systems...come[s] into the sights of moral scrutiny” (Johnson, 204). In this Utopia, computers have morals, but their morals spring from a constantly evolving human source. The helpless feeling of technological determinism is completely absent because everyone knows where the buck stops, and the ease with which the system can be updated for changing times.
Powerful private organizations, governments, rich citizens, and poor citizens experience a completely level playing field in the online world. User traffic and web content is handled is handled exactly the same way no matter the source.
A leading factor to online neutrality is an intellectual property system that relies on a public tax and donation system rather than an outmoded one-to-one system where each song, movie, or other creative work has an individual price tag. Since piracy is not a concern, ISPs and other gatekeepers have no incentive to throttle access speeds based on “illegal” content transfers. Since there are no illegal uploads or downloads, and the creators of the digital space do not give special treatment to the rich and powerful, there is no justification for digital access to vary in any way based on content, source, or destination.
The Platonic ideal is both unattainable and incoherent. No perfect whole or ultimate solution exists, and a society that pursues such a solution cannot be a happy or just society. Pursuing a perfect solution ignores the pluralism of values that the many rational members of a society have. Even though we may criticize or disagree with the values of others, we all share some common values as rational human beings. These commonalities make it possible for us to understand each other.
When clashing values (between constituent groups, between individuals, and even within an individual) come to fore in technological design, designers will not pursue their ideal while ignoring the values held by other members of society. Instead, they will consider all competing values and then make informed choices. In choosing one thing, something else will inevitably be lost.
These trade-offs will not be dictated solely by subjective judgment. Designers will strive to maintain equilibrium in society and avoid creating an intolerable situation for some members of society. Those values that are commonly held by the majority of mankind cannot be traded for others. Furthermore, designers will use sound judgment in making practical decisions between values. They will analyze problems, evaluate what principles apply, and seek guidance from relevant disciplines in order to make rational decisions.
The "right to be forgotten" is implemented correctly and respected by all parties.
Users are given control over their own data and the ability to share it or hide it on their own terms.
Digital locks are applied only enough to give inventors and creators the incentive to work. Digital Rights Management is not so pervasive that it curtails the creativity of the general public. All users will be given easy “access to information, freedom of expression, privacy, encryption research, freedom to tinker, education” (Kerr, 253) and every right to encourage the rights of authors, designers, small businesses and educators. The result will be a maximum market efficiency
If someone encounters a lock and has the knowledge to disarm it, there will only be legal repercussions of equal or lesser to the amount of harm done to the original owner of the information in question. The mere act of unlocking is not in itself a crime, only the reappropriation of any information behind the locks for means that disenfranchise the individual or corporation who put it there.
Control and agency on the net should be equally distributed among those who have the education to wield the data appropriately. Rights and protections should not be given based on money, market standing or political affiliation.
Users are sufficiently educated and motivated to practice good security habits. Additionally, providers and producers design with security in mind instead of cobbling it together out of a fear of bad media exposure.
The ease with which information--from medical records to books borrowed from libraries--are accessible no matter the part of the country or the brand of IT hardware or software used. Proprietary file types do not exist. If the information exists, and it is relevant to, and the property of a user, they can access it no matter what.
Encrypted personal online lockers are used to store each individual’s information. Users can choose if and when to access any of their information--no one is forced to confront data he/she does not wish to access. Since all of the files use standardized open source software, this requires no extra effort on the administrators’ side either. After a short period of adjustment and extra clerical work, the system was successfully put into place, and it now operates at no inconvenience to governments, doctors, small business owners, or any other data holder. When personal information is generated on an individual, it manifests in a standard form and waits in a personal locker that the user can choose to access, or allow temporary access to other doctors or the like, if and when he or she deems necessary.
Law and Control
Online and other connected systems not only are designed to embody the sorts of values noted elsewhere but contain a sufficient awareness of those value systems at play so as to flag value conflicts of protocols, data routing, legal ambiguities, etc. for proper adjudication and negotiation in the offline world. Should emergent biases develop over time as value systems and technologies shift, they are exposed rather than merely propagated.
Attempts to control the behaviors of users via design (architecture), rather than via fiat or some other modality (Lessig), should be carefully considered. While designating control functions to technological design is attractive in that it circumvents some of the enforcement difficulties inherent in other modalities, it also may prevent a user's conscious reflection on the value of various options for action, and can unduly restrict the user's agency. As such, whenever possible, technologies should be designed so as to permit a wide range of user actions, and where decisions are structured or "nudged" by the technology, this fact and the reasons for it should be designed as transparently as possible.