er heiãÿt john fass. er ist designer und lecturer in london. er hat schon fast ã¼berall gelebt:berlin, milano, brã¼ssel und london. einen groãÿen applausfã¼r john fass „designing humanity“. hello, anyone who came hereto see thomas fischer? i am extremely impressed that you are here to hearabout the ethics of designing interfaces. my name is john. i am a teacher, lecturer in london.
i run a course on interface designin london and i’m a lecturer in kensington. i am going to talk about design,because i am a designer. i will talk extremely fastand use english idioms. this is me, send me a message. ich kann auch auf deutsch. so digital interfaces,they are a doorway in the digital world. they are the everyday activityof what i do. a lot of what we heardat re:publica so far
is important stuffabout data politics, digital politics. but it does not reach upto the interface level. it doesn’t reach up to the detailsof designing interactions for the screen. this ethical and politicallook of interfaces should be more widely understood. i first wanna talk aboutthe concepts of this idea. and how technology will havea lot of unforeseen effects. good and bad. i will show how technologies,particularly interfaces
embody certain types of power relations. i will give some suggestions, how interfaces can be more moral,more ethical. this talk is called:designing humanity. after a philosophercalled peter paul verbeek who says that technology increasingly shapeour behavior and actions in the world in often unexpected ways. designing technology thenmeans designing humanity. this view takes technologynot as a functional instrument,
but as active mediatorsin a set of relations. we don’t see technologies as objects. but as waysto get through to our live world. i see digital technologies as indicative of these relationswe have with the world around us. who here wears any kind ofactivity or fitness tracker? is anyone wearing one of those now? weird devices. my wife wore oneuntil she found it too difficult
climbing up enough stairsto reach her goal. weird behavior. this is a sleep tracker. i wore one of theseon a headband for 6 months. it tracks what sleep phase you are in. i woke up every nightto look at in what sleep phase i was in. ridiculous! i threw it away after that. technology can make us behavein quite strange ways. no one is immune to this.
the holy father himself included. crowd behavior amplifies this effect.people wanted to take selfies with him. technologies also shape our identity. it allows being different typesof people and identities. the questions for a designerin the digital interface context is, what types of identitiesdoes your interface inhibit or produce? are you even awarethat technologies do this? technologies of powerintersect with those of the self. which has a specific resonantfor social media.
a lot of talk about social mediaat this conference. this is a clear example for me: an interesting mem:a breast cancer campaign. you are not allowedto show women’s breasts on facebook, so a man's breast was shown. this is an interface that forbids you to show your bodyexcept in very specific forms. the power relationis in a very obvious dynamic. what happens when technologies
no longer mediatethe experiences of our daily life? like the way we look at the sunset. the fact thattechnologies constitute life itself, like an oculus rift or vr helmet. then there might bereally different ethical dimensions. if all of lifeis experienced by this technology not simply a passagethrough technology to other life. then there could be more complicatedmoral questions. the external worldis only external to the extent
that it sits on your facein the form of a helmet. if you go over to the la:boratory,everything is vr in there. i would like to seea more critical approach to how those interfaces are designed. i found it quite telling that onlythe zuckerberg has his head above water. i wonder what it meansthat there is no one in the audience that wants to take the headset off. all designers act through materials. furniture makers use wood and steel,jewelry designers use gold or metals.
but the materials of interface designare different: they are much less materially present. they include menus and a hierarchy of hardware:memory chips, processors. and software:desktop applications, tools, files. at code level java script and code level toolsan interface designer might use. all of those technologiesare holistically aligned towards an end. that is either functionalgetting people to use the system.
i say we should harness them alsoon moral and political ends. i use these tools every day,designers become familiar with them. in some ways they disappear. the constraints of these materialsare also implicit ethical constraints. again the questionfor the interface designer is: how do you design constraintsturn into behavior constraints? and are you even awarethat that’s what is happening? interface design has developedunder the influence of determinism. a techno-centric wayof being in the world.
everything is gonna beseamless and smooth. there is no friction. it looks like smooth californian stuff. this illusion is hard to maintain. this is how tinder respondedwhen their system went down ten days ago. they said: “we’re sorry, the system is broken.all those interactions have disappeared.†their interface doesn’t really allowto show what is happening. "oh we're really sorry.we will be back really soon"
at the same time the making of technology has changed from a responseto a human necessity “we need tools for communicationâ€to the essential purpose of human effort. that is kind of all we do. develop technologies.often quite blindly, i'd say. in the context of digital technologiesthis effort is shaped by market dynamics. they don’t really value reflectionor critical input. left on its own technical determinationwill never provide self correction. it can’t do that.
if left alone it must actto its own ultimate end. the interface designer should assume some of the moral responsibilityfor that to happen. technology is never gonna do thaton its own. digital systems are designedto encourage zone behaviors. their prime motivationis to increase the number of users and monetize the actions of the users. something is always to be introducedinto the system. here you can seethe amount of tinder likes is limited.
you want more likes,you have to pay for them. the system creates its own needs and the interface is designedto fulfil or deny these needs. it is a fairly closed system. we access the world through technology. social world and physical world. not so many people are texting,that’s surprising. this has a fairly significant effecton how we see the world. our understandingof how the human brain works
is profoundly shaped bymagnetic resonance imaging technologies. our ideasof what the unborn child looks like are mediated by ultrasound technologies. this influences the way we thinkabout the world around us. digital technologies alwaystend towards invisibility. contactless paymentis a huge thing at the moment. there were announcements:“contaclessness has arrived!â€. as if we all were supposedto wait for it. wouldn’t it be betterif financial processes were more visible?
so this is a big thing, too: tapping in and out of a public transport systemwith your watch. this seems to be a strange thing to me, this conceals a lot of othercomplex interactions. using your watchto access public transport has significant implicationsfor identity recognition, data capture, algorithmic profiling,all that other stuff. the ethical and moral thing
for an interface designer to dois to make those things apparent. make them clear. make clear what happens, when you touch your wristfor a second to that reader. this is pet facial recognition. this is the ultimate endpointof camera vision technology. it can recognize your petwherever it goes. what i want to say is:digital life is life. there are not two setsof political fear of moral and action. digital life is life.
moral and political decisionsthat govern our life are displaced to digital technologies. many of them developedand maintained by large corporations. we should insist that digital lifeis inseparable from real life. if we don’twe will always have the argument: “don’t worry,that does not have to be moral, that is only digital.†this is another part of the argumentthat digital life is life. it needs a moral framework around itjust like everyday life.
an interface designershould play a part in that argument. technologies are always being adaptedand transformed by the users. they are shaped by people. this is why hacker labs and makerspacesare very important. technologies aren’t closed,we can adapt them to our uses. that might also be unintended. but the evaluationof the success of any design, must not onlyinclude functional qualities, how quickly it is to download.
but also its moral qualities. seen as a measure of howit encourages people to act and what kind of acts it results in. that is the distinction i makebetween ethical and moral. ethical implies actions in the world. moral implies behavioror inner understandings. technologies are always adapted,people do whatever they want with them. as technology becomesmore physically closed with tiny screws, you needspecialist screwdrivers to unfasten.
they also becomemore conceptually closed. with commercially seeking algorithmswe are legally not allowed to enter. our opportunities for adaptationand transformation become more important as they are being more limited. one of the questions i ask you guys: the movement is well establishedthat we take things apart, but why are interfaces so rigid,so unchangeable and persistent? interfaces seem to bethe last thing in this argument. in india,people use the missed call function
as an important signifier. instead of using you expensive minutesyou don’t answer the call. when you miss a call, this learns. this is what i callsocio technical behavior that’s been exploitedby political parties. our assumptions about human activity are either people will answertheir phone when it rings. the answer buttonis designed in the interface. in fact people dodifferent things with technology.
the decision for what to include shapes the possibilitiesfor human action and human agency. this frames the design decisionas a moral decision. what does the system allow us to doand what is possible? one idea is repurposing stuff. here they took the batteryout of their phone and use it as a rechargeable cell. can we build these possibilitiesinto the interface itself? not into the physical objectbut into the interface itself.
this idea that it could be taken apart. this is the crystal meth drone. there are unintended consequencesfor the things we put in the world. once the design is packed into it,it is difficult to change it. this is why open technologyis so important. how does this manifest itself? dark patterns,i’m sure some of you know about this. they were called defaults. it's an interface designer sets upto deceive people.
many of themare on our everyday experiences. part of the design process is anticipating how people will useany particular designed object. virtual or physical. designers then build in prescriptions,how things should be used. some actions are invited,some are discouraged. the ethical task is to find balancebetween those two things. i'm not saying these things should beencouraged or prohibited, but there should also be a moral balance,not only a functional one.
dark patterns set on defaults. if you set the defaultfor organ donor ship to opt out, you have a far greater uptake of people. you have to choosenot be an organ donor. in every country where this is done,you have a larger uptake. oxfam a good organization,we can support that. it’s distributing aidall around the world. but look, the got a default, it is not a singular donation,but a monthly donation.
it would be pretty easy to overlook that. suddenly you are doinga regular monthly donation. i’m not saying that this is a bad idea,we should probably all do that. but this default is morally questionableand if oxfam is doing that, you can betmost other people are doing that, too. some have bought an ipad and at the last momenthave slipped in an ipad case. a classic example of a dark pattern. when you receive your ipad
it comes with the caseand you pay for it. this is the uk postal service. how many have seen this kind of thing? this one is especially tricky. the first two,when you don't want to receive, the next two,when you do want to receive it. you have to read that textand click all the correct boxes, otherwise you get a lot of junk. lots of websites do this.
here is a game. two dots. quite fun to play. it was four dots then. over time the interfacetrains you to react in a certain way to click an interface item,in this case the green button. it goes from left to rightto start a new game. suddenly you are not playing but paying. with no real indicationthat that interaction has changed. if you play this game a lotyou gonna end up spending money.
this is a famous example. ryanair, better known as the devil. they tried to get you to buy insurance. in order not to buy insuranceyou have to scroll to between denmark and finland to “dâ€as in “don’t give me insuranceâ€. tricky thing,don’t know who designed that. very sneaky, very unethical. a lot of stuff is not visible at all. even if you did scroll downbetween denmark and finland,
they have a lot of sneaky stuff. a few examples here. most are the kind of same examplewhich is a hidden geo location. here is an emoji input, this a famous casefrom the google chrome store. you have wonderful emojis, but hold on: they report your locationevery 10 minutes. they snuck that into the appwithout letting anyone know. it turns out, the main purpose of thisis not to give you emojis
but to tell advertiserswhere in the world you are. here is a weather app, but hold on:it lists your location. at incredible frequencieswith no reason to do that. who of you uses instagram? have you turned geo located images off? probably have,instagram doesn’t care about that. they are geo tagging you, whether you havegeo tagging turned on or not. very sneaky, very immoral.
they shouldn’t do it. finally a famous case.the torch app, it records where you areto advertisers around the world. so designers can use technology to persuade for political advocacylike in the case of oxfam. they seduce people into buying thingsor to do good things in the world. enforcing people to act in certain ways.you see this all the time. lots of platformsforce you to give your name. i was looking at uber yesterday,i would never sign up there.
you are not allowed to use uberuntil you’ve given them your full phone,name and credit card number. that is the first thingthey ask you for, before you can create an account. airbnb have a very complexsystem of feedback: "don't worry, only we see your data". don't know who is sure of that. those technologiesuse invisible power relations. they do this all the time.
one job of the interface designeris to bring them to light. to make you see them. how could things be better? a few clear, easy ways. this is quite a new field, i think,but this is voluntary regulation. code of ethics that are explicitlyfor interface designers. dia is an australian design institute. ethics code for the user experience. this kind of thingsare known in other fields,
why not in interface design? this is what i would call resistance. when you’re being invisibly trackedby your browser or your cookies. you can use many programsto help yourself. there are tools explicitly designedto counter forced actions. stopping apps taking data from you. my argument isthat this is a category of ethical design that takes ethics and moralityexplicitly as subjects. so people who design these things
interrupt this way of trackingwith a thing that they think is better. i support this kind of thing. this is an interventiondirectly to the interface. the first one is “kayakâ€, it's and app and it says:"these are kayak privacy settings. do you want to do this?" and they'll pop up while using the system. the second you can seeis “privacy protectionâ€. it will work in any app.
you can protect yourselfby faking your id. the map one you can startto fake your location. you can drag that pineverywhere in the world you wanna be. this is a good one for instagram users. you might be in berlin,but you say you are in santa monica. quite clever idea i think. the fourth one is... facebook was discovered to track peoplewho visit the site without being members. it is a way of givingfalse names to the system.
this way it will track imaginary people.this will deceive the system. i’m not the only one thinking about thisand i'm not good at presenting it. the designer gabriel whitehas been doing interesting stuff. he says that designersshould provide feedback on how much time people spend on a site. and how the system elicits the changein usage patterns over time. so when i start using an app i might be using itonce a day or something. as it starts to shape my behaviormy usage will change on that app.
i will play a gamemore often or whatever. the interfaceshould explicitly show that. how has this systemstarted to shape my behavior? you could also comparethe way you use a system to the way other people use it. we should design featuresso the users are not encouraged to do repeated and obsessive actions. candy crush, a great example. the ethical thing to do as a designer
will beto consider the behavioral impact. have you deliberately designedit for addictive behavior? the speaker after mehas done research on this topic, too. about how interfacesenforce obedience of a certain kind. so don’t miss the talk after this. i think you are locked in here secretly. so, all of this stuffcan be built into interfaces. what i want to end withis a call for designers. like me, though i'm a teacher as well.
in my design work,it is head-down stuff. it's difficult to lift my head upand explain what i do and how i do it. my call is that we should do that. consider the implicationsfor humanity and behavior. in all the stuff that we design. thanks for not falling asleep.thanks for listening. i’m happy to talk to youif you want to know more about this. thanks a lot! (applause)
john fass, thank you very much.
No comments:
Post a Comment