[00:00.000 --> 00:11.200] So, hello. Welcome, everybody. I'm announced as Winfrey Tilanus, but I should also announce [00:11.200 --> 00:21.480] Emily Tromp. We did this project together, and unfortunately she couldn't make it today [00:21.480 --> 00:27.880] because of problems with public transportation. So, she's now at home watching this presentation [00:27.880 --> 00:36.800] and I hope I can make her proud. I'm Winfrey Tilanus. I'm just a privacy expert and usually [00:36.800 --> 00:47.880] making totally dull, bad design presentations. Emily is a designer, really working with design [00:47.880 --> 00:56.760] thinking and really likes to reframe things. And she's also the one who is responsible for [00:56.760 --> 01:10.400] this really great presentation. I'm totally happy with it. So, value-driven design. And this is [01:10.400 --> 01:18.160] about the case study. And it has been really successful, privately, by design project. But we [01:18.160 --> 01:24.560] did a lot wrong and we make a lot of failures. Fortunately, we learned from it and we went on. [01:24.560 --> 01:35.440] And so, this presentation will be the story about that one. But before I really dive deep into it, [01:35.440 --> 01:42.800] I want to ask you a question. When you think about privacy, which concept do you think is most [01:42.800 --> 01:50.720] related? Is it about the requirements the technicians have to have to build the system in a [01:50.720 --> 02:00.360] private way? Is it about the trust of the end-users? Or is it about agency, about their ability of [02:00.360 --> 02:10.600] end-users to make their own choices? So, what do you think? Let's make a little poll of it. I hope it [02:10.600 --> 02:27.800] works. This is the first time I use NextCloud for a poll. It goes to a domain totally owned by me, [02:27.800 --> 02:32.000] so you have to trust me. So you already answered the question. [02:57.800 --> 03:12.680] It's also an IPv6 that might be a problem. Are you connected to the dual stack? [03:12.680 --> 03:42.040] Yeah, let's... Wait. Yeah. So, we think it's about requirements. [03:42.040 --> 03:57.000] See one hand. We think it's about trust. Alright. A little bit more than half of it. We think it's about [03:57.000 --> 04:11.640] agency. Quite close to each other. But requirements are down the list, I believe. Alright. This is [04:11.640 --> 04:21.240] what I will be talking about. The design challenge, the thing we had to design. Then, three perspectives [04:21.240 --> 04:32.640] on privacy. The approach we did in the design project. And what we learned from it. So, the [04:32.640 --> 04:45.040] design challenge was promoting agency with an open-source ecosystem for e-hels. One button. [04:45.040 --> 04:52.680] There are some things that are really important here. First of all, it is... It had to be [04:52.680 --> 05:01.800] about a digital open-source exchange standard. The idea that was envisioned were all different [05:01.800 --> 05:08.360] kind of e-hels apps. All kind of health professionals. All working together in unpredictable ways [05:08.360 --> 05:19.240] and unpredictable combinations. And that had to cooperate. So we had to find a way to make [05:19.240 --> 05:26.360] a standard to have these them communicate. And all the other starting point of the project [05:26.360 --> 05:32.800] was user agency. The idea was when people have to control about their own path in the [05:32.800 --> 05:38.160] health care system or in the care system, then they make same choices. They choose what's [05:38.160 --> 05:48.240] good for them. And that makes the health care system better and better affordable. So agency [05:48.240 --> 05:53.920] was not only just because we like to be ourselves and to have the possibility to be ourselves. [05:54.120 --> 06:01.440] It was really at the heart of the project of the question. Another part of it is, I already [06:01.440 --> 06:13.480] told a bit, it's really about the ecosystem of care. I see these are the wrong way. Ecosystem [06:13.480 --> 06:21.560] of care applications and blending with health applications. So more informal care, more [06:21.640 --> 06:30.000] social work comparing with really hard medical care. And these have different security levels. [06:30.000 --> 06:34.840] And they need different levels of trust. And trusting your neighbor is something else than [06:34.840 --> 06:49.000] trusting your doctor. So we all had to design for this. Well, then we started and we discovered [06:49.080 --> 06:55.400] that there were three perspectives on privacy that were really causing havoc in the project. [06:55.400 --> 07:04.120] Confusion, misunderstandings, everything. So the three perspectives was really a theoretical [07:04.120 --> 07:09.080] perspective. I told I'm the privacy expert. We like to make dual presentations with lots [07:09.080 --> 07:16.120] of text. And I really like to think about what means privacy, how does it relate to agency. [07:16.200 --> 07:25.400] And I came up with talking with peers on a relational view on privacy. And that relational [07:25.400 --> 07:33.000] view is really focusing on the choices we make. I'm standing here in certain clothes, [07:33.000 --> 07:37.920] revealing certain things about myself, not revealing other things about myself. Those [07:38.000 --> 07:47.000] choices about how we feel create how I relate to you and how I appear to you. And also that [07:47.000 --> 07:52.760] creates a relation. And we all make these choices. Just look around and see how we are. [07:52.760 --> 07:59.720] There's no process. We do this all the time. And by these choices we create agency. By these [07:59.720 --> 08:06.200] choices we can determine who we are and who we are in certain contexts. So that's really [08:06.200 --> 08:17.400] the theoretical sort we had about it. And then we had the end user perspective. And we really [08:17.400 --> 08:25.480] went into neighborhood centers, talked with users, gave them an app, asked how do you experience [08:25.480 --> 08:34.600] this. And they were totally somewhere different. What's this? This is complicated. Don't know [08:34.640 --> 08:43.120] how to handle it. And I have to enter this data. Why? And where does it go? And I don't [08:43.120 --> 08:53.200] trust this thing. And then we were sitting at the table with developers from the companies [08:53.200 --> 08:58.280] making the ELs and the care apps. And they were asking, well, tell us what to do. What [08:58.280 --> 09:13.520] are the requirements? So they're all talking different languages. So we had to find a way [09:13.520 --> 09:21.480] to get on, to get this big misunderstanding solved and to get something that got everything [09:21.480 --> 09:29.600] together. And we had to test it with users. It might be a problem because we were creating [09:29.600 --> 09:37.000] an abstract exchange standard and not an app or something like that. So we had to bridge [09:37.000 --> 09:44.040] a gap. And we took a step back. We're thinking, what are the common values? What are they [09:44.040 --> 09:48.560] talking about? What are the things that they share and what they all want to have in the [09:48.600 --> 09:56.640] system? And that's where the trust and the agency came in. That's where, when we noticed [09:56.640 --> 10:04.080] that users really wanted to see experience that they could trust the system and experience [10:04.080 --> 10:10.520] that they could make changes. And the developers, well, they're not bad guys. They don't want [10:10.520 --> 10:17.240] to violate everybody's privacy. They're working in health apps. So they know that it's important. [10:17.240 --> 10:24.640] So when you talk with them about what's needed in such a ecosystem, they very well understand [10:24.640 --> 10:29.840] there's also trust in the agency. It's really important to give the choices and to make sure [10:29.840 --> 10:38.120] that you are trusted. So we translated these principles of these common values into design [10:38.120 --> 10:49.200] principles. Tell more about it later. And, of course, I was there also and I helped to [10:49.200 --> 10:57.640] treat them, give my own input to the design principles. So that's how we came to a set [10:57.640 --> 11:05.920] of design principles that were bridging this gap. And then we came to the next problem [11:05.960 --> 11:19.200] I already mentioned it, how to use a test. Yeah, how to use a test of standards, abstract [11:19.200 --> 11:28.040] thing that can have totally different faces and all different kind of apps. And, you know, [11:28.040 --> 11:32.200] when we literally went to neighborhoods and talked about trust and agency and everybody [11:32.200 --> 11:45.120] was, what are you talking about? So we came up with a privacy game. And it was a small [11:45.120 --> 11:53.920] game, small simulation. And we used that to envision abstract scenarios. And we made a [11:53.920 --> 11:59.520] scenario about you're in this situation and you want to share this with somebody with [11:59.560 --> 12:05.760] who you want to share with or you're in this situation, you're talking to this person, [12:05.760 --> 12:11.840] what do you share, don't you share? Then we changed in the twist of the game. Turned the [12:11.840 --> 12:19.080] page around and then there was a new part of it. For example, somebody else or some [12:19.080 --> 12:27.080] changing your information, what you maybe made want to share. And based on that, we got into [12:27.360 --> 12:33.440] discussion, talks with the users about how they made the choices and if we could validate [12:33.440 --> 12:40.120] the principles we formulated. So by playing this game, we made it, creating this game, [12:40.120 --> 12:50.320] we really made it possible to see visible what, what, how the abstract scenarios, abstract [12:50.400 --> 12:58.360] principles, design principles worked out. So what did we learn? What are the things we [12:58.360 --> 13:11.960] got out of this? First of all, trust is an iterative process. You don't, even when you [13:11.960 --> 13:18.440] go in at the doctor, it would be totally strange to have a shield at the door of the doctor. [13:18.480 --> 13:26.920] Please undress yourself in the waiting room. You know, you first want to make contact, want [13:26.920 --> 13:34.600] to see, well, is this a doctor I like to see? And then slowly you start to trust the doctor [13:34.600 --> 13:41.360] and then slowly you said, well, now it's appropriate to undress. And the same happens between a [13:41.400 --> 13:50.000] human and a machine or when you interact with an app. It's not just, well, there are many [13:50.000 --> 13:57.640] apps where you have to first page, you see, create an account and you have to tell everything [13:57.640 --> 14:02.200] about yourself. And you're looking at it, well, why should I trust this app with all [14:02.200 --> 14:07.680] this information? Maybe the app suite is show that it's useful and there's some nice interaction [14:07.720 --> 14:20.160] with you. So next thing is reciprocity of the information or what happens. I share, you [14:20.160 --> 14:26.040] share. It's not one way around. Everyone wants to know everything about you and doesn't give [14:26.040 --> 14:32.120] you anything back, doesn't show what it does with it. Maybe you say I want to donate information [14:32.120 --> 14:39.120] about myself because I can help somebody else, but it has to be made visible. It has to be [14:39.120 --> 14:49.640] too racing. So really, I share, you share. People are willing to trust if there's balance. [14:49.640 --> 14:58.520] And then the last one, and that one was really important for the designers, we tell them [14:58.520 --> 15:05.960] these principles, they must be backed with your technical solution. The first thing they [15:05.960 --> 15:12.200] said, well, identify in the system. Well, let's use the email address because it's easy [15:12.200 --> 15:17.560] and everybody has it and then you can use it everywhere. But then people lose control [15:17.560 --> 15:23.400] about where the data goes because somebody can link all the email addresses across all [15:23.480 --> 15:31.080] applications. So if you want to give people really agency, really the possibility to make [15:31.080 --> 15:38.200] choices on a technical level also, then you have to say to the developers, well, for each [15:38.200 --> 15:45.520] link between two apps, you use a unique identifier. And if you want to share a piece of information [15:45.520 --> 15:52.360] across several apps, you have to create a star and make a deliberate decision to share [15:52.440 --> 15:59.720] it and really back that up in your technical architecture. So we really could tell there [15:59.720 --> 16:03.960] are the developers, well, this is what you should do. And it was really nice because [16:03.960 --> 16:08.920] we had many meetings with the developers, different developers, they were really, wow, [16:08.920 --> 16:13.480] we don't understand this. And then we came with these principles. Oh, and then I said, [16:13.480 --> 16:18.840] oh, then can do this and this and then the old piece together for them. And then they [16:18.840 --> 16:36.440] really drafted in several days and first version of a standard. So these are the principles [16:36.440 --> 16:47.240] we found. Well, I have another poll, but let's do it manual. I have five minutes left in [16:47.240 --> 16:53.800] my talk and five minutes for questions and I want to lengthen this. I'm a bit wondering, [16:53.800 --> 17:00.840] curious, and let's just shout out, raise hands, whatever. What do you take away from [17:00.840 --> 17:07.960] this presentation? What did you learn? Because I told about what we learned, but I'm wondering [17:07.960 --> 17:15.480] what you did learn. The back and forth of information and how we should build trust. [17:15.480 --> 17:20.440] I immediately identify with that. When you get an app and you have to fill in everything at first [17:20.440 --> 17:28.200] before you know what it does or what it's going to do with it, then you're way more apprehensive [17:28.200 --> 17:35.960] to do that. I repeat it a bit because otherwise it's not on the video. So it's the interactive [17:35.960 --> 17:46.280] building of trust. I might be projecting here a little bit, but some of the struggles you [17:46.280 --> 17:52.440] mentioned remind me of when I worked in the aid sector, where many people would try to apply [17:52.440 --> 17:59.400] design principles like human-centered design to develop a solution for certain people. But it's [17:59.400 --> 18:07.160] very difficult to ignore these larger economic contexts. So when you talk about privacy and [18:07.160 --> 18:10.840] people not being engaged with it, maybe they aren't politically engaged with the topic, [18:10.840 --> 18:15.400] educating them and figuring out how to mediate that. It reminded me of working in the aid sector. [18:18.440 --> 18:23.080] You have experience in a similar sector and you went into really the commercial [18:23.240 --> 18:30.680] barriers of parties who didn't think this way, didn't want to implement it. [18:36.040 --> 18:42.280] It's interesting because we were with lots of commercial organizations around the table [18:43.480 --> 18:48.920] and they were willing to cooperate and they were willing to change something. [18:49.000 --> 18:53.480] They had some idealism there too, but it is a problem in the care sector because [18:53.480 --> 18:57.640] it is a huge economic sector with lots of money and lots of people who try to get a bit of the [18:57.640 --> 19:03.640] money. So I feel your problem. Any other takeaways? [19:03.720 --> 19:08.360] A quite abstract question. How did you came out with the user testing as a game? [19:08.360 --> 19:14.840] And was it actually a game or was it a game as you just mean as kind of the exchange information? [19:15.560 --> 19:27.240] It was not a game you can win, but it might be played a bit as a party game [19:27.560 --> 19:32.520] and might be nice. They have lots of games, card games from this. [19:42.360 --> 19:48.840] It was a paper game and we literally went to neighborhood centers [19:49.800 --> 19:55.720] and sat with people, visiting there around the table and play the game with them. [19:57.560 --> 20:05.000] And interviewed them afterwards about even before and after about how to even influence. [20:07.080 --> 20:11.800] I work as a UX researcher and sorry, I was just curious. I find it really interesting how [20:12.680 --> 20:18.680] open source projects make decisions about which methodology to use or which tool to use. [20:19.240 --> 20:24.920] So I was just curious how did you guys make the decision to use the game because it's a tool [20:24.920 --> 20:27.960] that I've never heard of before and I'm very open to learning. [20:29.000 --> 20:34.440] So the question is how did you make the choice for the game for testing? [20:35.160 --> 20:44.360] And well, we didn't make a choice. We tried lots of things and we had many sessions in neighborhood [20:44.360 --> 20:53.480] centers that were totally failed and then we had a feeling well we need a way to discuss more. [20:54.440 --> 21:00.680] And then we came up with the game and we noticed this works and now when we're looking back we can [21:00.680 --> 21:07.240] make a nice story about it. Linear from well we had the game and it worked but I left out [21:08.840 --> 21:15.960] about a dozen of failures there. I can give you more information about the game and it's [21:16.920 --> 21:21.720] yeah, I've bought a paper on it and the game available too. [21:23.800 --> 21:31.080] Thank you. I never considered including privacy into exactly what I'm thinking of process. [21:32.840 --> 21:39.160] My question would be, how do you include a GNPR inside the process because it's part of the [21:39.160 --> 21:45.560] privacy you get to have the perfection of this art because saying to the end users [21:46.040 --> 21:51.400] this is the low and this is how it can help you and protect you if you have room for the trust. [21:52.520 --> 21:59.320] So the question is how do you include the GNPR in font and design? [22:00.280 --> 22:10.760] And well, I love. I once been introduced at the conference as a person who laughs about the GNPR. [22:11.400 --> 22:20.360] The point is, when you work like this, well, let's move on. We are now here. When you work like this, [22:22.280 --> 22:32.200] you have a far better way of handling privacy of your users than the DDPR says. And for example, [22:32.200 --> 22:38.440] part of the pattern is that in the iterative pattern you only ask for information when needed. [22:39.160 --> 22:43.320] And then you make clear why you need it. But that is already [22:45.000 --> 22:51.960] awfully close to an official privacy question you may have to ask according to the DDPR. [22:52.680 --> 22:57.320] Maybe add a link to some additional information with all legal stuff in it. [22:58.280 --> 23:03.800] But they are already there. Then you have asked a meaningful consent question. [23:04.760 --> 23:09.000] And that's, but that's just logical to do it at when you're designing this way. [23:09.560 --> 23:15.320] So just design for privacy and then the DDPR should be included already. [23:22.200 --> 23:24.520] Yeah, good to know you're the dark side. [23:25.480 --> 23:33.800] Well, can you give a clear example of how you do reciprocity and information exchange? [23:35.800 --> 23:47.320] Yeah, for example, there are two levels because you can, between humans, you can deliberately [23:47.320 --> 23:53.400] support and we already also had some design patterns, but there were UI design patterns [23:53.400 --> 24:00.600] that couldn't, couldn't be forced on all apps. But you can really support [24:02.840 --> 24:09.160] two-way communication before telling somebody something. And you can really support, [24:12.120 --> 24:19.080] well, I asked you this, but I want to ask you this because when I know I can do this or this for you. [24:20.040 --> 24:28.360] And so you can support in your design of the chat, interaction, messages, patterns like that. [24:28.360 --> 24:32.760] And you can also do that in human-to-machine things. [24:34.760 --> 24:44.120] The app asking, well, I really like, if you like to, then we can arrange it if you come back [24:44.920 --> 24:50.840] that I remember what you did. But to do so, I need an account. [24:56.120 --> 25:05.640] Exactly. Exactly. And don't ask before it is already clear what you can give back for it. [25:07.080 --> 25:13.880] Start, for example, with anonymous accounts. And then people can see what the app is doing. [25:14.840 --> 25:18.920] And well, at a certain point, you can say, maybe it's interesting if I remember this. [25:19.640 --> 25:29.160] Shell, would you like to make an account? So don't ask the account. And then for the account, [25:29.160 --> 25:32.360] ask only what's needed for the account. Don't ask anything more. [25:36.840 --> 25:37.320] Yeah? [25:37.320 --> 25:40.280] One final question. We're just coming close to time. [25:41.240 --> 25:42.360] What kind of failures? [25:47.160 --> 25:48.680] One final question you said. [25:54.200 --> 26:03.640] We had, for example, try to make a really taxonomy and architecture that would [26:04.600 --> 26:13.880] support all fine-grained type of privacy patterns. And that became a huge picture with lots of [26:13.880 --> 26:20.760] doorings and lines and data model that was really exploding on all sides. And [26:22.200 --> 26:28.520] nobody understood it anymore. And it was bonkers to even think about implementing it. [26:29.480 --> 26:40.280] And spend some days on doing all the boxes. No, that's not the way to go. To give one example. [26:43.160 --> 26:50.440] So the answer was constant feedback and express the whys and not just the whys. [26:50.520 --> 26:52.280] Exactly. Yeah. [26:54.280 --> 27:06.360] Yeah, I think we're just over time now. So thank you so much for your time.