[00:00.000 --> 00:08.320] Thanks, everybody, for coming. [00:08.320 --> 00:12.160] So yeah, I am not from the corporation, I'm from the Mozilla Foundation. [00:12.160 --> 00:16.320] And so besides being the non-profit behind the Firefox browser, we do a lot of other [00:16.320 --> 00:17.480] cool stuff. [00:17.480 --> 00:21.440] And so for those of you not familiar with it, I do want to go briefly over that. [00:21.440 --> 00:25.640] So the Mozilla Foundation, of course, like the corporation, is mission-driven and believes [00:25.640 --> 00:30.800] that the internet should be open, accessible, free-fair. [00:30.800 --> 00:35.640] And so at the Foundation side, we do a lot of interesting research, a lot of it into [00:35.640 --> 00:40.560] social media, and a lot of it related to kind of accountability and transparency of social [00:40.560 --> 00:44.920] media, the kind of the systemic problems that we see on a lot of the web. [00:44.920 --> 00:48.600] So some of you might have even participated in some of our research through, for instance, [00:48.600 --> 00:53.520] for Greta's reporter, through data donations from lots of people, we were able to actually [00:53.520 --> 00:58.520] investigate YouTube's algorithm in a way that there are a lot of us to get a view of [00:58.520 --> 01:02.320] it from the outside, right, because YouTube is not necessarily handing over that information [01:02.320 --> 01:04.560] right now. [01:04.560 --> 01:09.720] We've also done, we do policy work around kind of legislative files that either have great [01:09.720 --> 01:13.440] potential for our mission or might affect our mission in certain ways. [01:13.440 --> 01:17.360] You might have also followed some of this work over the years. [01:17.360 --> 01:20.080] We do advocacy and campaigning. [01:20.080 --> 01:23.440] And then we publish every year an internet health report. [01:23.440 --> 01:27.520] And so this year's internet health report is really worth checking out because it's actually [01:27.520 --> 01:29.520] in the form of a podcast. [01:29.520 --> 01:34.040] So you can find that and listen to those episodes. [01:34.040 --> 01:40.840] And Moz Fest also is taking place online in hubs in March. [01:40.840 --> 01:43.800] And then it will take place physically in a couple of different locations in June, but [01:43.800 --> 01:47.760] I would definitely encourage you to attend some of those sessions happening. [01:48.320 --> 01:52.480] Okay, but I'm going to talk to you about something very specific today. [01:52.480 --> 01:55.440] And I'm curious how many people here have actually heard of the Digital Services Act. [01:55.440 --> 01:57.640] Just raise your hand if you've heard of it. [01:57.640 --> 01:58.640] Okay, great. [01:58.640 --> 01:59.640] So it's a big deal. [01:59.640 --> 02:00.640] I think it's a big deal. [02:00.640 --> 02:01.640] Great. [02:01.640 --> 02:05.840] And as I'm going to go through really an overview of the Digital Services Act and particularly [02:05.840 --> 02:12.840] the obligations to a certain kind of actor, this is, yeah, this is an ongoing conversation. [02:12.840 --> 02:16.760] So I'm really curious in your thoughts as well about this. [02:16.760 --> 02:21.880] So I'm going to talk about why did the EU decide to make a Digital Services Act? [02:21.880 --> 02:24.000] What is roughly in this? [02:24.000 --> 02:26.040] And what are the next steps where are we in the process? [02:26.040 --> 02:31.040] Because while the Digital Services Act is agreed and is applied, it's enforced, sorry, [02:31.040 --> 02:34.000] it's not actually been applied to the services yet. [02:34.000 --> 02:38.200] We have to wait until the end of the year and until next year, actually. [02:38.200 --> 02:42.160] So first of all, why create a Digital Services Act? [02:42.160 --> 02:48.720] So the DSA is really designed to update the e-commerce rules, which are now almost two [02:48.720 --> 02:49.960] decades old. [02:49.960 --> 02:52.240] So it really takes that for the internet stack. [02:52.240 --> 02:57.600] And then it adds on to that, looking at kind of the way that the web has changed today. [02:57.600 --> 03:02.340] As we know in the last 20 years, sort of the handful of large companies have essentially [03:02.340 --> 03:04.840] taken over the web. [03:04.840 --> 03:11.040] And so at the same time, there are many countries that are trying to address social media and [03:11.040 --> 03:12.880] web policy. [03:12.880 --> 03:16.520] And the idea is that the EU can do this in a harmonized way, rather than having one law [03:16.520 --> 03:17.920] in one country and one law in another. [03:17.920 --> 03:23.800] We kind of make a harmonized playing field, which is then easier for providers as well [03:23.800 --> 03:26.080] as for the end user. [03:26.080 --> 03:30.840] And the DSA should really be seen also in conversation with the Digital Markets Act, [03:30.840 --> 03:33.720] which is probably really interesting to many of you here. [03:33.720 --> 03:36.880] And these really are two sides of a coin in a sense, and they're both different ways of [03:36.880 --> 03:39.720] addressing a problem of fairness. [03:39.720 --> 03:44.120] So the Digital Markets Act is really intervening in saying, OK, there are certain gatekeepers [03:44.120 --> 03:47.720] that have actually used their market power in an unfair way. [03:47.720 --> 03:53.800] And we need to update our competition law in certain ways. [03:53.800 --> 03:58.800] And then sort of a macro level fairness, fairness between different services. [03:58.800 --> 04:02.800] And then you have the Digital Services Act, which is fairness in a micro sense. [04:02.800 --> 04:06.440] It's fairness between a service provider and the end user. [04:06.440 --> 04:12.800] Where does that service deliver to the end user in an illegible way? [04:12.800 --> 04:13.800] Is it transparent? [04:13.800 --> 04:16.640] Is it accountable for the service that it's ultimately giving them? [04:16.640 --> 04:22.000] And it tries to kind of raise the bar instead of high bar when it comes to fairness towards [04:22.000 --> 04:24.760] end users. [04:24.760 --> 04:28.000] So two sides of the coin. [04:28.000 --> 04:29.400] Yes, what is it? [04:29.400 --> 04:30.640] What is the Digital Services Act? [04:30.640 --> 04:31.640] What does it do? [04:31.640 --> 04:34.240] Well, first of all, who does it apply to? [04:34.240 --> 04:37.360] As I mentioned, it's really taken the e-commerce directive. [04:37.360 --> 04:41.240] So for the internet stack, it's essentially moving that across. [04:41.240 --> 04:47.000] So that it categorizes the services that it applies to these four different categories. [04:47.000 --> 04:53.320] And the most interesting, the most important obligations are on this new category that [04:53.320 --> 04:58.840] it creates, which is very large online platforms and very large online search engines, of which [04:58.840 --> 05:01.960] there is one, maybe two. [05:01.960 --> 05:05.640] And also online platforms themselves, and the idea of an online platform being different [05:05.640 --> 05:10.640] than a hosting service, is that it's disseminating user-generated content publicly. [05:10.640 --> 05:17.920] But I'm really going to be focused on this middle blue block. [05:17.920 --> 05:20.640] So it comes up with this new term, which is a VLOP. [05:20.640 --> 05:21.640] Some people say VLOP. [05:21.640 --> 05:22.640] I like to say VLOP. [05:22.640 --> 05:23.640] It's really up to you. [05:23.640 --> 05:31.720] But the idea is that a VLOP is a platform that is serving over 10% of the EU population, [05:31.720 --> 05:38.000] that it has a systemic role in our information ecosystem, and therefore it deserves to have [05:38.000 --> 05:41.280] some corresponding responsibilities. [05:41.280 --> 05:45.640] So it relates to 45 million active monthly users. [05:45.640 --> 05:48.080] You might ask, what is an active monthly user? [05:48.080 --> 05:50.200] It is a good question. [05:50.200 --> 05:54.880] And platforms are in the process of counting this up right now. [05:54.880 --> 05:57.440] And of course, there are going to be different ways that this can be counted. [05:57.440 --> 06:00.520] The commission has provided some guidelines around this. [06:00.520 --> 06:04.200] There's still some ambiguity. [06:04.200 --> 06:11.560] And also, very large online search engines doesn't fit so beautifully into the DSA structure. [06:11.560 --> 06:15.120] This was added by the council, but the idea is that search engines that also have the [06:15.120 --> 06:22.040] same threshold also should be accountable in the same kinds of ways. [06:22.040 --> 06:26.960] So the DSA takes an asymmetric structure. [06:26.960 --> 06:30.000] That's usually not a difficult word for me. [06:30.000 --> 06:36.040] So the idea is that the larger you are, the more obligations that you have. [06:36.040 --> 06:39.720] And there are key exemptions for small and micro enterprises. [06:39.720 --> 06:43.520] So this is important to maybe a small, fediverse actor. [06:43.520 --> 06:50.120] There are exemptions if you have only a certain number of employees or only a certain revenue. [06:50.120 --> 06:55.720] But for the other ones, there are real obligations and there are the possibility of fines. [06:55.720 --> 06:58.400] So the DSA is a thing to be taken seriously. [06:58.400 --> 07:04.160] And this gives a kind of outline of the different obligations that I can go over more so specifically [07:04.160 --> 07:06.720] for the very largest online platforms. [07:06.720 --> 07:09.400] So the DSA, it doesn't say what is illegal. [07:09.400 --> 07:13.280] What is illegal is defined by EU member states and by EU law. [07:13.280 --> 07:16.120] And the DSA really tries to take a systemic approach. [07:16.120 --> 07:19.760] It's not trying to kind of whack a mole, take down pieces of illegal content because we [07:19.760 --> 07:22.800] know, we've been in this space for a while, we know it doesn't really work that way. [07:22.800 --> 07:25.200] I used to work on disinformation. [07:25.200 --> 07:26.600] It's not about a single piece of content. [07:26.640 --> 07:31.280] It's about the way that the service is providing narratives and amplifying certain things. [07:31.280 --> 07:32.760] But it's really structural. [07:32.760 --> 07:38.040] And so the DSA is really trying to take the systemic approach and asks each very large [07:38.040 --> 07:41.840] online platform to go and say, what are your systemic risks? [07:41.840 --> 07:47.840] Do they relate to certain kind of online violence that your service might be perpetuating? [07:47.840 --> 07:49.840] They relate to privacy. [07:49.840 --> 07:56.360] So and the platform then has to address that and actually propose their own ways of [07:56.360 --> 07:58.760] addressing that systemic risk. [07:58.760 --> 08:02.200] It also does have some very clear ideas though about how platforms should be addressing their [08:02.200 --> 08:03.200] systemic risks. [08:03.200 --> 08:09.600] For instance, they have to have advertising libraries where researchers can go through, [08:09.600 --> 08:13.880] actually anyone can go through it and look at the archive of what has been published [08:13.880 --> 08:18.480] as promotional content. [08:18.480 --> 08:22.800] Platforms that are over the threshold have to have a recommendation algorithm that is [08:22.800 --> 08:26.920] not based on user profiling onto the definition of the GDPR. [08:26.920 --> 08:31.000] So experts in GDPR might say, maybe we can still have a recommender system that's using [08:31.000 --> 08:33.440] some kind of engagement type of data. [08:33.440 --> 08:36.480] But the idea is that you have to have a really viable alternative. [08:36.480 --> 08:41.440] There are rules about online interface and design. [08:41.440 --> 08:45.240] So the idea that you shouldn't have deceptive patterns that are tricking you into maybe [08:45.240 --> 08:48.960] making decisions online that you wouldn't normally take. [08:48.960 --> 08:52.000] Data access for vetative researchers, it's critically important. [08:52.000 --> 08:57.400] We know this at the Mozilla Foundation having researched very large online platforms ourselves. [08:57.400 --> 09:00.680] It's very difficult to know sometimes what's going on inside the platform. [09:00.680 --> 09:04.880] So the idea is that eventually there will be infrastructure for the largest platforms [09:04.880 --> 09:10.880] to actually share certain data sets related to their systemic risks with vetive researchers. [09:10.880 --> 09:15.200] This is a very long process of building up this infrastructure, making sure that it has [09:15.200 --> 09:16.200] really clear safeguards. [09:16.200 --> 09:20.600] But this is really important and for those of you who read the news that Twitter is going [09:20.600 --> 09:25.200] to get rid of its free API, this is a really critical problem is how are we going to continue [09:25.200 --> 09:29.560] to hold platforms to account if we don't understand what's really happening inside of them. [09:29.560 --> 09:33.280] So many caveats on this data access regime, it really is about systemic risks. [09:33.280 --> 09:38.120] It really is about compliance and it is for researchers. [09:38.120 --> 09:42.040] But this is a really, really important thing that's going to change over the next couple [09:42.040 --> 09:43.040] of years. [09:43.040 --> 09:48.760] Codes of conduct, so like the Code of Practice on Disinformation, which now joins in the DSA, [09:48.760 --> 09:53.120] there will be also codes of conduct related to accessibility and related to other kind [09:53.120 --> 09:57.240] of key areas where platforms and companies might be able to come together and decide [09:57.240 --> 10:00.960] for themselves to an extent what are important measures they can take and then that's backed [10:00.960 --> 10:04.360] up through the rest of the regulatory framework. [10:04.360 --> 10:08.320] And then crisis response, which we can maybe talk about in the Q&A. [10:08.320 --> 10:10.280] But so what are the next steps? [10:10.280 --> 10:15.840] Where are we in this legislative process or now it's the implementation process, right? [10:15.840 --> 10:18.680] We have the regulation now. [10:18.680 --> 10:22.440] But there's a lot of things that are yet to be determined. [10:22.440 --> 10:27.360] So yeah, we've had, the draft was released in 2020, it's been a couple of years working [10:27.360 --> 10:34.000] on it with the institutions and now we are coming on a very important moment, which is [10:34.000 --> 10:36.800] so it's in force, but who does it apply to? [10:36.800 --> 10:42.880] And so on February 17th, so mark your calendars, platforms that are likely very large online [10:42.920 --> 10:48.880] platforms are very large online search engines are supposed to notify the European Commission [10:48.880 --> 10:49.880] of this. [10:49.880 --> 10:54.160] But this is actually something kind of funny happened here, which is that it's not written [10:54.160 --> 10:59.360] in the DSA exactly how the platforms or how the commission will find out which platforms [10:59.360 --> 11:00.360] are in it. [11:00.360 --> 11:07.440] So platforms are told to put this number on somewhere on their interface. [11:07.440 --> 11:10.920] So the commission might have to go and check a lot of platforms interfaces. [11:10.960 --> 11:14.160] They've been encouraged platforms have been encouraged to actually email the commission. [11:14.160 --> 11:16.520] There's an email address that they can use. [11:16.520 --> 11:21.360] But if some of you have some free time the next couple of weeks and want to find out [11:21.360 --> 11:25.640] if platforms are publishing their numbers on their interface, please do. [11:25.640 --> 11:27.600] That would be great. [11:27.600 --> 11:31.040] So yes, we'll see what happens on February 17th. [11:31.040 --> 11:35.920] And then so for the largest online platforms, the DSA will actually apply to them already [11:35.920 --> 11:39.800] at the end of actually midway through this year in July. [11:39.800 --> 11:45.000] And then for the rest of the stack and for the smaller online platforms, it won't apply [11:45.000 --> 11:48.000] until 2024. [11:48.000 --> 11:49.000] So we have a bit of time. [11:49.000 --> 11:55.720] And this time is because there's a lot of kind of secondary legislation, codes of conduct, [11:55.720 --> 12:00.440] voluntary standards, a lot of the real kind of details of the DSA are actually yet to [12:00.440 --> 12:02.760] be really specified and spelled out. [12:02.760 --> 12:04.360] And that process is what's happening now. [12:04.360 --> 12:06.920] And that's why it's great that you guys are in this room and maybe you're interested [12:06.920 --> 12:10.200] in some of these details, which are in some cases really not details at all. [12:10.200 --> 12:11.200] They're very, very important. [12:11.200 --> 12:16.680] Yes, so one of these things is the designation of the very large online platforms. [12:16.680 --> 12:18.440] One of them is the data access regime. [12:18.440 --> 12:24.480] We have a secondary delegated act that will actually entail what is the data access regime. [12:24.480 --> 12:29.200] Auditing requirements, I think I might have skipped that, but the very large online platforms [12:29.200 --> 12:32.800] and search engines have to submit themselves to an annual audit. [12:32.800 --> 12:36.120] They pay for this audit, but they have to be submitted to this audit. [12:36.120 --> 12:40.560] And this audit is all of their due diligence obligations, the massive kind of audit. [12:40.560 --> 12:46.960] And the kind of details of what this auditing regime will look like are actually being elaborated [12:46.960 --> 12:50.520] now in a delegated act, so that's really important. [12:50.520 --> 12:53.640] And we'll have a draft of that delegated act after that you can look at and give feedback [12:53.640 --> 12:57.640] on that should appear in the next couple of weeks towards the end of the month. [12:57.640 --> 13:04.480] Yes, and then guidelines, various voluntary standards will come out on a lot of different [13:04.480 --> 13:08.520] elements of the GSA. [13:08.520 --> 13:11.160] And right now, regulators are really getting prepared. [13:11.160 --> 13:13.000] They have a massive amount of work to do. [13:13.000 --> 13:17.720] The European Commission for the first time is going to become a regulator, actually. [13:17.720 --> 13:19.560] This was not the case with GDPR, right? [13:19.560 --> 13:23.000] It was really the member states that were regulating the GSA now for the largest online [13:23.000 --> 13:26.560] platforms, the Commission will be overseeing them. [13:26.560 --> 13:31.800] But of course, everyone else is going to be overseeing by regulators in their member states, [13:31.800 --> 13:35.760] so all EU countries are in the process of designating something called the digital services [13:35.760 --> 13:42.640] coordinator that will be responsible for overseeing the compliance of regulation in [13:42.640 --> 13:45.400] that member state. [13:45.400 --> 13:50.000] And a lot of online, very large online platforms are based in Ireland, right? [13:50.000 --> 13:54.000] And so that means that the eventual Irish digital service coordinator will have quite [13:54.000 --> 13:59.440] an important role, probably not as important as it had in the GDPR towards data protection [13:59.440 --> 14:03.000] issues, but still a very important role. [14:03.000 --> 14:08.960] And the European Commission has recently actually opened up an entirely new institution, which [14:08.960 --> 14:12.360] is the European Center for Algorithmic Transparency, ECAST. [14:12.360 --> 14:18.240] It has offices in Brussels, in Italy and in Spain, and they're going to be really serving [14:18.240 --> 14:23.440] as kind of an additional kind of technical support to the Commission to understand some [14:23.440 --> 14:28.680] of the research challenges that it has in front of it. [14:28.680 --> 14:31.000] And that's where I'm going to close. [14:31.000 --> 14:35.760] As usual, I talk fast, so I have a little bit of time, but that's okay. [14:35.760 --> 14:43.920] And if you have questions, yeah, I'm on Mastodon, I'm also on Twitter, but, and this is my, [14:43.920 --> 15:00.080] but this feels like a Mastodon crowd, so yeah. [15:00.080 --> 15:02.080] The hand went up really fast, yeah. [15:02.080 --> 15:03.080] Yeah, right there. [15:03.080 --> 15:04.080] Thank you very much. [15:04.080 --> 15:05.080] We also have a question in Matrix. [15:05.080 --> 15:09.360] Say, question for the DSA, where does the platform have to publish, communicate their [15:09.360 --> 15:10.880] monthly active users? [15:10.880 --> 15:12.520] Yeah, that's a good question, right? [15:12.520 --> 15:16.200] So they're supposed to put it on their, on their interface, I don't really know what [15:16.200 --> 15:17.200] that means. [15:17.200 --> 15:23.040] But they're supposed to put it on their interface, but they should also email the European Commission. [15:23.040 --> 15:26.040] So if that question is being asked by a very large online platform, you should email the [15:26.040 --> 15:31.040] Commission. [15:31.040 --> 15:34.520] Hello, thank you for the communication. [15:34.520 --> 15:38.480] I was waiting to know more about the law in itself. [15:38.800 --> 15:44.520] Recently, there have been a condemnation of Facebook and the US has been in lawsuit between [15:44.520 --> 15:48.440] LinkedIn and a small company about web scrapping. [15:48.440 --> 15:54.560] And web scrapping right now is, is, is the way for us, for developer and the developer [15:54.560 --> 16:00.320] open source developers, to access to the data on this platform because they are the only [16:00.320 --> 16:07.040] one that can have the monopoly, but the clear they have the monopoly on it yet the regulation [16:07.080 --> 16:09.560] actually is harsh. [16:09.560 --> 16:16.160] It could, it could be interpreted as forbidding data scrapping in somehow that the point of [16:16.160 --> 16:18.920] this company that's saying that it's forbidden. [16:18.920 --> 16:27.520] I wanted to know if they, this new law push, push obligation requirements for the, for [16:27.520 --> 16:36.360] the company for this block to have public APIs or at least not to block application [16:36.400 --> 16:38.880] want to, to make the content free. [16:38.880 --> 16:39.880] Yeah. [16:39.880 --> 16:44.520] I'm not sure is that clear because I'm not, I'm not native English. [16:44.520 --> 16:45.520] No, that was really clear. [16:45.520 --> 16:51.960] So questions about web scraping and how the DSA will block or, or help researchers essentially [16:51.960 --> 16:53.600] study the large online platforms. [16:53.600 --> 16:55.360] I can't respond to web scraping specifically. [16:55.360 --> 16:56.880] It's a really important question, right? [16:56.880 --> 17:01.400] But as to the kind of the ability of the DSA to encourage and facilitate research and [17:01.440 --> 17:08.760] especially through APIs, it will encourage, so for the largest online platforms again, [17:08.760 --> 17:11.080] it will require them to share data. [17:11.080 --> 17:13.320] It doesn't exactly say how we'll do this. [17:13.320 --> 17:16.800] It talks about interfaces, right? [17:16.800 --> 17:19.760] And this is again to be elaborated through the delegated act. [17:19.760 --> 17:24.040] So that your question really is going to be answered over the next, you know, year and [17:24.040 --> 17:26.000] a half or so. [17:26.000 --> 17:31.680] But specifically the platforms will have to, to share data and have to allow research. [17:31.680 --> 17:33.600] It won't necessarily be with everyone. [17:33.600 --> 17:38.680] There are clarifications about what is a vetted researcher, but it will, and we have this [17:38.680 --> 17:43.520] already in the text, at least in the article on, on data access, there are tiers. [17:43.520 --> 17:48.680] So for a tier of data, which is quote unquote manifestly made public, so we can think about [17:48.680 --> 17:53.880] if we think about Facebook, it's like a public page, that data should be available through [17:53.880 --> 17:55.200] actually an interface. [17:55.200 --> 17:59.880] It's essentially trying to mandate a kind of crowd tangle. [17:59.880 --> 18:04.720] So this isn't everything that necessarily we want to see, and it's also, this has to [18:04.720 --> 18:14.960] also itself still be elaborated, but the GSA is trying to encourage this kind of research. [18:14.960 --> 18:22.840] Speaking of mastodon, so if, like how does this definition of very large online platform [18:22.840 --> 18:26.800] come when it comes to like federated services? [18:26.800 --> 18:34.360] Like if mastodon grows to like, I don't know, hundreds of million of users and all the European [18:34.360 --> 18:45.000] instances together hit this 10% like point of numbers of users, but none of them individually [18:45.000 --> 18:46.920] actually do this. [18:46.920 --> 18:50.920] Like is there any specification for this and the regulation? [18:51.640 --> 18:56.000] Yeah, so this should be taken as legal advice for people who run mastodon servers. [18:56.000 --> 19:00.120] This I can tell you my understanding, and then I can say that the GSA was really not [19:00.120 --> 19:02.840] designed to try to penalize the Fediverse. [19:02.840 --> 19:05.680] The GSA is trying to encourage the Fediverse, it's trying to look at platforms that are [19:05.680 --> 19:07.560] largely profit driven, right? [19:07.560 --> 19:13.480] But in the Fediverse, it would be treated, I'm fairly, according to my understanding, [19:13.480 --> 19:18.960] as an individual server would be a platform, and so if you had one individual server that [19:19.000 --> 19:23.000] finally reached the threshold of 45 million active monthly users, which I think we're [19:23.000 --> 19:28.960] very, very far from, then there might be some obligations, but at that point they have a [19:28.960 --> 19:32.440] lot of other problems, I think if you're running a server that's that big. [19:32.440 --> 19:35.560] Otherwise it would be, yeah, I think we're really going to be looking at small micro [19:35.560 --> 19:36.560] platforms there. [19:37.560 --> 19:44.560] As far as I understood, the GSA applies to organizations and companies, right? [19:44.560 --> 19:50.560] Are you aware of there's any initiative to do something similar when it comes to the [19:50.560 --> 19:53.560] profession of software engineering? [19:53.560 --> 20:00.560] Because you could argue that people who build platforms also share the responsibility and [20:00.560 --> 20:02.560] the safety of the platforms. [20:02.560 --> 20:05.560] That's such a, I really like that question. [20:06.560 --> 20:11.560] The question is there, so the GSA is largely targeting companies, right, over platforms, [20:11.560 --> 20:14.560] but what about applying to software engineers as such? [20:14.560 --> 20:19.560] I mean, it's challenging, this is, I'm speaking really my personal opinion here, but to hold [20:19.560 --> 20:23.560] a person responsible for their company, or the thing that they're providing, that's [20:23.560 --> 20:28.560] difficult, but I think as far as there's some really interesting articles in the GSA that [20:28.560 --> 20:33.560] will be interesting to see how companies interpret them and how individuals interpret them, like [20:33.560 --> 20:37.560] on the interface design, I think this is really up to designers to do a really great and [20:37.560 --> 20:39.560] transparent job on that. [20:39.560 --> 20:43.560] So yeah, I don't think that, I mean, it's difficult to have regulation applied to employees, [20:43.560 --> 20:48.560] right, that would be tricky, but it's an interesting question. [20:55.560 --> 20:58.560] There, any other questions? [20:59.560 --> 21:05.560] So specifically, what it comes down to in the end are the details, right? [21:05.560 --> 21:10.560] So how's the Mozilla Foundation, and people in general, influence in these details that are [21:10.560 --> 21:15.560] nothing about that, in the end also, like, what's the case? [21:16.560 --> 21:18.560] So I didn't hear the second part. [21:18.560 --> 21:21.560] What were the examples at the end that you mentioned? [21:21.560 --> 21:35.560] The question is, what about the details, which are really yet to come, including also court decisions? [21:35.560 --> 21:41.560] Yeah, I think there are going to be a lot of civil society organizations that are going to engage [21:41.560 --> 21:43.560] probably in strategic litigation. [21:43.560 --> 21:46.560] I think that would be a huge component of the next couple of years. [21:46.560 --> 21:50.560] As for the Mozilla Foundation, we're really interested in the implementation phase, right? [21:50.560 --> 21:56.560] The areas that the commission has very much asked for help on certain things, [21:56.560 --> 22:00.560] that, you know, they're consulting with stakeholders, there are public processes, [22:00.560 --> 22:03.560] the Delegated Act, where they publish them and they want feedback. [22:03.560 --> 22:08.560] So really following those opportunities, and I think a lot of civil society actors are as well. [22:08.560 --> 22:12.560] And then there is a lot of, these codes of conduct, I think, are really interesting, [22:12.560 --> 22:16.560] and the voluntary standards where organizations and civil society actors can come together [22:16.560 --> 22:18.560] and add some details. [22:18.560 --> 22:22.560] So the research community has a huge role, I think, to play. [22:22.560 --> 22:26.560] So yeah, definitely Mozilla and definitely a lot of other actors are engaging, [22:26.560 --> 22:29.560] but there's so many opportunities to do that. [22:29.560 --> 22:32.560] So if you're interested in them, you can send me an email, actually. [22:34.560 --> 22:38.560] Is there any interaction between GDPR and DSA? [22:38.560 --> 22:40.560] There is, yeah. [22:40.560 --> 22:45.560] So in a couple of, I mean GDPR always applies, but it's sort of a bedrock for a lot of these things, [22:45.560 --> 22:48.560] but there are a couple of really interesting articles where they have overlap. [22:48.560 --> 22:51.560] So for instance, the provision that very large online, [22:51.560 --> 22:55.560] sorry, the question was the relationship between GDPR and DSA, [22:55.560 --> 22:59.560] and I'm not going to give you a holistic answer on that at all, you would ask the specialist, [22:59.560 --> 23:03.560] but I can tell you the provisions that I'm really interested in that overlap. [23:03.560 --> 23:08.560] So one of them is on the requirement for a very large online platform [23:08.560 --> 23:11.560] to have a recommender system not based on profiling. [23:11.560 --> 23:13.560] This is typically an area where they have overlap [23:13.560 --> 23:16.560] because profiling is understood under the GDPR. [23:16.560 --> 23:19.560] So it'll be really important to see how this is actually understood [23:19.560 --> 23:22.560] and how platforms will implement that [23:22.560 --> 23:29.560] and then how regulators will also see their responsibility to ensure that that's enforced. [23:29.560 --> 23:34.560] And that's one area, and there are a few other articles where they're also that kind of similar overlap. [23:35.560 --> 23:39.560] We have a question in matrix. [23:39.560 --> 23:43.560] Do you fear any backfire action from companies to happen, [23:43.560 --> 23:47.560] like the GDPR created the pop-up hell of consent? [23:47.560 --> 23:50.560] Okay, the question is about backfire. [23:50.560 --> 23:53.560] Well, technically it wasn't GDPR, it was the ePrivacy Directive. [23:53.560 --> 23:56.560] It's a combination of the two that's responsible for cookie banners, [23:56.560 --> 23:58.560] but it is a really good question. [23:58.560 --> 24:02.560] And this is where I think it's actually really important to be clear about [24:02.560 --> 24:05.560] what is compliance to the letter of the regulation [24:05.560 --> 24:08.560] and what is meaningful compliance for the actual end user [24:08.560 --> 24:10.560] whose experience you want to help, right? [24:10.560 --> 24:14.560] So it's possible that companies might take the regulation [24:14.560 --> 24:18.560] and in their kind of design of their new obligations [24:18.560 --> 24:22.560] make something really annoying for users to get them to not want to do it. [24:22.560 --> 24:25.560] For instance, take this example again of the alternate recommender system. [24:25.560 --> 24:27.560] You could hide it behind eight clicks [24:27.560 --> 24:30.560] and you could make it change back the next morning. [24:30.560 --> 24:33.560] You know, you could make it somehow like really boring. [24:33.560 --> 24:35.560] I don't know, but there are ways that you could make it unattractive, [24:35.560 --> 24:37.560] but it's still comply. [24:37.560 --> 24:40.560] And so I think it's important that companies decide to comply [24:40.560 --> 24:44.560] in a way that's also appealing and also fair, [24:44.560 --> 24:47.560] because I don't think that cookie banners are really complying [24:47.560 --> 24:50.560] with GDPR privacy in a way that is really fair. [24:53.560 --> 24:55.560] We are a bit out of time. [24:55.560 --> 24:57.560] Okay, just ask a quick question. [24:58.560 --> 25:02.560] So just related to that, especially in the context of dark patterns [25:02.560 --> 25:06.560] which is coded after GDPR because how you basically never know [25:06.560 --> 25:09.560] what you're thinking there is a normal amount of dark patterns [25:09.560 --> 25:12.560] that appear because of GDPR because now people or companies [25:12.560 --> 25:16.560] are trying to make it impossible for the user to know what is consensual or not. [25:16.560 --> 25:21.560] And so it was clearly not too easy to regulate that since everyone is doing it. [25:21.560 --> 25:24.560] So how would the GSA be different? [25:24.560 --> 25:29.560] How can the GSA then regulate it in a way that companies don't just play around? [25:29.560 --> 25:32.560] Yeah, the question is about dark patterns or deceptive design [25:32.560 --> 25:34.560] and especially after GDPR. [25:34.560 --> 25:38.560] So there is a specific article in the GSA Article 25 [25:38.560 --> 25:40.560] that looks at online interface. [25:40.560 --> 25:43.560] So it's not looking at any kind of patterns, [25:43.560 --> 25:45.560] but it's looking at online interface design [25:45.560 --> 25:49.560] and it's obliging platforms to design their interface [25:49.560 --> 25:52.560] in a way that is not deceptive [25:52.560 --> 25:54.560] and doesn't leave the user down a path they wouldn't intend. [25:54.560 --> 25:59.560] So very much the authors of the GSA are trying to not have that happen again [25:59.560 --> 26:01.560] through this one article. [26:01.560 --> 26:03.560] However, the article isn't as strong as it could be [26:03.560 --> 26:06.560] and so a lot of civil study advocates that were pushing for it to be stronger [26:06.560 --> 26:08.560] didn't get everything they wanted. [26:08.560 --> 26:10.560] That said, I see this as being really, really important [26:10.560 --> 26:13.560] and I think if platforms do look at this article and fully comply [26:13.560 --> 26:16.560] we shouldn't have some of these dark pattern issues. [26:16.560 --> 26:20.560] But this, yeah, this is one of those details that will yet to be seen [26:20.560 --> 26:23.560] and I think there's going to be a lot of pressure on this one provision [26:23.560 --> 26:26.560] and I hope it can withstand that pressure. [26:26.560 --> 26:28.560] Thank you very much. [26:28.560 --> 26:30.560] We don't have time for more questions, [26:30.560 --> 26:32.560] but feel free to pop them in the matrix room [26:32.560 --> 26:34.560] and thank you very much. [26:34.560 --> 26:36.560] Thank you.