[00:00.000 --> 00:07.000] So, my name is Steve Ball. [00:07.000 --> 00:09.000] My name is Nikke Dadell. [00:09.000 --> 00:11.000] I am from Sonataite. [00:11.000 --> 00:14.000] This different is McCutche Foundation. [00:14.000 --> 00:18.000] I had a much longer title for this, but I want now to use it. [00:18.000 --> 00:20.000] So, this is a short one. [00:20.000 --> 00:23.000] We're going to talk about security. [00:23.000 --> 00:25.000] I need to speak up. [00:25.000 --> 00:26.000] I need to speak up. [00:26.000 --> 00:27.000] Wow. [00:27.000 --> 00:28.000] Okay. [00:28.000 --> 00:29.000] Can you hear me in the back? [00:29.000 --> 00:30.000] No. [00:30.000 --> 00:31.000] Good. [00:31.000 --> 00:32.000] Can you hear me now? [00:32.000 --> 00:33.000] No. [00:33.000 --> 00:34.000] Okay. [00:34.000 --> 00:36.000] So, we're going to talk about security. [00:36.000 --> 00:37.000] Very quickly. [00:37.000 --> 00:39.000] I want to scare you a little. [00:39.000 --> 00:42.000] I want to tell you a little about what's happening. [00:42.000 --> 00:46.000] And then we're going to tell you about some concrete actions. [00:46.000 --> 00:51.000] The girls can tell you some concrete actions about what's happening in McCutche Foundation. [00:51.000 --> 00:56.000] Pay attention because this stuff is coming to you. [00:56.000 --> 01:03.000] So, the first thing is the way that you think about security has got to change. [01:03.000 --> 01:08.000] You think about security, you probably don't even think about what it means. [01:08.000 --> 01:13.000] You think about authentication, encryption and things like that. [01:13.000 --> 01:15.000] We had a couple of talks about dependencies. [01:15.000 --> 01:17.000] That's beginning to percolate. [01:17.000 --> 01:20.000] You've heard about S-bonds and things like that. [01:20.000 --> 01:24.000] What you've got to understand is that the world that we started with, [01:24.000 --> 01:29.000] with Java 25 years ago, 30 years ago, has changed. [01:29.000 --> 01:35.000] And it's changed dramatically. [01:35.000 --> 01:40.000] From now on, and probably from the last two or three years, [01:40.000 --> 01:44.000] but now it's become a big thing, [01:44.000 --> 01:47.000] is we have a new problem. [01:47.000 --> 01:55.000] And the problem is that cybercrime, which has gone beyond all expectation, [01:55.000 --> 02:03.000] cybercrime brings in $7 trillion. [02:03.000 --> 02:07.000] If cybercrime was a country, [02:07.000 --> 02:14.000] cybercrime would be the third biggest country. [02:14.000 --> 02:17.000] There's so much money that's coming in. [02:17.000 --> 02:19.000] But that's just not the worst of it. [02:19.000 --> 02:25.000] The worst of it is all the techniques that the bad guys have used to steal money [02:25.000 --> 02:31.000] are now being weaponized because it's become apparent [02:31.000 --> 02:35.000] that if you use these techniques to steal money, [02:35.000 --> 02:40.000] you can use these techniques to influence, to penetrate. [02:40.000 --> 02:43.000] So, can I get into your banks? [02:43.000 --> 02:46.000] Can I get into your chemical manufacturing? [02:46.000 --> 02:50.000] Can I get into your delivery systems? [02:50.000 --> 02:52.000] That's what they're going to do. [02:52.000 --> 02:54.000] That's what they're doing now. [02:54.000 --> 03:00.000] Because if they can get into these systems, they can manipulate it. [03:00.000 --> 03:02.000] They can turn them off. [03:02.000 --> 03:08.000] You've probably heard one or two of these things happening with the war in Ukraine. [03:08.000 --> 03:10.000] You may have seen these coming through. [03:10.000 --> 03:12.000] But that's just a little bit. [03:12.000 --> 03:15.000] It's happening all the time now. [03:15.000 --> 03:24.000] The new reality is that cybercrime is being used as a weapon to influence your economy. [03:24.000 --> 03:33.000] It's trying to get into your supply chain to do quiet, damaging things. [03:33.000 --> 03:35.000] So fake news is one of them. [03:35.000 --> 03:37.000] You've seen that. [03:37.000 --> 03:45.000] There are little things like getting into delivery systems and changing addresses [03:45.000 --> 03:48.000] so that things don't go quite well. [03:48.000 --> 03:52.000] Breaking systems, shutting down traffic lights, [03:52.000 --> 03:57.000] or disrupting the delivery systems for a supermarket. [03:57.000 --> 04:04.000] All these little tiny things influence your economy. [04:04.000 --> 04:09.000] So basically every country in the world is beginning to understand this, [04:09.000 --> 04:14.000] and every country in the world, every disaffected group, [04:14.000 --> 04:23.000] is looking at cyber technologies as a way to get into our systems. [04:23.000 --> 04:25.000] This is it. [04:25.000 --> 04:26.000] I cannot stress this enough. [04:26.000 --> 04:30.000] You will hear more and more of this goes on, and this is going to affect us all. [04:30.000 --> 04:34.000] Not just in this room, but everybody who's a false step. [04:34.000 --> 04:38.000] We're all open source people. [04:38.000 --> 04:43.000] So the governments are looking at what's going on and seeing the value, [04:43.000 --> 04:48.000] but they also understand, begin to understand the opposite. [04:48.000 --> 04:53.000] Because if you can attack somebody else's supply chain, somebody else's economy, [04:53.000 --> 04:59.000] you can be a victim too. [04:59.000 --> 05:04.000] Log4j was our wake-up call. [05:04.000 --> 05:08.000] It was the one where everybody went, [05:08.000 --> 05:13.000] there is this vulnerability that impacts everybody. [05:13.000 --> 05:17.000] Every in the world is running a business system that's got Log4j. [05:17.000 --> 05:20.000] One way or the other, they were impacted. [05:20.000 --> 05:22.000] And it became a government thing. [05:22.000 --> 05:24.000] You saw it, we all joked about it, [05:24.000 --> 05:31.000] but it became a demonstration of how dangerous these things could be. [05:31.000 --> 05:34.000] And it's still going on. [05:34.000 --> 05:40.000] How many people in this room had to fix a Log4j problem when it came out? [05:40.000 --> 05:42.000] Not good, was it? [05:42.000 --> 05:46.000] Tip of the iceberg, more come in. [05:46.000 --> 05:49.000] And we're not really good at this either. [05:49.000 --> 05:50.000] So I work for standardized. [05:50.000 --> 05:52.000] We run Maven Central. [05:53.000 --> 05:54.000] We see what happens. [05:54.000 --> 05:56.000] We see all the downloads. [05:56.000 --> 05:59.000] And even now, this is like live, [05:59.000 --> 06:02.000] maybe you go to a certain time to see this yourself, [06:02.000 --> 06:14.000] 28% of the downloads for Log4j are still vulnerable versions. [06:14.000 --> 06:19.000] One third ish of what people are downloading. [06:19.000 --> 06:21.000] They aren't safe versions. [06:21.000 --> 06:23.000] They're bad versions. [06:23.000 --> 06:28.000] Still, a year later, more than a year later. [06:28.000 --> 06:30.000] And this is just one example. [06:30.000 --> 06:31.000] This happens all the time. [06:31.000 --> 06:38.000] Because we don't have the tools or the knowledge or the awareness. [06:38.000 --> 06:41.000] Okay? [06:41.000 --> 06:42.000] And it's back. [06:42.000 --> 06:44.000] Why don't you block those? [06:44.000 --> 06:45.000] Why don't we block them? [06:45.000 --> 06:47.000] Have you got 20 minutes? [06:47.000 --> 06:51.000] So the simple answer to the reason that we don't block them [06:51.000 --> 06:55.000] is because it's possible that somebody has got to work around for it [06:55.000 --> 06:57.000] and is protected. [06:57.000 --> 07:00.000] So we break them if we did that. [07:00.000 --> 07:05.000] The only time we take things off Maven Central is if it's got malware. [07:05.000 --> 07:06.000] And we've done that once or twice. [07:06.000 --> 07:08.000] You wouldn't believe it, but Java does the malware. [07:08.000 --> 07:14.000] But other than that, you may actually not be affected. [07:14.000 --> 07:16.000] So you may be in the lucky thing. [07:16.000 --> 07:21.000] I bet you 28% of the majority of you just have no idea they're doing this. [07:21.000 --> 07:24.000] We know from talking to many large customers [07:24.000 --> 07:26.000] that they don't even know they're doing this. [07:26.000 --> 07:30.000] Couldn't it be inculcated if they aren't fixed by a certain date [07:30.000 --> 07:32.000] and then they're rude from... [07:32.000 --> 07:35.000] We could do that if people want us to do that. [07:35.000 --> 07:37.000] It's not being socialized, but it's possible. [07:37.000 --> 07:39.000] I'll give you some more... [07:39.000 --> 07:42.000] Okay, so, scary. [07:42.000 --> 07:45.000] Here's where it gets even more scary. [07:45.000 --> 07:49.000] Because for the last two years, [07:49.000 --> 07:50.000] governments have been going, [07:50.000 --> 07:52.000] we should do something about this problem. [07:52.000 --> 07:55.000] And they've been waking up. [07:55.000 --> 07:58.000] We were in Washington at the end of last year [07:58.000 --> 08:03.000] at one of many groups' conversations with governments, [08:03.000 --> 08:05.000] bigger organizations, [08:05.000 --> 08:12.000] were all looking at what rules do they apply to fix this problem. [08:12.000 --> 08:17.000] And the problem is that they think the problem is us, [08:17.000 --> 08:20.000] is open source. [08:20.000 --> 08:25.000] Because they keep looking at where the problem is, [08:25.000 --> 08:27.000] and the problem is in all this open source. [08:27.000 --> 08:31.000] We have Java, Node, Rust, Go, you name it. [08:31.000 --> 08:34.000] And it's not just a tech. [08:34.000 --> 08:36.000] So you've heard about some tools already. [08:36.000 --> 08:39.000] You're going to hit some more tools. [08:39.000 --> 08:42.000] It's not just about the tools. [08:42.000 --> 08:46.000] They are concerned about our behavior. [08:46.000 --> 08:50.000] That is even more frustrating and worrying to these people [08:50.000 --> 08:54.000] than the tooling. [08:54.000 --> 08:58.000] Now, the good news is that we were in Washington, [08:58.000 --> 09:00.000] and I don't know, this has been going on for, [09:00.000 --> 09:01.000] as I said, two years. [09:01.000 --> 09:03.000] There's lots of policy conversations. [09:03.000 --> 09:06.000] The Open Learner Foundation, OpenSSF, [09:06.000 --> 09:09.000] lots of people are getting together to work out [09:09.000 --> 09:16.000] what the right answer is. [09:16.000 --> 09:22.000] But whatever the community decides, this is happening. [09:22.000 --> 09:24.000] So there's two books here. [09:24.000 --> 09:29.000] So the US put out a national improving cybersecurity [09:29.000 --> 09:32.000] for the defense of the nation or something like that. [09:32.000 --> 09:34.000] That's the US one. [09:34.000 --> 09:37.000] May 2021, 2020. [09:37.000 --> 09:39.000] And recently, as you can see, the US Foundation [09:39.000 --> 09:42.000] are not happy with European one. [09:42.000 --> 09:45.000] Because the European one is about making all of you [09:45.000 --> 09:48.000] open source contributors and project suppliers [09:48.000 --> 09:53.000] and making you comply to a bunch of rules. [09:53.000 --> 09:57.000] And the rules are pretty stiff. [09:57.000 --> 10:00.000] So how many people here have their own little [10:00.000 --> 10:03.000] open source project they share? [10:03.000 --> 10:04.000] OK. [10:04.000 --> 10:07.000] So now, all of you are going to be suppliers. [10:07.000 --> 10:08.000] And so it's going to be. [10:08.000 --> 10:10.000] All of you have to provide esports. [10:10.000 --> 10:11.000] You've heard about that. [10:11.000 --> 10:15.000] All of them, you'll all have to have automatic processes. [10:15.000 --> 10:19.000] You'll all have to have evidence of software integrity. [10:19.000 --> 10:22.000] You'll have to have audit processes, [10:22.000 --> 10:24.000] fundability processes. [10:24.000 --> 10:27.000] This is coming our way. [10:27.000 --> 10:30.000] And so we have to get our act together to make sure [10:30.000 --> 10:33.000] that the way that we resolve this isn't as individuals, [10:33.000 --> 10:35.000] but as a community. [10:35.000 --> 10:37.000] Because you can see there's lots of things [10:37.000 --> 10:38.000] that people want to do. [10:38.000 --> 10:39.000] It's governments. [10:39.000 --> 10:40.000] They want to manage this. [10:40.000 --> 10:42.000] They want to look at what we do. [10:42.000 --> 10:47.000] They want to put processes in the way. [10:47.000 --> 10:52.000] So as a plug for Maven, since we're a plug in tools, [10:52.000 --> 10:54.000] you've got a Maven central. [10:54.000 --> 10:56.000] And you can download stuff. [10:56.000 --> 10:59.000] We're adding more pieces in conjunction with people [10:59.000 --> 11:01.000] like OSSF and others. [11:01.000 --> 11:06.000] We're working out, can we help assess the behavior [11:06.000 --> 11:08.000] of open source project? [11:08.000 --> 11:12.000] And that means how good are the contributors, [11:12.000 --> 11:15.000] how good are the committers of reviewing code? [11:15.000 --> 11:17.000] What's the release pattern like? [11:17.000 --> 11:20.000] Can we spot unusual behavior? [11:20.000 --> 11:23.000] Because what the bad guys are trying to do [11:24.000 --> 11:27.000] is subvert your projects, your behavior, [11:27.000 --> 11:30.000] and get in there and get your software [11:30.000 --> 11:32.000] to deliver malware or to have bad feedback. [11:32.000 --> 11:35.000] So we're looking at ways of trying to codify this. [11:35.000 --> 11:38.000] We've got some ideas, other people have other ideas, [11:38.000 --> 11:41.000] so we're plugging that into central. [11:41.000 --> 11:44.000] We have a visualization tool for S-bombs, [11:44.000 --> 11:46.000] i.e. your dependencies. [11:46.000 --> 11:47.000] So you can go and find that. [11:47.000 --> 11:49.000] You can see that for all of these projects. [11:49.000 --> 11:52.000] And again, we're trying to figure out how to score it [11:52.000 --> 11:55.000] so that we can give you the best advice. [11:55.000 --> 11:59.000] Because when you're choosing a dependency, [11:59.000 --> 12:01.000] as we said before, somebody was mentioning [12:01.000 --> 12:03.000] about compile levels and things like that, [12:03.000 --> 12:05.000] there's a whole bunch of choices you can make [12:05.000 --> 12:08.000] as to what's the right choice. [12:08.000 --> 12:10.000] But you need to know what your dependencies are [12:10.000 --> 12:12.000] and where the risks are. [12:12.000 --> 12:13.000] We're trying to help you do that. [12:13.000 --> 12:15.000] There's other stuff coming. [12:15.000 --> 12:16.000] We're not the only ones doing this. [12:16.000 --> 12:19.000] Everybody's concerned that we're trying [12:19.000 --> 12:22.000] to fix these problems, right? [12:22.000 --> 12:27.000] May 12th was sort of when the clock started for us. [12:27.000 --> 12:29.000] My call to you, Michal's going to talk about [12:29.000 --> 12:30.000] what's happening in formation, [12:30.000 --> 12:34.000] but my call to all of you is start to pay attention, [12:34.000 --> 12:37.000] start looking at the tools that are being proposed [12:37.000 --> 12:39.000] as ways of solving this. [12:39.000 --> 12:43.000] Understand it's not just the tools. [12:43.000 --> 12:44.000] It's the people. [12:44.000 --> 12:46.000] It's the data. [12:46.000 --> 12:49.000] Dependency management is great, [12:49.000 --> 12:52.000] but dependency management, not all tools are equal. [12:52.000 --> 12:54.000] So you have to look at the different tools. [12:54.000 --> 12:56.000] You have to ask yourself how the bad guys are going to behave. [12:56.000 --> 12:59.000] You have to start thinking differently. [12:59.000 --> 13:03.000] If you're an open source project contributor or a committer, [13:03.000 --> 13:06.000] you have to start thinking about this and getting involved. [13:06.000 --> 13:08.000] Start looking at the tools and the standards [13:08.000 --> 13:11.000] that are coming through and seeing how you can be helped. [13:11.000 --> 13:13.000] Because we need to have the community [13:13.000 --> 13:17.000] doing this and becoming a body of people [13:17.000 --> 13:22.000] who are behaving better in terms of how we develop software, [13:22.000 --> 13:24.000] how we think about security. [13:24.000 --> 13:29.000] Because if we don't do it, it's going to get done for us. [13:29.000 --> 13:30.000] Right, I should stop there. [13:30.000 --> 13:34.000] Hand over to Michal, who's going to tell us about [13:34.000 --> 13:36.000] some practical things that we're doing. [13:36.000 --> 13:41.000] Yeah, because it's not all doom and gloom. [13:41.000 --> 13:43.000] Thank you. [13:55.000 --> 13:57.000] Thank you, Michal. [13:57.000 --> 13:58.000] Thank you, Steve. [13:58.000 --> 13:59.000] Are you scared? [13:59.000 --> 14:00.000] Yeah? [14:00.000 --> 14:04.000] So, I will talk a little bit about what we do at the digital nation. [14:04.000 --> 14:07.000] In good news, we will solve everything. [14:07.000 --> 14:09.000] No, of course not. [14:09.000 --> 14:12.000] Our vision is to be kind of a role model [14:12.000 --> 14:14.000] in the open source project [14:14.000 --> 14:19.000] and how we can implement supply chain security best practices. [14:19.000 --> 14:23.000] But, of course, we realise that we cannot just put the burden [14:23.000 --> 14:27.000] of additional security on the shoulders of developers. [14:27.000 --> 14:30.000] You probably don't think it's important, [14:30.000 --> 14:32.000] or you don't have the time, or you don't have the skill [14:32.000 --> 14:35.000] to actually implement all those best practices. [14:35.000 --> 14:39.000] So, what we want to do is to help our projects [14:39.000 --> 14:41.000] that provide services and tools [14:41.000 --> 14:45.000] and best practices, recommendations about it. [14:45.000 --> 14:51.000] And we've been able to build capacity to do that for our project [14:51.000 --> 14:56.000] thanks to the open source security foundation [14:56.000 --> 14:59.000] and the specific project Alpha Omega [14:59.000 --> 15:01.000] that provides us bounds to build the capacity, [15:01.000 --> 15:04.000] build the team to actually help our projects. [15:04.000 --> 15:06.000] So, what I would like to show you today [15:06.000 --> 15:09.000] is what we are starting to do for our project, [15:09.000 --> 15:13.000] what are the tools and the practices that we are implementing, [15:13.000 --> 15:15.000] and also give you some of the examples [15:15.000 --> 15:20.000] with particular one of our projects that you may know already. [15:20.000 --> 15:23.000] So, of course, we try to do that with measurement. [15:23.000 --> 15:27.000] We want to measure what is the current status of security, [15:27.000 --> 15:31.000] analyse those status and try to improve it iteratively. [15:31.000 --> 15:34.000] So, the very first tool we are using to do that is CoreCard, [15:34.000 --> 15:36.000] it's an open SSF project, [15:36.000 --> 15:40.000] and what CoreCard does, it runs on your GitHub repository [15:40.000 --> 15:44.000] and gives you a score, a global score, [15:44.000 --> 15:47.000] regarding your security posture of your repository. [15:47.000 --> 15:50.000] So, do you have branch protection? [15:50.000 --> 15:54.000] Do you have a security policy file and so on? [15:54.000 --> 15:58.000] So, we run that on our GitHub repositories [15:58.000 --> 16:01.000] and the nice thing with being a foundation [16:01.000 --> 16:03.000] is that we have a large amount of projects [16:03.000 --> 16:05.000] so we can have a large dataset. [16:05.000 --> 16:08.000] So, we have about a thousand GitHub repositories, [16:08.000 --> 16:10.000] so we run CoreCard on all of them, [16:10.000 --> 16:12.000] and so that's the histogram, [16:12.000 --> 16:15.000] the distribution of the global score of our project. [16:15.000 --> 16:17.000] So, you can see we are not too bad, [16:17.000 --> 16:20.000] but we are not too great either. [16:20.000 --> 16:23.000] So, that's what we want to shift on the right, right? [16:23.000 --> 16:27.000] We want to have a better, more higher score project. [16:27.000 --> 16:31.000] But this thing all by itself in isolation [16:31.000 --> 16:35.000] does not give us a lot of what to do. [16:35.000 --> 16:38.000] So, let's dive into some of the findings [16:38.000 --> 16:40.000] we have from this analysis. [16:40.000 --> 16:43.000] So, we found two issues basically. [16:43.000 --> 16:46.000] In most of our projects, they don't have branch protection. [16:46.000 --> 16:51.000] So, do you know who knows what is branch protection on GitHub? [16:51.000 --> 16:53.000] Okay, about half. [16:53.000 --> 16:55.000] So, branch protection is basically, [16:55.000 --> 16:57.000] the most basic protection is do not force push, [16:57.000 --> 16:59.000] or you cannot force push to your project. [16:59.000 --> 17:02.000] And that's very important because if someone manages to steal credential, [17:02.000 --> 17:05.000] they will force push to your repo [17:05.000 --> 17:10.000] and be able to add malicious commits to your repo. [17:10.000 --> 17:13.000] The other issue that CoreCard found [17:13.000 --> 17:15.000] is that most of our GitHub actions [17:15.000 --> 17:20.000] actually use high-privileged tokens permissions. [17:20.000 --> 17:22.000] So, by default, you may not know, [17:22.000 --> 17:27.000] but tokens in GitHub actions have right permissions to the repository. [17:27.000 --> 17:29.000] But actually, most of the GitHub actions, [17:29.000 --> 17:32.000] they don't need to have the right permissions on your repo. [17:32.000 --> 17:34.000] They don't need to be able to push. [17:34.000 --> 17:36.000] So, you can do that. [17:36.000 --> 17:41.000] You can decrease the permission of your GitHub tokens, [17:41.000 --> 17:44.000] but it's not by default on GitHub. [17:44.000 --> 17:51.000] So, that's the two main things that we will focus on to improve our score. [17:51.000 --> 17:54.000] To do that, there are many tools available there. [17:54.000 --> 17:57.000] There is one from Step Security that is very helpful. [17:57.000 --> 18:00.000] It's a tool that also runs CoreCard on your repository [18:00.000 --> 18:04.000] and provides you the ability to create automatic PR on your repository [18:04.000 --> 18:06.000] to fix some of those issues. [18:06.000 --> 18:12.000] So, for instance, to lower the permission on the token on the bottom right, [18:12.000 --> 18:15.000] the other is about replacing the tag, [18:15.000 --> 18:18.000] the GitHub tag of the actions you are referencing by the SHA. [18:18.000 --> 18:20.000] So, for those who know about Docker containers, [18:20.000 --> 18:22.000] it's better to use the SHA rather than the tag [18:22.000 --> 18:24.000] because tags are not immutable [18:24.000 --> 18:28.000] and there are plenty of security issues with that. [18:28.000 --> 18:30.000] But what we want to do as well [18:30.000 --> 18:37.000] is to be able to disseminate those special practices [18:37.000 --> 18:39.000] to all of our organization and projects. [18:39.000 --> 18:42.000] In our organization, we have more than 100 organizations, [18:42.000 --> 18:44.000] GitHub organizations to manage, [18:44.000 --> 18:47.000] and we have, as I said, more than a thousand repositories. [18:47.000 --> 18:50.000] So, we need some tools to do that. [18:50.000 --> 18:53.000] So, I don't know if many of you in the audience [18:53.000 --> 18:56.000] have to manage that many organizations or that many projects, [18:56.000 --> 19:01.000] but going to GitHub to edit and configure your settings [19:01.000 --> 19:05.000] and especially the security permissions, security settings of GitHub, [19:05.000 --> 19:06.000] it's a pain. [19:06.000 --> 19:10.000] We are developing a tool called Autodog [19:10.000 --> 19:14.000] that will help us use a configuration as code [19:14.000 --> 19:19.000] to deploy the security batch practices on GitHub. [19:19.000 --> 19:22.000] We are also following some security framework [19:22.000 --> 19:26.000] to improve the security posture of the supply chain of our projects. [19:26.000 --> 19:30.000] So, the one we are following is Salsa SLSA, [19:30.000 --> 19:33.000] so you can find more on SLSA.dev. [19:33.000 --> 19:37.000] It's basically a set of batch practices with different requirements [19:37.000 --> 19:40.000] and the more requirements you comply with, [19:40.000 --> 19:44.000] the higher the level of Salsa you comply with as well. [19:44.000 --> 19:49.000] And we have a way for projects to promote their security posture [19:49.000 --> 19:53.000] by displaying their Salsa compliance level. [19:53.000 --> 19:56.000] The most basic stuff, but please do that now, [19:56.000 --> 19:59.000] activate 2FA for your account, [19:59.000 --> 20:02.000] security starts with the developer, security of the supply chain, [20:02.000 --> 20:04.000] starts with you, [20:04.000 --> 20:08.000] and we will start to enforce that for all of our projects. [20:08.000 --> 20:11.000] We also generate S-bombs, so that's an experimentation. [20:11.000 --> 20:16.000] We are starting to using ORT to generate S-bombs for all of our projects [20:16.000 --> 20:21.000] and we are comparing it with what Sonataip and MavenSocial will provide [20:21.000 --> 20:25.000] and with also S-bombs generated by BILTools. [20:25.000 --> 20:27.000] It's still an experimentation, but we want all of our projects [20:27.000 --> 20:30.000] to be able to generate S-bombs. [20:30.000 --> 20:33.000] And thanks to this funding as well, [20:33.000 --> 20:36.000] we provide security orders to our projects. [20:36.000 --> 20:38.000] So we are funding security orders, [20:38.000 --> 20:43.000] thanks to our partner at OSTIF, West TIF. [20:43.000 --> 20:48.000] We just started three projects this week [20:48.000 --> 20:50.000] and three more to come in the year. [20:50.000 --> 20:53.000] Of course, I cannot tell you much about it today [20:53.000 --> 20:56.000] because it's still under auditing, [20:56.000 --> 20:59.000] but they will be published by the end of the year. [20:59.000 --> 21:03.000] And finally, I want to talk about one project in particular at Eclipse. [21:03.000 --> 21:07.000] It's Adoptium and the tenoring distribution of OpenJDK. [21:07.000 --> 21:10.000] Basically, what the project is doing [21:10.000 --> 21:15.000] is trying to get the world-most secure OpenJDK distribution out there. [21:15.000 --> 21:19.000] And to do that, they follow all the best practices I was talking about [21:19.000 --> 21:24.000] and they are doing a tremendous job in leading the way for all of our projects. [21:24.000 --> 21:27.000] So in particular, they are following two security frameworks [21:27.000 --> 21:29.000] to ensure the security of the supply chain. [21:29.000 --> 21:34.000] Salsa, as I already mentioned for all of our projects, [21:34.000 --> 21:38.000] but they also follow the NIST, SSDF framework. [21:38.000 --> 21:40.000] They are very similar, one to each other. [21:40.000 --> 21:45.000] One is more focusing on the what and the other on the how. [21:45.000 --> 21:51.000] But the combination of the two makes it very, very secure. [21:51.000 --> 21:53.000] They are today at level three, [21:53.000 --> 21:55.000] we're getting Salsa at level two, sorry, [21:55.000 --> 21:59.000] and nearing level three shortly. [21:59.000 --> 22:02.000] And they already actually comply with some of the requirements of level four, [22:02.000 --> 22:05.000] which is the top level of Salsa. [22:05.000 --> 22:09.000] And finally, I would like to mention that what makes tenoring and Adoptium [22:09.000 --> 22:12.000] the world-most secure OpenJDK distribution out there, [22:12.000 --> 22:14.000] it's actually reproducible. [22:14.000 --> 22:17.000] So if any of you know what a reproducible build is, [22:17.000 --> 22:21.000] it's the top-notch level for ensuring the supply chain security. [22:21.000 --> 22:24.000] You can rebuild the binaries on your laptop [22:24.000 --> 22:26.000] and check that it's exactly the same [22:26.000 --> 22:30.000] as the one distributed by the website, the project, [22:30.000 --> 22:33.000] so that you know that the supply chain has not been compromised. [22:33.000 --> 22:36.000] And it's already doing that for JDK 17 and 19, [22:36.000 --> 22:38.000] for Linux and macOS. [22:38.000 --> 22:42.000] They provide all the patches to achieve that upstream to OpenJDK [22:42.000 --> 22:46.000] and Windows should be there pretty soon. [22:46.000 --> 22:48.000] That's it for me. [22:49.000 --> 22:52.000] Okay, just one thing. [22:52.000 --> 22:54.000] Give him a clap. [22:58.000 --> 23:02.000] So I know we're over time, but you don't have to do this. [23:02.000 --> 23:05.000] We're talking about open source projects. [23:05.000 --> 23:07.000] You do not have to do any of this, [23:07.000 --> 23:12.000] but what will happen over time is that those projects that do [23:12.000 --> 23:15.000] improve their posture, become more security conscious, [23:15.000 --> 23:17.000] are going to end up being the software projects [23:17.000 --> 23:19.000] that get used more and more because the governments [23:19.000 --> 23:22.000] are going to force the businesses to make choices [23:22.000 --> 23:26.000] and the businesses rely on open source dramatically. [23:26.000 --> 23:28.000] But their choices are going to become limited [23:28.000 --> 23:31.000] based on our behavior and our actions. [23:31.000 --> 23:33.000] So we want to get ahead of the game. [23:33.000 --> 23:37.000] So I would encourage you, 2023 is the year of secure supply chains, [23:37.000 --> 23:39.000] start looking at all this tech, [23:39.000 --> 23:42.000] learn about the standards, some of it's rubbish, [23:42.000 --> 23:45.000] some of it's getting better, look at the tools, [23:45.000 --> 23:47.000] just start to get your head in the game, [23:47.000 --> 23:50.000] start to make choices, start to get involved. [23:50.000 --> 23:52.000] That's it. Thank you. [23:56.000 --> 23:58.000] We actually had a question in time. [23:58.000 --> 24:00.000] Oh, we have more time? [24:00.000 --> 24:02.000] If there are questions for anyone or... [24:02.000 --> 24:03.000] Questions? [24:03.000 --> 24:04.000] Comments or... [24:04.000 --> 24:07.000] Anybody scared enough to do something? [24:08.000 --> 24:10.000] I have a question. [24:10.000 --> 24:13.000] So about the product PRs, [24:13.000 --> 24:17.000] is there any possible way that the tech can actually make [24:17.000 --> 24:20.000] like a post-pied PR that looks like something close [24:20.000 --> 24:22.000] to like a proper security solution, [24:22.000 --> 24:26.000] but in reality it's an actual vulnerability? [24:26.000 --> 24:27.000] Yes. [24:27.000 --> 24:32.000] So if you're asking, can PRs be... [24:32.000 --> 24:34.000] You're asking if PRs can be faked? [24:34.000 --> 24:35.000] Yeah. [24:36.000 --> 24:39.000] So we could do a whole day, [24:39.000 --> 24:42.000] a whole week, on how the bad guys [24:42.000 --> 24:44.000] will compromise your projects, [24:44.000 --> 24:46.000] but PRs are one of them. [24:46.000 --> 24:48.000] If you get somebody turns up [24:48.000 --> 24:51.000] and they're very helpful and they like doing the merges for you, [24:51.000 --> 24:55.000] merging is a good place to other code to come in, [24:55.000 --> 24:58.000] because nobody checks it afterwards. [24:58.000 --> 25:01.000] So there's all sorts of places where that can happen, [25:01.000 --> 25:04.000] but honestly, most of the time, [25:04.000 --> 25:06.000] right now, if you want to know the one thing they attack, [25:06.000 --> 25:08.000] it's your build systems, [25:08.000 --> 25:10.000] because almost always, [25:10.000 --> 25:14.000] the build is less protected than anything else, [25:14.000 --> 25:16.000] and it's easy to trigger. [25:16.000 --> 25:19.000] So one of the things that we watch for in terms of the store cards [25:19.000 --> 25:25.000] is we look for unusual build behavior. [25:25.000 --> 25:28.000] If a project releases once a month, [25:28.000 --> 25:30.000] and then suddenly it releases five in a row, [25:30.000 --> 25:32.000] there's something wrong. [25:33.000 --> 25:35.000] But honestly, [25:35.000 --> 25:37.000] every way you can think of, [25:37.000 --> 25:40.000] there is happening, people using that. [25:56.000 --> 25:58.000] Yes, absolutely. [25:58.000 --> 26:01.000] Dependencies can be contaminated. [26:01.000 --> 26:03.000] One of the things we talk about [26:03.000 --> 26:05.000] when you're looking at S-ponds, [26:05.000 --> 26:07.000] you think of log4j, [26:07.000 --> 26:09.000] you're assuming that you can find log4j [26:09.000 --> 26:12.000] because it's listed as dependency. [26:12.000 --> 26:15.000] Think about all the fat jars that you've ever built [26:15.000 --> 26:18.000] where you've switched things together. [26:18.000 --> 26:21.000] That information may or may not be in this S-pond, [26:21.000 --> 26:23.000] because it depends on whether the bad guys [26:23.000 --> 26:25.000] who've compromised your projects [26:25.000 --> 26:27.000] make that available. [26:27.000 --> 26:29.000] That make sense? [26:29.000 --> 26:31.000] S-ponds are really good, [26:31.000 --> 26:33.000] but you still need good scanning tools [26:33.000 --> 26:36.000] because the bad guys are trying to hide from the S-pond. [26:36.000 --> 26:40.000] Nothing's new, it's just the game has changed. [26:40.000 --> 26:42.000] Yes? [26:46.000 --> 26:48.000] Is it OK to use grade or W? [26:48.000 --> 26:50.000] I think so. [26:50.000 --> 26:52.000] I'm not sure I... [26:52.000 --> 26:54.000] Do you think it shouldn't be? [26:54.000 --> 26:57.000] Use grade or major in S-p-t, whatever. [27:04.000 --> 27:07.000] I have no specific guidance on wrappers, [27:07.000 --> 27:10.000] because I don't know if there is or isn't a problem. [27:10.000 --> 27:13.000] If you think there is a problem, come and tell me afterwards, [27:13.000 --> 27:15.000] but I'm not aware of one. [27:15.000 --> 27:18.000] Just another remark about the proposal [27:18.000 --> 27:21.000] to pull some college from the repository again. [27:21.000 --> 27:25.000] Remember, somebody who was denigrant of his calls [27:25.000 --> 27:27.000] and broke off the internet? [27:27.000 --> 27:29.000] Oh, yes. [27:29.000 --> 27:31.000] We're talking about situations [27:31.000 --> 27:34.000] where people have removed code from a repository. [27:34.000 --> 27:37.000] Node was the best example recently [27:37.000 --> 27:40.000] where people just took something out of the repository. [27:40.000 --> 27:44.000] You're basically one of your dependencies disappears, [27:44.000 --> 27:47.000] and of course your build process is break. [27:47.000 --> 27:49.000] The one that's worse than that [27:49.000 --> 27:52.000] is you may have heard of occasions where people [27:52.000 --> 27:55.000] who have owned, valid owners of dependency, [27:55.000 --> 27:58.000] the committer of that thing has put bad code in. [27:58.000 --> 28:01.000] Not deliberately as in trying to crush the internet, [28:01.000 --> 28:04.000] but for instance, there was one who was trying to do geolocation, [28:04.000 --> 28:07.000] again, if you're in Russia when you use this, [28:07.000 --> 28:09.000] bad things will happen. [28:09.000 --> 28:11.000] And of course, they got it completely wrong [28:11.000 --> 28:13.000] and a lot of people were hurt. [28:13.000 --> 28:15.000] But that's just an example of the sorts of things [28:15.000 --> 28:17.000] that everybody tries to do, [28:17.000 --> 28:20.000] and it's just going to become more obvious. [28:20.000 --> 28:23.000] We as a community rely on trust, [28:23.000 --> 28:26.000] and I'm afraid that trust is being diluted [28:26.000 --> 28:28.000] because we have a lot of people [28:28.000 --> 28:31.000] who are going to exploit our trust, [28:31.000 --> 28:34.000] and so we have to learn to be protected against it. [28:34.000 --> 28:36.000] Sorry, that's the way it is. [28:36.000 --> 28:38.000] Yeah. [28:38.000 --> 28:40.000] Sir, about reprisical builds, [28:40.000 --> 28:42.000] I know that you had a JDK, [28:42.000 --> 28:44.000] you're getting now a reprisical build, [28:44.000 --> 28:47.000] but what about all the mammoth artifacts, [28:47.000 --> 28:49.000] like all the javas? [28:49.000 --> 28:54.000] Can we actually get built-in reprisical builds [28:54.000 --> 28:56.000] for the javas? [28:56.000 --> 28:58.000] So the whole point of reproducible builds [28:58.000 --> 29:00.000] is that you can be absolutely certain [29:00.000 --> 29:03.000] that you can produce a binary that looks identical [29:03.000 --> 29:08.000] apart from specified differences to dates. [29:08.000 --> 29:10.000] So the reproducible build process says [29:10.000 --> 29:13.000] if you're actually paranoid, check out the source code, [29:13.000 --> 29:16.000] compile it, and you should get a binary [29:16.000 --> 29:18.000] that you can prove is the same as the one [29:18.000 --> 29:20.000] that you got from a supplier. [29:20.000 --> 29:23.000] Now, that's conceptually, [29:23.000 --> 29:25.000] you can do that across all binaries, [29:25.000 --> 29:28.000] provided that you've got a way of describing the differences [29:28.000 --> 29:30.000] and they're not too big. [29:30.000 --> 29:32.000] So what the reproducible build process has done, [29:32.000 --> 29:34.000] like with Tamarin, is they've worked it down [29:34.000 --> 29:36.000] to just like two or three differences. [29:36.000 --> 29:38.000] So it's easy to spot, [29:38.000 --> 29:40.000] because otherwise you could obviously [29:40.000 --> 29:42.000] rebuild a jar file from source code [29:42.000 --> 29:45.000] and you get a binary, and depending on what compiler you used, [29:45.000 --> 29:48.000] it might be slightly different in 50,000 places. [29:48.000 --> 29:50.000] So the idea of the reproducible build [29:50.000 --> 29:54.000] is to get the differences down [29:54.000 --> 29:58.000] to something that you can assess. [29:58.000 --> 30:02.000] And you can see for people who are paranoid [30:02.000 --> 30:05.000] and want to ensure that they're not actually taking downloads, [30:05.000 --> 30:07.000] but they're actually rebuilding from source, [30:07.000 --> 30:11.000] reproducible builds is a way for a lot of big companies [30:11.000 --> 30:14.000] like to do that and take the source and build them, [30:14.000 --> 30:16.000] but they need reproducible builds to make that happen, [30:16.000 --> 30:20.000] which is another reason why you'll see more and more of it. [30:20.000 --> 30:22.000] More? Yes? [30:22.000 --> 30:24.000] If you're looking for reproducible builds, [30:24.000 --> 30:26.000] also for maintenance and so on, [30:26.000 --> 30:30.000] look at Nixos, because they have... [30:30.000 --> 30:32.000] So if you're looking for reproducible builds to Java, [30:32.000 --> 30:34.000] it's on Nixos? [30:34.000 --> 30:36.000] Yeah, Nixos.org. [30:36.000 --> 30:39.000] Oh, thank you, more resources. [30:39.000 --> 30:42.000] There are pretty good recipes for Maven [30:42.000 --> 30:45.000] to build reproducible jobs with Maven. [30:45.000 --> 30:48.000] So check on the reproducible builds [30:48.000 --> 30:50.000] that info or whatever the extension, [30:50.000 --> 30:54.000] slash the Java, and you will have the process. [30:54.000 --> 30:56.000] Hi, so I'm an open source maintainer, [30:56.000 --> 30:59.000] and you really scared me for your idea [30:59.000 --> 31:02.000] of regulators, governments coming in to put in classes [31:02.000 --> 31:04.000] and they're telling me what I should be doing. [31:04.000 --> 31:06.000] Have you got any advice like what can we do [31:06.000 --> 31:08.000] as open source people to, you know, [31:08.000 --> 31:10.000] prevent the action from happening [31:10.000 --> 31:13.000] that a stupid regulation comes in that makes us... [31:13.000 --> 31:15.000] Can we prevent a stupid regulation coming in? [31:15.000 --> 31:17.000] I doubt it. [31:17.000 --> 31:20.000] What we can do is manage it. [31:20.000 --> 31:22.000] What we're trying to do, [31:22.000 --> 31:24.000] and what Linux Foundation is doing, [31:24.000 --> 31:27.000] or Google and IBM and everyone is trying to do, [31:27.000 --> 31:29.000] is to come up with something that will work, [31:29.000 --> 31:32.000] because we need these protections. [31:32.000 --> 31:34.000] It's not like we don't need them. [31:34.000 --> 31:37.000] We just want it done in a way that doesn't mean [31:37.000 --> 31:40.000] that everybody who's writing open source stops and goes home, [31:40.000 --> 31:44.000] because 90% of what application is open source, [31:44.000 --> 31:46.000] which is why we're all scared. [31:46.000 --> 31:49.000] So the basic advice I would give you, [31:49.000 --> 31:52.000] two things, one is get your head in the game [31:52.000 --> 31:55.000] and start looking at these standards [31:55.000 --> 31:57.000] of what's happening, go open SSF, [31:57.000 --> 31:59.000] start reading about what's happening. [31:59.000 --> 32:03.000] Look at the scorecard processes that people are putting together [32:03.000 --> 32:06.000] because they will help you understand what you've got to do. [32:06.000 --> 32:09.000] So like, Linux Foundation is a really good example [32:09.000 --> 32:13.000] because you're looking at Sousa and scorecards, [32:13.000 --> 32:16.000] CNCF, everything called CLO monitor. [32:16.000 --> 32:19.000] Some of them are really straightforward. [32:19.000 --> 32:23.000] There's simple things like, do you have a security.md file? [32:23.000 --> 32:27.000] Do you use protection, branch protection, things like that? [32:27.000 --> 32:31.000] So there's that list, and that will get you somewhere. [32:31.000 --> 32:35.000] And if you start to say, I'm doing these things, [32:35.000 --> 32:37.000] and here's the protection I'm doing, [32:37.000 --> 32:39.000] if you're public about your behavior, [32:39.000 --> 32:41.000] it becomes just a little bit more for us to see [32:41.000 --> 32:43.000] that people are following it. [32:43.000 --> 32:47.000] My expectation, honestly, is that one or more of these standards [32:47.000 --> 32:50.000] will fall out and it will become the bar. [32:50.000 --> 32:53.000] And we just got to make sure it's not a heavy bar, [32:53.000 --> 32:55.000] but it's something that we can all agree [32:55.000 --> 32:58.000] makes reasonable sense to do the next thing. [32:58.000 --> 33:01.000] But my final problem, my final advice to you is, [33:01.000 --> 33:04.000] you have now got to learn to be suspicious. [33:04.000 --> 33:07.000] So everybody contributes to your stuff. [33:07.000 --> 33:10.000] Think about, is it the right thing? [33:10.000 --> 33:13.000] When your code is designed, when you design code, [33:13.000 --> 33:16.000] think about the unhappy part. [33:16.000 --> 33:18.000] When you use dependencies yourself, [33:18.000 --> 33:21.000] think about, do I trust the people who wrote that? [33:21.000 --> 33:24.000] Go look at the website, go look at the GitHub repo. [33:24.000 --> 33:27.000] How many of you have downloaded something for the first time? [33:27.000 --> 33:30.000] How many of you have gone to the GitHub repo and gone, [33:30.000 --> 33:32.000] oh, this hasn't been updated for seven years, [33:32.000 --> 33:34.000] I'm not going to use that? [33:34.000 --> 33:35.000] We all do that. [33:35.000 --> 33:39.000] So do it more often and just get a bit more thoughtful [33:39.000 --> 33:41.000] about your choices. [33:41.000 --> 33:43.000] I think we're way over time. [33:43.000 --> 33:44.000] No? [33:44.000 --> 33:46.000] Wow, okay, boy. [33:46.000 --> 33:47.000] I've got to talk to you. [33:47.000 --> 33:50.000] To Mickey, I was pointing earlier about the read and write [33:50.000 --> 33:52.000] GitHub token permission. [33:52.000 --> 33:54.000] So GitHub launched last week that essentially [33:54.000 --> 33:56.000] all of the repositories created would be by default [33:56.000 --> 33:58.000] with a read token. [33:58.000 --> 34:00.000] But at this current stage, they've left it open [34:00.000 --> 34:04.000] that any existing repository will stick to by default write. [34:04.000 --> 34:06.000] I wanted to see whether you had an opinion [34:06.000 --> 34:08.000] on whether that was the right move to make [34:08.000 --> 34:10.000] or whether at some point they should consider [34:10.000 --> 34:13.000] actually making the switch for everything across GitHub. [34:13.000 --> 34:14.000] That's a good question. [34:14.000 --> 34:20.000] I think we should GitHub switch the token permission. [34:20.000 --> 34:22.000] It's the same problem that we have, [34:22.000 --> 34:24.000] or the same sort of question we have [34:24.000 --> 34:26.000] that Simon asked about made in central. [34:26.000 --> 34:29.000] If you do that overnight unilaterally, [34:29.000 --> 34:31.000] like if you said, okay, from now on [34:31.000 --> 34:33.000] all APIs are going to be paid for, [34:33.000 --> 34:35.000] where did that happen? [34:35.000 --> 34:37.000] You can see the consequences. [34:37.000 --> 34:40.000] So you have to understand. [34:40.000 --> 34:43.000] So I would say, if you look at what Google are doing, [34:43.000 --> 34:45.000] Google will obviously have a program [34:45.000 --> 34:48.000] and they're working through and trying to make it safe [34:48.000 --> 34:50.000] because obviously you've got Google, GitHub. [34:51.000 --> 34:55.000] GitHub business is all about this. [34:55.000 --> 34:58.000] So they're really vested interest in making sure [34:58.000 --> 35:01.000] that they're providing as many security features as possible. [35:01.000 --> 35:06.000] I think if we agree that there's a situation [35:06.000 --> 35:08.000] where you should have tokens that are read-only [35:08.000 --> 35:10.000] because you don't need write-read rights. [35:10.000 --> 35:11.000] So what the hell have we got one? [35:11.000 --> 35:14.000] So they should produce one to make it obvious that it exists. [35:15.000 --> 35:17.000] Okay. [35:19.000 --> 35:21.000] Are we around today? [35:21.000 --> 35:23.000] Are you in this room today? I'm around today. [35:23.000 --> 35:24.000] You too. [35:24.000 --> 35:26.000] Yeah, can't find this. [35:26.000 --> 35:30.000] And also one thing, the session after the one coming up now [35:30.000 --> 35:32.000] will be by people from the Linux Foundation [35:32.000 --> 35:34.000] and from CNCF. [35:34.000 --> 35:36.000] So we can continue this discussion there. [35:36.000 --> 35:39.000] And also it will be about S-bombs and supply chain, etc. [35:39.000 --> 35:40.000] The same thing again. [35:40.000 --> 35:41.000] Good. [35:41.000 --> 35:43.000] I love you with a completely different story. [35:43.000 --> 35:45.000] Thank you very much.