[00:00.000 --> 00:15.960] Okay. Thank you. So, my name is Jos van Dijk. Disclaimer, I work for a commercial company [00:15.960 --> 00:21.880] called Ubico. So, maybe you've heard of Ubikis. Ubico produces those Ubikis. But I'm not [00:21.880 --> 00:26.800] going to talk about Ubikis. I'm going to talk about a technology called FIDO. And many of [00:26.800 --> 00:32.920] you will have seen the previous presentation that was about an application of FIDO. So, the [00:32.920 --> 00:40.480] goal of this presentation is to move on to the next slide somehow. Yeah. To explain what FIDO [00:40.480 --> 00:46.120] is. So, I give a quick introduction to FIDO. And then what you can use it for. And many people [00:46.120 --> 00:51.760] will already have seen it. For example, authentication. But I'm going to talk about things [00:51.760 --> 00:58.720] that are. So, authentication on the web primarily. I'm going to talk about things that are. That [00:58.720 --> 01:04.000] FIDO can be used for that is not involving a web browser. And these things are less or [01:04.000 --> 01:08.760] well known. So, I think it's interesting to have a look at the applications. And I'll give [01:08.760 --> 01:16.400] examples of open source software that you can use today that are actually using FIDO to do things [01:16.400 --> 01:26.480] that don't involve a browser. So, let me first explain about FIDO. So, FIDO is actually a set of [01:26.480 --> 01:30.560] specifications. One is by the World Wide Web Consortium. That's about using it in a web [01:30.560 --> 01:38.320] browser primarily. And the other one is using security keys. So, the tokens like this that are [01:38.320 --> 01:44.040] typically in your key ring. This is called a roaming authenticator in FIDO. And the idea is [01:44.040 --> 01:51.600] that you protect your private keys on a piece of hardware that has protection against extracting [01:51.600 --> 02:00.160] key material. So, this protocol is called CTAP. And that's by different organizations called [02:00.160 --> 02:07.040] FIDO Alliance. And so, this is specifically talking to authenticators like this one. So, [02:07.040 --> 02:12.400] how does that work? So, I'm simplifying things because there's a lot of details that I don't [02:12.400 --> 02:18.240] want to get into because that takes too much time. So, if we have some relying party. So, [02:18.240 --> 02:26.440] let's first look at the web authentication part. So, using a web browser typically. So, [02:26.440 --> 02:33.120] a relying party that will be typically a web server. And authentication works like many [02:33.120 --> 02:38.200] authentication protocols. You use a challenge response mechanism where you use asymmetric [02:38.200 --> 02:45.080] cryptography to sign a challenge. And then you do the verifier. So, the relying party can check [02:45.080 --> 02:51.600] the signature. And if it works out, then you're authenticating. So, the idea is that these two [02:51.600 --> 02:58.680] protocols, the web of N is basically used in a web setting. So, for example, the web server can [02:58.680 --> 03:05.800] send a challenge to a browser. And then the browser uses the web of N API, which is simply a [03:05.800 --> 03:11.960] JavaScript API to initiate the registration of a public key or authentication using that public [03:11.960 --> 03:20.520] key. So, that's what web of N, that's in the web part. And then on the back end, your web browser [03:20.520 --> 03:27.680] will communicate with a security key. So, this roaming authenticator. Just relaying that challenge, [03:27.680 --> 03:34.480] asking the key to sign it. And then the response is passed on to the relying party and we'll [03:34.480 --> 03:42.280] verify it. So, what's all the fuss with the pass keys and Fido and anti-fishing? Well, that's the [03:42.280 --> 03:46.960] merit of Fido too. It has phishing protection. And that is because in this challenge, the web [03:46.960 --> 03:52.200] browser can help you secure things by injecting the origin of the site that you are authenticating [03:52.200 --> 03:57.080] with. So, this is included in the signature. So, if you end up at the phishing site, the [03:57.080 --> 04:03.320] signature will match because it will have a different identifier for the better. So, this is [04:03.320 --> 04:11.240] why Fido is phishing, protecting you against phishing. But actually, I'm not going to talk about [04:11.240 --> 04:15.760] this use case. I'm going to talk about the right part of this image where we use CTAP to [04:15.760 --> 04:23.040] communicate with an authenticator. So, these are all kinds of authenticators. So, yeah, I work for [04:23.040 --> 04:30.120] a company that produces these authenticators, but it's an open standard. So, anyone can build a [04:30.120 --> 04:36.160] security key. So, I'm using security key and roaming authenticator interchangeably, but these are [04:36.160 --> 04:42.520] all security keys by different vendors. So, of course, my employer is there, but there's also [04:42.520 --> 04:49.720] Feixion, for example, Google, Nitro key, Solo keys. And that's interesting because that's actually [04:49.720 --> 04:55.160] also open source hardware. So, anyone can build a Solo key. The firmware is open source, everything. [04:55.160 --> 05:03.440] Nitro key actually uses the same software base, a firmware base. So, these are all, anything I [05:03.440 --> 05:09.960] talk about in this talk will work with any of these security keys. So, how does this protocol [05:09.960 --> 05:17.480] work? So, I'm focusing on CTAP, the back end. So, talking to a authenticator. Well, the idea is [05:17.480 --> 05:25.240] that first you have to register. So, registration is just to register your public key with this [05:25.240 --> 05:31.080] verifier, this relying party, whoever it is. And then later you can use that public key for [05:31.080 --> 05:35.600] authentication. So, there's two steps, registration, authentication. And so, in the registration [05:35.600 --> 05:41.280] steps, so I'm not going to talk about all the details, but you just register your public key [05:41.280 --> 05:51.560] with a relying party. And this is including something called the relying party ID. So, in [05:51.560 --> 05:58.920] Web of N, this is the identifier of the web server. But in other applications, it can be [05:58.920 --> 06:05.840] anything. But the idea is that it is included in any signature that you generate. And you set, [06:05.840 --> 06:11.720] so you fix this relying party ID when you register. And later with authentication, this [06:11.720 --> 06:16.680] relying party ID is included in the signature so you can, as a relying party, verify that it is [06:16.680 --> 06:22.040] used for your application. So, you cannot use the same public key for some other application [06:22.040 --> 06:31.360] with a different relying party ID. So, I'm not going into too much detail. Now, you might think, [06:31.360 --> 06:37.040] well, I can do this stuff with PGP. I can do it with smart cards. So, what is different about [06:37.040 --> 06:45.800] securities if you're not using it in a web browser? Well, actually, many of the things that I'm [06:45.800 --> 06:51.400] talking about will also work with PGP or other technologies, although there are some specific [06:51.400 --> 06:56.720] features that are not always included. And one of them is attestation. So, attestation means [06:56.720 --> 07:04.640] that you can prove that some signature was generated with a security key. So, of course, [07:04.640 --> 07:10.800] if you know that the public key is generated on a security key, then obviously that is the case. [07:10.800 --> 07:16.400] But if you're dealing with someone that claims to have a security key but you're not sure, [07:16.400 --> 07:22.480] you can actually verify it by this process called attestation. So, you can prove that someone uses, [07:22.480 --> 07:30.400] let's say, a Google Titan key to generate the signature. So, this is what called attestation. [07:30.400 --> 07:36.480] And there's a service hosted by the FIDO Alliance or the organization that actually produces those [07:36.480 --> 07:43.960] specifications. And they host metadata. So, if you have a security key, it will have a unique [07:43.960 --> 07:50.960] identifier. So, not unique for that particular Yubiqui but unique for the MAKA model. So, [07:50.960 --> 08:00.520] any, let's say, any Titan key or any Facion key or any Yubiqui that is of a particular MAKA [08:00.520 --> 08:06.160] model will have the same identifier. And in the specs, it says that at least 100,000 keys [08:06.160 --> 08:11.400] need to use that same identifier if they are the same MAKA model. So, we can be sure that, [08:11.400 --> 08:16.960] let's say, that the signature is generated by a Titan key. And that is also interesting. So, [08:16.960 --> 08:25.520] attestation together with the metadata, they really add something to this process. So, [08:25.520 --> 08:31.320] here's an example of the metadata. So, someone built a nice web view of the metadata. So, [08:31.320 --> 08:36.800] you can look up things like, of course, who's the vendor of this Yubiqui or this security key. [08:36.800 --> 08:45.120] But also, is it using protected hardware? And is it certified to a certain security web? So, [08:45.120 --> 08:52.160] all these things you can actually use in, actually, yeah. So, I'm not going to do any demo. So, [08:52.160 --> 08:56.560] I include all my demo slides for you to try yourself. So, we don't have time here. But, [08:56.560 --> 09:02.960] I'm just leaving them in the slides so you can actually try. So, this is a way to extract [09:02.960 --> 09:12.800] metadata. So, it's a bit technical. But, if you want to try it, please do. Then, about open source [09:12.800 --> 09:21.160] software. So, Yubico publishes a FIDO library. And it's actually used by a lot of open source [09:21.160 --> 09:26.800] projects. So, this is open source, although it's produced by Yubico. And, yeah, if you look at, [09:26.800 --> 09:31.480] for example, GitHub and you look at all the projects that use this library, then there's [09:31.480 --> 09:36.720] lots of interesting projects that do it. And that means that you can use a security key, [09:36.720 --> 09:44.080] any security key by any vendor, using that software. And, yeah, what I'll do in this, [09:44.080 --> 09:47.600] and the rest of the talk is give you some examples. But, because it's interesting that, [09:47.600 --> 09:53.480] although FIDO was primarily intended to do authentication, you can actually do other [09:53.480 --> 09:57.440] things. You can do encryption. You can do signing. And you can actually store things on the [09:57.440 --> 10:04.320] YubiKey. So, I'll give an example of all these features. So, let's start with a very simple [10:04.320 --> 10:10.840] example, like a pluggable authentication module. So, that's another open source library that is [10:10.840 --> 10:38.360] included in many Linux distributions. And the idea is that you can... [10:38.360 --> 10:49.500] ... [10:49.500 --> 10:51.560] you [11:19.500 --> 11:21.560] you [11:49.500 --> 11:51.560] you [12:19.500 --> 12:21.560] you [15:49.500 --> 16:18.500] you [16:18.500 --> 16:23.500] Open source software, you can try it out yourself, [16:23.500 --> 16:25.500] but let's give a practical example. [16:25.500 --> 16:28.500] Let's say that you want to encrypt your hard disk. [16:28.500 --> 16:30.500] You're on a Linux system, you're using a Lux, [16:30.500 --> 16:33.500] and typically this is done using a password. [16:33.500 --> 16:37.500] Instead of using a password, you can also use a security key, [16:37.500 --> 16:43.500] a FIDO key, that instead of deriving some encryption key [16:43.500 --> 16:46.500] from your password, it will derive the encryption key [16:46.500 --> 16:51.500] from this extra key that is generated on your security key. [16:51.500 --> 16:54.500] So this means that if you want to decrypt your hard disk, [16:54.500 --> 16:57.500] you just need to insert your security key, [16:57.500 --> 17:02.500] so this is what you have factor to get some extra confidence [17:02.500 --> 17:05.500] that only you can decrypt that hard disk. [17:05.500 --> 17:09.500] So worth looking at. [17:09.500 --> 17:13.500] Then there's another extension called large blocks, [17:13.500 --> 17:17.500] and this is for storing things on your security key. [17:17.500 --> 17:20.500] So it doesn't have a lot of space, [17:20.500 --> 17:23.500] but this is typically used for storing certificates. [17:23.500 --> 17:25.500] So let's say you're using SSH, [17:25.500 --> 17:28.500] and you use SSH with SSH certificates. [17:28.500 --> 17:32.500] These are small files, and it's feasible to store them [17:32.500 --> 17:34.500] on your security key. [17:34.500 --> 17:38.500] So this is an extension that is not very often implemented [17:38.500 --> 17:40.500] at the moment. [17:40.500 --> 17:43.500] I think there are a couple of vendors [17:43.500 --> 17:45.500] that actually implement this, [17:45.500 --> 17:49.500] but it means that if you move to a different system [17:49.500 --> 17:51.500] and you want to log in there, [17:51.500 --> 17:53.500] you can actually take your security key [17:53.500 --> 17:56.500] and extract both the public key and the certificates. [17:56.500 --> 17:59.500] Of course, the private key stays on your key and your security key, [17:59.500 --> 18:02.500] and then log into a remote server from there, [18:02.500 --> 18:08.500] so everything is contained in the same security key. [18:08.500 --> 18:11.500] Here's an example how you do this with the tools. [18:11.500 --> 18:15.500] Do it yourself if you have a key that supports it. [18:15.500 --> 18:19.500] Finally, last example about this attestation. [18:19.500 --> 18:23.500] So if you generate an SSH key that is backed by a security key, [18:23.500 --> 18:27.500] you can actually ask the security key to provide attestation. [18:27.500 --> 18:30.500] So there's this extra parameter in SSH key gen [18:30.500 --> 18:34.500] that will extract the attestation data, it is called, [18:34.500 --> 18:38.500] that you can look up in the FIDO metadata service, [18:38.500 --> 18:43.500] and this way you can prove that the signature was generated [18:43.500 --> 18:48.500] by some security key, and you can look up exactly which one, [18:48.500 --> 18:54.500] and you're certain that this is done with secure hardware. [18:54.500 --> 18:58.500] Okay, you can try it out. [18:58.500 --> 19:02.500] So, getting to a conclusion, [19:02.500 --> 19:05.500] so I'm not saying you have to stop using PGP or anything, [19:05.500 --> 19:11.500] but this is an alternative to doing things with secure hardware, [19:11.500 --> 19:16.500] and the idea is now that since all the big vendors, [19:16.500 --> 19:18.500] like Apple, Google, and Microsoft, [19:18.500 --> 19:21.500] they've jumped on the FIDO bandwagon [19:21.500 --> 19:24.500] so we can see a lot more support for FIDO in the future. [19:24.500 --> 19:29.500] That means that a lot more people will own a FIDO security key. [19:29.500 --> 19:32.500] For example, I already mentioned this morning, [19:32.500 --> 19:34.500] if you have an Apple iOS device, [19:34.500 --> 19:39.500] you can protect your Apple ID using a FIDO security key. [19:39.500 --> 19:46.500] So it's built into iOS 16.3, I think. [19:46.500 --> 19:49.500] So this means that it is a viable alternative [19:49.500 --> 19:54.500] that's actually also a lot cheaper than many of the other hardware keys [19:54.500 --> 19:57.500] that you can buy, like a smart card. [19:57.500 --> 20:01.500] So I think for 20 or 30 barbs you can buy a FIDO security key, [20:01.500 --> 20:05.500] whereas smart cards are usually, [20:05.500 --> 20:07.500] especially if you consider all the middleware [20:07.500 --> 20:10.500] and everything else that you need to get it working, [20:10.500 --> 20:14.500] that can be a lot more expensive. [20:14.500 --> 20:18.500] So, just a list of resources. [20:18.500 --> 20:20.500] If you want to look up things, of course, you can also contact me. [20:20.500 --> 20:29.500] I'll be here all day and happy to take questions if there are any. [20:29.500 --> 20:32.500] Hi, I have a question about the attestation process [20:32.500 --> 20:36.500] and open implementations of FIDO 2. [20:36.500 --> 20:38.500] If you have an open implementation, [20:38.500 --> 20:42.500] is it possible to get that certified with FIDO 2 Alliance? [20:42.500 --> 20:45.500] My understanding is that in order to enable the people [20:45.500 --> 20:48.500] to compile their own binaries to put on their own open keys, [20:48.500 --> 20:50.500] is that possible or is it not? [20:50.500 --> 20:53.500] Yes, so as a line party, [20:53.500 --> 20:57.500] there's a certification process that you can use to test [20:57.500 --> 21:01.500] if you are compatible with the FIDO standards. [21:01.500 --> 21:07.500] So there's different certification programs. [21:07.500 --> 21:11.500] The most heavy ones are for the actual security keys. [21:11.500 --> 21:15.500] So there you have to actually do a lot of work to get it certified. [21:15.500 --> 21:19.500] But there's also a self-certification toolkit, I think, [21:19.500 --> 21:23.500] that you can use to see if you are compatible with FIDO standards. [21:23.500 --> 21:27.500] So there's a lot of tests you can run against your own line party software. [21:27.500 --> 21:29.500] Can you do something? [21:29.500 --> 21:32.500] Yeah, yeah, yeah. [21:32.500 --> 21:34.500] Any other questions? [21:34.500 --> 21:39.500] What is publicly known about hardware tokens failure rates, [21:39.500 --> 21:42.500] at least for popular models, [21:42.500 --> 21:46.500] and how many identical devices would you enroll [21:46.500 --> 21:49.500] for the most important things personally? [21:49.500 --> 21:52.500] The question is, I think, two-fold. [21:52.500 --> 21:55.500] First is about failure rates. [21:55.500 --> 21:57.500] Actually, I don't know about failure rates, [21:57.500 --> 22:01.500] but of course there's different ways to do user verification, [22:01.500 --> 22:03.500] and I skipped over it a bit, [22:03.500 --> 22:06.500] but to use a security key you need to either touch it [22:06.500 --> 22:09.500] and sometimes also prove that you are using it. [22:09.500 --> 22:13.500] And this is done typically with a pin or with a biometric. [22:13.500 --> 22:15.500] So I don't think... [22:15.500 --> 22:18.500] I've never seen a security key fail, [22:18.500 --> 22:20.500] except when it's using biometrics, [22:20.500 --> 22:23.500] because of course that is a less... [22:23.500 --> 22:27.500] So every biometric has a false acceptance rate [22:27.500 --> 22:30.500] and a false rejection rate. [22:30.500 --> 22:34.500] And yeah, I don't know numbers, [22:34.500 --> 22:37.500] but it differs with different vendors. [22:37.500 --> 22:40.500] So I think Apple doesn't say anything [22:40.500 --> 22:43.500] about their false acceptance rate, for example. [22:43.500 --> 22:45.500] So I guess you just have to trust them, [22:45.500 --> 22:48.500] and that is with many vendors. [22:48.500 --> 22:51.500] Then this... I missed the second part of your question. [22:51.500 --> 22:53.500] This... [22:57.500 --> 22:59.500] Oh yeah, so already mentioned, [22:59.500 --> 23:01.500] if you use your security key you have a problem, [23:01.500 --> 23:05.500] so you want to have multiple public keys registered, [23:05.500 --> 23:09.500] and how many, that depends on the relying party. [23:09.500 --> 23:15.500] So some relying parties initially only allow you to register one, [23:15.500 --> 23:18.500] so that's pretty useless, because if you lose that one, [23:18.500 --> 23:20.500] then you're out of business. [23:20.500 --> 23:23.500] So usually there are several, [23:23.500 --> 23:27.500] and yeah, I think for example on an Apple iOS device [23:27.500 --> 23:29.500] you can now register six, [23:29.500 --> 23:31.500] but that's really depending on the line party. [23:31.500 --> 23:33.500] So some have an infinite number, [23:33.500 --> 23:37.500] so you can just register as many as you like, [23:37.500 --> 23:40.500] but that's really depending on the relying party. [23:40.500 --> 23:42.500] Okay, we've got time for our next question. [23:42.500 --> 23:44.500] There is an IC hand here. [23:48.500 --> 23:53.500] So I guess we have now a single point of failure [23:53.500 --> 23:55.500] with the USB socket, [23:55.500 --> 23:57.500] especially if we are travelling. [23:57.500 --> 24:04.500] So are there any plans to have implementation [24:04.500 --> 24:09.500] for the Bluetooth final standards? [24:09.500 --> 24:12.500] So we don't... [24:12.500 --> 24:17.500] We can still work if USB is broken, [24:17.500 --> 24:19.500] maybe at the phone or whatever. [24:19.500 --> 24:21.500] Give it a little quiet, [24:21.500 --> 24:28.500] but I have trouble hearing the question. [24:28.500 --> 24:36.500] Are there any plans for the Bluetooth support in the libraries? [24:36.500 --> 24:39.500] Because when... [24:39.500 --> 24:43.500] Especially let's see if I'm travelling, [24:43.500 --> 24:47.500] I often have problems with USB sockets, [24:47.500 --> 24:54.500] so I would have a single point of failure there. [24:54.500 --> 24:57.500] So you're saying if you have problems with your USB sockets, [24:57.500 --> 25:00.500] then you cannot use a key, so is there an alternative? [25:00.500 --> 25:04.500] And I would prefer to have some Bluetooth fallback then. [25:04.500 --> 25:10.500] And the FIDO standard also specifies FIDO over Bluetooth, [25:10.500 --> 25:14.500] but I don't see it's only implemented on Windows [25:14.500 --> 25:22.500] and Android but not on the Linux libraries. [25:22.500 --> 25:25.500] Sorry, Bluetooth, yeah. [25:25.500 --> 25:27.500] So Bluetooth support is not in this FIDO 2, [25:27.500 --> 25:30.500] it only has USB and NFC. [25:30.500 --> 25:33.500] No Bluetooth, but there may be an addition. [25:33.500 --> 25:35.500] In the next version of CTAP, [25:35.500 --> 25:37.500] there will be a way to... [25:37.500 --> 25:39.500] And maybe you've seen it in Google Chrome. [25:39.500 --> 25:42.500] You can generate a QR code so that you can use your phone. [25:42.500 --> 25:46.500] So if you have enlisted your phone as an authenticator, [25:46.500 --> 25:48.500] you can do it over Bluetooth Low Energy, [25:48.500 --> 25:50.500] but the Bluetooth channel is only used for proximity. [25:50.500 --> 25:54.500] It's not used to actually transfer anything. [25:54.500 --> 25:56.500] For example, if you register a key, [25:56.500 --> 26:00.500] your public key is not submitted over Bluetooth. [26:00.500 --> 26:02.500] But that will be in the next version of CTAP. [26:02.500 --> 26:04.500] Okay, thank you for the talk. [26:04.500 --> 26:06.500] We are out of the time, so applause. [26:06.500 --> 26:16.500] Thank you.