[00:00.000 --> 00:11.240] Okay, then we continue with music with the next talk which is become a rock star using [00:11.240 --> 00:12.240] free and open source software. [00:12.240 --> 00:13.240] Please welcome Lorenzo. [00:13.240 --> 00:14.240] Thank you. [00:14.240 --> 00:15.240] Thank you very much. [00:15.240 --> 00:23.760] And it's going to be a hard job following up a guided broad, a synthesizer and an amplifier. [00:23.760 --> 00:27.760] I didn't bring anything to the table, so you'll just have to endure me talking about it. [00:27.760 --> 00:29.240] I hope it's fine anyway. [00:29.240 --> 00:33.040] And first of all, this is my first talk in open media, and I already feel like a fraud [00:33.040 --> 00:36.600] because clickbait alert, I'm not a rock star at all. [00:36.600 --> 00:39.240] I mean, this is an email that I got from Spotify last month. [00:39.240 --> 00:41.080] I got two listeners last month. [00:41.080 --> 00:42.280] One of those was me. [00:42.280 --> 00:45.960] So I think on average, you need a bit more than that. [00:45.960 --> 00:48.360] We actually call the rock star, but it doesn't really matter. [00:48.360 --> 00:53.200] I had a lot of fun in the past few years just playing with music and open source, I mean, [00:53.200 --> 00:57.680] and I had to use this for something rather than just show them around the home. [00:57.680 --> 01:01.120] And for a living, of course, I don't do any music at all. [01:01.120 --> 01:04.680] I'm just a hobbyist musician, and that's what I'm going to be talking about today. [01:04.680 --> 01:09.000] In the real world, I am a WebRTC developer, for instance, involved in some of the things [01:09.000 --> 01:11.560] that Dan mentioned in this previous talk. [01:11.560 --> 01:17.440] I love a lot hard rock, metal, orchestral music, symphonic music, when they work together [01:17.440 --> 01:18.440] as well. [01:18.440 --> 01:21.800] And this is something that I've been trying to do in my own music as well. [01:21.800 --> 01:24.240] And here you can find some links if you want to get in touch. [01:24.240 --> 01:28.280] For instance, you can get in touch on mastodon, a couple of links to my music as well, and [01:28.280 --> 01:29.880] so on and so forth. [01:29.880 --> 01:33.880] And this is just a very basic and completely out of order table of content. [01:33.880 --> 01:37.920] So I talk about a few different things, and I will not follow this order because I will [01:37.920 --> 01:43.080] mostly follow the order by which I learned how to do things with music and Linux in the [01:43.080 --> 01:44.080] first place. [01:44.080 --> 01:50.240] So how I tried to dip my toes into a few different things, and how I eventually learned how to [01:50.240 --> 01:53.160] do some more complex things in the process. [01:53.160 --> 01:58.040] And of course, it will be a very, if you pass the term, a very dumb presentation because [01:58.040 --> 01:59.640] I will only scratch the surface. [01:59.640 --> 02:06.120] I'll try to introduce several different concepts, and really just to tickle your interest enough [02:06.120 --> 02:09.120] so that maybe you have your own guitar getting dusted. [02:09.120 --> 02:13.120] You don't know how to get it started with, for instance, using your laptop to do some [02:13.120 --> 02:17.360] music, and maybe this presentation will tickle your interest and you start doing something. [02:17.360 --> 02:21.240] And besides, there is a very good chance that I'll say something dumb as well, or maybe [02:21.240 --> 02:22.240] something incorrect. [02:22.240 --> 02:24.200] So if I do, please bear with me. [02:24.200 --> 02:28.280] It's really a high-level presentation and something that's really meant as an introduction, [02:28.280 --> 02:32.200] not really something that goes very much in detail. [02:32.200 --> 02:37.440] And when I first started learning about all these, I was really surprised by how mature, [02:37.440 --> 02:42.160] for instance, Linux and the audio ecosystems was to actually do music production on those [02:42.160 --> 02:47.880] machines because you always assumed the world around was Linux is not good enough to do [02:47.880 --> 02:48.880] real-time music. [02:48.880 --> 02:52.880] You have to use Windows or MacOS or whatever, and I disagree with that because especially [02:52.880 --> 02:57.600] when I started with Jack at the time, I found a very interesting ecosystem to do things. [02:57.600 --> 03:02.840] And especially, I really loved the port-based approach that allowed you not to use monolithic [03:02.840 --> 03:07.240] applications to do things, but you have different applications that you can just connect arbitrarily [03:07.240 --> 03:13.760] any way you want, possibly use the same source to connect to multiple applications, implement [03:13.760 --> 03:18.840] very complex workflows, and all in real-time and very low latency, which was really amazing. [03:18.840 --> 03:24.240] And the fact that you can have all these different applications talking to each other means that [03:24.240 --> 03:29.640] you also often have a lot of different applications implementing more or less the same requirements. [03:29.640 --> 03:34.600] So you will have different synthesizers or different ways of implementing effects for [03:34.600 --> 03:36.240] your guitars and so on. [03:36.240 --> 03:40.960] And often, they don't really need to be a substitution for one or the other. [03:40.960 --> 03:45.240] Maybe for one genre, it's better to use some applications, for some others, it's better [03:45.240 --> 03:46.240] using others. [03:46.240 --> 03:51.480] It's really up to your preference and how you like to work with music, and some tools [03:51.480 --> 03:55.840] may make sense more than others in that sense. [03:55.840 --> 04:00.800] And it's probably useless to make this distinction right now because we just had a very interesting [04:00.800 --> 04:05.000] presentation by Ernst, who explained a bit what MIDI signals are. [04:05.000 --> 04:11.160] So when we talk about music production, especially on Linux and Jack, you do have to know basically [04:11.160 --> 04:16.000] that you have audio signals, so a sound that has already been processed, recorded or something [04:16.000 --> 04:18.080] away from of some sort. [04:18.080 --> 04:22.800] And MIDI signals that just carry information that is then used to produce sounds. [04:22.800 --> 04:26.760] And so of course, these two can go through very different workflows. [04:26.760 --> 04:32.480] Different applications can handle just one or both or maybe none at all. [04:32.480 --> 04:36.360] What's really important though is, again, how you can actually have different applications [04:36.360 --> 04:38.400] that you connect arbitrarily on your own way. [04:38.400 --> 04:42.680] That was at the very basis of how Jack was conceived at the very beginning. [04:42.680 --> 04:49.000] And Jack a few years ago was really the way to do very low latency audio on Linux systems. [04:49.000 --> 04:53.160] The downside of that, it was that it was a bit complex to set up and manage. [04:53.160 --> 04:57.640] And luckily, and we've seen a presentation by WIM this morning, PipeWare has made this [04:57.640 --> 04:58.640] so much simpler. [04:58.640 --> 05:00.720] So I was a bit skeptical at the beginning. [05:00.720 --> 05:05.480] I just jumped the bag on a couple of weeks ago. [05:05.480 --> 05:11.240] And basically, PipeWare comes with an implementation of Jack that basically hides all the complex. [05:11.240 --> 05:15.360] I mean, the applications think they're still using Jack, but in practice you're using [05:15.360 --> 05:20.160] PipeWare instead, which means that you can start using also applications that were not [05:20.160 --> 05:25.800] specifically conceived for Jack purposes and work on them together while you work on music [05:25.800 --> 05:27.600] production of some sort. [05:27.600 --> 05:31.560] And all of these small boxes that you see over there are basically different processes. [05:31.560 --> 05:36.360] And you see that some of them have inputs, some of them have outputs, either one or both [05:36.360 --> 05:37.800] these sort of things. [05:37.800 --> 05:41.920] And this is what allows you to basically just arbitrarily connect different applications [05:41.920 --> 05:47.440] to each other to, again, create more complex workflows that you can get out of what a single [05:47.440 --> 05:48.680] application can do. [05:48.680 --> 05:51.320] And I'll show a couple of practical examples in a minute. [05:51.320 --> 05:56.040] So let's assume that you have that whole guitar getting dusted home and now you want [05:56.040 --> 05:57.040] to get some noise. [05:57.040 --> 05:59.640] You want to connect it to your laptop and do something. [05:59.640 --> 06:04.560] So what you don't do is, of course, just plug it in the microphone slot because that [06:04.560 --> 06:06.080] will cause problems. [06:06.080 --> 06:08.760] What you need is some sort of an audio interface instead. [06:08.760 --> 06:14.400] So something like an external sound card that has some inputs that do accept your instrument [06:14.400 --> 06:15.680] instead. [06:15.680 --> 06:21.400] And often these interfaces come with USB interfaces and so are very easy to plug. [06:21.400 --> 06:25.080] Your reprinting system will very likely recognize them out of the box. [06:25.080 --> 06:27.280] And they will be available as a system capture. [06:27.280 --> 06:30.480] And so as one of these boxes that we saw over there. [06:30.480 --> 06:34.240] So something that you can connect to something else. [06:34.240 --> 06:37.560] And the one that I have at home, and spoiler alert, it didn't come with the cat. [06:37.560 --> 06:40.960] The cat came with something else. [06:40.960 --> 06:47.120] I personally bought this Focusrite Scarlett Solo because it's quite inexpensive. [06:47.120 --> 06:52.800] It's very common among hobbyists because it already provides decent enough quality for [06:52.800 --> 06:53.920] recording at home. [06:53.920 --> 06:56.960] It's very flexible and I really like it a lot. [06:56.960 --> 07:00.880] And basically the one that I bought basically comes with two separate inputs. [07:00.880 --> 07:05.320] So it has a USB interface which recognizes an external USB sound card by the operating [07:05.320 --> 07:06.720] system. [07:06.720 --> 07:09.800] And mine in particular, I think later versions changed this a bit. [07:09.800 --> 07:15.040] But it comes with one input that is XLR, the typical cable that you use for microphones [07:15.040 --> 07:16.200] for instance. [07:16.200 --> 07:20.760] And another one is the cable, the typical guitar jack slot. [07:20.760 --> 07:25.160] And since it's two different inputs, when you connect it to the operating system, the [07:25.160 --> 07:29.120] box that you see, which may have this name or an entirely different name depending on [07:29.120 --> 07:33.920] what you're using, shows two different channels, which means that depending on where you're [07:33.920 --> 07:38.720] actually plugging what you want to plug, it will come out of one of those two different [07:38.720 --> 07:42.920] channels for what you want to do, which opens the door to a lot of different things that [07:42.920 --> 07:43.920] you can do. [07:43.920 --> 07:49.040] Because for instance, I could plug my guitar directly into this external sound card. [07:49.040 --> 07:52.080] In this case, I'm plugging it into the jack slot. [07:52.080 --> 07:56.280] That's capture number two, which means that I can then use this capture number two to [07:56.280 --> 07:57.280] do something. [07:57.280 --> 08:02.320] And the best thing that I can do is just connect it to the playback system so that I hear what [08:02.320 --> 08:05.840] I'm playing just unencoded. [08:05.840 --> 08:11.240] So I don't hear any effects, it's just the raw sound of the guitar, but it's something [08:11.240 --> 08:12.240] that I can do. [08:12.240 --> 08:15.680] Of course, I can do something more interesting and we'll show an example later. [08:15.680 --> 08:19.720] Or maybe you have a very good amplifier at home and a very good microphone. [08:19.720 --> 08:26.200] You put the microphone in front of the amplifier, you connect it to the first slot. [08:26.200 --> 08:31.560] And what you get when you're on your laptop is an already distorted, for instance, sound [08:31.560 --> 08:34.440] of your guitar out of your amplifier. [08:34.440 --> 08:38.120] Or maybe you can use them both at the same time, which is what I do often for classical [08:38.120 --> 08:45.000] and acoustic guitars, for instance, where I attach both the pickup of the guitar, whether [08:45.000 --> 08:47.280] it's integrated or added. [08:47.280 --> 08:51.360] And I put the microphone in front of the guitar just so that I capture different frequencies, [08:51.360 --> 08:52.360] different sounds. [08:52.360 --> 08:56.720] Together, they give me a more full sound than what they would give me individually. [08:56.720 --> 09:00.960] I mean, again, it's just very simple examples that show you that before you couldn't do [09:00.960 --> 09:05.880] anything with an external sound card like this, now you have ways to put your instrument [09:05.880 --> 09:11.160] and get it part of a workflow in your own laptop and do something cool. [09:11.160 --> 09:14.160] And one cool thing that you could do, for instance, is just launch Guitarix. [09:14.160 --> 09:20.600] Guitarix is a very complex and effective guitar simulator, basically. [09:20.600 --> 09:25.240] So it has different beats that you can, it's very configurable, so you can create your [09:25.240 --> 09:30.640] own configuration, you can choose the different beats, what you want, how you want your amplifier [09:30.640 --> 09:31.640] to look like. [09:31.640 --> 09:34.720] I'm really stupid in that sense, so I never really tried to do it myself. [09:34.720 --> 09:37.880] I work a lot with presets shared by the community. [09:37.880 --> 09:42.120] But if you are savvy enough, you can just do things on your own to create, to really [09:42.120 --> 09:48.000] shape your own sound so that the guitar sounds exactly how you want it to sound like. [09:48.000 --> 09:52.560] And when you launch Guitarix, it basically spawns two different boxes, as far as jack [09:52.560 --> 09:58.800] or slash pipe, again, when I'm seeing jack, you can assume I'm also just implying pipe [09:58.800 --> 10:01.880] or usage as a consequence. [10:01.880 --> 10:05.880] Basically it comes with two different boxes, one as an amplifier and one for effects. [10:05.880 --> 10:11.680] And then it means that since we had the jack in my Scarlett Focusrite was capture number [10:11.680 --> 10:16.120] two, I connect that to the amplifier, I connect the amplifier to the effects, I connect the [10:16.120 --> 10:21.560] effects to whatever I want, playback, something that records it, it's really that simple. [10:21.560 --> 10:28.400] So you have already created a workflow out of that beat that you manage to capture thanks [10:28.400 --> 10:30.760] to the external audio interface. [10:30.760 --> 10:34.400] Another application that I love a lot as a guitar player is Racka Rack. [10:34.400 --> 10:40.080] I'm not sure if I'm pronouncing it correctly or not, which is not a guitar amplifier simulator [10:40.080 --> 10:45.400] as Guitarix is, it's basically a pedal board simulator instead. [10:45.400 --> 10:49.680] So it has a lot of different effects that you can use and combine. [10:49.680 --> 10:52.680] It also comes with a lot of different presets. [10:52.680 --> 10:56.800] I particularly love the clean sounds that you can get out of Racka Rack. [10:56.800 --> 10:58.120] And again, similar approach. [10:58.120 --> 11:03.600] You connect whatever capture you add your guitar on to Racka Rack and then the output [11:03.600 --> 11:09.640] of that, so the processed sound of the guitar, it's something that you can end up using. [11:09.640 --> 11:14.960] You can do something more complex or, in some cases, also damned by just possibly using [11:14.960 --> 11:19.160] both Guitarix and Racka Rack at the same time, so putting them one after the other. [11:19.160 --> 11:23.360] This is a very simple example and probably doesn't make sense to have the effects box [11:23.360 --> 11:29.360] in between there, but this is a similar approach is what I use, for instance, myself in one [11:29.360 --> 11:33.720] of the songs because I add an effect that I like in Guitarix, but I also need a sustainer [11:33.720 --> 11:35.320] effect in Racka Rack as well. [11:35.320 --> 11:38.720] So I basically just chain them in my workflow. [11:38.720 --> 11:43.400] I plug my guitar in the sound card, that Guitarix distorted and then Racka Rack do some more [11:43.400 --> 11:47.560] things with the sound before I actually used it for something. [11:47.560 --> 11:51.840] And again, this is just very simple examples that are meant to show you how easy it is [11:51.840 --> 11:57.240] to create a workflow using different applications out of sounds that you have access to, to [11:57.240 --> 12:00.240] do some interesting and really cool things. [12:00.240 --> 12:05.720] And so let's assume that basically now we managed to get a decent sound. [12:05.720 --> 12:10.560] Now we just want to record something because we want to either write a song or whatever. [12:10.560 --> 12:14.400] And of course, if you want to do something very simple, so just record the sounds and [12:14.400 --> 12:20.320] then use them somewhere else, you can use any tool that is actually able of consuming [12:20.320 --> 12:21.320] these sources. [12:21.320 --> 12:25.560] And so Audacity or Gstreamer come to mind, but there's so many more. [12:25.560 --> 12:31.280] If you want to do something more complex, maybe work to write a song no matter how complex [12:31.280 --> 12:35.680] it is, you'll want to work within some sort of a project instead. [12:35.680 --> 12:40.200] So maybe in an application that is capable of handling multiple tracks at the same time [12:40.200 --> 12:45.120] and that maybe can add different filters to all these tracks that you're having. [12:45.120 --> 12:50.520] So because you need a compressor on one or maybe reverb on some tracks or you need equalization [12:50.520 --> 12:52.400] or something like this. [12:52.400 --> 12:56.280] And this is the kind of application that you use a digital audio workstation for. [12:56.280 --> 12:59.200] And DAV is a short term for that. [12:59.200 --> 13:03.560] And mostly because these kind of applications are specifically conceived to do exactly that. [13:03.560 --> 13:10.040] So possibly record things in real time or use existing assets, edit and produce, although [13:10.040 --> 13:15.400] these are different audio files in different ways, they often support media as well. [13:15.400 --> 13:20.000] And especially most of them have a modular nature that allows you to use existing models [13:20.000 --> 13:27.120] that are part of the open source ecosystem to add different effects to any of those tracks, [13:27.120 --> 13:31.200] either as a whole, for instance, a filter that applies to multiple tracks at the same [13:31.200 --> 13:33.880] time or just one of them and so on and so forth. [13:33.880 --> 13:40.000] Because you may want equalization, compression, I mean whatever is part of the usual audio [13:40.000 --> 13:44.240] editing process in a regular music studio, if you want, it's something that a digital [13:44.240 --> 13:46.600] audio workstation can provide for you. [13:46.600 --> 13:50.520] So if you've ever heard, for instance, of Pro Tools and stuff like this, this is exactly [13:50.520 --> 13:54.680] what a digital audio workstation can do for you. [13:54.680 --> 14:00.200] And the one that I personally use is called Ardour, which is a very powerful component. [14:00.200 --> 14:03.320] I personally use this because it was the first one I stumbled upon. [14:03.320 --> 14:07.240] I fell in love with it at the time and I just kept on learning. [14:07.240 --> 14:10.360] But again, there are more than that you can use out there. [14:10.360 --> 14:15.480] There's Qtractor, there's Reaper, which is not open source, but it's also very used in [14:15.480 --> 14:19.560] the open source applications as well. [14:19.560 --> 14:24.160] And one thing that you'll notice when you start using an application like this is that [14:24.160 --> 14:29.520] the box is in that graph that I showed, for instance, on Jack or Piper, is going to explode [14:29.520 --> 14:33.720] because a digital audio workstation is going to handle a lot of tracks and those tracks [14:33.720 --> 14:36.240] are going to be connected to a lot of things. [14:36.240 --> 14:39.560] And so you'll see a huge amount of connections out there. [14:39.560 --> 14:44.120] And luckily for you, you don't really have to create those connections on your own because [14:44.120 --> 14:46.080] otherwise you will go crazy. [14:46.080 --> 14:50.840] Often it's the digital audio workstation that does this for you and there are easier ways [14:50.840 --> 14:57.200] to change those connections if you need to from the user interfaces of all those applications. [14:57.200 --> 15:02.000] And most importantly, this shows that no matter how monolithic now this application can look [15:02.000 --> 15:06.560] like, it's still able to communicate with all those external applications that we mentioned. [15:06.560 --> 15:11.120] So you can still have, for instance, an outdoor session open, a guitar rig session open, you [15:11.120 --> 15:16.400] connect your guitar to guitar rigs and then you connect your guitar rigs to that in order [15:16.400 --> 15:20.880] for it to record it, or maybe you have guitar rigs as a plug-in so that you just record [15:20.880 --> 15:25.080] the clean sound and then you have it processed in different ways any time that you needed [15:25.080 --> 15:27.760] these sort of things. [15:27.760 --> 15:31.400] And so let's assume that we have now bass and guitars. [15:31.400 --> 15:35.280] I am assuming that bass and guitars, you can process them pretty much the same way. [15:35.280 --> 15:39.400] I'm sure that there are bass players that will disagree with me, but the concept is [15:39.400 --> 15:40.400] like this. [15:40.400 --> 15:46.360] Let's say that you now need drums and let's assume that you're like me, I'm a WebRTC developer, [15:46.360 --> 15:50.200] I have no friends, so I don't have any drummer friend either. [15:50.200 --> 15:55.040] So I have to create a virtual one instead, so something that plays drums for me because [15:55.040 --> 15:57.400] I am at home doing nothing. [15:57.400 --> 16:02.720] And which means that this is the very first good example of a virtual instrument. [16:02.720 --> 16:07.960] So I need to write the drum parts somehow and then I need to sequence them somehow, [16:07.960 --> 16:12.120] which means that the drum parts will be the instructions so what I want to be played and [16:12.120 --> 16:15.680] then something will actually translate them to a kick sound, there's an air sound, these [16:15.680 --> 16:17.400] sort of things. [16:17.400 --> 16:21.840] And of course you can just play, write the MIDI manually or use something like Melrose [16:21.840 --> 16:23.080] as we've seen. [16:23.080 --> 16:27.120] What I found out that is for drums, it's much easier to work with a pattern bass instead, [16:27.120 --> 16:32.440] mostly because of the rhythmic nature of the instrument and the fact that you can often [16:32.440 --> 16:37.680] do some repetitions, maybe some variations and then just play a bit with those instead. [16:37.680 --> 16:43.480] And personally I like hydrogen a lot in that sense, because it allows you to create multiple [16:43.480 --> 16:49.200] patterns for instance, it has all the different parts of a drum, you can say within the context [16:49.200 --> 16:54.840] of a measure, play this, this, this, this and this point, a kick here and here and here, [16:54.840 --> 16:59.160] you create different patterns, you specify the sequence of those patterns or even some [16:59.160 --> 17:04.480] patterns in parallel if you use some patterns just to create variations, these sort of things. [17:04.480 --> 17:09.800] And then you basically hydrogen plays drums for you out of what you just wrote basically. [17:09.800 --> 17:14.840] And while hydrogen comes with its own sounds, which is really cool, I personally just use [17:14.840 --> 17:18.760] hydrogen to write the parts but then use drum gizmo to actually render them mostly because [17:18.760 --> 17:24.800] drum gizmo is probably the most advanced drum renderer that is out there because it's basically [17:24.800 --> 17:31.680] a lot of drum keys that were captured and recorded by professional drummers, they created samples [17:31.680 --> 17:39.040] and then using drum gizmo you can replay them also using, I mean, I'm not even going to try and [17:39.040 --> 17:43.920] explain how drum gizmo works because it's very complex but it's suffices to say that it has so [17:43.920 --> 17:49.000] many channels that it's the best way to actually get drum sounds out there and also work with them [17:49.000 --> 17:55.400] in within the mixing process, which means that from a jack perspective, I just use hydrogen to [17:56.120 --> 18:01.880] generate the drum parts and then I connect the MIDI output of hydrogen to drum gizmo and then [18:01.880 --> 18:07.240] whatever drum gizmo generates I use within the context of my own application. That's basically [18:07.240 --> 18:13.160] how it works. And now that we dipped our toe in the MIDI world, what can we do with other instruments [18:13.160 --> 18:18.840] because maybe we want a keyboard background, a pad or a piano solo or even a full orchestra [18:18.840 --> 18:25.720] behind our music. And again, this is what MIDI is for because basically we don't have access to a [18:25.720 --> 18:32.440] whole orchestra because I don't have 30,000 euros to pay a lot of players. So what I do is I just [18:32.440 --> 18:36.840] sketch and write the notes for all the different instruments that are involved and then those [18:36.840 --> 18:41.480] notes, that information will be translated to actual sounds. So something will be played by [18:41.480 --> 18:46.920] something that simulates strings, something else will simulate a trumpet, these sort of things. [18:46.920 --> 18:51.400] And of course, these notes can come from different places. It can come from an hardware keyboard, [18:51.400 --> 18:56.360] as we saw in the previous presentation with Melrose or it can come from something that you wrote, [18:56.360 --> 19:01.960] for instance, using Melrose or other approaches. And I am not a keyboard player, but I did buy [19:01.960 --> 19:07.720] this small tiny thing over here that again is just plug and play. You plug it in there, it becomes a [19:07.720 --> 19:13.240] MIDI input that you can use for different things. And once you have it, you can have a lot of [19:13.240 --> 19:18.360] different ways of rendering MIDI sounds. Soundphones are historically the oldest and easiest way to [19:18.360 --> 19:24.280] do that. And Q-Synth, thanks to FluidSynth, is one of the most popular ways to actually play them. [19:24.280 --> 19:29.960] So you download a soundphone from somewhere that contains all the sounds that are associated [19:29.960 --> 19:35.320] to different instruments. And then, for instance, you just connect your keyboard to FluidSynth [19:35.320 --> 19:41.240] using the MIDI part and that is, that eventually gets generated to actual sounds that you can [19:41.240 --> 19:46.120] then use for something else. If you don't want to use soundphones, maybe you want to use a [19:46.120 --> 19:50.440] synthesizer instead. There's plenty of those as well. I use Yoshimi a lot, but there's also [19:50.440 --> 19:56.760] Xena, that's, uh, Xena, so by effects, so very complex name, but they are basically, they share [19:56.760 --> 20:00.680] a lot of code because Yoshimi was actually a fork of death. There is Surge, which is also [20:00.680 --> 20:05.640] another excellent synthesizer. So in that case, there is no sound bank they start from. They [20:05.640 --> 20:10.360] actually generate the sound depending on what you want to do. And again, I'm, I'm dumb, so I never [20:10.360 --> 20:16.680] aim at creating my own synthesized sounds, but if you are more, if you know more about that, it's [20:16.680 --> 20:20.920] something that you can do. And again, it works pretty much the same way. You can connect your [20:20.920 --> 20:26.200] keyboard to Yoshimi and Yoshimi. You can connect the sound to that as well. And in this example, [20:26.200 --> 20:30.520] I also wanted to highlight the fact that, again, you can connect the same source to multiple [20:30.520 --> 20:35.800] things. So in this example, I'm connecting my keyboard to both Yoshimi and that sound from [20:35.800 --> 20:40.520] the application that I showed before. So that's whatever I play sounds both like a synthesizer [20:40.520 --> 20:47.480] and a piano at the same time. Oh yeah, I'm, I'm, okay, I'm just, I'll just fly over the last slide. [20:47.480 --> 20:53.240] So for sound files are also very interesting to do the same thing. You can use Windows VSTs [20:53.240 --> 20:58.680] over there as well via Lean VST, for instance. If you want to write music, you can use Lilipon, [20:58.680 --> 21:03.800] Melrose, MuScore, which I personally like a lot. If you don't know music notation, you can use [21:03.800 --> 21:08.920] piano rolls. And if you want to read, to then publish this, you can, there's a lot of places [21:08.920 --> 21:13.160] where you can publish your music. But what I really encourage to do, however you publish your music, [21:13.160 --> 21:17.800] make sure that you engage the community in order to exchange information. So for instance, I, [21:17.800 --> 21:22.840] I use, I use a lot of Linux musicians and the main author of Linux musicians is there as well. He [21:22.840 --> 21:27.400] asked me to tell you there's a lot of stickers that you can get from here if you want. It's an [21:27.400 --> 21:31.720] excellent place to get in touch with other musicians working, you know, in the open space so that [21:31.720 --> 21:36.280] you exchange ideas. You publish a piece, they give you advice. It's, it's an excellent place to learn. [21:36.280 --> 21:40.840] You may want to add video like I did with my Viking Metal cover of All You Want for Christmas [21:42.120 --> 21:47.400] by Mariah Carey. And that's a fun watch if you want to see it. I thought about about, I, [21:47.400 --> 21:52.040] I thought also a bit about WebRTC used for musicians as well, which in a previous presentation that [21:52.040 --> 21:57.080] you may want to look at. And then that's basically it. These are again my contacts if you want to, [21:57.080 --> 22:02.840] to have a look at that. Unfortunately, I, I didn't speak fast enough, so. [22:02.840 --> 22:13.240] I, I. [22:16.680 --> 22:17.240] Oh, thank you. [22:22.200 --> 22:23.240] Yeah. [22:23.240 --> 22:35.000] Yeah. Basically, I personally use, oh, sorry. Yeah. The question was, [22:35.000 --> 22:40.440] does PipeWire also have the, the way of showing those different boxes that you can connect in [22:40.440 --> 22:47.320] order to create a workflow? I personally still use QJack control, which was a front end specifically [22:47.320 --> 22:52.840] conceived for Jack mostly because PipeWire implements Jack as well. And so QJack controls [22:52.840 --> 22:58.120] allows you to, to create the same connections from the Jack perspective, but there are some [22:58.120 --> 23:02.680] applications that are specifically conceived for PipeWire as well. It's, it really works the [23:02.680 --> 23:07.000] same way. You have, you have these different boxes and you can connect them any way that you want. [23:09.720 --> 23:14.280] No, no, it's, you can script it if you want, but you cannot definitely use a GUI. I always use a [23:14.280 --> 23:31.320] GUI because otherwise you go crazy. Any other question? I'm sorry. [23:33.640 --> 23:38.520] Yeah. The, the one that I use, since I work a lot with metal, the one that I prefer is called [23:38.520 --> 23:43.640] Mould Your Kids, which yeah, I personally love that one a lot, but there are a lot of excellent [23:43.640 --> 23:49.800] tools out there. So thank you. Which one? Sorry. [23:52.600 --> 23:59.720] Yabridge. No, for Windows VSTs, the one that I use the most are some free, free VSTs by Spitfire [23:59.720 --> 24:05.480] Audio, the Spitfire Labs Audio. They have a lot of free different VSTs that are very interesting [24:05.480 --> 24:08.200] and experimental sounds. And those are the ones that I like the most. [24:08.200 --> 24:16.920] Yeah, there are different ways. So Lean VST is the one that I found to be the easiest, but you [24:16.920 --> 24:25.080] can use DSSI, VOS, for instance, is another way to do that. But there are different approaches. [24:25.080 --> 24:29.880] Personally, the one that worked consistently for me was Lean VST. That's just why I use that. [24:29.880 --> 24:37.960] Sorry. If I install PipeWire, can I get rid of Pulse Audio? [24:37.960 --> 24:42.200] Yeah, no, because PipeWire is basically a replacement for both Pulse Audio and Jack, [24:42.200 --> 24:44.760] even though it's compliant with both of them. So you can just... [24:48.200 --> 24:52.120] Yeah, exactly. While you record your guitar, which is something that you couldn't do before. And [24:52.120 --> 24:57.560] besides, PipeWire also does an excellent thing. For instance, with just playing Jack, I couldn't [24:57.560 --> 25:03.080] attach two different audio interfaces to my computer. I had to choose one or use Zeta to add [25:03.080 --> 25:08.200] another. Instead, with PipeWire, I can plug as many as I want, and they all appear, which is great. [25:09.160 --> 25:10.280] I think I have to... [25:12.280 --> 25:12.760] Thank you. [25:12.760 --> 25:14.280] Do you have time for a question afterwards? [25:14.280 --> 25:15.240] Yeah, absolutely. Just... [25:16.120 --> 25:18.280] Yeah, yeah, I'll come outside in a second, sure. [25:18.280 --> 25:28.280] Nice to meet you, Lorenzo. Nice to meet you, Lorenzo. Nice to meet you, too. [25:28.280 --> 25:52.280] Yeah, yeah, yeah, yeah, yeah, yeah, yeah. [25:52.280 --> 26:02.280] I'll just come outside in a second, because I think I have to wait for him. Sorry.