[00:00.000 --> 00:20.280] So let's start with our next talk, sorry, we're going in a bit late because the project [00:20.280 --> 00:25.280] was not running, so we have a brand new TV now, so the next talk is about hypo erectile [00:55.280 --> 01:24.280] Yeah, give it a talk, so we'll start it all, and then we'll talk about what's going on [01:25.280 --> 01:36.280] So a bit about what it became since last time and what it currently is, so it's kind of like a sort of [01:37.280 --> 01:49.280] So it's functionality is basically show objects in a graph and pass data between them [01:54.280 --> 02:01.280] There is for example a session manager called wire clumber who orchestrates the graph of all of the objects in it [02:01.280 --> 02:12.280] Then we have a couple of services, like for example there is a post audio service that converts post audio clients to pipeline [02:13.280 --> 02:20.280] We also have a replacement library to run in Jackpile, the purpose of this is to share multimedia [02:20.280 --> 02:29.280] So we have video sharing, simple actually, but also audio functionality of the audio service [02:33.280 --> 02:42.280] So at its core it's to share data very quickly, zero copy, it was originally made to do screen sharing in Wayland [02:42.280 --> 02:51.280] So you need to pass video from the GPU without touching by the CPU to clients [02:54.280 --> 03:01.280] So memory using file descriptors, stuff like that, zero copy [03:03.280 --> 03:08.280] So it turned out in the end that it was also going to work for audio [03:08.280 --> 03:14.280] So audio passing around and then I started working on that [03:14.280 --> 03:24.280] So the audio part is very much like a Jack audio server, very simple how it passes the data around [03:24.280 --> 03:30.280] But on top of that you can build all the existing audio solutions better [03:30.280 --> 03:41.280] So this is the current situation of what we have for audio and video [03:41.280 --> 03:48.280] So you have the Bluetooth stack with the Bluetooth heat that goes on the e-bots [03:48.280 --> 03:58.280] There is also the camera now, there will be talks about that to interface with the kernel video for Linux originally [03:59.280 --> 04:03.280] There is still all solid to do all the audio stuff [04:03.280 --> 04:10.280] And then we do it together with UDAF, we have the session barger, part wire there [04:10.280 --> 04:17.280] And then the applications that go through a replacement false audio server [04:17.280 --> 04:24.280] Or creative, not so many, or Jack clients, they go through a really good translation layer [04:28.280 --> 04:38.280] So originally for real and screen sharing, it was implemented [04:38.280 --> 04:45.280] But the motor, which is the compositor, would expose inside by wire graph [04:45.280 --> 04:51.280] It's called a node, which is something inside the stream, it produces a video feed [04:51.280 --> 04:56.280] And then you can connect with clients and consume that data [04:56.280 --> 05:01.280] So this is all in the process, so different processes connect to the part wire demo [05:01.280 --> 05:06.280] They grab a piece of the graph and consume or produce [05:06.280 --> 05:11.280] So you can produce data, the compositor produces clients and consumes [05:11.280 --> 05:15.280] So you can branch out, you can mix states [05:15.280 --> 05:19.280] And part wire makes sure that the data flows around [05:21.280 --> 05:32.280] So for audio support, it's like pro audio model like Jack [05:32.280 --> 05:37.280] So everything is floating point in the graph, the audio field [05:37.280 --> 05:43.280] All channels are split into separate channels [05:43.280 --> 05:49.280] So there is one buffer size to the whole graph, just like Jack [05:49.280 --> 05:52.280] But it is dynamic, it can change [05:52.280 --> 05:58.280] So there is automatic format converted to or from sources and things [05:58.280 --> 06:01.280] With all the fancy stuff [06:01.280 --> 06:09.280] And it has a couple of things you'll be able to do, the clocks, maybe, and the media stuff like that [06:09.280 --> 06:17.280] So with that poor layer, you can run Jack clients almost very good translation [06:17.280 --> 06:27.280] And with a little server, you can run Pulse Audio clients as well [06:27.280 --> 06:34.280] So it copies basically stuff that was already there, partially from Jack [06:34.280 --> 06:40.280] But also from Pulse Audio, the timer model is also used [06:40.280 --> 06:47.280] It uses a copy of all the Pulse Audio stuff for managing the cards and the mixers [06:47.280 --> 06:51.280] And you plug in the headphones that switches things and remembers volumes [06:51.280 --> 06:54.280] And controls the volume styles and all of that [06:54.280 --> 07:04.280] That was just copied directly because that piece is very evolved over the years [07:04.280 --> 07:17.280] So you can run all the Jack clients like that, it's pretty cool [07:17.280 --> 07:22.280] And they will show up like Pulse Audio clients, we'll just show up there [07:22.280 --> 07:28.280] And you can run them together like that [07:28.280 --> 07:34.280] Pulse Audio clients also run as normal [07:34.280 --> 07:39.280] So it was originally start for video, the screen sharing [07:39.280 --> 07:43.280] Then a little detour to audio [07:43.280 --> 07:46.280] But we're actually back now to video [07:46.280 --> 07:51.280] And now we have two words for the other side [07:51.280 --> 07:57.280] So now we're basically working on the camera capturing stuff [07:58.280 --> 08:05.280] So additionally, browsers like for WebRPC go directly to video for Linux [08:05.280 --> 08:07.280] Using our CTLs and stuff like that [08:07.280 --> 08:10.280] So it's very difficult to put anything in between there [08:10.280 --> 08:14.280] This all needs to be rewritten, this is in the process [08:14.280 --> 08:19.280] But also newer cameras, they need more stuff than normal gifts [08:19.280 --> 08:24.280] There is a new library called the Camera that orchestrates all of this [08:25.280 --> 08:27.280] You'll hear more about that in the camera [08:27.280 --> 08:31.280] But you can't just go directly to video for Linux anymore [08:31.280 --> 08:33.280] So there is a layer needed [08:33.280 --> 08:36.280] So we need to rewrite the applications anyway [08:36.280 --> 08:39.280] So then it's probably better to rewrite into pipe wires [08:39.280 --> 08:45.280] So that you can route the video more flexibly [08:46.280 --> 08:52.280] So basically pipe wire we provide cameras [08:52.280 --> 08:54.280] And you can get them from clients [09:00.280 --> 09:04.280] This also means that you can have multiple apps going to the same camera [09:04.280 --> 09:20.280] So the status currently, it's been in Fedora for a while now for two years [09:20.280 --> 09:22.280] Almost two years [09:22.280 --> 09:26.280] It was API and API stable [09:26.280 --> 09:29.280] So far we can make this work [09:29.280 --> 09:34.280] Yeah, it was made to default [09:34.280 --> 09:38.280] Against all expectations [09:38.280 --> 09:44.280] I was a bit afraid that it wasn't going to work [09:44.280 --> 09:46.280] But it works rather than expected [09:46.280 --> 09:48.280] So it became a default [09:48.280 --> 09:52.280] So for the moment, Jack and Paul saw you [09:52.280 --> 09:54.280] Featured targeting [09:54.280 --> 09:56.280] Everything should work as it worked before [09:56.280 --> 09:59.280] You might have noticed a bit [09:59.280 --> 10:05.280] There's a couple of things that are not implement in that Jack [10:05.280 --> 10:07.280] There are alternatives [10:07.280 --> 10:14.280] This is to connect multiple Jacks on your servers on the network with no rate [10:14.280 --> 10:17.280] It's a very specific implementation [10:17.280 --> 10:21.280] With very little finicky, maybe a bit [10:22.280 --> 10:25.280] There are other alternatives [10:25.280 --> 10:29.280] Yes, we have an R2P based solution [10:29.280 --> 10:32.280] There are more compatible with actually hardware [10:34.280 --> 10:40.280] And there's all bunch of stuff that maybe nobody uses [10:45.280 --> 10:50.280] So right now most of these tools are switching as well [10:50.280 --> 10:52.280] Or have switched [10:52.280 --> 10:55.280] I think WM is getting there [10:55.280 --> 11:01.280] Ubuntu switched for 3210 [11:01.280 --> 11:05.280] They have no more Pulse Audio [11:05.280 --> 11:08.280] You have to notice though, no more Pulse Audio [11:08.280 --> 11:12.280] The only thing that is changed there is the server part of Pulse Audio [11:12.280 --> 11:15.280] The client part of Pulse Audio is still in use [11:15.280 --> 11:17.280] But it talks to a different server [11:17.280 --> 11:21.280] It's a re-implementation of the client part [11:24.280 --> 11:27.280] So now, what are we doing? [11:27.280 --> 11:28.280] Backfixing [11:38.280 --> 11:40.280] So backfixing [11:47.280 --> 11:50.280] Because it mixes only on the right [11:50.280 --> 11:54.280] If you're on a laptop speaker, you only get it in the right channel [11:57.280 --> 12:01.280] No, right channel is his lapel [12:04.280 --> 12:05.280] Okay [12:11.280 --> 12:13.280] It's not true, but okay [12:17.280 --> 12:18.280] It's easier [12:18.280 --> 12:22.280] So the camera elements, the camera as we work on [12:22.280 --> 12:24.280] To get them all integrated [12:24.280 --> 12:27.280] We don't have a roof about this on Sunday [12:29.280 --> 12:32.280] So it's a whole bunch of things that need to be done [12:32.280 --> 12:36.280] Like for example, for the camera, there's also a door [12:36.280 --> 12:40.280] So we try to hide everything now behind a DDoS service [12:40.280 --> 12:41.280] Called the portal [12:41.280 --> 12:43.280] That will actually manage the permissions [12:43.280 --> 12:46.280] Like application can access the camera, yes [12:46.280 --> 12:52.280] It's sort of like other computers [12:52.280 --> 12:55.280] Where you actually give access to an application [12:55.280 --> 12:57.280] To do this and this [12:57.280 --> 12:59.280] So that goes through the portal [12:59.280 --> 13:01.280] We try to hide everything through that [13:01.280 --> 13:05.280] But it requires lots of applications to be included [13:05.280 --> 13:07.280] So for example, the DS Studio [13:07.280 --> 13:10.280] Is a bit of a test case there [13:10.280 --> 13:13.280] For the camera and screen integration [13:13.280 --> 13:17.280] So there are merged requests ready [13:17.280 --> 13:19.280] To be merged [13:19.280 --> 13:21.280] But it's ongoing [13:21.280 --> 13:23.280] Also Firefox and Chrome [13:23.280 --> 13:26.280] For their WebRTC implementation [13:26.280 --> 13:29.280] Have patches that are, I heard, merged [13:29.280 --> 13:31.280] Two days ago [13:31.280 --> 13:32.280] Some of them [13:32.280 --> 13:35.280] Some of them, there are some more than new things [13:35.280 --> 13:38.280] But it's ongoing [13:38.280 --> 13:41.280] So the end result is that if you do a call [13:41.280 --> 13:44.280] If you do a call in one of these browsers [13:44.280 --> 13:45.280] In a future version [13:45.280 --> 13:48.280] You have to go through pipeline to access the camera [13:48.280 --> 13:50.280] And then it's, yeah [13:50.280 --> 13:52.280] You can do lots of fun stuff then [13:52.280 --> 13:55.280] You can add filters to cameras [13:55.280 --> 13:56.280] Or you can make it a... [13:56.280 --> 13:58.280] Like for example, in the OBS Studio [13:58.280 --> 14:00.280] You can do a virtual camera [14:00.280 --> 14:02.280] And then you can have that camera [14:02.280 --> 14:05.280] Used as an input for the browser [14:05.280 --> 14:07.280] You can do all kinds of functions [14:07.280 --> 14:09.280] For things [14:09.280 --> 14:11.280] Once they're installed [14:14.280 --> 14:18.280] Yeah, this is all I had to say [14:18.280 --> 14:20.280] Do you have any questions? [14:20.280 --> 14:22.280] Yes, yeah [14:22.280 --> 14:26.280] So, also you have network functionality [14:26.280 --> 14:29.280] That allows you to use the audio system for it [14:29.280 --> 14:31.280] There was an experimental batch [14:31.280 --> 14:34.280] That allows you to apply the compression to that [14:34.280 --> 14:37.280] Do you have anything like that? [14:37.280 --> 14:39.280] Repeat the question [14:39.280 --> 14:42.280] Yes, so Jack had a network transport [14:42.280 --> 14:45.280] Can you repeat the question for the stream? [14:45.280 --> 14:47.280] Yes, so the question was [14:47.280 --> 14:49.280] Jack had a network transport [14:49.280 --> 14:52.280] And it also allowed you compression [14:52.280 --> 14:54.280] I think it wasn't OBS [14:54.280 --> 14:57.280] We don't have an alternative for that [14:57.280 --> 14:59.280] Is there any plans for that? [14:59.280 --> 15:01.280] Or is there any... [15:01.280 --> 15:03.280] No concrete plans [15:03.280 --> 15:06.280] But it sounds like a useful feature [15:06.280 --> 15:08.280] Yeah [15:08.280 --> 15:10.280] Like for example now [15:10.280 --> 15:12.280] The network things that we tried to use [15:12.280 --> 15:14.280] AES, which is RQB based [15:14.280 --> 15:17.280] It's an uncompressed, but very small batch [15:20.280 --> 15:22.280] Okay, I'm just curious [15:22.280 --> 15:24.280] How kind of a product that actually uses [15:24.280 --> 15:26.280] That stuff that we will tell you [15:26.280 --> 15:28.280] Since I put it in order [15:28.280 --> 15:30.280] It might... [15:34.280 --> 15:36.280] Following up on that [15:36.280 --> 15:38.280] There is a rock plug-in [15:38.280 --> 15:40.280] Apart from that, would that be any help? [15:40.280 --> 15:42.280] Yeah, so the suggestion is [15:42.280 --> 15:44.280] That there is a rock plug-in [15:44.280 --> 15:46.280] Which is... [15:46.280 --> 15:49.280] The rock tour kit is built on RQB [15:49.280 --> 15:52.280] And also provides network transport [15:52.280 --> 15:55.280] I don't know, does it do compression? [15:55.280 --> 15:57.280] I don't know [15:59.280 --> 16:01.280] But it is... [16:01.280 --> 16:03.280] It's a bit more generic streaming [16:03.280 --> 16:05.280] I don't know if it's suited [16:05.280 --> 16:09.280] For doing extra low latency communication [16:09.280 --> 16:11.280] It's more like a generic streaming [16:13.280 --> 16:15.280] Another question [16:15.280 --> 16:17.280] Are there any plans [16:17.280 --> 16:20.280] For the plans of introducing signal processing [16:20.280 --> 16:23.280] Eco-consolation, noise reduction [16:23.280 --> 16:25.280] Yeah, so the question is [16:25.280 --> 16:27.280] Are there plans to add [16:27.280 --> 16:29.280] Eco-consolation, noise reduction [16:29.280 --> 16:31.280] Other plug-ins? [16:31.280 --> 16:33.280] Eco-consolation we have [16:33.280 --> 16:35.280] We have a module where you can load [16:35.280 --> 16:37.280] Eco-consolation using [16:37.280 --> 16:39.280] The network part to see [16:39.280 --> 16:41.280] Eco-consolation and the load module [16:42.280 --> 16:44.280] So signal processing [16:44.280 --> 16:46.280] There is also a whole bunch of things [16:46.280 --> 16:48.280] We can of course load [16:48.280 --> 16:50.280] Jack tools [16:50.280 --> 16:52.280] To do a few set of things [16:52.280 --> 16:54.280] We also have a native module [16:54.280 --> 16:57.280] That can load lots of LV2 filters [16:57.280 --> 16:59.280] And all those products [16:59.280 --> 17:01.280] Filters, filters [17:01.280 --> 17:04.280] That you can use to construct filter chains [17:04.280 --> 17:06.280] There's also easy effects [17:06.280 --> 17:10.280] Which is an app [17:10.280 --> 17:13.280] To manage the filters for you [17:13.280 --> 17:15.280] So there are just quite a few options [17:17.280 --> 17:19.280] We have an online question [17:19.280 --> 17:22.280] Is there a way to visually show 5.0 views? [17:22.280 --> 17:24.280] Yes [17:27.280 --> 17:30.280] Yeah, so for visualizing [17:30.280 --> 17:32.280] The part-wire graphs [17:32.280 --> 17:34.280] You can of course use the jack tools [17:34.280 --> 17:36.280] There's also a native [17:36.280 --> 17:38.280] A native tool to do this [17:38.280 --> 17:40.280] To make it to the level [17:40.280 --> 17:42.280] Telephone, Bluetooth [17:42.280 --> 17:44.280] And then you can load your graph [17:45.280 --> 17:47.280] Show how to load your signals [17:47.280 --> 17:49.280] Can you change it? [17:50.280 --> 17:53.280] If you use a native tool [17:53.280 --> 17:55.280] You can also route the video [17:55.280 --> 17:57.280] Later in the camp [18:02.280 --> 18:05.280] So you said you had AES output [18:05.280 --> 18:07.280] Are you referring to AES67 here? [18:07.280 --> 18:09.280] And if you are [18:09.280 --> 18:11.280] How are you dealing with the clocking elements of AES67? [18:11.280 --> 18:13.280] Because you need a PTP grandmaster [18:13.280 --> 18:15.280] Are you just ignoring it? [18:15.280 --> 18:18.280] No, so the question is [18:18.280 --> 18:20.280] For AES67 [18:20.280 --> 18:23.280] You need a synchronized clock [18:23.280 --> 18:25.280] To synchronize the PTP [18:29.280 --> 18:31.280] So you have to set that up [18:31.280 --> 18:32.280] For the machine [18:32.280 --> 18:34.280] But what happens if you don't? [18:34.280 --> 18:37.280] Do people just send garbage time stamps [18:37.280 --> 18:39.280] By default? [18:39.280 --> 18:41.280] Because this is a big problem for receivers [18:41.280 --> 18:43.280] To look at the time stamps and go [18:43.280 --> 18:45.280] I need to do something with it [18:45.280 --> 18:47.280] The question is if you don't have PTP clock [18:47.280 --> 18:49.280] Can it still work? [18:49.280 --> 18:51.280] Yes, it can still work [18:51.280 --> 18:53.280] But with reduced synchronization [18:53.280 --> 18:55.280] Of course [18:55.280 --> 18:57.280] What should a third party receiver do? [18:57.280 --> 18:59.280] Because if they have two different clocks [18:59.280 --> 19:01.280] What does it do with the time stamps? [19:01.280 --> 19:03.280] Just like this, I guess [19:03.280 --> 19:05.280] So the time stamp in the receiver [19:05.280 --> 19:07.280] The question is what does the receiver do [19:07.280 --> 19:09.280] To synchronize itself to the screen [19:09.280 --> 19:11.280] Basically follow the rate [19:11.280 --> 19:13.280] At which you receive data [19:13.280 --> 19:15.280] And remain to consume [19:15.280 --> 19:17.280] And then try to adjust [19:17.280 --> 19:21.280] The consumption rate by result [19:21.280 --> 19:23.280] That's the question [19:23.280 --> 19:26.280] You talked a lot about porters for video [19:26.280 --> 19:28.280] Is there also something similar for audio? [19:28.280 --> 19:30.280] Like for example, I want to capture [19:30.280 --> 19:33.280] An application using video and audio [19:34.280 --> 19:37.280] The question is the port for video [19:37.280 --> 19:38.280] We talk about lockbooks [19:38.280 --> 19:40.280] Is there also a port for audio? [19:40.280 --> 19:43.280] The answer is we are thinking about it [19:43.280 --> 19:46.280] But we don't have anything concrete [19:51.280 --> 19:55.280] Is there anything for networking videos? [19:57.280 --> 19:59.280] The question is is there anything [19:59.280 --> 20:01.280] For networking videos to use? [20:01.280 --> 20:03.280] No, absolutely not [20:03.280 --> 20:05.280] Last? [20:05.280 --> 20:07.280] Not sure [20:10.280 --> 20:12.280] There are a lot of things you can do [20:12.280 --> 20:14.280] Go in fast [20:14.280 --> 20:16.280] Go away [20:16.280 --> 20:18.280] Just try and work [20:18.280 --> 20:20.280] There are a lot of things you can do [20:20.280 --> 20:22.280] I don't know what to do [20:22.280 --> 20:24.280] Let's start from there [20:24.280 --> 20:26.280] applause [20:40.280 --> 20:42.280] Thank you