[00:00.000 --> 00:13.200] So, our next talk is about 4K video, HDR video with 8.1, please welcome Vibhuti. [00:13.200 --> 00:14.200] Thanks. [00:14.200 --> 00:23.440] Hi everybody, so this is my first post-dem talk, so the main idea for this talk is that [00:23.440 --> 00:29.320] I just want to have a brief introduction about 4K HDR with AV1 and to have to know [00:29.320 --> 00:34.540] whether how it actually works, that's the main idea for the talk and to give a brief [00:34.540 --> 00:39.240] introduction about myself, I am currently a PhD student, my second year currently working [00:39.240 --> 00:44.040] on video coding, trying to learn how video works and I am also involved in open source [00:44.040 --> 00:51.400] multimedia since 2018, start with video land zip and AOM media, so what are we going to [00:51.400 --> 00:52.400] do today? [00:52.400 --> 00:56.280] Okay, this picture was captured when we were trying to play multiple streams at the same [00:56.280 --> 01:04.280] time 4K AV1 HDR stream, so this was some snapshot of that, so main idea is that we [01:04.280 --> 01:09.680] want to talk about the main technical challenges when we want to play back AV1 streams which [01:09.680 --> 01:15.880] is in HDR and to retain the same colors as such, that's the main idea for the talk. [01:15.880 --> 01:23.880] So, before I get into that, I just want to have an introduction about HDR, we all heard [01:23.900 --> 01:29.280] about that and there are multiple aspects of HDR right now which is happening, so first [01:29.280 --> 01:33.160] aspect of HDR is that let's break down into multiple parts, so first aspect is that it [01:33.160 --> 01:40.160] has brighter pixels, so here is a snapshot of an image which is torn mapped down to BT7094 [01:40.160 --> 01:44.440] display, so this is in HDR even though the picture was in HDR, so what happens here is [01:44.440 --> 01:49.440] that you could see, I don't know if I can see the cursor, so somewhere here it's like [01:49.480 --> 01:55.160] super dark at like 1 nits and somewhere here it is super bright at 1000 nits, so in theory [01:55.160 --> 02:00.400] the picture is at like 4000 nits but our display is only capable of showing at 1000 nits, so [02:00.400 --> 02:06.000] that's captured at 1000 and on comparison for a normal standard dynamic range images [02:06.000 --> 02:12.320] it's usually in 100 to 200 nits and flat panels, modern flat panels go like 500 nits or something [02:12.320 --> 02:18.640] like that but in theory HDR can go up to 10,000 nits but most of the displays can't do that, [02:18.640 --> 02:25.040] so second aspect of HDR is that it has more bits, so in theory, so this is a representation [02:25.040 --> 02:32.040] of same brightness in 8 bits 8 to n SDR and in HDR, so you could see that there is lot [02:33.400 --> 02:39.160] of condensation happening when it is in SDR, so if you want to show or quote something [02:39.160 --> 02:43.560] like 200 or 200 nits you could get away with something like 8 bits and when you want to [02:43.640 --> 02:50.640] go to something like 1000 nits, 8 bits is not, it's not what to say, it's not, we need [02:52.360 --> 02:58.800] more than 8 bits, so that's one aspect of HDR, so that means we need more bits, so what [02:58.800 --> 03:04.640] happened is that we need a mechanism to combine both the nits and bits, so that's how this [03:04.640 --> 03:10.520] transfer function was born, so what happens is that our humanize is quite sensitive to [03:10.520 --> 03:17.520] dark and mid-gray areas and we are okay with bright sports, so with keeping this in mind [03:19.280 --> 03:23.040] different standard bodies and organization try to create something called transfer function [03:23.040 --> 03:29.160] where they try to spend more amount of bits in the bright and low mid-gray areas and spend [03:29.160 --> 03:36.160] lot of, spend, have a coarser condensation for bright areas, so that's how this transfer [03:37.160 --> 03:43.160] function is born and in HDR one of the common thing is perceptual condensation, TQ which [03:43.800 --> 03:47.800] is based upon human perception of banding, so I'm not getting into details, so main [03:47.800 --> 03:52.080] idea of HDR transfer function is this and there is multiple transfer function but the [03:52.080 --> 03:59.080] core idea is that spend more bits on darker and mid-gray areas, then comes the color, [03:59.440 --> 04:05.720] color is complex, I don't know what the color is, so how to say is that to keep it simple [04:05.760 --> 04:11.760] and short, so display technology since standardization of SDR which was like back in 90s or early [04:11.760 --> 04:18.520] 2000 and has evolved quite a lot and now the display technologies can, is capable of displaying [04:18.520 --> 04:23.960] more brightness and more colors and things like that, so if you see this diagram here [04:23.960 --> 04:30.520] right, so to keep it simple what happens is that this shaded region is the standard dynamic [04:30.560 --> 04:37.560] range and SDR range, so that is this SDR and what happens is that in HDR domain or not [04:37.560 --> 04:42.800] in HDR the white color gamut domain it is expanded to be something like this, so it [04:42.800 --> 04:48.680] can have more wider colors to have a quick background, so you remember the picture which [04:48.680 --> 04:53.800] we were talking earlier about a picture which we are talking, so in this picture when we [04:53.800 --> 04:58.280] actually check the color distribution of how it is actually spread out, so you could see [04:58.360 --> 05:04.200] this was manually measured with the help of a color library, so you could see that the [05:04.200 --> 05:09.080] trees and the green areas in the picture is actually beyond 709, so in this picture you [05:09.080 --> 05:16.080] could see that this red triangle is the standard dynamic range in SDR and the green one is [05:16.080 --> 05:22.160] the BT 2020, you could see that the greens is in the wider region, so the main idea to [05:22.160 --> 05:27.640] keep it simple is that there is reds and green and wider range and blues do not change much, [05:28.520 --> 05:32.600] so that is why most of the HDR pictures and videos which you see might have vibrant colors [05:32.600 --> 05:39.000] with reds and green, that is one of the main reason for that, so now comes the question [05:39.000 --> 05:43.720] where do you find this HDR sequences, again these are tone map sequences just for representation, [05:43.720 --> 05:49.800] so in practice or in reality this may not be how it looks, so there are bunch of sequences, [05:49.800 --> 05:54.120] Netflix has read some of the sequences as open content which is available in public [05:54.120 --> 05:59.240] and some other broadcasters have also published some of them which is pretty good and lately [05:59.240 --> 06:05.000] maybe like early last year Academy and Academy Software Foundation and partnered with American [06:05.000 --> 06:10.120] cinematographers also produced proper cinema grading material with different color spaces [06:10.120 --> 06:13.960] which is also available, I have not tried tested that I found that like last week, [06:15.720 --> 06:22.120] right, now let us come back to AV1, so I had given a brief idea about HDR, now talking about AV [06:22.120 --> 06:27.480] and JB has given a brief idea about the current storyline of AV1 decoding, it is to have a quick [06:27.480 --> 06:31.720] refresher, so for people who do not know, so AV1 is a current video coding standard developed [06:31.720 --> 06:37.160] from Alliance for Open Media around 2018 and it claims to be around 30 to 50% more efficient [06:37.160 --> 06:43.720] over predecessors and that is around 200K lines, so it was an old number which I wrote, [06:44.360 --> 06:50.440] so for the video decoding and playback David was there, so David is quite popular from video [06:50.440 --> 06:56.440] and released that and even major browsers and also operating system supports that including Apple, [06:56.440 --> 07:00.680] yeah I do not know how to use AV1 in Apple, no I do not know if anyone was able to actually use [07:01.720 --> 07:07.960] AV1 in Apple even though Apple ships that, so that is the storyline, so that sounds good, [07:07.960 --> 07:13.080] so what is the actual problem when it comes to HDR and AV1, we see that it is able to playback [07:13.080 --> 07:19.640] and things like that, so the problem is playing back signals, playing back HDR videos with [07:19.640 --> 07:26.600] correct metadata and signaling them correctly on a display is not actually easy, so if you [07:26.600 --> 07:31.000] break down into three, first part is like the macOS, we know that the display support is there, [07:31.000 --> 07:35.480] latest macOS has the support, the OS is having support for signaling that, that is good, [07:35.480 --> 07:41.640] but the problem is video output drivers and playback in natively macOS is a bit problematic [07:41.640 --> 07:47.320] and most of the players try to do tone mapping and others try to do, others have support but [07:47.320 --> 07:52.120] they are very limited, so you cannot be able to actually visualize them as you want and in the [07:52.120 --> 07:59.480] next I believe we cannot do because protocols are still work in progress and in Windows if we see [07:59.480 --> 08:04.040] that there is display and voice level support and also players do support that with the help of [08:04.040 --> 08:09.640] direct drivers, so we could drive HDR playback in Windows and it works fine, but the problem I [08:09.640 --> 08:14.600] noticed mainly was like the display transition is there, when you play an HDR window, Windows try [08:14.680 --> 08:20.840] to flip from HDR to HDR that takes few seconds and sometimes some black screen, so that is a [08:20.840 --> 08:24.680] problem which I noticed when you have, when you try to do in Windows, so that is a problematic [08:24.680 --> 08:33.080] thing which I noticed, so what we did is that we took a different approach, so this is kind of [08:33.080 --> 08:39.400] not too many people, probably most of the people in this room, so but we took a, this is the most [08:39.400 --> 08:44.200] common approach used in broadcast and standardization body industry, so that we are just using [08:44.200 --> 08:48.840] playback capture, playback cards to play the video streams and we are using a black cards [08:48.840 --> 08:54.200] from the black magic, we use something called deckling 8k pro, so it can play the streams with [08:54.200 --> 09:00.520] and it can send these signals with the help of SDI as the output and to play them we are using [09:00.520 --> 09:05.720] ffmpeg and gstreamer for driving the playback, so we can work in any operating system if there is [09:05.720 --> 09:13.480] sdk support for that, so that's good, so we found a solution for playback, now comes to the [09:13.480 --> 09:19.560] question like how do you display that, so first thing is that we need to display this [09:19.560 --> 09:25.720] hdr signal with little to no changes, so with the playback card we can send the signal that's good, [09:26.920 --> 09:32.040] then we need to make sure that the tv is not doing any sort of funky things when doing the [09:32.040 --> 09:37.240] playback, because most of the tv sometimes tend to do some sort of tone mapping or try to play [09:37.240 --> 09:41.640] with the brightness and things based upon the ambient display, so we need to make sure that [09:41.640 --> 09:46.200] there is no sort of tone mapping and tv is not playing with that and it should be calibrated [09:46.200 --> 09:50.440] as per standard, so that's the other thing and it should have minimum of 1000 in its brightness, [09:51.160 --> 09:57.080] so that's the main requirement which we have for hdr testing, so what we went is that we went to the [09:57.080 --> 10:03.160] reference monitor, so we use something called sony's reference monitor, so that's a 32 in [10:03.160 --> 10:09.080] jollet panel that's strictly calibrated as per the standard and it ticks all the boxes which we want [10:09.720 --> 10:14.280] and even we can force the signal metadata on the playback, so that's good even though that's an [10:14.280 --> 10:20.840] expensive guy, so once, so the main idea with the reference monitor is that once we establish [10:20.840 --> 10:25.640] playback with hdr we know that this is a source of truth or ground truth which we have, once we do [10:25.640 --> 10:31.320] that we can now show that this is how it actually look, then we can extend to normal consumer displays [10:31.320 --> 10:41.160] which says to have hdr support, so that's good, now how do we say that the signal [10:41.160 --> 10:47.080] and video which we play is actually hdr or how do we say that that's the actual thing, [10:47.080 --> 10:52.520] so we need to confirm that thing, so first thing is that we need to confirm the brightness part [10:52.520 --> 10:57.320] and so for the bright confirming the brightness part we are using the european broadcasters [10:57.320 --> 11:03.720] as released a chart called eotf chart, so which is going from 0 to 1000 nits you won't be able [11:03.720 --> 11:08.680] to see that here anyway, so the idea here is that, so with this chart if you play back in your stream [11:08.680 --> 11:15.560] you could see the peak brightness which is available in your display, so that's the first part which [11:15.560 --> 11:21.480] for confirmation for the streams, the second part which we need to test is that what's the viewing [11:21.480 --> 11:26.440] area which you are able to see that, so if you are playing in a consumer tv's some tv says that [11:26.440 --> 11:33.080] they may have something like 4500 nits, so in theory that 4500 nits I mean in practice that [11:33.080 --> 11:38.600] 4500 nits might be only for like 2% area for few minutes or few seconds of your whole screen [11:38.600 --> 11:44.360] and it goes back to something like 2000 nits or 1500 nits on other times, so we need to be sure [11:44.360 --> 11:49.960] that what's the actual area which we are able to show that, so that's one thing we can use this [11:49.960 --> 11:54.760] so european broadcasters again release bunch of test patterns which you can confirm the maximum [11:54.760 --> 12:01.880] viewing area, so next comes that to confirm the signal which you are sending from the screen is [12:01.880 --> 12:08.200] okay or not, so for that we are just using a cross converter motor which can it's just check the [12:08.200 --> 12:14.040] existence of the signal you pass through the signal through this and it just says what's the signal [12:14.040 --> 12:18.200] which you are sending is okay or not, this is just to make sure that the cables which you send [12:18.200 --> 12:23.000] and they is correct or not because sometimes cable can mess with you, so that's one thing which you [12:23.000 --> 12:28.440] can try, then comes the color this is the most important because it can play a lot of tricks [12:28.440 --> 12:33.720] to you and you don't know how what's the ground truth right, so for that in the reference if you [12:33.720 --> 12:38.520] have the privilege of reference monitor then the reference monitor has such settings were called [12:38.520 --> 12:44.520] gamma marker, so it's actually essentially turns gives some much like something like a zebra line [12:44.520 --> 12:50.840] on top of the screen if it is above 709 that's the main idea but we may not be having reference [12:50.920 --> 12:56.840] monitor all the time right, so we need some other mechanism to validate this thing, so we had to use [12:56.840 --> 13:02.760] some we tried to use something called spectroradiometer, so this is a device which is used to measure [13:02.760 --> 13:07.560] something called radiance from the screen, so with this device we can measure the color volume [13:07.560 --> 13:13.080] that is the color space and also the brightness, so with this device you can just point to the [13:13.080 --> 13:17.960] screen which we are playing the stream and we know that this area might be in HDR, so once we [13:17.960 --> 13:22.440] point to the screen and measure the color you can know that whether this is in HDR space and [13:22.440 --> 13:27.960] what's the actual is the pixel above 709 or not and what's the actual brightness which we are seeing [13:27.960 --> 13:34.280] and things like that based upon the brightness it varies, brightness in the various in the sense like [13:34.280 --> 13:42.440] the time it takes to capture that it varies, so next important part is that the making sure the [13:42.440 --> 13:49.160] whole pipeline which you are using is in 10-bit, so this is a very important thing which might be [13:49.160 --> 13:54.600] a bit tricky to see in real life because most of the to give a background right, so if you have a [13:56.680 --> 14:04.040] so to make this easier, so the main idea here is that we make custom coded 10-bit image having [14:04.040 --> 14:12.200] a 1024 levels of the brightness, so once you send that and if the whole pipeline is in 10-bit [14:12.200 --> 14:17.640] you won't be seeing band and you will be having a smooth gradient and if it is not in 10-bit [14:19.000 --> 14:23.000] you may see some ramps here like this it's not visible here, so essentially you will see some [14:23.000 --> 14:29.080] ramps here, so when in practice when you send a signal and if you try to do in consumer displays [14:29.080 --> 14:33.240] even if you may even get the color volume and brightness and things like that but it can be [14:33.240 --> 14:39.160] still in 8-bits that what I'm talking is like it can be an 8-bit PQ, it's a real thing I didn't [14:39.160 --> 14:45.800] know that before starting this, so with this you can be sure, the reason for saying this is that [14:45.800 --> 14:51.720] if you have some noise in your signal and due to TV's debanding and dithering algorithm and [14:51.720 --> 14:57.960] all the other things which I don't know what TV is doing it can make the bands not visible and [14:57.960 --> 15:05.080] it can be smooth as this 10-bit even though the video might be in 8-bits, so that's the main [15:05.080 --> 15:10.440] things which we focus to say that for testing the HDR things we need to do at least one of [15:10.440 --> 15:14.040] them for each of these things brightness, signal, color and bit depth. [15:16.840 --> 15:21.880] So next question comes up that can we extend this to consumer displays, yes we can do that [15:21.880 --> 15:27.160] because we now establish the ground truth and it works then we can use an SDI to HDMI converter [15:27.160 --> 15:31.880] then we can send the HD, we can play it in a normal consumer displays but that comes to [15:31.880 --> 15:37.720] the question that whether the consumer TVs can actually signal the metadata or whether this [15:37.720 --> 15:43.160] playback card or the FM back drivers which how you play can actually transport the metadata, [15:43.160 --> 15:50.120] so in the TV, modern consumer TVs can force the HDR metadata that's okay and if you don't have that [15:50.120 --> 15:54.600] there is some device which we found recently but it's there in industry for many years that's [15:54.600 --> 16:01.240] called Dr. HDMI, so this guy is like you plug in HDMI and it will insert the HDR metadata how we want [16:01.880 --> 16:09.480] with the web server and it's like it's a magic device I would say and fun fact it can even do [16:09.480 --> 16:15.640] Dolby vision with 8K 60fps, it can even fake Dolby vision metadata to the TV so that TV will be [16:15.640 --> 16:22.440] happy and this device is like less than 200 euro, so it actually works for consumer TVs and old [16:22.440 --> 16:29.240] TVs we just have HDR this guy is magic and most of the time many people who do HDR demos have [16:29.240 --> 16:36.760] this in the background in NAB or IBC, so that's one part now there comes to the question that [16:36.760 --> 16:43.240] is this okay for HDR, so with the HDR right the viewing environment is crazy, so that's one thing [16:43.240 --> 16:48.120] and based upon the viewing environment you had different perception of colors, so main things [16:48.120 --> 16:51.640] here is that you just need to be sure that what's the display panel technology which you are using [16:51.640 --> 16:57.880] like I mentioned previous test would be happy for that would be you can have some sort of confidence [16:57.880 --> 17:03.080] with that then comes the ambient lighting condition on how you view that if your room is getting dark [17:03.080 --> 17:08.440] and based upon your ambient lighting this varies so you need to be sure what's how what's the [17:08.440 --> 17:13.960] lighting in your room when you're watching the HDR videos and again video materials plays an [17:13.960 --> 17:18.840] important role and lastly perception of color the different people at different tolerance of color [17:18.840 --> 17:25.320] so that's one thing and last important fact which I need to talk is that based upon viewing [17:25.320 --> 17:30.040] environment and certain individuals can experience fatigue and dizziness when you watch HDR that's a [17:30.040 --> 17:35.640] true thing and because of the super bright and super vivid colors you need to be careful when [17:35.640 --> 17:42.440] you are watching HDR videos for a long time, so ITU has laid off some methodology when you do this [17:42.440 --> 17:47.240] for scientific testing and subjective testing environment it's not strictly written for HDR [17:47.240 --> 17:52.520] but if you follow that it works on big picture just keep it under 30 minutes when you watch [17:52.520 --> 17:59.480] HDR videos continuously so this is some snapshot of how these set up scientific testing environment [17:59.480 --> 18:04.280] which we went and when we had a person to view that this is how it happened [18:06.200 --> 18:12.360] so yeah so the main things what I was talking here is that HDR signaling HDR metadata and [18:12.360 --> 18:17.960] things are different and one of the main two things of HDR is that there is wide range of brightness [18:17.960 --> 18:22.920] due to different quantization scheme and white color gamut this was an entirely different concept [18:22.920 --> 18:27.800] now the current HDR videos and things has clubbed together and that's this current standard says [18:27.800 --> 18:33.880] so that I enhance the colors and last thing is that setting up the whole set a subjective [18:33.880 --> 18:39.240] testing environment and things or scientific testing environment is non-trivial and it's [18:39.240 --> 18:43.640] accompanied by high cost even though this is standard back in like 10 years back the whole HDR [18:44.600 --> 18:50.360] and the viewing environment plays an important role currently we are doing some sort set of [18:50.360 --> 18:56.600] subjective tests for HDR viewing and things like that so I have put some references if you like [18:56.600 --> 19:03.800] to read more and that's it thanks would be like to hear more questions and I'm still learning [19:03.800 --> 19:06.920] these things so I could be wrong please correct me if there is anything wrong [19:14.520 --> 19:15.000] yeah thank you [19:17.000 --> 19:19.400] you have the slides also and for them websites [19:20.840 --> 19:25.880] all the slides yeah and also I have put some additional resources how you can do these things [19:25.880 --> 19:32.600] some more information which I skipped in this thing yes I only have a little background in [19:32.600 --> 19:37.560] video codex could you explain simply what the pipeline is you were mentioning what does it do [19:38.520 --> 19:44.040] not show which pipeline okay this one right so [19:46.360 --> 19:51.240] this one you're talking so what happens here so what here I'm trying to convey is that [19:52.040 --> 19:57.400] the HDR videos like I mentioned earlier it's coded in 10 bits or more than 10 bits so the whole [19:57.400 --> 20:04.280] pipeline which you are trying to so what have to boil down so the TVs or some devices can [20:04.280 --> 20:09.640] decimate two bits and make it eight bit when you play that it can be so you may see an HDR in [20:09.640 --> 20:14.680] eight bits so when you have it in eight bits the whole nature of HDR in eight bits you can't [20:14.680 --> 20:20.520] represent thousand its brightness so HDR having like quite high wide range of brightness and when [20:20.520 --> 20:26.840] you have less bits you can't actually view that so you just that's the one thing which I mentioned here [20:27.080 --> 20:29.080] yeah [20:30.600 --> 20:37.240] yes because HDR is eight bits right now but there are plans to make it 12 bits in the future yes [20:37.240 --> 20:42.600] so how are you going to extend your calibration and scientific testing equipment to code with [20:42.600 --> 20:49.240] the extra two bits I don't know yeah yes so sorry yeah so he was asking like in future HDR might [20:49.240 --> 20:54.920] become 12 bits so how am I going to extend this whole setup for 12 bits I think that's after like [20:55.000 --> 21:09.800] six years right yeah so probably the capture yeah but yeah okay so probably I think the capture [21:09.800 --> 21:17.880] card can do 12 bits and probably I probably would do something similar I need to dig up how would [21:17.880 --> 21:21.080] can we actually differentiate between 10 bit and 12 bits I actually don't know [21:21.560 --> 21:28.760] yeah 10 bits yeah yeah it would be a bit tricky but we'll find some way for sure [21:33.560 --> 21:37.560] yeah yes [21:40.360 --> 21:44.040] all right so he was suggesting we could use something like a waveform monitor to monitor the [21:44.040 --> 21:49.720] signal yeah so that is part of something like this and a more an more advanced device can have that [21:49.800 --> 21:55.800] so this thing is like a cheap device but even though it's one grand so this can show some [21:55.800 --> 22:01.320] waveforms so yeah he was suggesting we can use more advanced tools to show waveforms yeah [22:12.200 --> 22:16.840] so that's a tricky thing so we just so if you are repeating this for other setups [22:16.840 --> 22:20.680] we just need to make sure that these this for some of these things are like test patterns [22:20.680 --> 22:25.800] should be used to make sure the peak brightness is there and the maximum viewing area then we [22:25.800 --> 22:31.640] can use the spectroradiometer to make sure that the signal is in HDR and to so make sure that's okay [22:32.360 --> 22:36.680] and I think this is repeatable in other places even if you don't have reference monitor because [22:36.680 --> 22:40.040] we tried this on a normal consumer device with the same setup and it worked [22:40.120 --> 22:54.120] yes yes yes I would say so and so sorry sorry sorry again so he was suggesting why is AV1 [22:54.120 --> 23:01.080] better suited for HDR so I don't know if AV1 is best suited for HDR also it's one of the good [23:01.080 --> 23:07.800] codec which can do better compression so natively it can do a lot of contents which is in a better [23:07.800 --> 23:12.760] quantization and the things are which bit better compared to other codecs compared to predecessor [23:12.760 --> 23:20.280] codecs and so due to that nature it's okay to do HDR content but still there is lot of work to do [23:20.280 --> 23:26.760] in AV1 to be better compliant with HDR because other codecs have way better handling for HDR [23:26.760 --> 23:32.280] because most of the codecs right now even though they have HDR they don't treat the HDR signal or [23:32.280 --> 23:37.320] HDR videos differently so it's still a research and development process currently which is happening [23:37.320 --> 23:43.640] in codec development to treat HDR differently and AV1 is slowly trying to do that so codecs [23:43.640 --> 23:49.560] in the sense like HEVC has some other extensions which can handle HDR in a different way with [23:49.560 --> 23:54.840] help of having different quantization for colors and things like that AV1 is slowly add so current [23:54.840 --> 23:59.880] AV1 master has that feature I believe so it's slowly in development research and development [23:59.880 --> 24:14.440] process all right yes so the question he was asking is that whether [24:14.440 --> 24:19.640] is the power consumption is higher in HDR or not I actually didn't measure but I know that [24:19.640 --> 24:23.080] when I play HDR and if I keep my hand on top of the screen I can feel the heat [24:23.080 --> 24:31.640] so the OLED is really burning so if I keep like this distance I can feel the heat in my hand so [24:31.640 --> 24:52.520] it's really bad I would say yes so he was asking why is it why is it that more darker so to answer [24:52.520 --> 24:59.880] that question I was expecting this I have some so I was doing a subjective test for the same thing [24:59.880 --> 25:04.840] for a dark video with bunch of people I didn't include that this was this request more background [25:04.840 --> 25:12.200] and explanation to explain so took so this was a dark image so this is in HDR so in actual setup [25:12.200 --> 25:17.080] and proper environment you could see that clearly but now this is purely washed out in blacks so [25:17.960 --> 25:25.960] because on HDR it can do why is it like that so I don't know because in HDR you can spend more bits [25:25.960 --> 25:34.360] right so people tend to focus the importance of the dark videos and things like that so I think [25:34.360 --> 25:39.880] it's just that you can exploit the wide range of luminance and brightness and condensation scheme [25:39.880 --> 25:47.720] of HDR when it is in dark so what I am here trying to show is that this is a bit complex to explain [25:47.720 --> 25:55.320] so like like the perception of this is very hard and if you have a different lighting condition [25:55.320 --> 26:01.960] the blacks and darks would be entirely different so I think it's just the nature of Kandan and [26:01.960 --> 26:07.720] they just want to utilize the HDR to be that I am not exactly sure how to answer that I don't know [26:08.600 --> 26:12.360] so just like this what I am here trying to show is that even though you had different [26:13.320 --> 26:18.680] videos with different brightness and all this is like a dark video and when people try to view [26:18.680 --> 26:23.480] this in this environment and when you have the worst compression and the best compression people [26:24.040 --> 26:27.480] like these are different kind of noises people just thought that they are same [26:28.360 --> 26:33.560] even in the proper environment so that's a whole different storyline so it's a bit hard to answer [26:33.560 --> 26:38.360] why is it is like that and how we can do that probably in the another time I could explain this [26:38.360 --> 26:49.480] in a clear way but I don't know but yeah. Thank you Vibhuti. Thanks.