So, welcome for the second time. Thanks for staying this long with me last talk today, named Google Home, but better. Starting really good. Just a second. So, even though it's a short talk, just really quick, a little agenda for today, what you can expect. A bulk section, really brief about me, why am I talking about this, why should you listen to me talking about Flutter. The hardware used in this project, of course, that's one of the interesting parts, but no really big surprises there. It's just what you would all expect. Then we get to the software part one, the embedded Flutter part, and part two, the implementation. And I think this is for most of you, this will be the most interesting parts of this talk. So, first about me. Hi there, I'm Moritz. Yeah, a few years ago, when I was 15, 16, I started out with embedded development. Back then it was all hobbies. I started out with an 8051 Derivate. I think it was an Infinion XC878. I started developing in C. Back then I wanted to mainly build everything around music, high-fee, loudspeakers, equalizers, digital sound processors, and so on. Following through college, I worked as a why we created Snap Embedded. That's what we're doing there. Also, co-organizing the Flutter Munich meetup. So if you ever want to come over or speak in Munich about Flutter, just feel free to hit me up. So, I left Embedded. Now I'm back at Embedded. Why? And this is maybe really short clip showcasing why I'm back at Embedded user interfaces, because this is still stuff we get today in new projects. And it's still sometimes you get a new coffee machine state of the art with a touchscreen and you use the touchscreen and you're like, oh no, God, why did you build this? So, yeah, I don't want to build any more of those things. I run, I want to build the UI like of the one today's talk is about. I think this, I hope this looks a little better than the things you saw before. That's the user interface of the Google Home replica we built or I built that I normally wanted to present here, but sadly there was, it would have been hard to set it up in five minutes, get it here on the table, so we'll rather stick with the presentation. Also, it would have been unfair for all the people online. But nevertheless, I have picture of everything. We're going through that now. So, the hardware. Yeah, as I said, not much more as you would imagine, Raspberry Pi 4. It's still model 4B, 4GB of RAM, that's enough. 2GB of RAM with a desktop environment and Flutter, yeah, wouldn't recommend that on a Raspberry Pi. Of course, the Raspberry Pi 5 would work. It would just be more expensive and it would run just as good. A little thing we have in here, what deals with the but better part with the Google Home or Smart Home devices in general, we can't add whatever we want hardware and as we will not be adding a voice command service on this device, I thought about what would be cooler. Voice commands would are already out there and what do we need to see? What is the most interesting thing and that's for a lot of people, I guess, interacting with custom hardware. Therefore, we integrated an air sensor there, the Pimoroni SCD41 measures CO2 temperature and humidity, connects to the Raspberry Pi with I2C and it comes that is also very handy with a ready Python library that's known to be working with the Raspberry Pi. The touchscreen is just some WaveShare 11 inch IPS panel, capacitive touch, USB, HDMI, really nothing too special. Those touch screens just got really good in the last years using them. Yeah, at least with the Raspberry Pi OS or so, just works out of the box, it's fine, nothing to worry anymore about. Then for the last part, yeah, with Smart Home, what most people think about is turning light bulbs, plugs on and off and for Smart Home projects or whenever you want to do projects on your own, devices that come in really, really handy are those Shelly bulbs and Shelly plugs because they just come in with a built-in web server and you just have a REST API, connect them through your Wi-Fi, they come with an app, super easy and yeah, you have a REST API where you just can interact and it on, off, change the colors, it couldn't get much easier. So, all together without a whole bunch of cables, that then looks like this. So, now that we have the hardware part together, now comes the interesting, the next part, the embedded flutter part and as the talk earlier already pointed out, there's not just flutter to run on embedded devices. If you Google it, if you want to start out with it, you will find a few repositories all dealing with flutter and embedded devices. We just saw one, in fact in the last talk, it was using flutter Pi, so what's with that? Why are there different options? Is this not flutter or well, it is flutter, but to understand this, we may have to, yeah, next slide, we may have to look at the Linux embedded that flutter uses. The main difference, those custom embedders have, the custom embedder connects or the embedders connect the flutter engine with the targeted platform and the main difference we have with those custom embedders, which I have, let's see if this works, here on the right side, fancy, I wasn't prepared for that. So, the main thing you can see is here, something's missing. Flutter for Linux just heavily depends on GTK and GTK, in fact GTK 2, which is getting a pain right now for flutter itself. So, what most of those, or what all of those libraries have in common, we don't really need those GTK parts that flutter uses anyway in embedded hardware. We don't have tabs, we normally don't have windows, we don't need all of that stuff, so they just get rid of it, and which sadly isn't that easy in the Linux, in the, let's call it vanilla, flutter, embedded, but they get rid of it, so you can use flutter on custom hardware without GTK and GTK, and that means you can use flutter, for example, with Wayland, with a custom embedder, as the talk before already pointed out, which is not possible right now with the, let's call it flutter on embedded projects, especially if you want to go in a really industrial style, but we're getting there. Also, a big part that's missing right now is tutorials on how tools are still, there's not so many out there, just Google it, it's, yeah, there's not much out there, but I'm sure we will get through this within this year, or at least maybe the next year, and then flutter will also definitely become available to startups, to smaller, medium-sized companies, there will be tools, software as a services around that, and flutter will get more mature, I think we don't know it, but I guess that flutter will get more mature in the embedded world in the next one to two years. But, so if we want to do a project right now, where we just want to try out how flutter on embedded devices works, at least for this project, when we use a Raspberry Pi, we have Raspberry Pi OS, we can just use flutter as it is, we can build for Linux there, it will work just fine. The newest Raspberry Pi OS changed to, I think it changed to using Wayland, I haven't tried it yet, but apparently it works alright. Flutter needs to do something about GTK2 anyway, so maybe it will be possible with the just normal flutter to build something suitable for Wayland and direct rendering as well in the future. For right now, if you're doing a hobby project, if you just want to try something out with a Raspberry Pi, just go with flutter as it is, it's fine. If you want to go with, if you want to use direct rendering, if you want to go with Wayland, if you want to get something into production grade, then you have to look at flutter Pi, Toyota's, IVI Home Screen, or the one from Sony, whereas the Toyota thing really is amazing and is moving forward at a really fast pace. So enough to this generic talk about flutter, what about the implementation for this project right now? I want to go through it in a few steps and yeah, the first part or the first part that we need to integrate for this project to work is connect to Raspberry Pi to the touchscreen. What do we do for that? We use the Raspberry Pi Installer, Installer Raspberry Pi OS, it just works out of the box. Thanks to a lot of guys that are also here. That's really, really easy. Then we need to get flutter running. For that, we wrote a tool, I just said it with Snap Embedded, we're doing open source projects around that. We basically built a tool in the end, there's a repo with the link called Snap CLI which allows you to, from your host machine, set up a Raspberry Pi that's connected in the same network as you are. It'll connect over SSH, it will install flutter, all the stuff you need, and it will set it up as a custom debug device so that you can just run the code and debug out of VS code on Linux, Mac, Windows, and the code will compile and everything will run in real time with hot reload working with the Dart tools on your Raspberry Pi. If you want to just develop on Raspberry Pi, that's already really easy and straightforward. Even the Dart DevTools work, all of that is already there. Just, yeah, no cross compilation, we don't want to get in that direction yet. The next part is, yeah, it's rather uninteresting. Here you can see a little bit of Dart, that code won't run, I cut out everything that looked ugly. So that's just basically a get request. You connect the bulb and the plugs with your flutter or Dart application, run this function to get the bulb status, set the bulb status, or to set the bulb color. The more interesting part, I guess, and what I wanted to point out, which will also explain how you would integrate a voice assistant to with a flutter application on the Raspberry Pi, is how do we connect this sensor that's connected to the I2C bus with our flutter application. We would have a different approach, or we do have different approaches that we could use here. We could do a Dart implementation of everything directly to the I2C bus. We could go through the data sheets of the sensor, implement everything by ourselves, all the commands do it all by ourselves. We could run up an MQTT prokure on the Raspberry Pi. We could then connect the sensor to this on the prokure, subscribe the flutter application on the MQTT prokure, because MQTT is one of those plugins that work with most of the custom embedders, so that really works out of the box. So that would be possible to take. We could, of course, here, I use Python, but we could use a Python backend, just make another REST API on this device and talk to it locally, I think, in a lot of embedded projects. It's done that way. Or we use Dboss. We have the Dboss running on Raspberry Pi OS. We have the Dboss running on most Linux systems, and we can just clip on the session bus for this purpose. The plugins are also already there. And for this example, this is what we did, because connecting Flutter application with whatever else process is running on the machine, you can just use Dboss. We can just use the Python example library that was already shipped with the sensor, of course. I mean, we don't want to do work twice. So we can connect whatever we want right now with packages plugins that are already available. Resources, thank you very much. Two minutes.