[00:00.000 --> 00:17.760] I'm going to be presented by two amazing researchers, Iroclis and Christos. [00:17.760 --> 00:18.760] Good morning. [00:18.760 --> 00:19.760] Thank you for the introduction. [00:19.760 --> 00:22.400] Thank you for being here and attending our presentation. [00:22.400 --> 00:27.640] So I'm Iroclis Varlamis and here is Christos Cronis. [00:27.640 --> 00:33.880] We come from Harakopio University but here we are also representing the Open Technology [00:33.880 --> 00:38.920] Alliance, the Greek Open Technology Alliance for open source software. [00:38.920 --> 00:45.880] So we will present today FOSBOT which is a project that started a couple of years ago [00:45.880 --> 00:54.120] in terms of the Google Summer Code as an open design, an open source software robot and [00:54.120 --> 01:00.120] gradually it evolved to what you see here which is a more concrete, let's say robot, [01:00.120 --> 01:07.400] a more concrete design that is specifically targeted to educators in our educational levels. [01:07.400 --> 01:12.720] So together with me, I have Christos and I will give the floor to Christos Cronis who [01:12.720 --> 01:21.560] is the lead developer of the team of FOSBOT and also the designer of this robot. [01:21.560 --> 01:24.560] So Christos, the floor is yours. [01:24.560 --> 01:25.560] Thank you. [01:25.560 --> 01:27.560] Hello, I'm Christos. [01:27.560 --> 01:32.800] I would like first to introduce the team behind the FOSBOT. [01:32.800 --> 01:38.160] Let's start with Mitzi, the supervisor of Professor Iroclis Varlamis who is the coordinator [01:38.160 --> 01:43.320] of the project, me as the designer of the robot including the hardware and the software, [01:43.320 --> 01:49.920] Eletherian Vanai who are front end and backend developers and past Google Summer Code contributors [01:49.920 --> 01:56.840] of the project, Dimitris and Yoros are backend developers and assemblers of the physical robot [01:56.840 --> 02:04.920] and finally Manusos who is backend developer and also the developer of the FOSBOT simulator. [02:04.920 --> 02:12.960] Now so this is the FOSBOT, also we have a physical version of the robot in the table. [02:12.960 --> 02:20.000] Let's start with an overview of the hardware and design aspects of the FOSBOT. [02:20.000 --> 02:26.120] On the front side of the robot, we have a multi-corner RGB LED on the top left, photo [02:26.120 --> 02:30.840] resistor on the top right and ultrasonic sensor in the middle. [02:30.840 --> 02:40.960] The entire body of the robot is 3D printable in any FDM printer with bed size 220 by 220 [02:40.960 --> 02:42.760] millimeters. [02:42.760 --> 02:48.200] The design is customizable and easy to assembly. [02:48.200 --> 02:54.280] At the bottom of the robot, there are three IR optical sensors suitable for line following [02:54.280 --> 03:00.760] or detection application. [03:00.760 --> 03:06.600] The robot moves based on two DC motors and a free roller on the backside. [03:06.600 --> 03:14.720] This motor has odometers to measure the movement of the robot and there is also a three axis [03:14.720 --> 03:20.480] accelerometer and gyroscope sensor inside. [03:20.480 --> 03:27.320] At the back of the robot, there is a speaker and an off switch, a charging port and the [03:27.320 --> 03:33.640] robot powered by three lithium ion rechargeable batteries. [03:33.640 --> 03:39.680] On the top of the robot, there are some unique features, a pulling component in the backside [03:39.680 --> 03:41.480] of the robot. [03:41.480 --> 03:48.840] That component gives the opportunity to attach a small object, let's say using a rope, in [03:48.840 --> 03:56.360] order to measure how the extra weight affects the movement of the robot. [03:56.360 --> 04:05.160] Also we have a large detachable cover, it's the white circular piece on the top. [04:05.160 --> 04:13.160] That component is a Lego brick compatible and also have a hole in the outside. [04:13.160 --> 04:19.160] This hole drives to the bottom of the robot and allowing the marker to be attached for [04:19.160 --> 04:23.120] programmatic drawing. [04:23.120 --> 04:31.760] Now let's summarize the key features from the hardware or design aspects. [04:31.760 --> 04:39.120] The robot is 3D readable and offers repeatability and customization, can be designed to be [04:39.120 --> 04:43.560] used with common electronics. [04:43.560 --> 04:49.000] In the current version, the brain of the robot is based on the Raspberry Pi, but we already [04:49.000 --> 04:53.760] see some variation of the robot with Arduino or MicroBit inside. [04:53.760 --> 05:01.960] It's open source and not open design, of course, and it comes in a low cost around 90 to 120 [05:01.960 --> 05:03.720] euros. [05:03.720 --> 05:08.320] In the slide you can see our lab on Harcobio University and some pictures from, let's say [05:08.320 --> 05:12.520] the assembly line of the robot, from the physical robot. [05:12.520 --> 05:21.080] Next we have some pictures from different assembly phases of the robot. [05:21.080 --> 05:23.920] Now we'll speak about the software. [05:23.920 --> 05:28.200] We have created a custom platform built in the robot with three modes, the Gitter-Carden [05:28.200 --> 05:32.280] mode, the elementary school mode and the high school or advanced mode. [05:32.280 --> 05:38.160] The Gitter-Carden mode features a friendly UI with card blocks based on Google's Blockly. [05:38.160 --> 05:44.680] Additionally, it's expandable with the option to add new cards to execute Python or Blockly [05:44.680 --> 05:45.680] scripts. [05:45.680 --> 05:51.440] For the elementary school, we have a custom user interface once again based on a more [05:51.440 --> 05:59.300] complicated Google's Blockly version that uses custom code blocks for all the sensors [05:59.300 --> 06:00.800] of the robot. [06:00.800 --> 06:08.840] The experience is similar to Scratch and that makes the robot to be easy to interact, that [06:08.840 --> 06:11.200] offers easy interaction with the robot. [06:11.200 --> 06:15.640] Finally, we have the high school or advanced mode. [06:15.640 --> 06:20.760] This mode based on Jupyter notebooks and the native Python language. [06:20.760 --> 06:27.400] This mode is under construction but will be available soon. [06:27.400 --> 06:30.680] Now let's take a look to the platform UI. [06:30.680 --> 06:34.000] This is the platform's home screen. [06:34.000 --> 06:37.880] That home screen allows the creation of multiple projects. [06:37.880 --> 06:45.440] Also we have the ability to import or export or share a project between the users. [06:45.440 --> 06:50.760] Also using the little code in the top right, we can modify the behaviors on some blocks [06:50.760 --> 06:55.160] such as the default distance, let's say for one-step movement. [06:55.160 --> 07:08.280] The icon with the three ABC cubes in the lower right offers access to the kindergarten mode. [07:08.280 --> 07:12.160] Now we see the kindergarten mode. [07:12.160 --> 07:17.320] The kindergarten mode utilizes a simplified version of Blockly using card-based blocks [07:17.320 --> 07:19.760] for basic actions. [07:19.760 --> 07:25.000] In the bottom right corners, we can see an example of how this mode can be used in classroom [07:25.000 --> 07:26.000] setting. [07:26.000 --> 07:31.720] In this example, we are trying to teach students the numbers through a gridded carpet with [07:31.720 --> 07:35.400] numbers on it. [07:35.400 --> 07:40.880] In the elementary school mode, we have, as already said, a fully custom version of Blockly. [07:40.880 --> 07:46.720] On the left side, we have different categories of blocks, including mathematics, programming, [07:46.720 --> 07:49.320] movement and sensing. [07:49.320 --> 07:55.080] On the right, there are some control buttons and a terminal window for printing real-time [07:55.080 --> 07:58.160] measurements and messages. [07:58.160 --> 08:01.640] Now, for the advanced mode. [08:01.640 --> 08:05.360] The advanced mode of the robot is based in Jupyter. [08:05.360 --> 08:10.800] The user can directly program the robot using native Python language, combined with our [08:10.800 --> 08:17.840] custom robot library, and the code can be combined with text and images. [08:17.840 --> 08:24.280] Then the whole page can be exported, including the result of the program execution, as an [08:24.280 --> 08:30.080] experimental report in the class. [08:30.080 --> 08:33.720] Now for the action. [08:33.720 --> 08:38.160] This is the line-following program that is written using the Blockly. [08:38.160 --> 08:43.880] It's a very common task for students when we teach them robotics. [08:43.880 --> 08:49.920] We have some videos to present you, on the left you see a video of the robot line-following [08:49.920 --> 08:54.160] a line and stopping when it detects an obstacle in front of it. [08:54.160 --> 08:59.360] On the right we have a video on the robot running the same code, but this time following [08:59.360 --> 09:01.320] the line in the loop. [09:01.320 --> 09:07.240] When a colleague puts her hand in front of the robot, it stops and waits until no obstacle [09:07.240 --> 09:08.240] is detected. [09:08.240 --> 09:24.080] Additionally, the physical robot, we also have a simulated environment. [09:24.080 --> 09:30.240] We have developed a library and a custom simulation environment for our robot using [09:30.240 --> 09:32.120] Coppelia SIM. [09:32.120 --> 09:37.200] This was a crucial step for us because it eliminates the need for a physical robot, [09:37.200 --> 09:42.680] and that means that the experimentation with the robot comes in no cost. [09:42.680 --> 09:50.120] Our support platform works seamlessly in both the physical and the simulated environment, [09:50.120 --> 09:54.440] allowing the project to run identically in either setting. [09:54.440 --> 09:58.920] On the left you can see a video of a simple example of our platform combined with the [09:58.920 --> 09:59.920] simulator. [09:59.920 --> 10:09.360] Here is more examples of our virtual environment. [10:09.360 --> 10:14.840] Those examples created for different teaching scenarios, and also we have a video in the [10:14.840 --> 10:25.200] top right that demonstrates a line-following project inside the simulator. [10:25.200 --> 10:31.240] We are trying to constantly improve the robot, and we have received strong interest from [10:31.240 --> 10:33.120] our university students. [10:33.120 --> 10:38.760] We have already students creating educational material and developing new features such [10:38.760 --> 10:46.400] as the real-time graphs, and soon we will hope to integrate the new features to the [10:46.400 --> 10:49.000] main platform. [10:49.000 --> 10:55.560] Now let's dive deeper into the workings of our platform. [10:55.560 --> 10:59.840] Firstly, we have created a custom library to simplify the process of controlling the [10:59.840 --> 11:06.200] electronic parts of the robot, and that library is based on my 2019 Google Summer of Code [11:06.200 --> 11:07.680] contribution. [11:07.680 --> 11:14.640] The platform was built using Flask, Socket I.O., Python, Blockly, and can be deployed [11:14.640 --> 11:20.800] to the robot via docking containers for easy distribution and maintenance. [11:20.800 --> 11:27.880] The robot, after powering up, tried to connect to a non-Wi-Fi, and if that is not possible, [11:27.880 --> 11:31.600] then the robot creates its own wireless access point. [11:31.600 --> 11:38.960] The access to the platform can be gained through a user's preferred browser, such as Chromium, [11:38.960 --> 11:46.880] and finally, all the necessary tools are already pre-installed inside the robot. [11:46.880 --> 11:51.520] These allow a hassle-free experience, as the user never needs to install anything to their [11:51.520 --> 11:54.560] computer. [11:54.560 --> 11:59.480] In this slide, we present a brief overview of how users can access the multiple levels [11:59.480 --> 12:00.480] of the robot. [12:00.480 --> 12:06.640] The top level is designed for less experienced users, and is where the platform resides inside [12:06.640 --> 12:09.840] multiple docked containers. [12:09.840 --> 12:14.960] The access to this layer can be achieved through a web browser. [12:14.960 --> 12:21.440] The second and third levels are designed for advanced users, with knowledge of Python [12:21.440 --> 12:29.560] language and bus, and can be accessed through SSH. [12:29.560 --> 12:33.560] Before concluding, I would like to present the future prospects for the robot and its [12:33.560 --> 12:39.000] potential use in higher education. [12:39.000 --> 12:44.880] In Heracopia University, we have already started to examine the potential of using the Fosbund [12:44.880 --> 12:50.680] as a machine learning robotics platform by combining the custom high-level library, the [12:50.680 --> 12:56.560] simulated environment, and the physical robot with advanced algorithms. [12:56.560 --> 13:01.680] With this combination, the Fosbund, we believe, has the potential to be used in various ways, [13:01.680 --> 13:04.080] for example, reinforcement learning. [13:04.080 --> 13:11.440] Additionally, if attached to the robot some advanced sensors, such as cameras or LiDAR, [13:11.440 --> 13:19.680] it can be used as a self-driving platform or a computer vision platform, whatever. [13:19.680 --> 13:22.240] So that brings us to the end of our presentation. [13:22.240 --> 13:24.320] I hope you enjoyed it. [13:24.320 --> 13:31.040] Before closing, I want to add a couple of things to this excellent presentation of Christos. [13:31.040 --> 13:37.640] First of all, I have to say that technology wouldn't be successful without content, first [13:37.640 --> 13:39.000] of all, and without people. [13:39.000 --> 13:48.480] So with the help of Open Technologies Alliance, we also managed to have a great group of educators, [13:48.480 --> 13:54.800] primary and secondary education, that currently are creating, are developing educational activities [13:54.800 --> 14:00.320] and educational content for teachers in Greece. [14:00.320 --> 14:07.280] And they are currently running some seminars for Fosbund, and they are currently educating [14:07.280 --> 14:16.280] them on programming and using Fosbund in their activities, in their teaching activities, [14:16.280 --> 14:22.560] over 1,000 or almost 1,000 teachers around Greece. [14:22.560 --> 14:29.240] So the benefit is that we have the virtual simulation environment for Fosbund, so they [14:29.240 --> 14:33.520] can start working in the simulation environment, and then everything that they have created [14:33.520 --> 14:39.240] there, they can directly apply it to Fosbund, they can print Fosbund, assemble it and use [14:39.240 --> 14:42.480] it in the actual process. [14:42.480 --> 14:48.640] Another thing that I would like to add to the higher education part of this presentation [14:48.640 --> 14:54.120] of this work is that we are currently working with some colleagues in the university in [14:54.120 --> 14:59.680] order to develop a short term curriculum, let's say one year curriculum with basic [14:59.680 --> 15:07.840] IT courses such as data management courses, IoT programming, basic Python programming, [15:07.840 --> 15:13.200] machine learning and AI as Christos said, in order to develop content that in most of [15:13.200 --> 15:17.800] the activities will use Fosbund as its main demonstration platform. [15:17.800 --> 15:21.560] So this is another effort that we are trying to do, we are working on at the moment, and [15:21.560 --> 15:25.640] we hope that it will soon bring us some results. [15:25.640 --> 15:52.800] And I would like to thank you once again for your attendance.