HTE-ComSoc Thesis Award 2020

Papers under review

Marcell Rausch: Drone Localization in Ad-hoc Indoor Environment  (MSc)

reviewers identified: none
received reviews: none

In this thesis, I propose a system that uses single point LIDAR sensors to scan a multicopter’s environment and build a 3D map of its surroundings using SLAM algorithm. The selected sensor is a VL53L1X time of flight LIDAR sensor developed by STMicroelectronics. Due to the recent COVID-19 pandemic, I had no access to the laboratory equipment. As a workaround, I have used Gazebo simulator and simulated LIDAR measurements as close to real VL53L1X measurements as possible.

Different camera and ranging sensor-based solutions can be found for SLAM applications. Ranging sensor Based setups mostly use heavy, sensitive but highly accurate planar scanners. A solution for hobbyists and students, called Crazyflie Multi-ranger deck uses the 5 pieces of the same VL53L1X sensor for SLAM applications. Another product called Skydio2 is an autonomous drone that is capable of active tracking and object avoidance even in dense forests, using 6 fisheye cameras to scan its environment. This drone is the most reliable drone for autonomous tracking currently available.

I have used PX4 repository that uses Gazebo to simulate an Iris drone and placed two LIDAR sensors on it, each scanning the top and bottom hemisphere evenly. LIDAR and IMU measurements are recorded into a Rosbag file, that is filtered to simulate a number of VL53L1X sensors in predefined orientation and settings. These sensor parameters are determined by measurements done with an actual VL53L1X sensor. Filtering of Rosbag file is done offline that allows rapid testing of sensor settings and layouts and enables SLAM performance comparison.

In the process of 2D SLAM evaluation, I used 13 sensors evenly distributed in a circular shape. Each sensor setting is introduced one by one to see their effects independently. I have found that 3x3 resolution and 33ms sampling rate in long preset mode is the optimal setting for VL53L1X for SLAM applications. I have found that the minimum number of sensors is 8 for Cartographer to be able to track the drone and map its surroundings reliably.

As for 3D SLAM evaluation, I have used 43 simulated LIDARs evenly distributed to scan the drone’s environment in 3D. With all sensor parameters enabled, Cartographer SLAM proved to be unreliable and was unable to track the path of the drone. Accuracy can be increased by adding more sensors, but placing 43 sensors on a real quadcopter is already not a feasible project, therefore I came to the conclusion, that 3D SLAM is not possible using the VL53L1X sensor and Cartographer SLAM.

As a conclusion the proposed LIDAR system with 8 VL53L1X simulated sensors can localize and map a quadcopter’s environment in 2D, with 2 limitations: The biggest measured distance in the room being mapped cannot be more than the maximum ranging distance of the sensor and requires a slow flight speed of maximum 5km/h.

 

Dávid Kóbor: Photonic approaches to beam steering of phased array antenna  (MSc)

reviewers identified: none
received reviews: none

The commercial roll-out of fifth generation mobile networks is happening currently. To satisfy the user demands of high data-speed, widespread coverage and low latency, completely new solutions are being commissioned in every layer of the network. Moreover, service providers are interested in transparent, highly reconfigurable, low cost and scalable solutions.

I would like to draw attention to three specifics area of development in this field, all related to the physical layer. Firstly, in order to provide the sufficient bandwidth, more and more radio access units (RAUs) are being directly connected to higher level aggregation points of the network with optical fiber. Secondly, RAUs are being equipped with large antenna systems and use MIMO technology to ensure highly reliable wireless access. These sophisticated antennas also allow for beamforming (up to several beams from the same system) and electronic steering of the beams, which could increase transmission distance and speed in a number of scenarios. Thirdly, closely related to the previous point, higher frequencies up to mm-wave are being utilised. Large effort has been dedicated to developing electrical-optical systems, which are capable of addressing all these issues at once. The motivation to do so is the opportunity of reducing overall complexity while lowering both device and operational costs.

In my thesis I evaluate a photonic assisted beam-steering optical-electrical system by measurements. I review the recent literature on this topic and choose the system to be constructed. I select and characterise the required devices, focusing on the optical ones. After elaborating on the underlying theory, I conduct measurements to asses the reliability and applicability of such systems and underline the main factors contributing to their errors. Finally, I run simulations to determine how the error from the optical layer affects the behaviour of the antenna system.

 

Csongor Bartha: Application of Extended Berkeley Packet Filters in Cloud Environment  (MSc)

reviewers identified: none
received reviews: none

The wide-scale adoption of container-based virtualization technologies is supported by the Kubernetes container management system, which provides the required tools to implement reliable and scalable services. It is inevitable to create a monitoring system for the containers and all kinds of computing resources that make up a Kubernetes cluster, when operating services of great complexity, as it influences the quality of those services to a great extent.

In my thesis, I present, how the Berkeley Packet Filter, and especially its enhanced version (the extended Berkeley Packet Filter - eBPF) can be used for detailed monitoring of cloud systems. The eBPF is a Linux kernel mechanism, that lets executing packet filtering and other small programs of data collecting purpose inside the kernel, that was written in the user space, by using different kinds of probes. I present eBPF in detail, including its programming possibilities and its applications, with special focus on the networking and monitoring areas. I also illustrate the more important use cases related to eBPF, that are currently available.

I present the design of an eBPF-based Kubernetes monitoring system along with its components and also the detailed steps of its implementation. I examine, how this system can be used for collecting metrics and monitoring two kinds of resources (TCP network traffic and cache memory). I finish my thesis evaluating the results of the measurements with the help of some of the most popular monitoring tools, and also cover the topic of how this system could be extended in the future.

 

Attila Bálint Gróf: Deep Learning based Object Analytics on UAV Imagery  (MSc)

reviewers identified: none
received reviews: none

In the past few years, deep learning solutions have conquered most fields of computer vision. Today, state-of-the-art neural networks can perform tasks that previously required human input, such as style transfer or vehicle counting.

There is an increasing number of deep learning papers published each year and object detection is among the most researched fields. Object detection neural networks have outperformed other solutions based on traditional image processing in most baseline scenarios. The UAV (Unmanned Aerial Vehicle) industry has been very active since the 19th century, and in recent years small and compact drones have become more capable and accessible.

In this work, a fully functional vehicle counting system is presented, that is based on deep learning algorithms and uses a DJI Spark drone for vision. An SSD (Single Shot Multibox Detector) object detection deep neural network is trained with transfer learning on 3 different datasets. The first dataset is the COCO database, then the model is trained on the VEDAI (Vehicle Detection in Aerial Imagery) database. There is an aerial vehicle database created in this work that contains 250+ hand labelled images (some containing more than 50 cars and 5 buses). The SSD deep neural network is trained with this custom dataset and reaches a global loss of just ~ 0.2. Later, this trained model is used for inference requests.

Various inference environments are analysed, CPUs, GPUs, Android SoC and FPGAs are inference performance tested an evaluated. The best result comes from the Azure FPGA, it reaches 0.15 seconds/image speed when sending 10 images in one batch.

The vehicle counting system is operating by the DJI Spark drone sending its’ live video feed to an Android application, which is running MobileNet v2 SSD to process the image frames. The average inference time is measured at ~ 280 ms. To further improve detection results and remove ghost objects from the final detections, a new convolutional deep neural network is introduced. This new model is capable of removing ghost vehicles from a video, based on the initial detection results. At the end, all results and implementations are analysed and evaluated.

 

Instructions

Scoring instructions (doc)

Review deadline 30 August, Sunday.


If you need further information, please contact me simon@tmit.bme.hu

Csaba Simon

Past international events

SDL 2017 - 18th International System Design Languages Forum Model-driven dependability engineering
October 9-11, Budapest
http://www.sdl2017.hte.hu/

 

ONDM 2017 - 21st International Conference on Optical Network Design and Modeling
May 15-17, Budapest
http://www.ondm2017.hte.hu

 

SPECOM 2016 - 18th International Conference on Speech and Computer
August 23-27, Budapest
http://www.specom2016.hte.hu/

 

EUSIPCO 2016 - 24th European Signal Processing Conference
29 August - 2 September, Budapest
http://www.eusipco2016.org

 

IEEE HPSR 2015 - 2015 IEEE 16th International Conference on High Performance Switching and Routing
July 1-4, Budapest
http://www.ieee-hpsr.org

 

21th European Wireless Conference
May 20-22, 2015 Budapest
http://ew2015.european-wireless.org


IEEE PerCom - IEEE International Conference on Pervasive Computing and Communications

March 24-28, 2014, Budapest, Hungary
http://www.percom.org/2014
IEEE PerCom 2014 conference


IEEE ICC2013 - IEEE International Conference on Communications
June 9-13, 2013, Budapest, Hungary
http://www.ieee-icc.org/2013