Adverse driving in equatorial and nordic weathers.
Background
Autonomous vehicles require several sensors to âunderstandâ what is around a vehicle. Common sensors include visible light cameras, thermal or heat cameras, Light Detection and Ranging (LiDAR), 4G Imaging Radar Detection as well as meta information such as the vehiclesâ location, movements, vibration as well as the current time. Information about the road conditions from Cooperative Intelligent Transport Systems or C-ITS systems, are beginning to appear through the deployment of roadside units. The role of the weather is a critical factor in autonomous driving as adverse, or harsh, weather reduces the effect of sensors. By harsh this implies snow, ice, fog, rain as well as sand, haze, and heat effects. Simply put, cold and polar as well as hot and equatorial conditions.
Figure 1 – Autonomous driving vision in the future (image ZF, Germany).
Given that this is an application for collaboration between Sweden and Singapore and our domain is autonomous driving, with a focus on harsh weather we draw attention to the difference between the two meterologies.
Swedenâs conditions | Singaporeâs conditions |
Nordic | Equatorial |
Cold and nordic conditions | Hot, hazy conditions |
Often rural | Often urban |
Low sun | High sun |
Lightly used streets | Crowded streets |
20 people per km2 | 8400 people per km2 |
Table 1 – Conditions in Sweden compared to Singapore wrt. autonomous driving.
Swedenâs contribution
In order to obtain the best object detection; other vehicles, pedestrians, cyclists and so on, the data should be of the best quality. From sensors to machine learning algorithms, using the data sources above, we need to ascertain the quality at each step and assign a value, 1 = poor to 9 = excellent. This score, or metric, we call the Data Readiness Level, or DRL.
The tasks will be to look at the data sources we have, camera, RGB and thermal, as well as LiDAR. From the data we have constructed the pipelines used by autonomous vehicles to determine the data quality. A simple example is, if an object cannot be detected, (classically a bounding box), we will look at flaws in the data, (noisy), missing data and so on. Algorithms to perform the object detection will be done by RISE based on mostly open-source sources. Algorithms with high quality data will be implemented.
Figure 2 – Data Readiness Levels (DRL).
Data Readiness Level, akin to Technical Readiness Levels for systemizing data quality. This formulates data quality into a metric (1-9) which we use to assess data for various driving situations. 1 implies more work needs to be gathered, or has errors, corrupted, missing data and so on. Level 9 indicates the data is complete, fits all intended situations to âobserveâ and can be used without significant issues. The situations are often described by Operational Design Domain (ODD) documents.
Figure 3 – Data Readiness Levels (DRLs) in practice. RGB, Thermal Camera and LiDAR data merged and evaluated. Fused data from the Finnish Geospatial Institute (FGI) which we use to assess the data quality.
There is a significant difference between ârealâ and âcuratedâ data, public datasets are cleaned, long distances removed, some locations obscured. This is adequate for researchers and practitioners who want to import data and try ML algorithms, but the real world is different. RISE has performed data quality for data from 2 ROADVIEW & 1 public datasets, KITTI and mmDetection3D.
Figure 4 – Singaporeâs streets in nuScenes AD dataset.
We would like to include the competence built up in Singapore for autonomous driving. We will discuss both proprietary & open datasets, where the nuScenes dataset features Singapore. This is shown in the image to the right. Note, FellowBots Digital Twin competence can generate most weather conditions using UnrealEngine5 in many city scenarios.
Swedenâs contribution through FellowBot AB
FellowBot is a spin-off from Royal Institute of Technology KTH since 2017 and was nominated as one of top 50 start-ups by Nvidia 2018. FellowBot shares Vision Zero and has claimed around 5% of the digital car driving market in Sweden. We began collecting non-personal driving behavior data from many students in cooperation with driving schools this year. The FellowBot team is one of the first to integrate the process including weather generation, AI traffic navigation and co-synchronization between Unreal Engine and SUMO (traffic simulation software). The outcome of co-simulation lays ground to decision support for testing both intelligent connected vehicles/autonomous vehicles and mobility systems as a whole.
Figure 5 – Co-simulation of SUMO for traffic planning with Unreal Engine for 3D visualization.
During this project, FellowBot will test road & intersection generation in open drive format. This integrates the design of future city/infrastructure with mobility systems. Autonomous driving in urban mixed traffic is identified as much more challenging than motorway driving considering vulnerable road users & heterogeneous traffic situations.
Figure 6 – Scenes and maps generated by FellowBotâs software.
Singaporeâs contribution through National University of Singapore (NUS)
The Sweden-Singapore collaboration project described fits NUS projects, this is because they complement RISE and FellowBots competences:
RoboDepth is a comprehensive evaluation benchmark designed for probing the robustness of monocular depth estimation algorithms. It includes 18 common corruption types, ranging from weather and lighting conditions, sensor failure and movement, and noises during data processing.
Robo3D is an evaluation suite heading toward robust and reliable 3D perception in autonomous driving. NUS probes the robustness of 3D detectors and segments under out-of-distribution (OoD) scenarios against corruptions that occur in the real-world environment. Specifically, they consider natural corruptions in the following cases:
- Adverse weather conditions, such as fog, wet ground, and snow
- External disturbances caused by motion blur or result in LiDAR beam missing
- Internal sensor failure, including crosstalk, incomplete echo, cross-sensor scenarios.
Essentially, the prospect of combining NUSâs efforts and expertise in these areas & RISEâs is well matched. RISE researches systemizing quality detractors in autonomous driving Nordic adverse weather, whilst NUS focusses on LiDAR data quality. A very recent video overviews an adverse driving challenge NUS competed in Japan, 2024-05-15, see this link.
Recommendations
âIn ROADVIEW we also need to collaborate outside of Europe, particularly Asian (friendly) countries are good: Japan, South Korea, Taiwan, Singapore. Software and something to show to the project officer would be good, indeed in your visualisation tool.â
Associate Prof. Eren Aksoy, Halmstad University, page.
âWe have carefully reviewed the Sweden-Singapore collaboration project you mentioned and find it both relevant and intriguing. The prospect of combining our efforts and expertise in these areas is excitingâ.
Associate Prof. Ooi Wei Tsang, National University of Singapore, page.
Project Plan
# | Date | Format | Topic | Involved partners |
1 | 2024-07-01 | Physical | Project start. Stockholm sync. on scenario setup, locations, maps & weather. | RISE & FellowBot. |
2 | 2024-07-15 | Online | Sync meeting | RISE, FellowBot & NUS |
3 | 2024-09-01 | Face-2-Face | Introduction(s)Personnel Groupsâ directions Existing projects intros.Demos RISE, FellowBot, NUS | RISE, FellowBot (online) and NUS. |
4 | 2024-09-02 | Face-2-Face | Software reviewCode review of DRLCode review of Robo*Wrap-up | RISE, FellowBot (online), and NUS. |
5 | 2024-10-01 | Online | Meeting follow-up Work going forwardDigital EnvironmentsFuture Application | RISE, FellowBot and NUS. |
6 | 2024-10-20 | Documentation | Project end. Project document delivered. | RISE, FellowBot and NUS. |
In conclusion
RISE AB, FellowBot AB, and the National University of Singapore have been in discussions around autonomous driving with a focus on adverse weather conditions over several months. Given that the European and Asian weather conditions differ, we understand there is mutual interest in collaborating, not only over the weather conditions, but also in a detailed study of source code & AD data. Joint interest in systemizing LiDAR quality data is our clear focus.
Finally, in terms of joint work going forward, we note a recently opened programme entitled âInternational individual mobility for cutting-edge technologyâ where Singapore is a country that Vinnova has highlighted for further staff and project exchanges.