Autonomous vehicles rely on a robust understanding of their surroundings to make safe and efficient decisions in complex urban environments. One of the key challenges in this domain is the ability to perceive dynamic scenes and anticipate the behavior of other road users in real time.
Emerging approaches aim to improve perception systems in automated driving, particularly in urban contexts. Building on recent advances in sensor fusion and environmental understanding, this work investigates how aerial perspectives and dynamic data integration can support safer and more adaptive driving behavior. The goal is to enhance perception systems by reducing reliance on static prior knowledge, which can quickly become outdated in ever-changing cityscapes.
Another focus lies in the tighter integration of perception tasks to minimize abstraction losses and promote a more coherent flow of information from raw sensor inputs to high-level predictions. This involves leveraging real-world driving data, applying advanced training methods, and evaluating system performance under realistic conditions using research vehicles.
The results aim to advance the development of perception architectures that are more resilient, efficient, and capable of achieving human-level foresight in complex driving scenarios.
