Workshop on Active Perception for Autonomous Systems in Unknown Environments
• Date and Time: Wednesday 7 December 2022 at 12:50 pm – 3:00 pm (UTC/GMT+10, Gold Coast, Australia)
• Room: G42_1.04, Griffith University Gold Coast
Dr. Hyondong Oh, Ulsan National Institute of Science and Technology (UNIST), South Korea
Dr. Cunjia Liu, Loughborough University, UK
• Workshop Schedule (It is subject to change)
|12:50 – 12:55||Welcome Address|
|12:55 – 13:20||Presentation 1: Source term estimation using information-theoretic and deep learning-based strategies for mobile sensors|
Dr. Hyondong Oh, UNIST, South Korea
|13:20 – 13:45||Presentation 2: Autonomous robotics for characterising gas releases in complex environments|
Dr. Cunjia Liu, Loughborough University, UK
|13:45 – 14:10||Presentation 3: Robust Robot Perception: From Semantics-Enabled Localisation to All-Weather SLAM|
Dr. Sen Wang, Imperial College, UK
|14:10 – 14:35||Presentation 4: Guidance with active observability enhancement |
Prof. Hyo-Sang Shin, Cranfield University, UK
|14:35 – 15:00||Presentation 5: Trajectory optimization for multi-target tracking using joint probabilistic data association filter|
Prof. Shoaming He, Beijing Institute of Technology, China
This workshop is organized and sponsored by UNIST Center for Autonomous Unmanned Monitoring Systems and supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (2020R1A6A1A03040570).
• Biography of speakers
Dr. Cujia Liu
Cujia Liu is currently a leading researcher in unmanned vehicles at Loughborough University. He received his PhD in autonomous vehicle control from the Dept. of AAE at Loughborough University in 2011. He worked as a control system engineer in industry for a short time after completing his PhD, then returned to Loughborough University as a Research Associate, where he became an academic in 2013. His research develops autonomous control methods and robotic systems for different application domains, including perception, remote sensing, data assimilation and intelligent decision-marking for precision agriculture and environment monitoring. He has published more than 100 research papers in journals and conferences and received about £1.7m research funding as the PI with another £3m research funding profile as the Co-I. He has been awarded the Charles Sharpe Beecher Prize by the Institution of Mechanical Engineers in 2013 for his contribution in autonomous helicopter control.
Dr. Sen Wang
Dr. Sen Wang is a Senior Lecturer (Associate Professor) in Robotics and Autonomous Systems at the Department of Electrical and Electronic Engineering and I-X, Imperial College London. Previously, he was an Associate Professor at the Edinburgh Centre for Robotics and UK National Robotarium, Heriot-Watt University and a post-doctoral researcher at the University of Oxford. His research focuses on robot perception and long-term autonomy, especially robot vision, Simultaneous Localisation and Mapping (SLAM) and robot learning. He has been PI/Co-I of over £25M projects funded by UKRI, EPSRC, EU Horizon, etc. He was awarded the 2022 AI Most Influential Scholar Award Honourable Mention in Robotics. He serves as Associate Editors of IEEE Transactions on Automation Science and Engineering, IEEE Robotics and Automation Letters and IEEE ICRA and IROS conferences.
Prof. Hyo-Sang Shin
Hyo-Sang Shin is Professor of Guidance, Control and Navigation Systems in Centre for Autonomous and Cyberphysical Systems at Cranfield University. He received his BSc from Pusan National University on Aerospace Engineering in 2004 and gained an MSc on flight dynamics, guidance and control in Aerospace Engineering from KAIST and a PhD on cooperative missile guidance from Cranfield University in 2006 and 2011, respectively. He has published over 200 journal and conference papers and has been invited for many lectures, invited talks and keynotes both in Universities and industries mainly on flight control and guidance, sensor, data and information fusion, cooperative control and aircraft ISTAR applications. He has been also involved in many research programmes and has been the coordinator of a few European projects. Professor Shin has significant experiences and track records in flight control, guidance, and sensor/data/information fusion related research and development projects. Professor Shin is a member of various technical committees (IFAC Technical Committee on Aerospace Control, IEEE Technical Committee on Aerial Robotics and Unmanned Aerial Systems), and program and editorial boards. He was programme co-chair of the 21st IFAC Symposium on Aerospace Control and programme chair of the 2019 IEEE Conference on Research, Education and Development of Unmanned Aircraft Systems (RED-UAS). He is also an Associate Editor of various international journals such as IEEE Transactions on Aerospace and Electronic Systems and also Journal of Intelligent and Robotic Systems. Professor Shin is Head of Autonomous and Intelligent Groups and leading relevant research areas in Cranfield University. The current research activities he is leading and being involved include, but not being limited to, data-centric guidance and control, decision making on multi-agent systems, information-driven sensing and fusion, multiple target tracking, and improvement of multiple vehicle cooperation.
Prof. Shaoming He
Shaoming He received the B.Sc. and M.Sc. degrees in aerospace engineering from the Beijing Institute of Technology, Beijing, China, in 2013 and 2016, respectively, and the Ph.D. degree in aerospace engineering from Cranfield University, Cranfield, U.K., in 2019. He is currently a Professor with the School of Aerospace Engineering, Beijing Institute of Technology, and also a Recognized Teaching Staff with the School of Aerospace, Transport and Manufacturing, Cranfield University. His research interests include aerospace guidance, multi-target tracking, and trajectory optimization. Dr. He received the Lord Kings Norton Medal Award from Cranfield University as the most outstanding doctoral student in 2020.