*** Please accept our apologies if you receive multiple copies of this CFP ***
============================================ CALL FOR PAPERS ****************************************************************** ICRA 2023 Workshop on Multidisciplinary Approaches to Co-Creating Trustworthy Autonomous Systems (MACTAS) https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmactasworkshop.github.io%2F2023%2F%23cfp&data=05%7C01%7Cuai%40engr.orst.edu%7C3d9a2131624c4e7e21e808db2ae6155c%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638150939907457674%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=fiDfmoKZUxSKv3wmFGrRq4naeoq6Cpjw6LpDBDzp5RI%3D&reserved=0 ********************************************************** The workshop on Multidisciplinary Approaches to Co-Creating Trustworthy Autonomous Systems (MACTAS) will bring together academics and industry practitioners from a wide range of disciplines and backgrounds (including robotics, engineering, AI, computer science, social science, humanities, design, and law). Defining autonomous systems as systems involving software applications, machines, and people, which are able to take actions with little or no human supervision, the workshop will explore different definitions of TAS and individual aspects of trust from a multidisciplinary perspective. We are interested in several factors contributing to the trustworthiness of autonomous systems, which include but not limited to the following: robustness and resilience in dynamic and uncertain environments; the assurance of the design and operation of autonomous systems through verification and validation processes; the confidence the systems inspire as they evolve their functionality, their explainability, accountability, and understandability to a diverse set of users; defences against attacks on the systems, users, and the environment they are deployed in; governance and the regulation of their design and operation; and the consideration of human values and ethics in their development and use. Trust is a multi-dimensional issue and is conceptualised differently by a range of disciplines. Hence, we invite novel contributions (short and regular papers with 2-4 and 6 pages respectively) as well as already published journal/conference papers covering a wide set of topics that will be attractive to both technical and non-technical audiences: - Methodologies to certify autonomous systems - Public perception of autonomous systems - Explainable and Interpretable AI solutions for real-world applications - Safety and security of autonomous systems - Trustworthy and resilient human-machine teaming - Regulation - Notions of trust in autonomous systems - Responsible Research and Innovation for trustworthy AI and autonomous systems - Transparency of AI systems - System of humans Submission Types - Short Paper 2-4 pages excluding references) - Regular Paper (6 pages excluding references) - Published papers (to be presented at the workshop) Please use the standard ICRA template when submitting a novel contribution. All accepted papers will be presented in a spotlight talk as well as a poster. Additionally, all accepted contributions will be also invited to submit an extended version to our planned Special Issue on TAS (TBD). Best Paper / Poster Awards - Best TAS Paper Award - Best TAS Poster Award Our international Programme Committee will review all submissions using EasyChair and will also select a paper for the Best TAS Paper Award (£250). The workshop participants will vote for a paper to receive the Best TAS Poster Award (£250). Important dates: - Paper submission: 31 March, 2023 (AOE) - Acceptance notification: 26 March, 2023 - Camera-ready: 10 May, 2023 - Workshop date (TBD): 29-30 May, 2023 Organising Committee: Lars Kunze, University of Oxford, UK Sinem Getir Yaman, University of York, UK Mohammad Naiseh, University of Southampton, UK Ayse Kucukyilmaz, University of Nottingham, UK Baris Serhan, University of Manchester, UK Zhengxin Yu, Lancaster University, UK
_______________________________________________ uai mailing list uai@engr.orst.edu https://it.engineering.oregonstate.edu/mailman/listinfo/uai