********************************
Call for Papers
“Uncertainty Quantification for Computer Vision” Workshop & Challenge at ICCV 
2023 (2nd Edition)
https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Funcv2023.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=AIWC8Mr0UXtzZ%2BFPetzSYsvTJc8Iz0M5gVIpmNubdtc%3D&reserved=0
********************************

Submission Deadline July 18th AOE
Two types of paper are welcome:
- Regular Papers - (novel contributions not published previously)
- Extended Abstracts - (novel contributions or papers that have been already 
accepted for publication previously)

In the last decade, substantial progress has been made w.r.t. the performance 
of computer vision systems, a significant part of it thanks to deep learning. 
These advancements prompted sharp community growth and a rise in industrial 
investment. However, most current models lack the ability to reason
about the confidence of their predictions; integrating uncertainty 
quantification into vision systems will help recognize failure scenarios and 
enable robust applications.

The ICCV 2023 workshop on Uncertainty Quantification for Computer Vision will 
consider recent advances in methodology and applications of uncertainty 
quantification in computer vision. Prospective authors are invited to submit 
papers on relevant algorithms and applications including, but not limited to:

  *   Applications of uncertainty quantification
  *   Failure prediction (e.g., OOD detection)
  *   Robustness in CV
  *   Safety critical applications in CV
  *   Domain-shift in CV
  *   Probabilistic deep models
  *   Deep probabilistic models
  *   Deep ensemble uncertainty
  *   Connections between NNs and GPs
  *   Incorporating explicit prior knowledge in deep learning
  *   Computational aspects and real-time probabilistic inference
  *   Output ambiguity, multi-modality and diversity
All papers will be peer-reviewed, and accepted Regular papers are presented at 
the workshop and included in the ICCV Workshop Proceedings.

Challenge
The workshop features the MUAD Uncertainty Estimation for Semantic Segmentation 
Challenge. This challenge aims to evaluate the uncertainty estimation 
performance of the semantic segmentation models. The participants will download 
the training and validation sets (containing the RGB images and the 
corresponding ground truth maps) as well as the test set (only the RGB images 
are provided), and design and train the models. Then ones should provide the 
confidence maps which can provide enough information to help the decision-maker 
to find out the Out-Of-Distribution objects in the test set images. Some of the 
test set images have different levels of weather conditions (rain, fog, snow), 
which will be challenging to the robustness of the models.
The MUAD challenge link: click 
here<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcodalab.lisn.upsaclay.fr%2Fcompetitions%2F8007&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=cCyO029GlA4vTwO3Gb%2FbEygXznRfew31d06wnbUN22M%3D&reserved=0>.
More information about the MUAD dataset and its download link are available at 
MUAD 
website<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmuad-dataset.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=DFuS2%2FgJ6XiO%2Bw24L0lj1bL8DZ30JfYQn9WWrH4dYoY%3D&reserved=0>.

Submission Instructions
At the time of submission, authors must indicate the desired paper track:

  *   Regular papers will be peer-reviewed following the same policy of the 
main conference and will be published in the proceedings (call for papers with 
guidelines and template 
here<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Ficcv2023.thecvf.com%2Fsubmission.guidelines-361600-2-20-16.php&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=mzG3ugY3jX8BqZlYWD0gyuI%2FavyZKFNjdNujiECndx8%3D&reserved=0>,
 max 8 pages, additional pages for references only are allowed). These are 
meant to present novel contributions not published previously (submitted papers 
should not have been published, accepted or under review elsewhere).
  *   Extended abstracts are meant for preliminary works and short versions of 
papers that have already been accepted, or are under review, preferably in the 
last year in some major conferences or journals. These papers will undergo a 
separate reviewing process to assess the suitability for the workshop. These 
will *not appear* in the workshop proceedings. Template and guidelines (max 4 
content pages, additional pages for references allowed) 
here<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Funcv2023%2Funcv2023.github.io%2Fblob%2Fmain%2Fassets%2Funcv2023_workshopICCV_author-kit.zip&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=FnH8uXt9MmOW727iwhvX0MmDl0Vt4lT5LBJ3UC2Fp1k%3D&reserved=0>.
Submission site: 
https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fopenreview.net%2Fgroup%3Fid%3Dthecvf.com%2FICCV%2F2023%2FWorkshop%2FUnCV&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=kEYKMVu0ShbvtnBOwdiKRcgXwPlpB5GpBmhbcZz3HcI%3D&reserved=0

Important Dates (All times are end of day AOE)
Submission deadline: July 18th, 2023
Notification of acceptance: August 5th, 2023
Camera-ready deadline: August 10th, 2023

Organizing Commitee

  *   Andrea 
Pilzer<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fandrea-pilzer.github.io%2Fabout%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=cNXKJLNFUL8hSfzjt8Ihk3RBmvYOeggrng4KXgaxCCk%3D&reserved=0>,
 NVIDIA, Italy
  *   Elisa 
Ricci<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Felisaricci.eu%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=31v0U8wYIf5Bo5gWz6vYvNO%2F3c%2Bf4MT%2FQj0wC78z844%3D&reserved=0>,
 University of Trento, Italy
  *   Gianni 
Franchi<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ensta-paris.fr%2Ffr%2Fgianni-franchi&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=Xk8qnfMwsmp2pZE%2BO5LnBKnI0otrpSdTeVPQeZNPb1Q%3D&reserved=0>,
 ENSTA Paris, France
  *   Andrei 
Bursuc<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fabursuc.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=mP4Go7C7iiDzKNsxCx%2BVb%2FqsY0Njc64s%2B4f%2FbZzU8As%3D&reserved=0>,
 
<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fvaleo.ai%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=y09%2FiwyHMH2IWebVGKI678sBEaxJrm6OQT%2BMDLogjUM%3D&reserved=0>
 
valeo.ai<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fvaleo.ai%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=y09%2FiwyHMH2IWebVGKI678sBEaxJrm6OQT%2BMDLogjUM%3D&reserved=0>,
 France
  *   Arno 
Solin<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Farno.solin.fi%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=ELmTdIYOBQZ2lfdDoYXEUwoEBY0L8LNhCVy7DJibWxI%3D&reserved=0>,
 Aalto University, Finland
  *   Martin 
Trapp<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Ftrappmartin.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=cPuh0864pWm2CmgcQY302j90cYXz1ef3qLlMYf8oBzs%3D&reserved=0>,
 Aalto University, Finland
  *   Rui 
Li<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fruili-pml.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=tw0UeTSo1vRFcP17cjdxlJHuFgZTc7ULGrn0YLMqaZM%3D&reserved=0>,
 Aalto University, Finland
  *   Angela 
Yao<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.comp.nus.edu.sg%2F~ayao%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=nyfb9Iz8pOit9Ke3DFsOMJd6FlNiqwbVjxiUoy6%2BOjo%3D&reserved=0>,
 National University of Singapore, Singapore
  *   Wenlong 
Chen<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fchenw20.github.io%2Fwenlongchen.github.io%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=12oII%2BSlfqbg1c9mBgXoCb%2FSHJ9ViONJqoAfMExoWZE%3D&reserved=0>,
 Imperial College London, UK
  *   Ivor 
Simpson<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fprofiles.sussex.ac.uk%2Fp504012-ivor-simpson&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=lXXyEl0NM4uSZqZ87G5bw8Q2AIlb73M0O11rghqz%2Bos%3D&reserved=0>,
 University of Sussex, UK
  *   Neill D. F. 
Campbell<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fndfcampbell.org%2F&data=05%7C01%7Cuai%40engr.orst.edu%7Ccba867f6687445bb4a9508db5dfcb223%7Cce6d05e13c5e4d6287a84c4a2713c113%7C0%7C0%7C638207114242627049%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=1J54ejpYMUpRaplMtKesVg%2BLEyguHy7ZQn%2B7K0SmMH0%3D&reserved=0>,
 University of Bath, UK

_______________________________________________
uai mailing list
uai@engr.orst.edu
https://it.engineering.oregonstate.edu/mailman/listinfo/uai

Reply via email to