[UAI] [CfP] Now running: IEEE FG 2020 Chalearn LAP Challenge: Identity-preserved Human Detection (IPHD)

2019-11-20 Thread Albert Clapés i Sintes
*** Please accept our apologies if you receive multiple copies of this CfP
***

This is a message regarding a change of status of our competition
*IEEE FG 2020 Chalearn Looking at People Challenge on Identity-preserved
Human Detection (IPHD)* at http://chalearnlap.cvc.uab.es/challenge
/34/description.

*The competition is now up and running!*

*CONTEST DESCRIPTION*

For the competition, we ask the participants to perform human detection in
depth and/or thermal images. Human detection in images/video is a
challenging computer vision problem with applications in human-computer
interaction, patient monitoring, surveillance, and autonomous driving, just
to mention a few. In some applications, however, keeping people's privacy
is a big concern for both users and companies/institutions involved. Most
notably, unintended identity revelation of subjects is perhaps the greatest
peril. While video data from RGB cameras are massively available to train
powerful detection models, the nature of these data may also allow
unpermitted third parties to access such data to try to identify observed
subjects. We argue that moving away from visual sensors that capture
identity information in the first place is the safest bet. However, the
lack of these more privacy-safe data affects the ability to train big
deep-learning models, thus affecting negatively the popularity of these
sensors.

For this competition, we offer a freshly-recorded multimodal image dataset
consisting of over 100K spatiotemporally aligned depth-thermal images of
different people recorded in public and private spaces: street, university
(cloister, hallways, and rooms), a research center, libraries, and private
houses. In particular, we used RealSense D435 for depth and FLIR Lepton v3
for thermal. Given the noisy nature of such commercial depth camera and the
thermal image resolution, the subjects are hardly identifiable. The dataset
contains a mix of close-range in-the-wild pedestrian scenes and indoor ones
with people performing in scripted scenarios, thus covering a larger space
of poses, clothing, illumination, background clutter, and occlusions. The
scripted scenarios include basic actions such as: sit on the sofa, lay on
the floor, interacting with kitchen appliances, cooking, eating, working on
the computer, talking on the phone, and so on. The camera position is not
necessarily static, but sometimes held by a person. The data were
originally collected as videos from different duration (from seconds to
hours) but skipping frames where no movement was observed. The ordering of
frames is removed to make it an image dataset (the only information
provided will be the video ID).

There are *three tracks* associated to this contest:

1. *Depth track*. Given the provided depth frames (and bounding box
groundtruth annotations), the participants will be asked to develop their
depth-based human detection method. Depth cameras are cost-effective
devices that provide geometric information of the scene at a resolution and
frame acquisition speed that is comparable to RGB cameras. The downside is
their noisiness at large real distances. The method developed by the
participants will need to output a list of bounding boxes (and their
confidence scores) per frame containing each person in it. The performance
on depth image-based human detection will be evaluated.

2. *Thermal track*. Given the provided thermal frames (and bounding box
groundtruth annotations), the participants will be asked to develop their
thermal-based human detection method. Thermal cameras provide temperature
readings from the scene. They are less noisy than depth cameras, but at a
comparable price they offer a much lower image resolution. The method
developed by the participants will need to output a list of bounding boxes
(and their confidence scores) per frame containing each person in it. The
performance on depth image-based human detection will be evaluated.

3. *Depth-Thermal Fusion track*. Given the provided aligned depth-thermal
frames (and bounding box groundtruth annotations), the participants will be
asked to develop their multimodal (depth and thermal) human detection
method. Both modalities have been temporally and spatially aligned and,
hence, so they will try to exploit their potential complementarity with a
proper fusion strategy. The participants will need to output a list of
bounding boxes per frame (and their confidence scores) containing each
person in it. The performance on depth image-based human detection will be
evaluated.

The competition will be run in the CodaLab platform. The participants will
register through the platform, where they will be able to access to the
different tracks (corresponding data, evaluation scripts, leaderboard, etc).

The CodaLab can be found at:
 http://chalearnlap.cvc.uab.es/challenge/34/description.


*ASSOCIATED EVENTS*
The participants will be invited to submit their papers to the associated
event:
*IEEE FG 2020 Workshop on Privacy-aware Comp

[UAI] [Percom 2020] Work-in-Progress (WiP) Call for Papers

2019-11-20 Thread Jonathan Liono
The Percom 2020 Work-in-Progress (WiP) Call for Papers



The Work-in-Progress (WiP) session provides an opportunity to present and 
discuss new challenges and visions, showcase early research results, and 
explore novel research directions. The specific aim of the WiP session is to 
provide a forum for timely presentation, discussion and feedback for novel, 
controversial, and thought-provoking ideas. All selected papers will be 
presented as posters in a special conference session and published as part of 
the PerCom Workshops proceedings to be published by IEEE and included in the 
IEEE Xplore digital library. Contributions are solicited in all areas of 
pervasive computing research and applications. Papers are expected to report on 
early or ongoing research on any aspect of pervasive computing and 
communications. Preliminary experimental results are appreciated.



Submission Guidelines



Authors are requested to submit original, unpublished manuscripts in standard 
IEEE proceedings format (two-column format with a 4-page limit including 
figures, tables, and references) in PDF format that include contact information 
of all the authors. All manuscripts must be registered and submitted through 
the EDAS submission site.



New this year! The WiP Chairs and the WiP Program Committee will also select up 
to three finalists for best WiP contribution. Each best WiP candidate will be 
given an additional 10 minute presentation as part of the main conference in 
addition to their poster presentations. The committee will select a best WiP 
contribution from among these finalists, with the award to be presented at the 
conference.



Important Dates

Submission deadline: November 24, 2019

Notification of acceptance: December 23, 2019

Camera ready deadline: January 31, 2020



For further information about the Work In Progress portion of the program, 
please see http://www.percom.org/call-for-wip or contact the WiP Co-Chairs 
Petteri Nurmi (ptnu...@cs.helsinki.fi) and James Xi Zheng 
(james.zh...@mq.edu.au).


___
uai mailing list
uai@ENGR.ORST.EDU
https://secure.engr.oregonstate.edu/mailman/listinfo/uai


[UAI] [journals] Final Reminder (due 15 Dec): Special Issue on "Robotics in Extreme Environments"

2019-11-20 Thread Linda Wang

Dear Colleagues,

Robotics is currently running a Special Issue entitled "Robotics in
Extreme Environments". Prof. Rustam Stolkin, from Royal Society Industry
Fellow for Nuclear Robotics, University of Birmingham, is serving as
Guest Editor for this issue.

We are pleased to invite you to submit your papers to this Special Issue
of Robotics, "Robotics in Extreme Environments".

Extreme environments can be defined as those that are so hazardous that
it would be undesirable or impossible to send a human worker into the
environment. Such applications are of special importance to the robotics
research community, because they demand the use of robots and often
cannot be done at all without major new advances in robotics. In
contrast, while research on, e.g., household helper robots is certainly
interesting, such jobs can still be done by human workers if needed at
the present time...

For further reading, please follow the link to the Special Issue Website
at: https://www.mdpi.com/journal/robotics/special_issues/REE.

The submission deadline is 15 December 2019. You may send your
manuscript now or up until the deadline. Submitted papers should not be
under consideration for publication elsewhere. We also encourage authors
to send a short abstract or tentative title to the Editorial
Office in advance (robot...@mdpi.com).

The Robotics is an online open access journal dedicated to both the
foundations of artificial intelligence, bio-mechanics and mechatronics,
and the real-world applications of robotic perception, cognition and
actions. The journal is presently covered by the following indexing:
Scopus(citescore 1.53), Emerging Sources Citation Index (ESCI - Web of
Science),Directory of Open Access Journals (DOAJ), INSPEC (IET) and DBLP
Computer Science Bibliography.

For further details on the submission process, please see the
instructions for authors at the journal website
(http://www.mdpi.com/journal/robotics/instructions).

We look forward to hearing from you.

Best regards,
Linda Wang
Managing Editor

To submit to the journal click here:
http://susy.mdpi.com/user/manuscripts/upload?journal=robotics

Unsubscribe:
http://www.mdpi.com/unsub/unsubscribe/mail/Um9ib3RpY3NAbWRwaS5jb20=/8795b601147e7dee2ad869265e01dc6bf1bad227048e5558d1cfb6a3136b52d5

Manage your subscriptions:
http://www.mdpi.com/unsub/managesubscriptions/mail/Um9ib3RpY3NAbWRwaS5jb20=/8795b601147e7dee2ad869265e01dc6bf1bad227048e5558d1cfb6a3136b52d5

MDPI - Multidisciplinary Digital Publishing Institute
www.mdpi.com

St. Alban-Anlage 66
4052 Basel
Switzerland

Tel. +41 61 683 77 34
Fax +41 61 302 89 18

___
uai mailing list
uai@ENGR.ORST.EDU
https://secure.engr.oregonstate.edu/mailman/listinfo/uai