[Apologies for cross postings]

                        CALL FOR PARTICIPATION

            The 2nd International Competition on Knowledge
         Engineering for Planning and Scheduling ICKEPS 2007

              Hosted at the International Conference on
             Automated Planning and Scheduling ICAPS 2007
                    Providence, Rhode Island, USA
                        September 22--26, 2007
                     icaps07.icaps-conference.org


The International Competition on Knowledge Engineering for Planning
and Scheduling (ICKEPS) is a bi-annual event, hosted at the
International Conference on Automated Planning and Scheduling
(ICAPS). The objectives of the competition are to promote tools for
knowledge acquisition and domain modeling, to accelerate knowledge
engineering research in AI, and to encourage the development of
software platforms that promise more rapid, accessible, and effective
ways to construct reliable and efficient systems.  Registration is
free for ICAPS 2007 participants.

Knowledge Engineering (KE) for AI Planning has been defined as "the
process that deals with the acquisition, validation and maintenance of
planning domain models, and the selection and optimization of
appropriate planning machinery to work on them. Hence, KE processes
support the planning process: they comprise all of the off-line,
knowledge-based aspects of planning that are to do with the
application being built, and any on-line processes that cause changes
in the planner's domain model". We expect the competition to encourage
the development of tools across the whole KE area including domain
modeling, heuristic acquisition, planner-domain matching, and domain
knowledge validation.

The 2nd ICKEPS is hosted at ICAPS 2007. It builds on the previous
competition, held in 2005. Participants must submit 5-page papers by
April 18, 2007.  Participants are also encouraged to submit
full-length conference papers about their tools to the ICAPS Systems
Track.  Full details of the competition are available on the ICKEPS
website: 

    http://andorfer.cs.uni-dortmund.de/~edelkamp/ickeps/
 
IMPORTANT DATES:

    Submission deadline:           April 18, 2007
    Notification/Feedback:         June 11, 2007        
    Camera-ready copy due date:    July 27, 2007
    Simulations Available:         April 18, 2007
    Simulations Close:             August 24, 2007
    Competition day:               TBD
    Competition results:           TBD

AREA OF SCOPE

The competition is open to authors of a tool, an integrated tools
environment or tools platform (below we simply refer to the entry as
'the tool') where the tool helps knowledge engineering for AI P&S
purposes in at least one of the following categories:
  - knowledge formulation (the acquisition and encoding of domain
  structure and/or control heuristics)
  - planner configuration (fusing application knowledge with a P&S
  engine)
  - validation of the domain model (e.g. using visualization,
  analysis, re-formulation) or validation of the P&S system as a whole
  (e.g. using plan/schedule visualization)
  - knowledge refinement and maintenance (e.g. through automated
  learning/training, or a mixed initiative P&S process)

Stand-alone planners/schedulers are NOT eligible, but can be a part of
a tool.  The competition entries must be distributable without any
associated fees.  The competition is otherwise open to all
participants. The authors must follow the competition structure. In
particular, at least one of the authors must present the system during
the conference.

COMPETITION STRUCTURE

Phase 1: Pre-conference

Authors must submit a short paper describing the tool (no more than 5
pages including screenshots) to the competition chairs by the
submission deadline. (Papers are due April 18, 2007.)  Papers must be
in US letter size, and use the AAAI Style template.

Authors are encouraged to make the tool available for download from
the web prior to the competition in order for judges to evaluate tools
independently prior to the conference.  All tools will ultimately be
posted on the ICKEP web page.

The short papers will be 'lightly' reviewed in order to
  (a) provide feedback to the contributors for updating their paper
      and matters to explain and address during the conference
  (b) contribute to the overall evaluation of the submissions

If the tool is accepted for the competition then the authors must
submit a camera-ready copy of the paper to the competition chairs by
the camera-ready copy deadline (July 27, 2007), and at least one
author must register for the competition via the standard ICAPS
registration.

Phase 2: Evaluation through Simulation

In addition to the pre-conference paper submission, the competition
will make available planning and scheduling simulations that
competitors will use to evaluate their tools.  These simulations will
be available via a web service.  Competitors will read a short text
description of the competition domain, including a description of the
simulation API, retrieve problem instances, submit plans for each
instance, and receive feedback describing the quality of the plan.
Each participant's interaction with simulators will be logged, and the
logs will be used as part of the tool evaluation.  Simulations will be
made available no later than April 18, 2007; competitors may evaluate
their systems until August 24, 2007.

Phase 3: At the conference

At least one author per tool must come to the conference and
demonstrate the system in person.  This requirement entails giving a
short talk during the competition workshop and being ready to
demonstrate the system and answer questions during the demonstration
session.  The competitors must bring their tools installed on their
own laptops.  Competitors should tailor their presentations to discuss
how their tools helped them solve the simulated domains, but should
also discuss other facets of their tools not exercised by the
simulations.

The competition will take place during a one day workshop, where
competitors will present their systems (15 minutes), and system
demonstration.  The workshop and system demonstration will be open to
everyone attending the conference.

The judges will decide the results using the reviews of the short
papers, quantitative results from the competitors' interaction with
the simulations, and the system presentations and demonstrations. The
proposed criteria to be used by the judges are presented below.  The
final rules will be posted on the competition web page.  There will be
a presentation of the results and winners' prizes either during the
conference reception or during/before the conference dinner.  Final
scores for each entry will also be posted on the competition web-page
after the competition.
 

EVALUATION CRITERIA USED IN THE COMPETITION:

  - Support potential: what potential has the tool in helping the
  processes within the scope of the competition?  Will the tool save
  time and resource?
  - Scope: how broad is the scope of the tool within the defined scope
  of the competition?
  - Usability: can the tool be easily used, accessed and/or
  configured?  Could non planning-experts use it?
  - Interoperability: can the tool be integrated with other P&S
  technology?  Are its interfaces well defined - can the software be
  easily used with other P&S software, or easily combined with third
  party planners?
  - Innovation: what is the quality of the scientific and technical
  innovations that underlie the software?
  - Wider comparison: how does the tool compare with KE software in
  other areas of AI?  For example, could the software be subsumed by
  some other existing KBS KE tool?
  - Build quality: does the software appear robust?  Has the software
  been well tested?
  - Relevance: to what degree does the tool address problems peculiar
  to KE for P&S?  Is the software relevant or applicable to real-world
  applications?
  - Domain simulation applicability: how well did the competitors
  address the simulation domains using their tools?  How many domains
  were the simulators tried on?  How long did it take the competitors
  to generate valid plans for the domains?  How many problem instances
  were solved?  What was the quality of the plans generated?

ICKEPS CHAIRS:

  Stefan Edelkamp, University of Dortmund, Germany
  stefan.edelkamp at cs.uni-dortmund.de

  Jeremy Frank, NASA Research Center, USA
  frank at email.arc.nasa.gov

_______________________________________________
uai mailing list
uai@ENGR.ORST.EDU
https://secure.engr.oregonstate.edu/mailman/listinfo/uai

Reply via email to