Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Workshop on Metrics & Test Methods for Human-Robot Teaming

Workshop on Metrics & Test Methods for Human-Robot Teaming

Date:  23 March, 2020

Location:  2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK

Technical contacts:

  • jeremy.marvel [at] nist.gov (Jeremy Marvel)(301) 975-4592
  • shelly.bagchi [at] nist.gov (Shelly Bagchi)(301) 975-2455
  • megan.zimmerman [at] nist.gov (Megan Zimmerman)(301) 975-0452
  • murat.aksu [at] nist.gov (Murat Aksu)(301) 975-3735
  • brian.antonishek [at] nist.gov (Brian Antonishek)(301) 975-6033

Despite large advances in robot interfaces and user-centric robot designs, the need for effective HRI continues to present challenges for the field of robotics.  A key barrier to achieving effective human-robot teaming in a multitude of domains is that there are few consistent test methods and metrics for assessing HRI effectiveness.  The necessity for validated metrology is driven by the desire for repeatable and consistent evaluations of HRI methodologies.

This full-day workshop at the 2020 ACM/IEEE HRI Conference will address the issues surrounding the development of test methods and metrics for evaluating HRI performance across the multitude of application domains, including industrial, social, medical, field and service robotics.  This workshop is driven by the need for establishing consistent standards for evaluating HRI in real-world applications, and how the interfaces, technologies, and underlying theories impact the effective collaboration of human-robot teams.  Specific goals include the following:

  • to develop and encourage the use of consistent test methods and metrics in evaluating HRI technologies, producing quality data sets of pragmatic applications, and validating human subject studies for HRI;
  • to establish benchmarks and baselines along a spectrum of key performance indicators for assessing and comparing novel HRI systems and applications;
  • to support a discussion about best practices in metrology and what features should be measured as the underlying theory of HRI advances;
  • to encourage the creation and sharing of high-quality, consistently-formatted datasets for HRI research; and
  • to promote the development of reproducible, metrics-oriented studies that seek to understand and model the human element of HRI teams.

Schedule and Format

The workshop schedule is shown below, with a more detailed schedule posted on the workshop website.  The structure follows that of last year's workshop, and is formatted such that the first half of the day focuses on the technical aspects of metrology for effective, real-world HRI, and features a keynote speaker and technical presentations of peer-reviewed, contributed papers.  The second half of the day will focus on international efforts that explore repeatability, reproducibility, traceability, and the impacts of demographics, culture, and study design on the impacts and results of HRI research.

Discussion Topics

Presentations by contributing authors will focus on the documentation of the test methods, metrics, and data sets used in their respective studies.  Keynote and invited speakers will be selected from a targeted list of HRI researchers across a broad spectrum of application domains.  Poster session participants will be selected from contributors reporting late-breaking evaluations and their preliminary results.

Discussions are intended to highlight the various approaches, requirements, and opportunities of the research community toward assessing HRI performance, enabling advances in HRI research, and establishing trust in HRI technologies.  Specific topics of discussion will include:

  • reproducible and repeatable studies with quantifiable test methods and metrics;
  • human-robot collaboration and teaming test methods;
  • human data set content transferability, and traceability;
  • HRI metrics (e.g., situation and cultural awareness);
  • human-machine interface metrics; and
  • industry-specific metrology requirements.

Peer-reviewed submissions by contributing authors will be automatically be submitted to an accepted special issue of the Transactions on Human Robot Interaction, scheduled for publication in March of 2021.  A workshop report documenting the presentations, discussions, and ensuing take-away and action items will be produced, and made publicly available.  An additional summary paper will be written targeting publication in the proceedings of the 2021 ACM/IEEE HRI conference.

Finally, this workshop is the second in a series of workshops leading toward formalized HRI performance standards.  The IEEE Robotics and Automation Society (RAS) will hosting and supporting this standardization effort.  Early workshops are intended to target community and consensus building, and on the establishment of a culture of repeatable and reproducible, metrology-based research in HRI.  A third workshop is planned for the 2021 ACM/IEEE International Conference on Human Robot Interaction, and will specifically address the action items identified in this year's workshop.

Important Dates

14 February, 2020 : Submission deadline for extended abstracts

28 February, 2020 : Notification of acceptance for presentations

23 March, 2020 : Full-day workshop

Submissions

A link to the submission website will be provided soon.

Organizers

  • jeremy.marvel [at] nist.gov (Dr. Jeremy A. Marvel), National Institute of Standards and Technology (NIST), USA
  • shelly.bagchi [at] nist.gov (Shelly Bagchi), National Institute of Standards and Technology (NIST), USA
  • megan.zimmerman [at] nist.gov (Megan Zimmerman), National Institute of Standards and Technology (NIST), USA
  • murat.aksu [at] nist.gov (Murat Aksu), National Institute of Standards and Technology (NIST), USA
  • brian.antonishek [at] nist.gov (Brian Antonishek), National Institute of Standards and Technology (NIST), USA
  • yue6 [at] g.clemson.edu (Dr. Yue Wang), Clemson University
  • ross [at] semio.ai (Dr. Ross Mead), Semio
  • terry.fong [at] nasa.gov (Dr. Terry Fong), National Aeronautics and Space Administration (NASA), USA
  • hbenamor [at] asu.edu (Dr. Heni Ben Amor), Arizona State University

Sponsors

Arizona State University

 

Clemson University
NASA
Semio

Agenda

Coming soon.

Created December 6, 2019