Following the success of the 2019 Conversational Telephone Speech (CTS) Speaker Recognition Challenge, which received 1347 submissions from 67 academic and industrial organizations, the US National Institute of Standards and Technology (NIST) will be organizing a 2020 CTS Challenge, the next iteration of an ongoing series of speaker recognition evaluations conducted by NIST since 1996.
Similar to the 2019 CTS Challenge, the 2020 evaluation will feature a leaderboard-style challenge offering an open/unconstrained training condition, but using CTS recordings extracted from multiple data sources containing multilingual speech. In addition, unlike the 2019 CTS Challenge, no Development set will be released.
The objectives of the evaluation series are (1) for NIST to effectively measure system-calibrated performance of the current state of technology, (2) to provide a common test bed that enables the research community to explore promising new ideas in speaker recognition, and (3) to support the community in their development of advanced technology incorporating these ideas. The evaluations are intended to be of interest to all researchers working on the general problem of text-independent speaker recognition. To this end, the evaluations are designed to focus on core technology issues and to be simple and accessible to those wishing to participate.
Participation in the 2020 CTS Challenge is open to all who find the evaluation of interest and are able to comply with the evaluation rules set forth in the evaluation plan. This page will be updated once the evaluation plan becomes available.
2020 NIST CTS Challenge Evaluation Plan
Please visit: https://sre.nist.gov
Results on the Test set will be published here periodically (see Disclaimer below).
RANK | TEAM | SET | TIMESTAMP | EER [%] | MIN_C | ACT_C |
---|---|---|---|---|---|---|
1 | THUEE | Test | 20210921-005400 | 2.53 | 0.061 | 0.068 |
2 | I4U | Test | 20210528-015904 | 2.91 | 0.066 | 0.07 |
3 | STC | Test | 20210826-020705 | 2.81 | 0.067 | 0.081 |
4 | TEAM-CDPE-28 | Test | 20210616-024452 | 3.2 | 0.081 | 0.087 |
4 | JHU-MIT | Test | 20210413-171216 | 3.19 | 0.083 | 0.087 |
6 | Veridas | Test | 20210729-055711 | 3.09 | 0.09 | 0.092 |
7 | ROXANNE | Test | 20211021-131243 | 2.84 | 0.09 | 0.094 |
8 | TEAM-MGQG-50 | Test | 20210930-003746 | 2.33 | 0.089 | 0.095 |
9 | AAP | Test | 20210621-103112 | 2.92 | 0.1 | 0.101 |
10 | TEAM-QSYF-27 | Test | 20210625-040137 | 3.31 | 0.114 | 0.116 |
11 | BGU | Test | 20210630-030728 | 3.45 | 0.127 | 0.134 |
12 | LIA | Test | 20210405-115008 | 3.23 | 0.108 | 0.135 |
13 | ABC | Test | 20210916-101147 | 3.23 | 0.119 | 0.14 |
14 | XMUSPEECH | Test | 20210702-015042 | 3.78 | 0.142 | 0.146 |
15 | I2R | Test | 20210203-223806 | 3.29 | 0.131 | 0.161 |
16 | TEAM-ZBSM-52 | Test | 20210729-075707 | 3.88 | 0.154 | 0.164 |
17 | BiometricVox | Test | 20201031-061334 | 3.95 | 0.144 | 0.203 |
18 | Elektronika | Test | 20210605-105329 | 5.1 | 0.213 | 0.22 |
19 | NSYSU_CHT | Test | 20210702-035631 | 4.07 | 0.177 | 0.227 |
20 | TEAM-QTUY-05 | Test | 20210218-041325 | 5.92 | 0.24 | 0.25 |
21 | TEAM-IZYX-36 | Test | 20211026-013456 | 3.93 | 0.181 | 0.251 |
22 | dBLab | Test | 20210210-044741 | 6.9 | 0.293 | 0.333 |
23 | TEAM-FKFK-39 | Test | 20210216-103739 | 5.94 | 0.244 | 0.342 |
24 | TJU- | Test | 20210601-023932 | 5.04 | 0.218 | 0.43 |
25 | TEAM-FYYA-59 | Test | 20211004-133818 | 9.17 | 0.455 | 0.608 |
26 | TIT_SER | Test | 20211025-012155 | 15.55 | 0.539 | 0.863 |
27 | LEAP | Test | 20210820-082901 | 4.93 | 0.207 | 0.934 |
28 | GRD | Test | 20201021-022551 | 20.31 | 0.733 | 0.951 |
29 | HIT | Test | 20211021-103833 | 12.6 | 0.606 | 1 |
29 | katon | Test | 20210812-032522 | 49.32 | 1 | 1 |
29 | TEAM-QLSB-38 | Test | 20210506-204611 | 50 | 1 | 1 |
32 | TEAM-MQKX-64 | Test | 20210902-175153 | 48.95 | 1 | 1.033 |
33 | TEAM-BPZI-72 | Test | 20211027-103100 | 27.55 | 0.993 | 1.079 |
Participants are allowed to publish the leaderboard results unaltered, but they must not make advertising claims about their standing/ranking in the evaluation, or winning the evaluation, or claim NIST or the U.S. Government endorsement of their system(s) or commercial product(s). See the evaluation plan for more details regarding the participation rules in the NIST CTS Challenge.
For more information about the challenge please send questions to sre_poc [at] nist.gov (sre_poc[at]nist[dot]gov). For the CTS Challenge discussion please visit our Google Group.