Abstract:This paper presents our recent developments in the field of automatic processing of sign language corpora using the Hamburg Sign Language Annotation System (HamNoSys). We designed an automated tool to convert HamNoSys annotations into numerical labels for defined initial features of body and hand positions. Our proposed numerical multilabels greatly simplify the structure of HamNoSys annotation without significant loss of gloss meaning. These numerical multilabels can potentially be used to feed the machine learning models, which would accelerate the development of vision-based sign language recognition. In addition, this tool can assist experts in the annotation process to help identify semantic errors. The code and sample annotations are publicly available at https://github.com/hearai/parse-hamnosys.