Bidirectional Encoder Representations from Transformers (BERT) is a general-purpose model that can be leveraged for nearly every text-based machine learning task! Learn how we used the almighty BERT for named entity resolution.
Interspeech 2019, held in Graz Austria, saw experts from around the world gathering to discuss some of the most recent advances in technologies at the crossroads of speech and language. At Lab41 we were excited to co-host, together with SRI International, one of 10 special sessions and challenges, the VOiCES from a distance challenge.
Generative Adversarial Networks (GANs) have become increasingly popular in machine learning due to their ability to mimic any distribution of data. By pitting two neural networks against each other, they are able to learn ever more subtle differences between real and synthetic data, which in turn drives the generation of ever more life-like examples, otherwise known as deep fakes.
Sharing some highlights from the first day of the 57th annual meeting of the Association for Computational Linguistics (ACL), held July 28-August 2, 2019.
LAb41 discusses next steps on their VOiCES project.
LAb41 discusses VOiCES, the first open source speech data-set recorded in real environments with far-field microphones, capturing reverberant acoustics and background noise.