Bidirectional Encoder Representations from Transformers (BERT) is a general-purpose model that can be leveraged for nearly every text-based machine learning task! Learn how we used the almighty BERT for named entity resolution.
Advanced Data Analytics
Interspeech 2019, held in Graz Austria, saw experts from around the world gathering to discuss some of the most recent advances in technologies at the crossroads of speech and language. At Lab41 we were excited to co-host, together with SRI International, one of 10 special sessions and challenges, the VOiCES from a distance challenge.
Cyphercat, a research project out of IQT Labs, helps determine if training data is safe. Cyphercat measures the privacy risks that arise from sharing access to models trained on private data and enables safe and informed model sharing.
Generative Adversarial Networks (GANs) have become increasingly popular in machine learning due to their ability to mimic any distribution of data. By pitting two neural networks against each other, they are able to learn ever more subtle differences between real and synthetic data, which in turn drives the generation of ever more life-like examples, otherwise known as deep fakes.
Sharing some highlights from the first day of the 57th annual meeting of the Association for Computational Linguistics (ACL), held July 28-August 2, 2019.
LAb41 discusses next steps on their VOiCES project.