Advisor

Hear My Words: Transformer-Based Models for NLP and Speech Recognition

Posted February 11, 2020 | Technology |
Transformer-Based Models for Natural Language Processing and Speech Recognition

Transformer-based natural language processing (NLP) models have serious implications for the entire field of NLP — from speech recognition systems to natural language generation (NLG), natural language understanding (NLU), machine translation, and text analysis applications. Consequently, tools for developing transformer-based models have become popular among researchers and developers implementing NLP applications.

About The Author
Curt Hall
Curt Hall is a Cutter Expert and a member of Arthur D. Little’s AMP open consulting network. He has extensive experience as an IT analyst covering technology and application development trends, markets, software, and services. Mr. Hall's expertise includes artificial intelligence (AI), machine learning (ML), intelligent process automation (IPA), natural language processing (NLP) and conversational computing, blockchain for business, and customer… Read More
Don’t have a login? Make one! It’s free and gives you access to all Cutter research.