Has AI solved its language problem?
Artificial neural networks, a type of AI that is able figure out relationships within datasets without relying on the input of specific code, have been around for a while. But it was the development of the transformer neural net by Google in 2017 that sparked the recent revolution in AI and language. Transformer models are able to mimic certain processes of the human brain and power natural language processing. They are essentially the bedrock of AI language models like BART and the GPT family, which includes of course, ChatGPT, the behemoth that dominates the conversation today.