Is Hardware the Key to Advancing Natural Language Processing?

MIT SpAtten

Researchers at MIT have created an algorithm-based architecture called SpAtten that reduces attention computation and memory access in natural language processing (NLP) systems. If we think it’s hard to learn a new language, imagine the challenges hardware and software engineers face when using CPUs and GPUs to process extensive language data. Natural language processing (NLP) attempts to bridge this gap between language and computing.

Read More

Google trained a trillion-parameter AI language model

Parameters are the key to machine learning algorithms. They’re the part of the model that’s learned from historical training data. Generally speaking, in the language domain, the correlation between the number of parameters and sophistication has held up remarkably well. For example, OpenAI’s GPT-3 — one of the largest language models ever trained, at 175 billion parameters — can make primitive analogies, generate recipes, and even complete basic code.

Read More

Facebook’s AI matches people in need with those willing to assist

Facebook says it has deployed a feature in its Community Help hub to make it easier for users to assist each other during the pandemic. As of this week, AI will detect when a public post on News Feed is about needing or offering help and will surface a suggestion to share it on Community Help.

Read More