Natural language processing

MIT researchers make language models scalable self-learners

Socrates once said: “It is not the size of a thing, but the quality that truly matters. For it is in the nature of substance, not its volume, that true value is found.” Does size always matter for large language models (LLMs)? In a technological landscape bedazzled by LLMs taking center stage, a team of MIT Computer Science and Artificial…

Read more

Jacob Andreas and Mingda Li honored with Junior Bose Award for Excellence in Teaching

Each year, MIT’s School of Engineering gives the Junior Bose Award to a junior faculty member who has made outstanding contributions as an educator. The award is given to a member of faculty who is up for promotion from assistant professor to associate professor without tenure. The 2023 Junior Bose Award has been given to two outstanding educators: Jacob Andreas,…

Read more

3 Questions: Jacob Andreas on large language models

Words, data, and algorithms combine, An article about LLMs, so divine. A glimpse into a linguistic world, Where language machines are unfurled. It was a natural inclination to task a large language model (LLM) like CHATGPT with creating a poem that delves into the topic of large language models, and subsequently utilize said poem as an introductory piece for this…

Read more

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More