LLMs will transform medicine, media and more

A toolbox filled with regular tools and speech bubbles.

2024-08-13  1480  困难

The AI that is hogging so much of the world’s attention now—and sucking up huge amounts of computing power and electricity—is based on a technique called deep learning. In deep learning linear algebra (specifically, matrix multiplications) and statistics are used to extract, and thus learn, patterns from large datasets during the training process. Large language models (LLMs) like Google’s Gemini or OpenAI’s GPT have been trained on troves of text, images and video and have developed many abilities, including “emergent” ones they were not explicitly trained for (with promising implications, but also worrying ones). More specialised, domain-specific versions of such models now exist for images, music, robotics, genomics, medicine, climate, weather, software-coding and more.

经济学人和华尔街日报的文章是会员专属

请加入会员以继续阅读完整文章

成为会员后您将享受无限制的阅读体验,并可使用更多功能


免责声明:本文来自网络公开资料,仅供学习交流,其观点和倾向不代表本站立场。