GMBStaff
10 Nov 23
Amazon is investing millions of dollars in training a large language model called Olympus, which has two trillion parameters. Led by former head of Alexa, Rohit Prasad, the team aims to develop self-made AI models, such as Titan, to enhance Amazon Web Services. This move demonstrates Amazon's commit...
Amazon is investing millions of dollars in training a large language model to compete with AI models developed by OpenAI, backed by Microsoft, and Alphabet. The model, called Olympus, has two trillion parameters, making it one of the largest models being trained. The team behind Olympus is led by Rohit Prasad, former head of Alexa, who brought together researchers from Alexa AI and the Amazon science team to train the models. Amazon's goal is to develop self-made models that can enhance its offerings on Amazon Web Services (AWS), attracting enterprise clients who require high-performing AI models. In addition to Olympus, Amazon has trained smaller models such as Titan and has invested in AI startup Anthropic. Google, on the other hand, has committed to investing $2 billion in Anthropic. Generative AI services have become quite popular, with companies worldwide developing their own language and generative AI models. Examples include Alibaba's Tongyi 2.0 and Tongyi Wanxiang, Baidu's Ernie Bot, OpenAI's DALLĀ·E 3, Meta Platforms' AudioCraft, Google's Bard, and Getty Images' Generative AI model. Amazon's move to develop its own models reflects its commitment to expanding its AI capabilities and staying competitive in the industry.