At the Mobile World Congress in Barcelona, Spain, Amazon's VP of Artificial General Intelligence, Vishal Sharma, emphasized the company's widespread adoption of AI technology, stating that "there's scarcely a part of the company that is unaffected by AI." Sharma, a former AI entrepreneur, highlighted Amazon's deployment of AI in various forms, including its own foundational models across AWS, robotics in its warehouses, and the Alexa consumer product.
Sharma's comments came during an interview with TechCrunch's Mike Butcher at the 4YFN startup conference, where he showcased Amazon's extensive AI integration. The company has over 750,000 robots in its warehouses, which are capable of performing tasks such as picking and running themselves. Additionally, the Alexa product is reportedly the most widely deployed home AI product in existence, with Sharma stating that "there's no part of Amazon that's untouched by generative AI."
Last December, Amazon Web Services (AWS) announced a new family of four text-generating models, known as Nova, which are tested against public benchmarks. Sharma explained that these models cater to diverse use cases, including video generation and quick, predictable responses, such as those required by Alexa. However, he dismissed the idea that open-source models might reduce compute needs, citing the need for increased intelligence in various scenarios.
Amazon has also launched "Bedrock," a service within AWS that allows companies and startups to mix and match various foundational models, including China's DeepSeek. Sharma emphasized that Amazon is committed to providing choice and adopting trends and technologies that benefit customers. When asked about the potential pressure from open-source models emerging from China, Sharma remained unfazed, stating that Amazon is "relaxed" about deploying these models on AWS.
The conversation also touched on the recent controversy surrounding Trump and Zelensky, and the subsequent cooling of relations between the US Administration and many European nations. Sharma acknowledged that this issue was outside his area of expertise but hinted that some companies might adjust their GenAI strategies in response to geopolitical tensions, citing the role of incentives in driving technical innovation.
In the context of compute resources, Sharma commented on Elon Musk's xAI, which recently released its flagship AI model, Grok 3, using an enormous data center in Memphis containing around 200,000 GPUs. Sharma expressed his personal opinion that compute will remain a crucial part of the conversation for a long time to come. Amazon is building a massive AI compute cluster on its Trainium 2 chips in partnership with Anthropic, in which it has invested $8 billion.
When asked if Amazon was caught off guard by the emergence of Open AI's ChatGPT in late 2022, Sharma disagreed, highlighting Amazon's 25-year history of working on AI. He pointed to the 20 different AI models running on Alexa, which had billions of parameters for language processing. Sharma emphasized that Amazon has been exploring AI for a considerable amount of time and is well-positioned to adapt to emerging trends and technologies.
In conclusion, Amazon's extensive AI deployment, commitment to providing choice, and significant investments in compute resources demonstrate the company's ambition to remain at the forefront of the AI landscape. As geopolitical tensions continue to evolve, it will be interesting to observe how companies like Amazon adapt their GenAI strategies to respond to emerging incentives and challenges.