Table of Contents
On Thursday, Amazon‘s online computing division unveiled a set of tools to assist other businesses in creating chatbots and AI-powered image-generation services of their own. In addition to integrating AI chatbots into their consumer offerings, such as their search engines, Google and Microsoft are also eyeing a sizable market for the underlying technology: selling it to other businesses via their on-demand cloud operations. The largest cloud computing company in the world, Amazon Web Services, entered that battle on Thursday with a number of its own intellectual AI capabilities, but it is going in a different direction.
AWS will provide a service called Cornerstone that enables companies to use their own data to develop bespoke foundation models from the fundamental artificial intelligence (AI) techniques that do activities like react to questions with language that sounds human-like or produce pictures in response to a prompt. For instance, OpenAI, the company that developed ChatGPT, provides a similar service that enables users to customize the ChatGPT models to produce a unique chatbot.
Customers of the Bedrock service will be able to work with Amazon Titan, a set of the company’s own proprietary foundation models, but it will also provide a menu of models provided by other businesses. Along with Amazon’s own models, the first third-party solutions will come from firms AI21 Labs, Anthropic, and Stability AI. AWS users may test-drive such technologies using the Bedrock service without getting to interface with the supporting data center infrastructure.
Amazon Races Across Microsoft
The underlying servers will utilize a combination of Nvidia Corp. processors, the largest provider of chips for artificial intelligence applications but whose processors have been in short supply this year, as well as chips from Amazon’s own bespoke AI chips. In 2010, the business unveiled the first Linux version tailored for the cloud, then two years later, Amazon Linux 2. Customers can anticipate AL2023, a two-year substantial release cycle that is the predictable and long-term support, regular and flexible updates, an enhanced security posture thanks to features like SELinux, the kernel that is used live-patching, OpenSSL 3.0, updated cryptographic policies, predetermined enhancements with versioned collections, kernel hardening, which and more.
Between Linux on Amazon 2 and AL2023, there are a number of changes. One of the most significant distinctions is that AL2024 has a predictable abandoned-year major version cycle and long-term support, whereas Amazon Linux 2 promises long-term support through June 30, 2024. The latest edition of Amazon Linux in 2024 is really intriguing. It has a lot of positive aspects, such as the new Fedora foundation, updated packages, enhanced performance, and increased security. For current Amazon Linux 2 customers, it is not a simple update to suggest due to the numerous breaking changes. Additionally, because of the restricted package availability, some workloads may still be better serviced by other well-known AMIs.
This year has seen a shortage of Nvidia Corp. processors, the largest provider of chips for AI work, so a combination of Amazon’s own bespoke AI chips and those chips will be used in those underlying servers.
Dave Brown, the executive director of Elastic Computing Cloud at Amazon Web Service, said of the organization’s bespoke chips, “We’re ready to land thousands, if not thousands, or even millions of such chips, as we’ll require them.” It serves as an escape route for certain supply-chain worries that people may be experiencing, in my opinion.