Monday 24 April 2023

Amazon launches generative AI play in AWS Bedrock

 

Amazon says AWS Bedrock will provide access to multiple foundation AI models for enterprise-scale AI applications.

Amazon is the latest hyperscaler to take on the world of foundation AI including generative and large language models. It has launched a new platform called AWS Bedrock that includes access to in-house tools such as the Titan family of foundation models, and pre-trained models from start-ups like AI21 Labs, Anthropic and Stability AI. The company says the focus is on providing a range of models for use in “enterprise-scale” AI tools. One expert said Amazon has “a long way to go” to catch up with other players in the field.

Opening AWS up as a marketplace for multiple AI models mirrors moves by Google to offer those made by third parties in Google Cloud alongside its own PaLM, including from Midjourney and AI21 Labs. Microsoft has gone “all in” with OpenAI through its Azure cloud, offering GPT-4, ChatGPT and other models for customers.

Amazon says it will allow companies to train chatbots and AI tools on their own proprietary data without having to invest in costly data centres and expensive AI chips. AWS will use a combination of its own custom AI chips and those from Nvidia. “We’re able to land hundreds of thousands of these chips, as we need them,” explained Dave Brown, VP of Elastic Compute Cloud at AWS.

The launch of Bedrock has been in the works for the past few months, with AWS signing partnership agreements with Stability AI and other start-ups, as well as investing more in generative AI apps and its underlying technology. Hugging Face has also worked to bring its library of text-generating models onto AWS and Amazon has launched an AI accelerator for startups.

AWS is the largest hyperscaler in the world but is facing increasing competition from Google Cloud, Microsoft Azure and others, largely off the back of their AI offerings. Both companies have invested heavily in general AI tools including in chatbots such as ChatGPT and Google Bard.

Amazon hasn’t unveiled the pricing for its AI offerings yet and full details aren’t clear but users will be able to tap into the various foundation models via an API. It is focused on “enterprise-scale” apps rather than individual tools as it is designed for scale.

Multiple AI models

AI21 Labs Jurassic-2 family of foundation models are particularly suited to generating multilingual text, while Anthropic’s Claud is good for text-processing and conversational tools. Stability AI brings text-to-image tools to Bedrock including Stable Diffusion which can be used for images, art, logos and graphic design. The most recent version of Stable Diffusion has improved text accuracy and clarity. Using Bedrock, developers will be able to create tools that combine models.

Amazon’s own Titan models include text and embedding. This allows for text generation like writing a blog post or a sales pitch, where embedding can translate text into numerical representations to find the semantic meaning of the text.

Any of the models can then be further trained on labelled datasets stored in S3, Amazon’s cloud storage tool. Only 20 well-labelled pieces of data is required to make the model work against the proprietary information and none of that data will be used to train the underlying models, according to Amazon.

“At Amazon, we believe AI and ML are among the most transformational technologies of our time, capable of tackling some of humanity’s most challenging problems. That is why, for the last 25 years, Amazon has invested heavily in the development of AI and ML, infusing these capabilities into every business unit,” the company said in a statement.

In the same statement Amazon highlighted the use of chips to bring down the cost of running generative AI workloads, explaining that these ultra-large models require massive compute power to run in production and so AWS Inferentia chips can be used to make this more efficient and reduce cost at enterprise scale.

AWS Bedrock has ‘a lot of catching up to do’

The company is also opening up its answer to Microsoft’s GitHub Copilot, a tool widely used by developers to help write code. Amazon is making CodeWhisperer available for free for individual developers. It is an AI-powered coding companion that can offer code suggestions based on previously written code or comments. There are no usage limits for the free version, but a paid tier, for professional use, also includes enterprise security and admin capabilities.

Daniel Stodolsky, former Google Cloud VP and current SVP of Cloud at SambaNova said the old cloud argument of bringing compute to your data doesn’t stack up in the new world of generative AI. “Whereas other cloud services such as predictive analytics rely on huge volumes of real-time data, Amazon says the process of customising its pre-trained LLM can be completed with as few as 20 labelled data examples,” he said.

“The trend for generative AI will be towards open best-of-breed approaches rather than vendor lock-in and closed models. It’s much better to own a large language model that’s built and fine-tuned for your use-case rather than relying on an off-the-shelf model with minimal customisation.

“The other consideration is getting value from generative AI quickly. Amazon’s Bedrock service is only in limited preview right now and anyone looking at AWS Service Terms will find that Service Level Agreements don’t apply – in other words, it’s not production ready and won’t be for some time. Generative AI is a race that’s already well underway, and Amazon clearly has a lot of catching up to do with other production-ready platforms.”

Courtesy: https://techmonitor.ai/



No comments:

Post a Comment

Maximizing Cloud Sales: How to Pitch Customers and Drive Business Growth

In the fast-paced world of cloud services, the ability to pitch effectively can make or break your business. Whether you’re an AWS partner, ...