AMAZON Web Services (AWS) has expanded its Amazon Bedrock service to simplify the development and deployment of generative artificial intelligence (AI) applications.
Bedrock is a platform for building AI applications that use large language models (LLMs).
The updated service adds several highly sought-after features to address customer needs for customization, model selection, and content moderation. As a service platform, it grants users access to a range of LLMs from providers like AI21 Labs, Amazon, Anthropic, Cohere, Meta, Mistral AI, and Stability AI. These models are available as managed services, eliminating the need for customers to worry about infrastructure complexities.
With the new Bedrock, organizations can now import their own customized AI models, giving them access to Bedrock’s management, security, and deployment tools. It now has a model identification feature assists in assessing and comparing models offered. This speeds up model selection based on accuracy, performance, and other key metrics important to the intended application. It also now offers improved content filtering and moderation tools, allowing users to block harmful content, align AI output with company standards, and ensure the safe and responsible use of AI.
Bedrock also adds several new models, including Amazon Titan Text Embeddings V2, Amazon Titan Image Generator, Meta Llama 3, Cohere’s Command R and Command R+.
The focus of these new features is to simplify and accelerate the development of generative AI applications tailored to various industries and use cases. This includes the ability to better customize models, select those best suited to specific tasks, and enhance responsible AI practices.
“With today’s announcements, we continue to innovate rapidly for our customers by doubling-down on our commitment to provide them with the most comprehensive set of capabilities and choice of industry-leading models, further democratizing generative AI innovation at scale,” Dr. Swami Sivasubramanian, vice president of AI and Data at AWS concludes.