ARTIFICIAL intelligence (AI) accessible to all. In a recent summit by AWS led by its Chief Technologist for Asia Pacific Oliver Klein, discussions revolved around how Amazon Web Services (AWS) is positioning itself and its AI and machine learning (ML) offerings in the region.
An example of this technology is Bedrock a fully managed service that offers foundation models (FMs) from Amazon and top AI startups via an API.
“We announced our Amazon Bedrock service. What is the service doing? Well, it basically allows you to create your own generative AI applications. Fine tune them to your business, uh, to your industry, uh, with so-called foundation models or FMs. So, these foundation models are basically generative AI models that have been trained on very, very large data sets,” Klien said in response to a question from the tech media at the online summit.
Bedrock allows users to effortlessly discover the ideal model, start quickly, customize FMs privately with your data, and integrate them into your applications using familiar AWS tools. This includes integration with Amazon SageMaker ML features like Experiments for model testing and Pipelines for scalable FM management. All of this can be done without the need to manage any infrastructure.

While ChatGPT and Bard have already democratized artificial intelligence with its Large Language Models (LLM) free for public use. These LLMs are functionally impressive, able to do many tasks, capable of generating responses on a seemingly endless range of topics, based on instructions on how to respond to certain queries. However, these are limited also by the very architecture they are built on.
It was in 2015, when I first listened to AWS Chief Technology Officer Werner Vogels talk about AI and ML within the scope of the AWS platform at a ReInvent conference in the US. Little did I know that AWS has dedicated over two decades to enhancing customer experiences and optimizing internal operations using AI and ML technologies.
In 2017, it offered SageMaker services in its Marketplace initially to developer and data engineers. They provided access to services such as large databases, machine learning and data science tools for the creation of intelligent APIs and similar AI-solutions which in turn accelerated the development of more accessible tools like large language models (LLM).
But AWS has consistently pushed the boundaries of what’s possible.
AWS is approaching generative AI by investing and innovating across three essential layers of the generative AI stack: infrastructure, ML tools, and purpose-built AI services. Their aim is to transition this technology from the realm of research to everyday use, making it accessible to customers and developers with varying levels of expertise.
From its inception, AWS has prioritized accessibility, striving to make ML and AI available to customers across industries and of all sizes. With generative AI, they are adopting the same approach, focusing on three layers of the ML stack: infrastructure, machine learning tools and purpose-built AI services
In the second layer of the stack, which is machine learning tools, AWS is simplifying generative AI app development with the preview launch of Amazon Bedrock. This managed service provides easy access to pre-trained FMs (Foundation Models) via an API. Amazon Bedrock will also include Amazon Titan FMs, a family of industry-leading FMs developed by AWS.
Klein presented several uses of AI in the realm of creatives and arts–showing sketches and whole photos of waterfalls and scenery but insisted that the creations were based on how the machines interpreted the instructions, not really creativity but rather dependent on a database it has access to. This gives rise to the top layer of the generative AI stack where Amazon CodeWhisperer resides.
CodeWhisperer is trained on billions of lines of code and can generate code suggestions ranging from snippets to full functions in real time based on comments and existing code. It allows users to bypass time-consuming coding tasks even work around unfamiliar APIs. The technology is now generally available and supports 10 more programming languages and is free for individual developers.
Also, earlier in June this year, AWS unveiled the AWS Generative AI Innovation Center, a groundbreaking initiative aimed at helping customers worldwide build and deploy generative AI solutions successfully. AWS is investing $100 million in this program, which will connect AWS AI and ML experts with customers globally. The goal is to assist customers in envisioning, designing, and launching new generative AI products, services, and processes.