Select the directory option from the above "Directory" header!

Menu
The big reveals from Adam Selipsky at AWS re:Invent 2023

The big reveals from Adam Selipsky at AWS re:Invent 2023

GenAI, business chatbots, new storage and chips and more.

Adam Selipsky (Amazon Web Services)

Adam Selipsky (Amazon Web Services)

Credit: Amazon Web Services

Amazon Web Services (AWS) has teased the preview launch of its new generative artificial intelligence (genAI) chatbot product during this year's re:Invent. 

In his keynote, CEO Adam Selipsky claimed the chatbot, Amazon Q, is able to provide answers to questions, generate content and take action, streamlining tasks and accelerating decision-making. 

The service can also personalise interactions with individual users based on existing identities, roles and permissions, but does not use a business’ customers to train its underlying models.

During the keynote, Selipsky placed particular emphasis on Q, going so far as to claim that “this is going to be transformative”.

Essentially, Q is able to be used in numerous different facets of a business. For example, on the development side, Amazon played up Q’s code translation functionality named Amazon Code Transformation, claiming that it was able to upgrade 1,000 production applications and their dependencies from Java 8 to Java 17 in two days.

Other functionality includes providing employees with title-specific information, and ensuring that need-to-know information is not able to be accessed by those who do not have the correct permissions.

Aside from Amazon Q, Selipsky also revealed Amazon S3 Express One Zone, a high-performance, single-zone Amazon S3 storage for consistent, single-digit millisecond data access in latency-sensitive applications.

“Amazon S3 Express One Zone is the lowest latency cloud object storage available, with data access speed up to 10 times faster and request costs up to 50 per cent lower than Amazon S3 Standard, from any AWS Availability Zone within an AWS Region,” AWS claimed.

The latest iterations of its Graviton and Trainium chip families, Graviton4 and Trainium2, were also announced. According to claims from AWS, Graviton4 has 30 per cent better compute performance, 50 per cent more cores and 75 per cent more memory bandwidth than Graviton3.

Meanwhile, Trainium2, which is purpose-built for genAI and machine learning (ML) training, offers up to four times faster training compared to the first-generation Trainium chips and can be deployed in EC2 UltraClusters of up to 100,000 chips, as well as improving energy efficiency by twice as much.

Graviton4 is also available in memory-optimised Amazon EC2 R8g instances, which are available in preview from now and general availability is expected in 2024. Trainium2 is also expected to come next year.

AWS’ ongoing relationship with NVIDIA was also expanded with a bevvy of announcements made. One such announcement was that the pair are working together to offer the first “cloud AI supercomputer” Project Ceiba, utilising NVIDIA’s Grace Hopper Superchip and AWS UltraCluster scalability.

The chipmaker also said that its AI-training-as-a-service DGX Cloud service will be the first to offer NVIDIA GH200 NVL32 to provide developers “the largest shared memory in a single instance”, the cloud giant claimed.

Additionally, there will be new Amazon EC2 instances powered by NVIDIA GH200, H200, L40S, and L4 graphic processing units (GPU) for genAI, HPC, design and simulation workloads.

For Amazon Bedrock, Selipsky announced a range of new and updated services such as the preview for fine-tuning for foundational model accuracy, general availability of retrieval augmented generation (RAG) with knowledge bases, pre-training for Amazon Text Lite and Express in preview, general availability of Agents for Amazon Bedrock for orchestrating multistep tasks to accelerate genAI development and the preview of Guardrails for promoting safe interactions between genAI applications and users.

Meanwhile, four integrations for data connection and analysis without building and managing complex extract, transform and load (ETL) data pipelines were announced.

Amazon Aurora PostgreSQL, Amazon DynamoDB, and Amazon RDS for MySQL zero-ETL integrations with Amazon Redshift were revealed, as well as Amazon DynamoDB zero-ETL integration with Amazon OpenSearch Service.

The three former integrations enable users to analyse data from multiple sources without building and maintaining custom data pipelines, while the latter integration allows full-text and vector search on operational data in near real-time.

A preview of genAI capability for Amazon DataZone was also announced, with the company claiming that it will add business context to data catalogues and also “dramatically decrease the amount of time needed to provide context for organisational data”.

Rounding out Selipsky’s announcements was that Project Kuiper, AWS’ low Earth orbit (LEO) satellite broadband network, will provide enterprise-ready private cloud connectivity -- in addition to the previously announced public connectivity.

Sasha Karen travelled to re:Invent 2023 as a guest of AWS.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags amazonAmazon Web ServicesAWS

Show Comments