At an AWS re:invent conference today, Amazon Web Services (AWS) significantly extended its data management and serverless computing ambitions in a way that should ultimately help IT organizations develop more advanced applications that make use of artificial intelligence (AI).
In terms of providing the most immediate disruption, AWS revealed today that it will make instances of open source Postgres relational databases available within its Aurora database service alongside its existing MySQL-compatible offering and several NoSQL databases.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=iAWS CEO Andy Jassy told conference attendees the Postgres support is significant because Postgres provides a layer of compatibility capable of supporting existing Oracle database applications.
“Postgres is semantically close to Oracle,” says Jassy. “We’ll provide that capability at 1/10 the cost.”
In general, Jassy says database vendors such as Oracle represent the old guard of enterprise IT that is both overly expensive as well as too inflexible to respond to modern application requirements.
AWS also unfurled today Amazon Athena, a serverless query service that makes it possible to now launch SQL queries directly against data stored in the Amazon Simple Storage Service (S3) service.
In terms of raw compute power, AWS is adding seven additional cloud services, including an F1 instance that provides access to Field Programmable Gate Arrays (FPGAs) as well as Amazon EC2 Elastic GPUs that can be employed to accelerate graphics applications. AWS also announced Amazon Lightsail, a virtual private server (VPS) environment that comes bundled with storage and networking.
At the core of those offerings is AWS’s Elastic Network Adapter (ENA), a custom network interface capable of providing over 20 Gbps of total bandwidth above 20 Gbps.
AWS also announced that it is making available an instance of its Snowball appliance for capturing data on premise that is twice the size of the previous generation as well as capable of running applications locally on its own compute engine. Jassy says AWS envisions IT organizations being able to employ Snowball Edge to analyze data locally before physically transferring that data into AWS or asynchronously transferring directly into the AWS S3 storage service.
The primary method AWS will use to enable those applications to run locally is on an AWS Greengrass software platform that makes use of the AWS Lambda serverless computing to create a hybrid cloud computing environment.
AWS also revealed that it has developed a SnowMobile platform that fits in a container that will be transported to a local data center on the back of a truck. SnowMobile will be able to transfer exabytes of data to systems residing in the container that AWS will then drive to one of its data centers in order to transfer that data into the AWS cloud.
Finally, AWS announced three AI services, dubbed Amazon Lex, Amazon Polly, and Amazon Rekognition, that collectively make it possible to build natural language applications that can recognize images as well as nuances such as the intent associated with any given set of queries. In all three cases, AWS is clearly planning to leverage the massive amounts of data running in its cloud to drive a new generation of AI applications.
There’s no doubt that AWS in the last couple of years has significantly expanded its enterprise IT ambitions as the amount of data being moved into public clouds continues to increase exponentially. The issue now is to what degree AWS will be able to leverage a significant headstart over rivals that are clearly trying to play catch up.