This week the AWS Summit was in Amsterdam, and off course Qstars was there too! We attended lots of talks, strolled along the booths and had lots of nice conversations. The main topics where AI (like any IT conference nowadays) and serverless.
# Artificial Intelligence
There are lots of AI resources available in AWS, as demonstrated in lots of talks. We think these are worth mentioning to start your AI adventures:
- Amazon Q was introduced a few months ago and heavily promoted during the conference. This is a generative AI-powered assistent which can help you both on coding your infrastructure, as well as providing you support on all those AWS services. You can integrate it with editors like VScode (using the AWS Toolkit for Visual Studio Code).
- Bedrock allows you to run lots of AI models, including Mistral AI, Cohere and Llama 2. This allows you to use such models within your applications. This allows you to apply AI functionallity in your own applications. E.g. you can use this to make summaries of your documents, generate images or create a virtual assistant.
- SageMaker provides you a great way to build, train and deploy machine learning models. This is a great way to get more insights from your data. Your data analysts will love it. To get you started there is a nice tool called SageMaker Studio.
# Serverless
AWS provides serverless compute by providing Lambda. Serverless means you don’t have to bother about the underlying infrastructure. You can just focus on running your application, AWS will do all the other stuff for you. It scales automatically and you pay per request. Many languages are supported, including Java, Python and Node.js.
Lambda really shines in distributed applications. This means that you build a workflow based on existing cloud resources, where Lambda is the resource to insert your custom code. To demonstrate this here is a simple example workflow: a user can upload documents to an S3 bucket. Whenever a document is uploaded, a lambda function is triggered, which extracts the document title and put it in a DynamoDB table. Meanwhile also Bedrock is called to add a summary. To allow scaling up, you may decide to add queues to improve/shape the flow.
There are multiple tools available to build such workflows, we think these are the most interesting ones:
- Workflow Studio is a low-code IDE to build a distributed application with AWS Step Functions.
- Application Composer allows you to generate infrastructure-as-code using the Serverless Application Model.
# Takeaways
# Master at least one cloud provider
Cloud has become mature, every company should at least explore the opportunities to benefit from the cloud. Whether you’re into infrastructure, software development of data analytics, you should know how to use at least one cloud. Initially it doesn’t matter what cloud provider you choose; once you know the possibilities of a single cloud provider, you can better select the most appropriate cloud provider for your needs. Also learn how to deploy your cloud resources using an Infrastructure-as-Code, for example Terraform.
# Become familiar with AI
Artificial Intelligence (AI) is evolving and already provides lots of intesting features for you. Whether you include AI functions within the apps you develop, or use generative AI to assist coding your infrastructure, embrace the possibilities. Coding has become so much easier. And if you don’t trust publicly available services, run an LLM on your own laptop (e.g. with Ollama) or in your own datacenter or private cloud.
# Forget server virtualization, embrace the real cloud
AWS provides EC2 (Elastic Compute) so you can run your applications on virtual machines. But this is basically just outsourcing your server hosting. Explore container hosting (like EKS) and serverless for your ever demanding compute needs.