DEPLOY AWS LAMBDA FUNCTION TO KUBERNETES USING KNATIVE AND TRIGGERMESH KLR


Saurabh Verma

Saurabh VermaMay 23·4 min read

Knative is a platform that provides tools for deploying, running, and managing serverless cloud-native applications to Kubernetes. Knative brings together the scalability of Kubernetes and ease of development of serverless applications, giving the developer more control over the resources, at the same time ease to focus on the application without worrying about the infrastructure. In this blog, we will be exploring how to deploy an AWS lambda-based application, to Kubernetes using Knative and Triggermesh KLR ( Knative lambda runtime ).

Background

Knative consists of two core components:

Serving: Reduces developer’s efforts in maintaining Kubernetes stateless services by providing automatic scaling, networking, and rollouts.

Eventing: Exposes event routing as configuration allowing easy routing of on-cluster and off-cluster components.

This walkthrough is focused on Knative Serving. I’ll be working on Ubuntu 20.04 and will be using Minikube to set up a Kubernetes cluster in the local machine. I’ll be using Knative serving’s custom resources and core components to set it up in a Kubernetes cluster. Also utilizing Kourier as the networking layer (to let in external network traffic) and enabling its integration with Knative. Further, I will be using Triggermesh KLR to deploy lambda on Kubernetes.

Prerequisites:

  • Install Docker engine.
  • Add your user (who has root privileges) to docker group using the command (essential to start Minikube):
sudo usermod -aG docker $USER

Installing components

  • Configure Minikube and start the cluster:
minikube config set memory 4000
minikube config set cpus 4
minikube start

In a seperate terminal run below command to start minikube tunneling for external ips on load balancers.

nohup minikube tunnel &
  • Install Knative Serving using YAML file:
kubectl apply -f https://github.com/knative/serving/releases/download/v0.22.0/serving-crds.yaml
kubectl apply -f https://github.com/knative/serving/releases/download/v0.22.0/serving-core.yaml

Make sure all the pods are running:

kubectl get pods --all-namespaces
  • Install Kourier networking layer:
kubectl apply -f https://github.com/knative/net-kourier/releases/download/v0.22.0/kourier.yaml

Configure Knative Serving to use Kourier by default:

kubectl patch configmap/config-network --namespace knative-serving --type merge --patch '{"data":{"ingress.class":"kourier.ingress.networking.knative.dev"}}'

Save the cluster IP to configure the domain to determine your internal website on local DNS:

kubectl --namespace kourier-system get service kourier

Configure DNS:

kubectl patch configmap -n knative-serving config-domain -p “{\”data\”: {\”*Cluster IP*.nip.io\”: \”\”}}”
  • Install Knative CLI:
mkdir knative-cli
cd knative-cli/
git clone https://github.com/knative/client.git
cd client/
hack/build.sh -f
export PATH=$PATH:$PWD
kn version

Create a sample python lambda function

  • Create a directory and save your function code there. Following is a sample hello world python function:
mkdir python_dir
cd python_dir
cat > handler.py <<EOF
import json
import datetime
def endpoint(event, context):
current_time = datetime.datetime.now().time()
body = {
"message": "Hello, the current time is " + str(current_time)
}
response = {
"statusCode": 200,
"body": json.dumps(body)
}
return response
EOF
  • Create a Dockerfile (in the same directory), i am extracting Triggermesh Knative lambda runtime for python 3.7 which is a Tekton Task that can be used to run an AWS Lambda function in a Kubernetes cluster installed with Knative. _HANDLER variable needs the value in format (name-of-python-file.name-of-lambda-function).
cat > Dockerfile <<EOF
FROM gcr.io/triggermesh/knative-lambda-python37
ENV _HANDLER handler.endpoint
COPY . .
RUN if [ -f requirements.txt ]; then pip3.7 install -r requirements.txt ;fi
ENTRYPOINT ["/opt/aws-custom-runtime"]
EOF
  • Build and push the docker image:
docker build -t {Docker id}/python-klr-image .
docker push {Docker id}/python-klr-image
  • I am using docker network to test the function locally:
docker run -d --rm --name python-klr-container <python-klr-image>
# Set up docker network for the container
docker network create my-net
docker network connect my-net python-klr-container
# Install jq using sudo apt-get install jq
curl $(docker inspect python-klr-container | jq .[].NetworkSettings.Networks.bridge.IPAddress -r):8080

Deploy the sample lambda function

  • Using Knative cli:
kn service create python-klr-service --image <python-klr-image>

Output:

Creating service 'python-klr-service' in namespace 'default':0.045s The Route is still working to reflect the latest desired specification.
0.089s ...
0.101s Configuration "python-klr-service" is waiting for a Revision to become ready.
8.829s ...
8.873s Ingress has not yet been reconciled.
8.941s Waiting for load balancer to be ready
9.114s Ready to serve.Service 'python-klr-service' created to latest revision 'python-klr-service-00001' is available at URL:
http://python-klr-service.default.10.98.36.128.nip.io
  • Using YAML file:
kubectl apply -f - <<EOF
apiVersion: serving.knative.dev/v1
kind: Service
metadata:
name: python-klr-service
spec:
template:
spec:
containers:
- image: <python-klr-image>
EOF
  • Monitor deployments:
kubectl get deployments
NAME READY UP-TO-DATE AVAILABLE AGE
python-klr-service-00001-deployment 1/1 1 1 67s

The service is now up and running, lets test it out.

  • Send a curl request to the URL:
curl http://python-klr-service.default.10.98.36.128.nip.io
{"statusCode": 200, "body": "{\"message\": \"Hello, the current time is 18:20:24.440140\"}"}

This concludes our walk through. You can further play around with Knative, explore and implement more of its features. Also you can try exploring Knative eventing, and work out to deploy serverless functions to it.

References:Installing Knative Serving using YAML filesThis topic describes how to install Knative Serving by applying YAML files using the kubectl CLI. Prerequisites Before…knative.devAWS Lambda – Serverless Compute – Amazon Web ServicesRun code without thinking about servers or clusters. Only pay for what you use. AWS Lambda is a serverless compute…aws.amazon.comtriggermesh/knative-lambda-runtimeKnative Lambda Runtimes (e.g KLR, pronounced clear) are Tekton Tasks that can be used to run an AWS Lambda function in…github.comDeploying AWS Lambda-Compatible Functions in Amazon EKS using TriggerMesh KLR | Amazon Web Services中文版 – Custom AWS Lambda Runtimes were introduced at re:Invent 2018. Knative is an open source project to build, deploy…aws.amazon.comCreate your first Knative appKnative is an open source community project that adds components to Kubernetes for deploying, running, and managing…opensource.com

Por journey

system analyst lawyer journalist ambientalist

Deixar um comentário

Preencha os seus dados abaixo ou clique em um ícone para log in:

Logotipo do WordPress.com

Você está comentando utilizando sua conta WordPress.com. Sair /  Alterar )

Imagem do Twitter

Você está comentando utilizando sua conta Twitter. Sair /  Alterar )

Foto do Facebook

Você está comentando utilizando sua conta Facebook. Sair /  Alterar )

Conectando a %s

%d blogueiros gostam disto: