How To Deploy the Elastic Stack with the Elastic Cloud On Kubernetes (ECK)
The ELK or Elastic Stack is a complete solution to search, visualize and analyse logs generated from different sources in one specialised application.
ELK stands for Elasticsearch, Logstash and Kibana. Elasticsearch is a mechanism of search and analysis. Logstash is a server side data processing pipeline that receives data from many different sources and send them to Elasticsearch. Kibana is a web user interface that allows users to visualize and analyse data from Elasticsearch with many capabilities.
In this guide we will learn how to easily deploy Elasticsearch and Kibana in Kubernetes with the Elastic Cloud On Kubernetes (ECK).
We will be going through:
• Deploy ECK in your Kubernetes Cluster
• Deploy an Elasticsearch cluster
• Deploy a Kibana instance
DEPLOY ECK IN YOUR KUBERNETES CLUSTER
Elastic Cloud On Kubernetes extends the basic Kubernetes orchestration capabilities to support the setup and management of Elasticsearch and Kibana on Kubernetes.
By deploying ECK in your Kubernetes cluster you do not have to worry about a few critical operations, such as securing the stack with TLS certificates.
Elastic gives us an easy way to install it, just run:
kubectl apply -f https://download.elastic.co/downloads/eck/1.1.1/all-in-one.yaml
And it’s done!
DEPLOY ELASTICSEARCH ON KUBERNETES
The yaml below will set up an Elasticsearch cluster with three nodes and a persistence storage with 10gb. It will also expose it with Load Balancer. If you don’t want it to be exposed, just comment this block:
http:
service:
spec:
# expose this cluster Service with a LoadBalancer
type: LoadBalancer
To deploy it, download the file and run in the directory where the file is located:
kubectl apply -f elasticsearch.yaml
You can check if your deployment succeded by running:
kubectl get elasticsearch
You should see the deployment as Ready and its health as Green
The main password for the user “elastic” is automatically created as a Kubernetes Secrets. To decode it, run:
echo $(kubectl get secret elasticsearch-es-elastic-user -o go-template='{{.data.elastic | base64decode}}')
Now, to find the external IP for the cluster, run:
kubectl get service elasticsearch-es-http
To test it you must first download the automatically generated SSL certificate. You can do it by running:
kubectl get secret elasticsearch-es-http-certs-public -o 'go-template={{index .data "ca.crt"}}' | base64 --decode >> ca.crt
Then use to perform a GET request at https://<external-ip>:9200
You should see something like:
{
"name": "elasticsearch-es-default-1",
"cluster_name": "elasticsearch",
"cluster_uuid": "rXV3e46FQ4ybz4BXgierTY",
"version": {
"number": "7.7.0",
"build_flavor": "default",
"build_type": "docker",
"build_hash": "46afced28e652377862183f60681a1e9edaefa1",
"build_date": "2020-05-23T02:03:37.602180Z",
"build_snapshot": false,
"lucene_version": "8.5.1",
"minimum_wire_compatibility_version": "6.8.0",
"minimum_index_compatibility_version": "6.0.0-beta1"
},
"tagline": "You Know, for Search"
}
Now you have a fresh Elasticsearch cluster running in your Kubernetes! Woohoo!
DEPLOY KIBANA ON KUBERNETES
Now that our Elasticsearch cluster is running we can deploy our Kibana instance!
Kibana 7.7.0 comes with a new Alarms feature, in order to enable it we must set an encrypted key in our Kubernetes Secret. We can do it by running the following command with a 32 random character key:
kubectl create secret generic kibana-saved-objects-encrypted-key --from-literal=xpack.encryptedSavedObjects.encryptionKey=<RANDOM GENERATED 32 CHARACTER KEY>
Now we can deploy our Kibana instance that will automatically connect to our Elasticsearch cluster!
To deploy it, run the following command in the same directory of the yaml file below:
kubectl apply -f kibana.yaml
To find the external IP of the instance run:
kubectl get service kibana-kb-http
and reach it by HTTPS. The username and password are the same of Elasticsearch.
And that's it! Enjoy your fresh Kibana instance running in your Kubernetes cluster!
Contribute
Writing takes time and effort. I love writing and sharing knowledge, but I also have bills to pay. If you like my work, please, consider donating through Buy Me a Coffee: https://www.buymeacoffee.com/RaphaelDeLio
Or by sending me BitCoin: 1HjG7pmghg3Z8RATH4aiUWr156BGafJ6Zw
Follow Me on Social Media
Stay connected and dive deeper into the world of Elasticsearch with me! Follow my journey across all major social platforms for exclusive content, tips, and discussions.
Twitter | LinkedIn | YouTube | Instagram