How To Upgrade Elasticsearch, Kibana, Logstash and Filebeat On Kubernetes With ECK
--
The time has come. If you have followed my previous stories on how to Deploy Elasticsearch and Kibana On Kubernetes and how to Deploy Logstash and Filebeat On Kubernetes you probably have deployed the version 7.7.0 of the stack.
In June, 2020, the version 7.8.0 was released. Kibana has a new User Interface, Elasticsearch comes with new features, etc… You might be willing to update but yet, you are concerned that this might cause the loss of all your data that was already ingested into Elasticsearch or maybe you are afraid the stack will be down for a long time since you will need to restore backups and etc…
Good news are: thanks to ECK you can upgrade the stack with zero or little downtime and without losing any data or needing to restore backups. Let’s do it!
We will be going through:
• Backup
• Updating Elasticsearch
• Updating Kibana
• Updating Logstash
• Updating Filebeat
Backup
First of all, let’s be cautious. We don’t expect data to be lost, but naturally it might happen and of course, I won’t be hold accountable for any data you might have lost during this process, therefore, be cautious and take snapshots.
If you still don’t know how to backup Elasticsearch, you should take a look at my previous story on How To Backup Elasticsearch On Kubernetes With Google Cloud Storage and Kibana or How To Backup Elasticsearch On Kubernetes With Amazon S3 and Kibana.
Don’t forget to backup Kibana by exporting all of its Saved Objects as well.
Updating Elasticsearch
Let’s start by the heart of the stack, Elasticsearch. First of all, let’s update our manifest file changing the version field to the new version that we want to upgrade to. In the example below I updated the manifest file to 7.8.0 (Beware that this manifest doesn’t contain the settings for backup plugins. You should use the manifest you are already running in your Kubernetes cluster):
Now that it is updated we can apply it by running:
kubectl apply -f elasticsearch.yaml
ECK will take over and manage its upgrade. Each node should be upgraded to the newest version and no data should have been lost. You can check your Elasticsearch cluster by requesting:
GET https://<your-es-address>:<port>/
And you should see something like:
{
"name" : "elasticsearch-es-default",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "<your-es-uuid>",
"version" : {
"number" : "7.8.0",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "<hash>",
"build_date" : "2020-07-16T19:35:50.134439Z",
"build_snapshot" : false,
"lucene_version" : "8.5.1",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
Your Elasticsearch Cluster is now updated! Woohoo!
Updating Kibana
Now it’s time to upgrade the eyes of our stack, the tool that allows us to see the formless in such a beautiful way. Kibana it is.
Just as we did with Elasticsearch, let’s update our manifest by upgrading the version field. Again, we changed the version to 7.8.0:
And reapplying its deployment by running he snippet below should allow ECK to take over and upgrade Kibana:
kubectl apply -f kibana.yaml
If everything go well you can celebrate and enjoy your instance of Kibana with its newest version. 🔥
Updating Logstash
Well, Logstash doesn’t really store any information, right? It works as a bridge receiving all these raw logs and transforming them into beautiful information before storing them into Elasticsearch. We can update it by updating its manifest as well and reapplying it.
Cool! Now let’s reapply it by running:
kubectl apply -f logstash.yaml
And that’s it. Filtering logs like a boss 😎
Updating Filebeat
None of this matters, though, if we don’t ship our logs to Logstash, right? So, let’s update our Filebeat instance!
Again, update the manifest by upgrading the version field:
And reapply it by running:
kubectl apply -f filebeat-daemonset.yaml
Congratulations! Your whole stack is updated and running the newest version! You had zero or little downtime and all your data was preserved. Besides that, you were cautious and created backups of your data just in case... Good job 😎🔥
Medium is changing (Did you know?)
Medium Partner Program (The Monetization Program) has suffered great changes since the 1st of August, 2023. Before that date, Medium would pay writers based on the time readers spent on their stories.
Unfortunately, this has changed. Medium pays writers now based on the interaction with their stories. The amount they earn is based on claps, follows, highlights, and replies. I really hate to do it, but if you enjoyed this story and other stories on Medium, don’t forget to interact with them. This is an easy way to keep supporting the authors you like and keep them on the platform.
Contribute
Writing takes time and effort. I love writing and sharing knowledge, but I also have bills to pay. If you like my work, please, consider donating through Buy Me a Coffee: https://www.buymeacoffee.com/RaphaelDeLio
Or by sending me BitCoin: 1HjG7pmghg3Z8RATH4aiUWr156BGafJ6Zw