How To Export Google Cloud SQL Logs To Elasticsearch On Kubernetes Through Logstash

Raphael De Lio
4 min readMay 27, 2020

--

Twitter | LinkedIn | YouTube | Instagram

Exporting Google Cloud SQL Logs to your Elasticsearch cluster isn’t as simple as installing Filebeat and shipping all these log files from your machine or your Kubernetes cluster. It doesn’t mean you can’t do it with just a few clicks, though!

In this guide you will learn how to easily export Google Cloud SQL Logs to Elasticsearch using Logstash on Kubernetes!

If you haven’t an Elasticsearch cluster nor a Logstash instance running on Kubernetes, you should take a look at these stories:

You will also need a Cloud SQL instance in Google Cloud, but I believe you had already figured that out 😉

How are we going to do it?

We will basically create a Pub/Sub to publish our Cloud SQL logs as messages and configure Logstash to subscribe to it.

We will be going through:

  • Creating a Pub/Sub
  • Creating a Log Router
  • Granting all the necessary permissions
  • Installing Google PubSub plugin in Logstash
  • Configuring Logstash

Create the Pub/Sub

Open the Pub/Sub managing page and click on Create Topic + . Give it a name of your choice and leave the option Google Managed Key enabled.

Once the topic is created, click on it to access its page, scroll down, and create a new subscription by clicking on Create Subscription and then selecting Create Simple Subscription . Give it a name of your choice and you don’t need to change any other settings.

Create a Log Router

Now that we have our Pub/Sub topic, we can create our Log Router. Open the Log Router page and click on Create Sink and choose Cloud SQL Database in the filters. Give it a name of your choice, select Pub/Sub for the Sink Service and select the Topic that you created in the previous step as the Sink Destination .

Once the sink is created, a Writer Identity will be configured to it. The Writer Identity will look like serviceAccount:p534543556-535689@gcp-sa-logging.iam.gserviceaccount.com

In order to the Sink to be able to publish the logs in the topic you will need to grant this service account the respective permissions. You can do it by going to the Pub/Sub page, select the topic that you previously created and a sidebar will open.

Click on the permissions tab, and then:

  • Add Member: paste your Writer Identity email in there, such asp534543556-535689@gcp-sa-logging.iam.gserviceaccount.com
  • Select the role Pub/Sub Publisher role
  • Save it

Be aware that Google Cloud might take up to 24h to grant the permissions. 🤷‍♂️

Configuring Logstash

Create a service account to allow Logstash to subscribe to the Pub/Sub Topic

Before we configure it, we need to create a Service Account with the permissions to subscribe to our topic. This Service Account will be used by Logstash to collect the logs of our Cloud SQL database.

To do it, open the Service Account page and then:

  • Click on Create Service Account
  • Give it a name, i.e. “logstash”
  • Give it an ID, i.e. “logstash”
  • Give it a description.

Then, in the Grant users access to this service account step, create an API key and download it as JSON.

Create a Kubernetes secret holding your Service Account Key

Go to the folder where you downloaded your json key and rename it to logstash-sa.json

Now open a terminal at this folder and run the following command to create the secret:

kubectl create secret generic logstash-sa --from-file=logstash-sa.json=logstash-sa.json

Configure Google Pub/Sub plugin in Logstash

If you followed our story teaching how to Deploy Logstash and Filebeat On Kubernetes With ECK, all you need to do is to add this snippet to you logstash.yaml spec.containers:

command:
- sh
- -c
- |
bin/logstash-plugin install \
logstash-input-exec \
logstash-input-google_pubsub && bin/logstash

We are basically telling Kubernetes to override the default command provided by the container image. In our case we are telling Logstash to install google-pubsub plugin and then start the instance.

The final yaml will look like:

Configuring Logstash Pipeline

To our logstash.conf we will add the following snippet into the input block:

google_pubsub {
type => "pubsub"
project_id => "<YOUR PROJECT'S NAME>"
topic => "<YOUR PUB/SUB TOPIC>"
subscription => "<YOUR PUB/SUB SUBSCRIPTION>"
json_key_file => "/etc/logstash/keys/logstash-sa.json"
codec => "json"
}

Our logstash-configmap.yaml will look like this:

Recreate Logstash Pod

Now we all set to apply our new configs.

Reapply our logstash-configmap.yaml by running:

kubectl apply -f logstash-configmap.yaml

And redeploy Logstash in your Kubernetes cluster by running:

kubectl delete -f logstash.yaml if it was already running and then :

kubectl apply -f logstash.yaml 

And that’s it! Now you should have your Cloud SQL database logs being exported to your Elasticsearch cluster!

Contribute

Writing takes time and effort. I love writing and sharing knowledge, but I also have bills to pay. If you like my work, please, consider donating through Buy Me a Coffee: https://www.buymeacoffee.com/RaphaelDeLio

Or by sending me BitCoin: 1HjG7pmghg3Z8RATH4aiUWr156BGafJ6Zw

Follow Me on Social Media

Stay connected and dive deeper into the world of Elasticsearch with me! Follow my journey across all major social platforms for exclusive content, tips, and discussions.

Twitter | LinkedIn | YouTube | Instagram

You might also enjoy:

--

--