Kubernetes events collect to elasticsearch

We what collect and storage kubernetes events to elasticsearch, easy to query and analyze alerts. we use this github repo componentkubernetes-event-exporter

The main.tf include kubernetes ‘deployment’ object

 1resource "kubernetes_deployment" "event_export" {
 2  metadata {
 3    name = "event-export"
 4    namespace = "kube-system"
 5    labels = {
 6      app = "event-export"
 7    }
 8  }
 9
10  spec {
11    replicas = 1
12
13    selector {
14      match_labels = {
15        app = "event-export"
16      }
17    }
18
19    template {
20      metadata {
21        labels = {
22          app = "event-export"
23          version = "v1"
24        }
25      }
26
27      spec {
28        service_account_name = kubernetes_service_account.event_export.metadata[0].name
29        automount_service_account_token = true
30        container {
31          image = "opsgenie/kubernetes-event-exporter:0.9"
32          args = [
33            "-conf=/data/config.yaml",
34          ]
35          name = "event-export"
36          resources {
37              requests {
38                memory = "50Mi"
39              }
40              limits {
41                memory = "100Mi"
42              }
43          }
44          volume_mount {
45              name       = "cfg"
46              mount_path = "/data"
47          }
48          image_pull_policy = "IfNotPresent"
49        }
50        volume {
51          name = "cfg"
52          config_map {
53            name = kubernetes_config_map.event_export_config.metadata[0].name
54          }
55        }
56      }
57    }
58  }
59
60  depends_on = [
61    kubernetes_service_account.event_export
62  ]
63
64}

The configmap.tf indlude the kubernetes configMap object storage event-exporter’s config as volume mount to previous deployment’s pod container

 1resource "kubernetes_config_map" "event_export_config" {
 2  metadata {
 3    name = "event-export-config"
 4    namespace = "kube-system"
 5  }
 6  data = {
 7    "config.yaml" =<<-EOF
 8    logLevel: error
 9    logFormat: json
10    route:
11      routes:
12        - match:     
13          - type: "Warning"  
14            receiver: "dump"
15        - match:
16          - type: "Error"
17            receiver: "dump"
18    receivers:
19      - name: "dump"
20        file:
21          path: "dev/stderr"
22    EOF
23  }
24
25}

In this config section, we decides collect the Error and Warning events only. And this component support for AWS ElasticSearch is not good, we output events directly to the Console, after which the ES are collected through Fluentdfluentd service

The Role.tfinclude the kubernetes RBAC config, used by deployments

 1resource "kubernetes_cluster_role_binding" "event_export" {
 2  metadata {
 3    name = "event-export"
 4    labels = {
 5      "app.kubernetes.io/name"       = "event-export"
 6      "app.kubernetes.io/managed-by" = "terraform"
 7    }
 8  }
 9  role_ref {
10    api_group = "rbac.authorization.k8s.io"
11    kind      = "ClusterRole"
12    name      = "view"
13  }
14  subject {
15    kind      = "ServiceAccount"
16    name      = kubernetes_service_account.event_export.metadata[0].name
17    namespace = kubernetes_service_account.event_export.metadata[0].namespace
18  }
19
20  depends_on = [    
21    kubernetes_service_account.event_export
22  ]
23}
24
25resource "kubernetes_service_account" "event_export" {
26  automount_service_account_token = true
27  metadata {
28    name      = "event-export"
29    namespace = "kube-system"
30    labels = {
31      "app.kubernetes.io/name"       = "event-export"
32      "app.kubernetes.io/managed-by" = "terraform"
33    }
34  }
35}

Next search in kibana the search keyword is: kubernetes.labels.app:"event-export" AND log:"Warning" OR log:"Error"