6 votes

Les logs Serilog collectés par Fluentbit vers Elasticsearch dans kubernetes ne sont pas correctement analysés en Json.

Utilisation de la pile EFK sur Kubernetes (Minikube). J'ai une application asp.net core qui utilise Serilog pour écrire à la console en Json. Les logs sont envoyés à Elasticsearch, mais ils arrivent sous forme de chaînes non analysées dans le champ "log", voilà le problème.

Voici la sortie de la console :

{
    "@timestamp": "2019-03-22T22:08:24.6499272+01:00",
    "level": "Fatal",
    "messageTemplate": "Text: {Message}",
    "message": "Text: \"aaaa\"",
    "exception": {
        "Depth": 0,
        "ClassName": "",
        "Message": "Boom!",
        "Source": null,
        "StackTraceString": null,
        "RemoteStackTraceString": "",
        "RemoteStackIndex": -1,
        "HResult": -2146232832,
        "HelpURL": null
    },
    "fields": {
        "Message": "aaaa",
        "SourceContext": "frontend.values.web.Controllers.HomeController",
        "ActionId": "0a0967e8-be30-4658-8663-2a1fd7d9eb53",
        "ActionName": "frontend.values.web.Controllers.HomeController.WriteTrace (frontend.values.web)",
        "RequestId": "0HLLF1A02IS16:00000005",
        "RequestPath": "/Home/WriteTrace",
        "CorrelationId": null,
        "ConnectionId": "0HLLF1A02IS16",
        "ExceptionDetail": {
            "HResult": -2146232832,
            "Message": "Boom!",
            "Source": null,
            "Type": "System.ApplicationException"
        }
    }
}

Voici le Program.cs, qui fait partie de la configuration de Serilog (ExceptionAsObjectJsonFormatter hérite de ElasticsearchJsonFormatter) :

.UseSerilog((ctx, config) =>
{
    var shouldFormatElastic = ctx.Configuration.GetValue<bool>("LOG_ELASTICFORMAT", false);
    config
        .ReadFrom.Configuration(ctx.Configuration) // Read from appsettings and env, cmdline
        .Enrich.FromLogContext()
        .Enrich.WithExceptionDetails();

    var logFormatter = new ExceptionAsObjectJsonFormatter(renderMessage: true);
    var logMessageTemplate = "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj}{NewLine}{Exception}";

    if (shouldFormatElastic)
        config.WriteTo.Console(logFormatter, standardErrorFromLevel: LogEventLevel.Error);
    else
        config.WriteTo.Console(standardErrorFromLevel: LogEventLevel.Error, outputTemplate: logMessageTemplate);

})

En utilisant ces pkgs nuget :

  • Serilog.AspNetCore
  • Serilog.Exceptions
  • Serilog.Formatting.Elasticsearch
  • Serilog.Settings.Configuration
  • Serilog.Sinks.Console

Voici à quoi cela ressemble dans Kibana Here

Et voici le configmap pour fluent-bit :

fluent-bit-filter.conf:
[FILTER]
    Name                kubernetes
    Match               kube.*
    Kube_URL            https://kubernetes.default.svc:443
    Kube_CA_File        /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
    Kube_Token_File     /var/run/secrets/kubernetes.io/serviceaccount/token
    Merge_Log           On
    K8S-Logging.Parser  On
    K8S-Logging.Exclude On

fluent-bit-input.conf:
[INPUT]
    Name             tail
    Path             /var/log/containers/*.log
    Parser           docker
    Tag              kube.*
    Refresh_Interval 5
    Mem_Buf_Limit    5MB
    Skip_Long_Lines  On

fluent-bit-output.conf:

[OUTPUT]
    Name  es
    Match *
    Host  elasticsearch
    Port  9200
    Logstash_Format On
    Retry_Limit False
    Type  flb_type
    Time_Key @timestamp
    Replace_Dots On
    Logstash_Prefix kubernetes_cluster

fluent-bit-service.conf:
[SERVICE]
    Flush        1 
    Daemon       Off
    Log_Level    info
    Parsers_File parsers.conf
fluent-bit.conf:
@INCLUDE fluent-bit-service.conf
@INCLUDE fluent-bit-input.conf
@INCLUDE fluent-bit-filter.conf
@INCLUDE fluent-bit-output.conf
parsers.conf:

Mais J'ai également essayé https://raw.githubusercontent.com/fluent/fluent-bit-kubernetes-logging/master/output/elasticsearch/fluent-bit-configmap.yaml avec mes modifications.

J'ai utilisé Helm pour installer fluentbit avec helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true

J'obtiens également beaucoup des erreurs suivantes :

log:{"took":0,"errors":true,"items":[{"index":{"_index":"kubernetes_cluster-2019.03.22","_type":"flb_type","_id":"YWCOp2kB4wEngjaDvxNB","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"json_parse_exception","reason":"Duplicate field '@timestamp' at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@432f75a7; line: 1, column: 1248]"}}}}]}

y

log:[2019/03/22 22:38:57] [error] [out_es] could not pack/validate JSON response stream:stderr

comme je peux le voir dans Kibana.

5voto

imbageek Points 49

Le problème venait d'un mauvais configmap fluentbit. Ceci fonctionne :

apiVersion: v1
kind: ConfigMap
metadata:
  name: fluent-bit-config
  namespace: logging
  labels:
    k8s-app: fluent-bit
data:
  # Configuration files: server, input, filters and output
  # ======================================================
  fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     info
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020        
    @INCLUDE input-kubernetes.conf
    @INCLUDE filter-kubernetes.conf
    @INCLUDE output-elasticsearch.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               kube.*
        Path              /var/log/containers/*.log
        Parser            docker
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     5MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-kubernetes.conf: |
    [FILTER]
        Name                kubernetes
        Match               kube.*
        Kube_URL            https://kubernetes.default.svc:443
        # These two may fix some duplicate field exception
        Merge_Log           On
        Merge_JSON_Key      k8s
        K8S-Logging.Parser  On
        K8S-Logging.exclude True
  output-elasticsearch.conf: |
    [OUTPUT]
        Name            es
        Match           *
        Host            ${FLUENT_ELASTICSEARCH_HOST}
        Port            ${FLUENT_ELASTICSEARCH_PORT}
        Logstash_Format On
        # This fixes errors where kubernetes.apps.name must object
        Replace_Dots    On 
        Retry_Limit     False
        Type            flb_type
        # This may fix some duplicate field exception
        Time_Key        @timestamp_es
        # The Index Prefix:
        Logstash_Prefix logstash_07
  parsers.conf: |
    [PARSER]
        Name   apache
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache2
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache_error
        Format regex
        Regex  ^\[[^ ]* (?<time>[^\]]*)\] \[(?<level>[^\]]*)\](?: \[pid (?<pid>[^\]]*)\])?( \[client (?<client>[^\]]*)\])? (?<message>.*)$
    [PARSER]
        Name   nginx
        Format regex
        Regex ^(?<remote>[^ ]*) (?<host>[^ ]*) (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   json
        Format json
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name        docker
        Format      json
        #Time_Key    time
        Time_Key    @timestamp
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Time_Keep   Off # on
        # See: https://fluentbit.io/documentation/0.14/parser/decoder.html
        # Command      |  Decoder | Field | Optional Action
        # =============|==================|=================
        # Decode_Field_As   escaped    log
        # Decode_Field_As   escaped    log    do_next
        # Decode_Field_As   json       log     
    [PARSER]
        Name        syslog
        Format      regex
        Regex       ^\<(?<pri>[0-9]+)\>(?<time>[^ ]* {1,2}[^ ]* [^ ]*) (?<host>[^ ]*) (?<ident>[a-zA-Z0-9_\/\.\-]*)(?:\[(?<pid>[0-9]+)\])?(?:[^\:]*\:)? *(?<message>.*)$
        Time_Key    time
        Time_Format %b %d %H:%M:%S

Prograide.com

Prograide est une communauté de développeurs qui cherche à élargir la connaissance de la programmation au-delà de l'anglais.
Pour cela nous avons les plus grands doutes résolus en français et vous pouvez aussi poser vos propres questions ou résoudre celles des autres.

Powered by:

X