How does openshift output json logs


  1. Red Hat OpenShift 4.8 environment cluster building
  2. How does openshift output json logs
  3. openshfit Vertical Pod Autoscaler practice
  4. openshift Certified Helm Charts Practice
  5. openshift creates a serverless application
  6. openshift gitops practice
  7. openshift Tekton pipeline practice

1. JSON Logging

Version 4.8 of the Red Hat® OpenShift® Container Platform included JSON logging support, which is now back in the logging solution.
Customers can now control precisely which container logs are in json format. They can tag common JSON schemas so that log management systems (Elasticsearch systems managed by Red Hat or third-party systems) know exactly what to do with those logs.

When a parse: jsonfield is added to ClusterLogForwarderthe pipeline of a custom resource, the top-level field is added and structuredprefixed with .
When using them to store log data in an integrated Elasticsearch cluster, a new Elasticsearch index will be created for each new pipeline. Go to the translation page

The cluster's local Elasticsearch server faces a significant performance hit with every new index created

Goals:

  • Demonstrate structured JSON logging
  • Check out a ClusterLogging resource customized for the dev-coffee shop project
  • Displaying structured logs with Kibana
  • Explain how to send structured logs to other log collectors

2. Add Logging for dev-coffeeshop Namespace

For this exercise, you will use the following ClusterLogForwarderconfiguration, which is already deployed in your cluster.
To view the ClusterLogForwarder configuration:

In the OpenShift Container Platform web console, select " Project: All Projects ".
Use the perspective switcher to switch to the Administrator perspective and click Search .
In the " Resources " drop-down list, select " ClusterLogForwarder ".
Click on CLF instance and then on YAML on the page that appears.
Check out this configuration to see what's new when creating JSON logs:

apiVersion: logging.openshift.io/v1
kind: ClusterLogForwarder
metadata:
  name: instance
  namespace: openshift-logging
spec:
  inputs:
 - application:
      namespaces:
      - dev-coffeeshop 
    name: dev-coffeeshop-input-example
  outputDefaults: 
    elasticsearch:
      structuredTypeKey:  kubernetes.namespace_name 
        # OR
      structuredTypeName: dev-coffeeshop-index-name 
  pipelines:
 - inputRefs: 
    - dev-coffeeshop-input-example
    name: pipeline-dev-coffeeshop
    outputRefs:
    - default
    parse: json
 - inputRefs: 
    - infrastructure
    - application
    - audit
    outputRefs:
    - default
  • Custom input to bootstrap the log from dev-coffeeshopthe namespace.
  • The default output is the cluster's Elasticsearch, not the remote server.
  • This will set up a structured log index in Elasticsearch based on the namespace.
  • A user-defined string that names the index, preceded by one if present structuredTypeKey.
  • A pipeline to enable JSON parsing.
  • Pipelines for all common inputs without JSON parsing.

Every new JSON log pipeline for Elasticsearch creates a new index in Elasticsearch. From the command line, you can get a quick list of ES indexes:oc exec $es_pod -c elasticsearch -- indices

Later, we will see structured logs available for the cafe application in the dev-coffeshop namespace.

Write a query in Kibana and structured.messageonly get log messages from it.

3. Check structured logs in Kibana

In this exercise, you openshift-loggingaccess the Kibana web console via the Kibana route in the namespace and query structured logs.

3.1 Open Kibana Route

1. Find the Kibana URL as a route from the OpenShift Container Platform web console Administrator 's perspective, and make sure it is checked Project: OpenShift -logging.
2. In the navigation menu, navigate to " Network → Routes".
Looking forward to seeing where Kibana services are located.
3. Click the location to open it.

3.2 Build index mode

  1. In the Kibana web console, clickManagement.
  2. Click Index Patterns.
  3. Click Create index pattern.
  4. In the Index pattern field, enter app-dev-coffeeshop-*, then click Next Step.
  5. In the Time Filter field, select @timestamp.
  6. Click Create index pattern.

3.3 Querying Elasticsearch with Kibana

  • Click Discover.
  • If it is not already selected app-dev-coffeeshop-*, use the drop-down list on the left to select it app-dev-coffeeshop-*.
  • On the first log entry found, click the right arrow ▶ View log details.
  • At the bottom of the list, find structuredthe entry. elements,including structured.hostName
    structured.message,andstructured.level

insert image description here
Each log input type with a different JSON format creates a new index in the Elasticsearch database to handle the different structured.*data. Save resources by organizing logs in standard JSON format like "Apache", "Google", etc.

4. Clean up log query to facilitate troubleshooting

Applications are constantly emitting messages, so you need a way to clear the clutter and see only the messages and query results that make sense for your situation. Structured JSON logging makes this easier.

First, look at a messy message where the data is not obvious and cannot be searched:

  • Available FieldFind *t* messageand click below add.
  • Observe how the log output is easier to read, but the messages are still vague - this is typical when looking at any output

Collated information:

  • Under Selected fields find *t* message and click remove.
  • Under Available fields find and click add for *t* structured.message, *t* structured.origin, and *t* structured.type
  • Note that only the key data you need to make a decision is shown here

It would be nice if there was some sort of orderIDway to correlate the Order Details of the origin and type with FINISHEDthe Order Status of and . COLLECTEDWhat code changes would you ask your developers to make in order to achieve this goal?

Later in this course, you will add the prod-coffeshop namespace to this logging pipeline and update the query.
Now, let's take a look at another important feature of OpenShift Container Platform 4.8: Certified Helm Charts.

Guess you like

Origin blog.csdn.net/xixihahalelehehe/article/details/123335781