Logging

Open-Source-Logverwaltung

Der Elastic Stack (gelegentlich auch als ELK Stack bezeichnet) ist die beliebteste Open-Source-Logging-Plattform. Und das aus gutem Grund:

Entdecke Logging mit Elastic. Probier es aus.

Der Einstieg ist leicht

Mit integrierter Unterstützung für verbreitete Datenquellen und Standard-Dashboards bietet der Elastic Stack ein Rundum-Sorglos-Erlebnis. Übertrage Logs mit Filebeat und Winlogbeat, indiziere sie in Elasticsearch und visualisiere alles in Minutenschnelle in Kibana. Überspringe die nächsten Abschnitte, um loszulegen. (Und wenn du das Modul, das du brauchst, nicht finden kannst, erstelle einfach dein eigenes oder zapfe das Wissen der Community an. Open Source macht's möglich.)

Du kannst alles in Echtzeit verfolgen

Mit Elasticsearch im Kern des Elastic Stack profitierst du von schnellen Reaktionszeiten, auch bei hohen Skalierungsfaktoren. Stelle eine Frage und du erhältst im Handumdrehen eine Antwort. Verschwende keine Zeit damit zu warten...dass deine Dashboards...laden.

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

Es skaliert mit dir. Sende ein paar Dateien an die Warteschlange ... oder ein paar Milliarden.

Das Benutzererlebnis, das du auf einem einzigen Laptop hast, ist dasselbe, das du auf hunderten mit einem Petabyte an Daten haben wirst. Das Kopfzerbrechen über einen Umbau der Systemarchitektur kannst du dir sparen.

Und keine Sorge, dass wertvolle Daten ungenutzt bleiben. Verarbeite und indiziere, was für dich wichtig ist.

Jetzt ausprobieren

Hol dir eine frische Installation und beginne mit dem Transfer und der Visualisierung von Logs.
In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory inside a Docker container:

Modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.prospectors:
  - type: log
  paths:
    - '/var/lib/docker/containers/*/*.log'
  json.message_key: log
  json.keys_under_root: true
  processors:
  - add_docker_metadata: ~
output.elasticsearch:
  hosts: ["<elasticsearch_url>:9200"]
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

In Elasticsearch install directory:
Ctrl + C to Copy
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
From your machine or wherever you run kubectl:
  • Download filebeat-kubernetes.yml
  • Edit filebeat-kubernetes.yml to point to your Elasticsearch instance with credentials
env:
  - name: ELASTICSEARCH_USERNAME
    value: elastic
  - name: ELASTICSEARCH_PASSWORD
    value: changeme
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

Download Elasticsearch MSI installer, Kibana.zip file, and Winlogbeat .zip file.

Run through the Elasticsearch MSI installer (leave X-Pack checked)

In Elasticsearch install directory:
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

Extract the contents of Kibana zip file, and in that directory:
Ctrl + C to Copy

Modify kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
Extract the contents of Winlogbeat zip file, and in that directory:

Modify winlogbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

In Elasticsearch install directory:
Ctrl + C to Copy
Once Elasticsearch starts, in Elasticsearch install directory (separate window):
Ctrl + C to Copy

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:
Ctrl + C to Copy

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
Ctrl + C to Copy
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
            
Ctrl + C to Copy
Open browser @
http://<kibana_url>:5601 (login: elastic/<es_pw>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn’t work for you?

See documentation for how to configure Filebeat to look at other files and directories.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat System] Syslog dashboard"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing audit event types, accounts, and commands.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
            
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat Apache2] Access and error logs"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring common URLs, response codes, and user agent stats.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat MySQL] Overview"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring database queries, error messages, and events overtime.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to send logs enhanced with Docker metadata to Elastic

filebeat.prospectors:
  - type: log
  paths:
    - '/var/lib/docker/containers/*/*.log'
  json.message_key: log
  json.keys_under_root: true
  processors:
  - add_docker_metadata: ~
output.elasticsearch:
  hosts: ["<elasticsearch_url>:9200"]
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search logs for your application or service running in Docker
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default directory where Docker puts logs from your applications (/var/lib/docker/containers/*/*.log), and enhanced them with Docker container metadata. You can now look at logs from Docker in one central place in Kibana.

Didn't work for you?

Filebeat Docker metadata processor can be tuned further for your use case. See the documentation for more information.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                

From your machine or wherever you run kubectl:

env:
  - name: ELASTICSEARCH_USERNAME
    value: elastic
  - name: ELASTICSEARCH_PASSWORD
    value: changeme
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search your logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring your logs from your app and services running in Kubernetes.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download Winlogbeat .zip file.

Extract the contents of Winlogbeat zip file, and in that directory:

Modify winlogbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"Winlogbeat Dashboard"
What just happened?

Winlogbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start exploring windows event log information.

Didn't work for you?

Winlogbeat module assumes default settings for Windows event logging. See the documentation for supported versions and configuration options.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
                
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Go to Discover to search your app logs
What just happened?

Filebeat created an index pattern in Kibana with defined fields for logs residing in the default path directory (/var/log). You can change the path in the filebeat.yml config file. You can now look at logs in one central place in Kibana.

Didn't work for you?

See documentation for how to configure Filebeat to look at other files and directories.

Füge Machine Learning hinzu, um die Erkennung von Anomalien zu automatisieren

Du solltest dich nicht persönlich um jede einzelne Log-Nachricht oder Transaktion kümmern müssen – nur um die wichtigen oder bemerkenswerten.

Die Machine-Learning-Features von Elastic erweitern den Elastic Stack, um automatisch das Verhalten deiner Elasticsearch-Daten zu modellieren und dich bei Problemen in Echtzeit zu benachrichtigen.

Überzeuge dich selbst

Beim Telekommunikations-Riesen Sprint durchkämmten Systemadministratoren früher Logs, führten Shell-Scripts aus und führten grep Suchen nach dem durch, was sie kannten. Heute nutzen sie Elastic, um innerhalb kürzester Zeit Performance-Probleme zu beheben, die Kundenzufriedenheit zu steigern, B2B-Beziehungen zu vereinfachen und Retail-Systeme zu optimieren.

Dies ist nicht das einzige Unternehmen, das Elastic für das Logging einsetzt. Hier findest du weitere Kundenbeispiele.

Es gibt ein Leben jenseits von Logs

Hast du Kennzahlen? Proxy- oder Firewall-Logs? Dokumente mit einer Unmenge an Text? Zentralisiere alle diese Daten im Elastic Stack – für angereicherte Analysen, geringere Betriebskosten und eine vereinfachte Architektur.

Metriken

Erhalte Einblick in Zahlen: CPU, Speicher, usw.

Mehr erfahren

Website-Suche

Optimiere das Sucherlebnis auf deiner Website.

Mehr erfahren

Sicherheitsanalysen

Interaktive Untersuchung – schnell und skalierbar.

Mehr erfahren

APM

Erhalte Einblicke in deine Application-Performance.

Mehr erfahren

App-Suche

Suche nach Dokumenten, Geodaten usw.

Mehr erfahren