Security Analytics

Security Analytics

Threats don't follow templates. Neither should you. The Elastic Stack gives you the edge you need to keep pace with the attack vectors of today and tomorrow. Here's how.

Experience security analytics with Elastic. Try It Out

First, You've Got to Be Really Fast

Attacks aren't a matter of if, but when. So, ask yourself, how long do you want the adversary in your system?

Elastic is designed for speed, and indexes your data as it’s ingested. This shrinks your time-to-information to seconds, and makes running ad hoc queries and real-time visualization easy.

Don’t Throw Data Out, Throw It All In

The key to detecting a threat can come from anywhere. So having a complete picture of what's going on across your systems in real time matters.

Elasticsearch eats petabyte-scale data for breakfast — from firewalls, web proxies, detection systems, any source you like really — so don't hold back.

Other
search...
0 matched | 0 scanned
0 Unique Request
Elastic
search...
hits
Unique Request

Keep Data Online Longer for Investigation

When did they get in? Where did they go? What did they do? What else is compromised?

To answer these questions, seven days does not a proper historical look back make. Average threats can incubate for 100 days before they're resolved. Elastic makes searching through long-term historical data like this not only possible, but practical, easy, and fast.

Build Something New, Enhance Your SIEM

Start from a blank slate and create a home-grown security solution like Slack, or augment an existing SIEM investment like USAA. Elastic is, well, flexible. And if you don't see what you need, build it or leverage the community. You’re not locked in. That's open source for the win.

Try It Out

Start small. Go big. Your choice. Grab a fresh installation and discover what you can uncover.
In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Auditbeat install directory:

Modify auditbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
What just happened?

Auditbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system audit information.

Didn't work for you?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Logstash install directory, run:

Modify logstash.yml to set NetFlow module details

modules:
- name: netflow
  var.input.udp.port: <netflow_port>
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<es_pw>"
			

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Netflow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
            
In Packetbeat install directory:

Modify packetbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<es_pw>"
Open browser @
http://localhost:5601 (login: elastic/<es_pw>)
Open dashboard:
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

In Elasticsearch install directory:
Once Elasticsearch starts, in Elasticsearch install directory (separate window):

Note the password for elastic user as <es_pw>

Note the password for kibana user as <kibana_pw>

In Kibana install directory:

Modify config/kibana.yml to set credentials for Elasticsearch

elasticsearch.username: "kibana"
elasticsearch.password: "<kibana_pw>"
			
In Logstash install directory, run:

Modify logstash.yml to set ArcSight module details

modules:
- name: arcsight
  var.inputs: smartconnector
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<es_pw>"
			

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing ArcSight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Filebeat

In Filebeat install directory:

Modify filebeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Filebeat System] SSH login attempts" or "[Filebeat System] Sudo commands"
What just happened?

Filebeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing latest system log messages, and reporting on SSH login attempts and other authentication events.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Auditbeat

In Auditbeat install directory:

Modify auditbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Auditbeat File] File Integrity"
What just happened?

Auditbeat module assumes default operating system configuration. See the documentation for more details.

Didn't work for you?

Filebeat module assumes default log locations, unmodified file formats, and supported versions of the products generating the logs. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Logstash

In Logstash install directory:

Modify logstash.yml bin/logstash-plugin install x-pack

modules:
- name: netflow
  var.input.udp.port: <netflow_port>
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<password>"
				

Configure NetFlow to export flow events to Logstash via UDP on default port 2055.

Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"Netflow: Overview"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing NetFlow events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the Netflow solution, however you can override defaults. See the documentation for more details.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Packetbeat

In Packetbeat install directory:

Modify packetbeat.yml to set credentials for Elasticsearch output

output.elasticsearch:
  username: "elastic"
  password: "<password>"
				
Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[Packetbeat] DNS Tunneling"
What just happened?

Packetbeat created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing details of your DNS traffic.

Didn't work for you?

Packetbeat makes a set of assumptions around defaults, such as default network ports. See the documentation for more details on how to further configure your deployment.

  • Register, if you do not already have an account
  • Log into the Elastic Cloud console
To create a cluster, in Elastic Cloud console:
  • Select Create Cluster, leave size slider at 4 GB RAM, and click Create
  • Note the Cloud ID as <cloud.id>
  • Note the cluster Password as <password>
  • In Overview >> Endpoints section note Kibana URL as <kibana_url>
  • Wait until cluster plan completes

Download and unpack Logstash

In Logstash install directory, run:

Modify logstash.yml to set ArcSight module details

modules:
- name: arcsight
  var.inputs: smartconnector
  var.elasticsearch.username: "elastic"
  var.elasticsearch.password: "<password>"
				

Configure Smart Connectors to send CEF events to Logstash via TCP on default port 5000.

Open browser @
http://<kibana_url>:5601 (login: elastic/<password>)
Open dashboard:
"[ArcSight] Network Overview Dashboard"
What just happened?

Logstash created an index pattern in Kibana with defined fields, searches, visualizations, and dashboards. In a matter of minutes you can start viewing Arcsight events.

Didn't work for you?

Logstash module makes a set of assumptions around default configuration of the ArcSight solution, however you can override defaults. See the documentation for more details.

Automate Anomaly Detection, Explore Curious Connections

How do you keep up with billions of signatures? Or surface meaningful connections across millions of IP addresses? Add machine learning and graph analytics to your Elastic equation to quickly detect cyber threats — the ones you expected and the ones you didn't — in all the noise.

You'll Be in Good Company

USAA started with a few Elasticsearch nodes in their security lab. They now have a full production deployment augmenting their ArcSight SIEM. Previously, USAA first responders used to wait minutes (to hours) for log management appliance queries to produce outputs for threat hunting analysis. With Elastic, they don’t any more.

They're not the only ones managing security events with Elastic. Explore more customer examples.

Security Analytics is More Than Just Security Events

Have metrics? Infrastructure logs? Documents with tons of text? Centralize it all into the Elastic Stack with your security events to enrich your analyses, minimize your risk, and simplify your architecture.

Logging

Fast and scalable logging, that won't quit.

Learn More

Metrics

Do the numbers: CPU, memory, and more.

Learn More

Site Search

Easily create a great search experience for your site.

Learn More

APM

Get insight into your application performance.

Learn More

App Search

Search across documents, geo data, and more.

Learn More