Asma Zinneera Jabir
Asma Zinneera Jabir

Reputation: 841

Kibana showing only a limited number of data - ELK Stack

I parsed an apache access log file using logstash and it parsed all the logs successfully as seen in the command prompt window. But when I open up Kibana it only shows 8 of it. Why doesn't it show all of the parsed logs?

Updated:

I started all over again reinstalling elasticsearch-4.2.0, logstash-2.0.0 and Kibana 4. the name of my log file is http_access_2015-03-06_log and it is being parsed and showing in elasticsearch-kopf but not showing any of the logs on Kibana.

kopf enter image description here

command prompt output: updated

enter image description here

.conf file : updated

   input {
  file {
    path => "G:/MIT/level_03/Project/logstash-2.0.0/bin/tmp/*_log"
    #sincedb_path => "/dev/null"
    start_position => "beginning"
  }
}

filter {
  #grok {
  #  match => ["path", "G:/logstash-1.5.0/bin/tmp/(?<project>[^/_logs]+)/"]
  #}
  if [path] =~ "access" {
    mutate { replace => { type => "apache_access" } }
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
    date {
      match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
    }
  } else if [path] =~ "error" {
    mutate { replace => { type => "apache_error" } }
  } else {
    mutate { replace => { type => "random_logs" } }
  }
}

output {
  elasticsearch { 
   # action => "index"
    hosts => "localhost" 
   # index => "test"
}
  stdout { codec => rubydebug }
}

These are some logs parsed but not being shown on Kibana

127.0.0.1 - - [06/Mar/2015:10:26:31 +0530] "GET /carbon/ HTTP/1.1" 302 - "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36" 127.0.0.1 - - [06/Mar/2015:10:26:31 +0530] "GET /carbon/admin/index.jsp HTTP/1.1" 302 - "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36" 127.0.0.1 - - [06/Mar/2015:10:26:32 +0530] "GET /carbon/admin/login.jsp HTTP/1.1" 200 3398 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36" 127.0.0.1 - - [06/Mar/2015:10:26:32 +0530] "GET /carbon/admin/css/global.css HTTP/1.1" 200 5309 "https://localhost:9443/carbon/admin/login.jsp" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36" 127.0.0.1 - - [06/Mar/2015:10:26:32 +0530] "GET /carbon/styles/css/main.css HTTP/1.1" 200 1361 "https://localhost:9443/carbon/admin/login.jsp" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36" 127.0.0.1 - - [06/Mar/2015:10:26:32 +0530] "GET /carbon/dialog/css/dialog.css HTTP/1.1" 200 556 "https://localhost:9443/carbon/admin/login.jsp" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.111 Safari/537.36"

Kibana Discover Tab : updated enter image description here

kibana.yml

# Kibana is served by a back end server. This controls which port to use.
# server.port: 5601

# The host to bind the server to.
# server.host: "0.0.0.0"

# The Elasticsearch instance to use for all your queries.
# elasticsearch.url: "http://localhost:9200"

# preserve_elasticsearch_host true will send the hostname specified in `elasticsearch`. If you set it to false,
# then the host you use to connect to *this* Kibana instance will be sent.
# elasticsearch.preserveHost: true

# Kibana uses an index in Elasticsearch to store saved searches, visualizations
# and dashboards. It will create a new index if it doesn't already exist.
# kibana.index: ".kibana"

# The default application to load.
# kibana.defaultAppId: "discover"

# If your Elasticsearch is protected with basic auth, this is the user credentials
# used by the Kibana server to perform maintenance on the kibana_index at startup. Your Kibana
# users will still need to authenticate with Elasticsearch (which is proxied through
# the Kibana server)
# elasticsearch.username: user
# elasticsearch.password: pass

# SSL for outgoing requests from the Kibana Server to the browser (PEM formatted)
# server.ssl.cert: /path/to/your/server.crt
# server.ssl.key: /path/to/your/server.key

# Optional setting to validate that your Elasticsearch backend uses the same key files (PEM formatted)
# elasticsearch.ssl.cert: /path/to/your/client.crt
# elasticsearch.ssl.key: /path/to/your/client.key

# If you need to provide a CA certificate for your Elasticsearch instance, put
# the path of the pem file here.
# elasticsearch.ssl.ca: /path/to/your/CA.pem

# Set to false to have a complete disregard for the validity of the SSL
# certificate.
# elasticsearch.ssl.verify: true

# Time in milliseconds to wait for elasticsearch to respond to pings, defaults to
# request_timeout setting
# elasticsearch.pingTimeout: 1500

# Time in milliseconds to wait for responses from the back end or elasticsearch.
# This must be > 0
# elasticsearch.requestTimeout: 300000

# Time in milliseconds for Elasticsearch to wait for responses from shards.
# Set to 0 to disable.
# elasticsearch.shardTimeout: 0

# Time in milliseconds to wait for Elasticsearch at Kibana startup before retrying
# elasticsearch.startupTimeout: 5000

# Set the path to where you would like the process id file to be created.
# pid.file: /var/run/kibana.pid

# If you would like to send the log output to a file you can set the path below.
# logging.dest: stdout

# Set this to true to suppress all logging output.
# logging.silent: false

# Set this to true to suppress all logging output except for error messages.
# logging.quiet: false

# Set this to true to log all events, including system usage information and all requests.
# logging.verbose

p.s. Kiabna Discover tab shows all the data for "_all"

Upvotes: 0

Views: 2037

Answers (1)

Alcanzar
Alcanzar

Reputation: 17155

Several things here...

  • You only have a grok{} in your access file code path
  • You are getting _grokparsefailures, so your grok { match => ["path", "G:/logstash-1.5.0/bin/tmp/(?<project>[^/_logs]+)/"] } block isn't matching.
  • You are probably running into the situation where the files are being stored into the sincedb for logstash, so you are only seeing new records in the files after you run it the first time. You need to locate and remove your .sincedb or point it to something like /dev/null

Upvotes: 0

Related Questions