Geolocation mapping on Kibana

Tested with ELK 7.2

The below method has been deprecated in ELK v. 7.6. I will post an alternative approach at some point in the future.

Recently, I have attempted to get geolocation mapping on a Kibana visualization/dashboard. That was incredibly hard because of the little documentation around on how to get this working (I guess if you’re sole job is maintaining an ELK stack; then it’s a known territory).

Anyhow, I needed to map locations of physical assets given a CSV file that contains UK postcodes.

  1. The first step was to find GPS coordinates for UK postcodes, and thankfully the UK Office for National Statistics publish a list of those details. (Look for “National Statistics Postcode Lookup (NSPL)” if you’re interested in finding the same data).
  2. In this example, my csv file has the following headers: carMake, CarModel, postcode, lat, lon. (I have done the latitude and longitude enrichment to the CSV file prior to processing in logstash)
  3. Define a field in your elasticsearch index to have a “geo_point” data type.
    1. You cannot define/cast your data into this data type from logstash.
    2. You cannot change the field data type once it’s defined. Hence, you can only do this while creating an index. This also means you have to define all other fields you’re expecting at the same time!
    3. Create this index from the “Dev Console” interface in Kibana. Check example below that has been tested on elasticsearch 7.2

PUT indexCars

PUT indexCars/_mapping/doc

{
"doc":{
"properties":{
"location":{
"type":"geo_point"
},
"lat":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"lon":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"@timestamp":{
"type":"date"
},
"@version":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"carMake":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"carModel":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"postcode":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
},
"tags":{
"type":"text",
"fields":{
"keyword":{
"type":"keyword",
"ignore_above":256
}
}
}
}
} }

4. Use the following logstash configuration file to process your CSV file

input {
file {
path => “/path/to/*.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {
separator => ","
autodetect_column_names => "true"
skip_empty_columns => "true"
}

mutate {
add_field => { "location" => "%{lat},%{lon}" }
}

} #end of filter

output {
elasticsearch {
hosts => "http://localhost:9200"
index => "indexCars"
}

stdout {}

}

5. Once you have some data ingested, go to Kibana to create a geomapping visualisation. You will find the new “location” field that has the correct data type “geo_point”.