Parse json in a list in logstash -


i have json in form of

[     {         "foo":"bar"     } ] 

i trying filter using json filter in logstash. doesn't seem work. found can't parse list json using json filter in logstash. can please tell me workaround this?

update

my logs

ip - - 0.000 0.000 [24/may/2015:06:51:13 +0000] *"post /c.gif http/1.1"* 200 4 * user_id=userid&package_name=somepackagename&model=titanium+s202&country_code=in&android_id=androidid&et=1432450271859&etz=gmt%2b05%3a30&events=%5b%7b%22ev%22%3a%22com.olx.southasia%22%2c%22ec%22%3a%22appupdate%22%2c%22ea%22%3a%22app_activated%22%2c%22etz%22%3a%22gmt%2b05%3a30%22%2c%22et%22%3a%221432386324909%22%2c%22el%22%3a%22packagename%22%7d%5d * "-" "-" "-" 

url decoded version of above log

ip - - 0.000 0.000 [24/may/2015:06:51:13  0000] *"post /c.gif http/1.1"* 200 4 * user_id=userid&package_name=somepackagename&model=titanium s202&country_code=in&android_id=androidid&et=1432450271859&etz=gmt+05:30&events=[{"ev":"com.olx.southasia","ec":"appupdate","ea":"app_activated","etz":"gmt+05:30","et":"1432386324909","el":"packagename"}] * "-" "-" "-" 

please find below config file above logs..

filter {

urldecode{     field => "message" }  grok {   match => ["message",'%{ip:clientip}%{greedydata} \[%{greedydata:timestamp}\] \*"%{word:method}%{greedydata}'] }  kv {     field_split => "&? " } json{     source=> "events" } geoip {     source => "clientip" } 

}

i need parse events, ie events=[{"ev":"com.olx.southasia","ec":"appupdate","ea":"app_activated","etz":"gmt+05:30","et":"1432386324909","el":"packagename"}]

i assume have json in file. right, cannot use json filter directly. you'll have use multiline codec , use json filter afterwards.

the following config works given input. however, might have change in order separate events. depends on needs , json format of file.

logstash config:

input     {        file     {         codec => multiline         {             pattern => "^\]" # change separate events             negate => true             => previous                        }         path => ["/absolute/path/to/your/json/file"]         start_position => "beginning"         sincedb_path => "/dev/null" # testing     } }  filter     {     mutate   {             gsub => [ "message","\[",""]             gsub => [ "message","\n",""]         }     json { source => message } } 

update

after update guess i've found problem. apparently jsonparsefailure because of square brackets. workaround manually remove them. add following mutate filter after kv , before json filter:

mutate  {     gsub => [ "events","\]",""]     gsub => [ "events","\[",""] } 

update 2

alright, assuming input looks this:

[{"foo":"bar"},{"foo":"bar1"}] 

here 4 options:

option a) ugly gsub

an ugly workaround gsub:

gsub => [ "event","\},\{",","] 

but remove inner relations guess don't want that.

option b) split

a better approach might use split filter:

split {     field => "event"     terminator => "," } mutate  {     gsub => [ "event","\]",""]     gsub => [ "event","\[",""]    } json{     source=> "event" } 

this generate multiple events. (first foo = bar , second foo1 = bar1.)

option c) mutate split

you might want have values in 1 logstash event. use mutate => split filter generate array , parse json if entry exists. unfortunately have set conditional each entry because logstash doesn't support loops in config.

mutate  {     gsub => [ "event","\]",""]     gsub => [ "event","\[",""]     split => [ "event", "," ]    }  json{     source=> "event[0]"     target => "result[0]" }  if 'event[1]' {     json{         source=> "event[1]"         target => "result[1]"     }     if 'event[2]' {         json{             source=> "event[2]"             target => "result[2]"         }     }     # have specify more conditionals if expect more dictionaries } 

option d) ruby

according comment tried find ruby way. following works (after kv filter):

mutate  {     gsub => [ "event","\]",""]     gsub => [ "event","\[",""] }  ruby  {     init => "require 'json'"     code => "         e = event['event'].split(',')         ary = array.new         e.each |x|             hash = json.parse(x)             hash.each |key, value|                 ary.push( { key =>  value } )             end         end         event['result'] = ary     " } 

option e) ruby

use approach after kv filter (without setting mutate filter):

ruby  {     init => "require 'json'"     code => "             event['result'] = json.parse(event['event'])     " } 

it parse events event=[{"name":"alex","address":"newyork"},{"name":"david","address":"newjersey"}]

into:

"result" => [     [0] {            "name" => "alex",         "address" => "newyork"     },     [1] {            "name" => "david",         "address" => "newjersey"     } 

since behavior of kv filter not support whitespaces. hope don't have in real inputs, you?


Comments

Popular posts from this blog

python - Healpy: From Data to Healpix map -

c - Bitwise operation with (signed) enum value -

xslt - Unnest parent nodes by child node -