I need to read static JSON from a file for each event. I use that JSON data in following ruby filter.
Current workaround is to use translate filter to read JSON encoded JSON and use json filter to decode it to JSON.
Current workaround:
echo '{"languages":"{\"1\":\"Afar\",\"2\":\"Afrikaans\",\"3\":\"Albanian\"}"}' > /opt/languages.json
logstash filter:
if [languages][id] {
ruby {
add_field => {
"[@metadata][static][languages]" => "languages"
}
}
translate {
dictionary_path => "/opt/languages.json"
field => "[@metadata][static][languages]"
destination => "[@metadata][languagesJson]"
}
json {
source => "[@metadata][languagesJson]"
target => "[@metadata][languages]"
}
ruby {
path => "/etc/logstash/scripts/handle-languages.rb"
}
}
I would like it to be just:
echo '{"1":"Afar","2":"Afrikaans","3":"Albanian"}' > /opt/languages.json
if [languages][id] {
json {
path => "/opt/languages.json"
target => "[@metadata][languages]"
}
ruby {
path => "/etc/logstash/scripts/handle-languages.rb"
}
}
I need to read static JSON from a file for each event. I use that JSON data in following ruby filter.
Current workaround is to use translate filter to read JSON encoded JSON and use json filter to decode it to JSON.
Current workaround:
logstash filter:
I would like it to be just: