2

below is my configuration file for filebeat which is present in /etc/filebeat/filebeat.yml, it throws an error of

Failed to publish events: temporary bulk send failure

filebeat.prospectors:
  - paths:
     - /var/log/nginx/virus123.log
   input_type: log
   fields:
      type:virus123
   json.keys_under_root: true


   - paths:
     - /var/log/nginx/virus1234.log
     input_type: log
     fields:
       type:virus1234
     json.keys_under_root: true

setup.template.name: "filebeat-%{[beat.version]}"
setup.template.pattern: "filebeat-%{[beat.version]}-*"
setup.template.overwrite: true

processors:
 - drop_fields:
     fields: ["beat","source"]


output.elasticsearch:
  index: index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

  hosts: ["http://127.0.0.1:9200"]
2
  • what version of Filebeat are you using? Commented Jun 16, 2018 at 14:25
  • Also, normally the bulk sense failure is caused by an error on the Elasticsearch side. Knowing which error elasticsearch returns could be helpful here. Commented Jun 16, 2018 at 14:30

1 Answer 1

1

I think I found your problem, Although i'm not sure it is the only problem

index: index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

should be:

index: "filebeat-%{[beat.version]}-%{[fields.type]:other}-%{+yyyy.MM.dd}"

I saw a similar problem with a wrong index which cause the same error that you showed

Sign up to request clarification or add additional context in comments.

1 Comment

that was a typo mistake it's not working solution, I go with logagent it's super awesome and lightweight as compared to logstash

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.