2

i have single node ELK set up in 10.x.x.1 where i have installed logstash, elastic search and kibana.

i have my application running in another server 10.x.x.2 and i want my logs to be forwarded to elastic search.

My log file /var/log/myapp/myapp.log in 10.x.x.2

In 10.x.x.1 i provided this input in /etc/logstash/conf.d

input {
  file {
    path => "/var/log/myapp/myapp.log"
    type => "syslog"
  }
}

output {
   elasticsearch {
       hosts => ["10.252.30.11:9200"]
       index => "versa"
   }
}

My questions are as below

  1. Do i need to install logstash in 10.x.x.2
  2. How can i grep only for the lines having "Error"
  3. Everyday my app produces a log of size 10MB. i just want to know, if i can add one more node to my elastic search so that the space wont fill up.
  4. i dont want to keep my logs permanently in elastic search . Is there any way i can set an expiry time for the logs that am sending ? i.e. delete the logs after 7 days .
2
  • You need LS on 10.x.x.2 and not on 10.x.x.1. Commented Aug 17, 2016 at 7:39
  • You should avoid asking more than one question in your post Commented Aug 17, 2016 at 8:58

2 Answers 2

3

I can answer 1 and 2.

  • You need to install at least one of Logstash (not recommend) or Filebeat or Packetbeat on 10.x.x.2. Filebeat or Packetbeat are both good and free from the Elastic.co company. Packetbeat is used to capture app logs via network, not log files. For your case, using a file log, just use Filebeat.
  • You need to edit the Filebeat configuration files (filebeat .yml) to shoot its logs to 10.x.x.1

filebeat: prospectors: - paths: - /var/log/myapp/myapp.log

And

logstash: hosts: ["10.x.x.1:5044"]

  • On 10.x.x.1, where you have installed Logstash (and others to make a ELK), you need to create some configuration files for Logstash:

    • Add a input file named 02-beats-input.conf into /etc/logstash/conf.d/

    input { beats { port => 5044 ssl => false } }

    • Add a filter file named 03-myapp-filter.conf into /etc/logstash/conf.d/. You should find a filter pattern to match your log.
Sign up to request clarification or add additional context in comments.

3 Comments

Packetbeat has nothing to do with the current use case. You should remove its mention to avoid confusion.
@baudsp, I don't know if this myapp is a packetbeat supported protocols (dns, http, memcache, mysql, ...) or not. I updated my answer.
since the OP explicitly asked about a log file, I don't see what a software "capturing the network traffic between your application servers, decoding the application layer protocols" (source) will be of use to him.
1

For 2:

Kibana act as a web interface to Elasticsearch. Once it is started, by default it will be available on port 5601. You can then use the discovery interface to search for terms, like "Error". It will return the first 500 document with this term.

For 3:

Another Elasticsearch will allow to spread your data between nodes. But a node can easily deal with a few gigas without problem.

For 4:

You can't set an expiry date to the data. At least it would not be automatic, you'll have to search all the logs expiring today and deleting them.
Another solution (and a better one) is to have one index per day (with index => "versa-%{+YYYY.MM.dd}") and delete the index after 7 days (easily done with elasticsearch curator and a cron job)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.