Ubuntu Logstash Server with Kibana3 Front End Autoinstall

I have been using Graylog2 and VMware Log Insight for some time now and wanted to try out Logstash finally. So the first thing that I wanted to do was create an automated script to do most of the install and configuration to get everything running. I figured that as I am going through this I would share with everyone and start building on this script more based on feedback. I created a Graylog2 script (located here) that has proven to be of great help to the community and figured I might be able to do the same with the Logstash community, but even if it didn’t I would learn a great deal about Logstash in the meantime. There is a great community around Logstash so getting support should be very easy. As well as, I am just starting to learn Logstash now so this should be a lot of fun. Which also means that there will be a good amount of change around this post.

First off I will be keeping this script updated and available on Github located here. This will be the only location that I will be keeping up with it.

I would recommend using a clean install of Ubuntu 12.04 to install onto. However if you decided to install on an existing server I am not responsible for anything that may get broken.

So here is how we get started and get everything up and running. Open up a terminal session on your server that you will be installing to and run the following commands.

sudo apt-get update
sudo apt-get -y install git
cd ~
git clone https://github.com/mrlesmithjr/Logstash_Kibana3
chmod +x ./Logstash_Kibana3/install_logstash_kibana_ubuntu.sh
sudo ./Logstash_Kibana3/install_logstash_kibana_ubuntu.sh

You will be prompted during the script to enter your domain name and ESXi naming convention. This will be used to configure logstash filtering for your ESXi hosts. If you do not monitor any ESXi just enter some random info into these. These are purely just collecting info to pass into a filtering rule for Logstash.

Once complete open your browser of choice and connect to http://logstashservername/kibana or http://ipaddress/kibana.

You will see the following screen once connected. Seeing as we are setting up Logstash with Kibana go ahead and select the link on the left.

19-31-11Screen Shot 2013-11-29 at 6.38.39 PM

Now here is a screenshot of some actual ESXi logging. Notice the tag called esxi, that is created by the filtering rule that we created with the installer which, is based off of the naming convention we passed to the installer.

19-34-35

Here is another screenshot of logging graphs by adding different search criteria items.

10-22-26

So what we have done with this script is installed Apache2, Elasticsearch, Logstash and Kibana3. Logstash has been configured to listen on UDP/514 (default syslog) and TCP/514 for ESX(i) logging.

If you want to purge and expire old logs have a look here. Jordan Sissel (creator of Logstash) has provided a python script to do this.

Here is how you setup the script. Open a terminal on your Logstash server and execute the following.

cd ~
sudo apt-get install python-pip
sudo apt-get install git
git clone https://github.com/logstash/expire-logs
cd expire-logs
sudo pip install -r requirements.txt

Now that you have this setup read the examples on the github link on different scenarios.

After you purge your logs using the above method you will need to restart elasticsearch.

sudo service elasticsearch restart

That should be it.

Enjoy!

All comments and feedback are very much welcomed and encouraged.

26 thoughts on “Ubuntu Logstash Server with Kibana3 Front End Autoinstall

    • @n00blet. Once you start sending data to it that error will go away. It is because there is not any syslog data yet into Elasticsearch. :)

  1. Having little linux under my belt, but interested in “learning to fish” rather than buying a fish sandwich at McDonalds, I’m interested in digging a little deeper. This post is of great interest but is written for sysadmins with some miles under their belt. I have the new ubuntu VM patched up and ready. I have your slick script run and Logstash installed. How do I:

    1) Configure Apache (or is it already configured?)
    2) “Start sending data to it” so the index error goes away?

    Also, do I need to provide access via SSH or anything to my ESXi hosts for Logstash to be able to pull log info?

    Thanks a ton!

    • @Fleetside58
      Have you been successful in getting any data in Logstash yet? If not I can add some additional information to the post to help those out that may not necessarily know where to setup everything.

  2. This looks great man, thank you very much for your contribution. I am actually trying out your graylog2 install script as I write this. I’ll give this one a try next. This will potential save me quite a bit of time…

  3. Hi, well the centos script didn’t work, I have tried it with a fully updated copy of CentOS 6.5. From what I can see it’s fails to install git which causes a problem with the ES install and further down there seems to be a problem with the version of passenger, it’s trying to get a version which is higher and seemingly works differently. You may be able to solve this by specifying the versions of Passenger via the gem install scripts. I setup a second machine with Ubuntu 12.10 Server and it worked perfectly. The only problem that I can see is that it uses the older version (0.12) of GL2. I am hoping to adapt this to work with the latest release of 0.2.

    • @cultavix Very cool. The CentOS script was actually being maintained by another person that originally got it working so if you want to you can post an issue against the CentOS script on Github and we can get it resolved. The Ubuntu/Debian scripts I maintain on my own. There is a preview script that works for 0.2 and it should be working just fine.

      • (Reading database … 61859 files and directories currently installed.)
        Unpacking elasticsearch (from elasticsearch-0.90.7.deb) …
        dpkg-deb (subprocess): short read on buffer copy for failed to write to pipe in copy
        dpkg-deb: error: subprocess paste returned error exit status 2
        dpkg: error processing elasticsearch-0.90.7.deb (–install):
        short read on buffer copy for backend dpkg-deb during `./usr/share/elasticsearch/lib/lucene-core-4.5.1.jar’
        Errors were encountered while processing:
        elasticsearch-0.90.7.deb

        -getting this error kindly help

  4. i wonder
    output {
    gelf {}
    }

    line 39 of /etc/init.d/logstash
    status_of_proc -p $pid_file “” “$name”
    should be
    status_of_proc -p “$pid_file” “$name”

    nice script :)

  5. Great package install ! Reviving my linux skills. This definitely saved me alot of time. I see on your blog about the index issue, there is none due to no data. Do you have a sample file i can implement to test with ? Any info on getting it going is also appreciated. Thank you and keep up the great blog !

  6. Way easier to use than the new graylog2 .20 and installed much faster! Thanks!

    BTW, you have a small typo in your expire directions: phython-pip instead of python-pip.

    Also, i dont really see any information about requirements.txt. That file doesnt exist by default.

    • @MACscr Yeah it really is :) Thanks for catching the typo too! I just updated it to the latest as well. Been spending a lot of time on Graylog2 and neglected to get back to logstash so thanks! And it looks like they have renamed expire-logs to curator so I need to update that too now.

  7. Hi mrlesmithjr it’s work great.
    can you import GEOIP in this Script.
    why anything in elasticsearch.yml don’t change and anything had # comment?

  8. hi mrlesmithjr
    when I run the pip command see this error
    Could not open requirements file: [Error 2] No such file or directory: ‘requirements.txt’

    • @morteza I need to look into the logstash installer and get it updated to the latest version once I get a chance to do it.

  9. This is just great! Got it up and running in no time, and I have pretty limited Linux skills! Got no budget for VMware Log Insight, but this might be an even better alternative! Big thanks!

    • @LB Glad it worked for you. I need to get around to getting this script updated for the latest version of Logstash but they changed the install method and I have not had the time to get to it yet. So stay tuned and Enjoy!

Leave a Reply

Your email address will not be published. Required fields are marked *

*

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>