Securing eXtplorer

March 11th, 2011

As mentioned in my previous blog post, Cool Web-Based Software- eXtplorer, eXtplorer is an excellent online file manager solution that has a great set of features. While I was examining eXtplorer’s potential to be used in an Offsite Linux Server Backup service, I did find several drawbacks to the software. In particular, two features needed to be disabled, in order for eXtplorer to be used in a shared environment.

The first feature in eXtlplorer that needed to be disabled was the “New File/Directory” button. This button allows the creation of symlinks, which can override the specified user’s directory. As you can imagine, this would allow a user to view any files that the webserver owner would be able to view. In most circumstances, the user running under the webserver process will be able to view well over 90% of the files on a server (some of which are sensitive). We can easily disable this feature, by editing a few files.

To disable this feature, edit the file include/mkitem.php. In that file, you should see code similar to the following:
/**
* Allows to create dirs, files and symlinks on a server
*
*/
class ext_Mkitem extends ext_Action {

New File/Directory disabled, on eXtplorer

New File/Directory disabled, on eXtplorer

To properly disable this feature, you can either comment out the function, or simply rename the function. This results in a screen similar to the one on the left, when a user attempts to create a new file or directory. As you can see, the screen is completely blank, and does not allow the user to upload new files, or create a symlink to another directory.



 

The other weakness in eXtplorer’s design was the creation of an “About” page, which contains (among other items) a way to view the phpinfo() function. This function reveals sensitive information about the webserver (the version of PHP running, OS, PHP configuration, etc). Fortunately, disabling this feature is rather easy.

To disable the “About” page, edit the file admin.extplorer.php, and look for the line that is similar to:

case'get_about':
require_once( _EXT_PATH . "/include/system_info.php" );
system_info();
break;

And change this to:


case'get_about':
echo "This feature has been disabled by your administrator.";
// require_once( _EXT_PATH . "/include/system_info.php" );
// system_info();
break;

eXtplorer disable about page

eXtplorer's About page, properly disabled

What we are effectively doing here is commenting out loading the script that reveals the phpinfo() information, and instead we create a line that tells the user that this feature is disabled. This way, the dialog now looks similar to the dialog on the left.







extplorer's new file/directory button disabled

New File/Directory (Disabled)

Once we have made these modifications, we now need to tell our users that these two features no longer work. We can do this a few ways- my favorite is to edit the images for the buttons on the menu (located in the images directory). The image that controls the New File/Directory image is named ” _filenew.png”, and the image that controls the About image is named ” _help.png”. By the time that you are done, you should have a navigational bar that appears close to the image on the left.

Finally, we need to edit the language file, and edit the “New File/Directory” and “About” descriptions. These help our users to understand that when they hover their mouse over an option, they can read that the features have been disabled. In order to do this, edit the file languages/english.php (or whatever language is the default), and edit the variables “aboutlink”, and “newlink”. You can edit these variables to display whatever message you choose, my personal favorite is just to use the text “Disabled” after the feature, to let the user know that the feature is disabled on purpose.

Once you have performed these modifications, eXtplorer will happily run in a multi-user environment securely. If you have any further questions about eXtplorer, feel free to contact me by posting a reply, or using the contact form on this website.

Cool Web-based Software- eXtplorer

February 22nd, 2011

Your server has fast Internet access, and tens of hundreds of GB free of storage. Why not use that storage and high speed connection to store files, and share files with other users?

If you need to share files with other users on the Internet, or easily access files remotely, a good web-based software to use for that purpose is eXtplorer. Featuring modern Javascript-based menus, and a sleek PHP backend, eXtplorer is strongly becoming my favorite way to access files remotely. Sure, I could use SFTP to store files on a server, but what happens when you need to access files on your server from someone else’s computer? You could always download a standalone SFTP client, but that’s such a hassle- and eXtplorer is easily accessed from any web browser.

This weekend, I was doing some research into a way to offer offsite Linux server backup solutions to my clients. While doing my research, I stumbled upon eXtplorer. At first, I was skeptical. I’ve looked at easily a dozen Open Source file managers. After using eXtplorer over the course of several weeks, I found that the software was sleek, functional, and easy to use. Needless to say, I was impressed.

eXtplorer File List

eXtplorer's list of files in the current directory




As you can see, eXtplorer features a clean user interface, with friendly graphics. It’s intuitive to use, and has very few disadvantages.








eXtplorer Right Click Menu

eXtplorer's Sleek Right Click Menu




I was also impressed by eXtplorer’s sleek right-click menu, which works in Opera, Firefox, and Internet Explorer. The menu is Javascript based, and seemed to be bug-free.








Viewing file contents, with eXtplorer

eXtplorer's "View File Contents" feature can even display images.




If you need to, you can even view the contents of a file, via the “View File Contents” feature. This feature supports images, as well as text documents.








One disadvantage of eXtplorer is that it requires a few modifications, to be securely used in a multi-user environment. In a default configuration, eXtplorer allows the creation of symlinks, which can override the location for a user’s profile. This means that a user can view files outside his or her path, which would normally be considered an information disclosure vulnerability. In addition, an “About…” dialog allows a user to view PHP’s phpinfo(); function, which tells a possible malicious user a lot of information! EXtplorer also supports connecting to remote FTP sites, which could also prevent a security vulnerability. However, all of these features are easy enough to disable.

eXtplorer Disable Symlinks and About

eXtplorer with symlinks disabled




As you can see, both the “New file” feature, and the “About…” feature have been disabled. This is necessary to prevent the creation of symlinks, and the disclosure about system settings via the “About…” page.







In summary, eXtplorer is definitely work looking into, if you need a way to share or host files over the Internet. If you don’t want to use FTP, and are in search of a web-based file manager, eXtplorer is easily the best choice in Open Source web-based file managers. It’s few security issues are easily fixed, and readily documented. In my next blog post, I specify the exact changes necessary to disable symlinks, disable the about page, and disable remote FTP access.

Benchmarking Nginx and Apache Performance

February 15th, 2011

Last week, I covered basic Apache and Nginx performance details, and my general observations. Without any further delay, I will now cover the technical details of the benchmark, and the exact benchmark numbers.

The test server used in this benchmark was running Debian GNU/Linux 5.0.8. The version of Apache used was 2.2.9, and the version used of Nginx was 0.6.32. The server is equipped with a RAID-0 array benchmarking at 118 MB/sec, and 1.5 GB of RAM. The processor used in the benchmark was a dual core Pentium-4, at 3.0 GHz.

The main benchmarking software used was Seige, version 2.66. Testing was done on a gigabit LAN connection. Pmwiki was the PHP application that was used as a benchmark page, and MySQL support was disabled (to strictly test PHP web application performance). In addition, all non-essential modules in Apache were disabled (mod_ssl, mod_security, etc).

The benchmark was performed a total of 3 times for each amount of concurrent users. As the server’s CPU temperature reached abnormal levels, testing was temporarily paused. Once CPU temperatures reached normal operating levels, testing resumed.

The raw benchmark data is available here: Nginx/Apache Benchmark Data.

The benchmark data presents one conclusion- that Nginx handles high traffic conditions better than Apache, and continues to handle high server loads, as the loads increase. Apache’s failed transactions increases exponentially with a higher concurrent users, whereas Nginx has a lineal increase in failed transactions with a higher concurrent users load. In addition, Apache did fail to respond several times during the benchmark, and required a manual restart of the Apache service. Nginx recovered from high traffic loads gracefully, and did not require restarting of the Nginx service after benchmarking.

One reason that Nginx handles high server loads better than Apache is the reliance of PHP support handled through php-fpm. There are rumors of Nginx supporting PHP in future builds (perhaps similarly to the way that Apache does- via a module). If those rumors are true, Nginx may behave similarly to Apache in regards to PHP performance. Currently, php-fpm launches PHP requests in as a dedicated service, as opposed to Apache’s way of invoking PHP at every request.

In any event, this benchmark shouldn’t be taken as a recommendation of Nginx over Apache. Each web server software solution has it’s advantages and disadvantages, and those should be carefully considered before the selection of a web server. However, if performance under high amounts of traffic is what you need, look no further than Nginx!

Apache vs. Nginx Web Server Performance

February 7th, 2011


Running a web server isn’t necessarily as easy as it used to be. With more and more high speed users requesting resources from our web servers, web server performance is becoming an important focus with web entrepreneurs. With resources such as social networks, unexpected high server loads can be generated by a single user sharing a link to your web server. DoS attacks can also play a large role in bringing your server down from a high volume of requests, often without warning. If you’ve never had a high volume traffic spike before, it is only a matter of time before you do.

With today’s dynamic websites, web server performance plays a large role in handling an unexpected traffic spike. Although dynamic pages don’t seem to take longer to load on a web server than a static page, heavy server loads (more than 20 concurrent users) result in exponentially long delays in a web server’s response time.

Nginx (pronounced ENGINE-X) is a relatively new web server that is highly praised for it’s performance under high server loads. Although competitor web server software such as Apache offers more features, Nginx is designed primarily with web server performance in mind. Nginx currently has a 6.62% market share in the busiest sites on the Internet, as referenced by Netcraft’s December 2010 Web Server Survey. Nginx’s market share might seem small, but in the highest traffic sites, changes in server software don’t happen overnight.

http://news.netcraft.com/archives/2010/12/01/december-2010-web-server-survey.html

With Nginx’s rising market share, we have to wonder what the performance advantage of switching to Nginx over Apache is. The only real way to show Nginx’s performance advantage is to compare it with a popular web server software solution, such as Apache. According to Netcraft, Apache currently holds the highest market share of web servers.

Pmwiki was used to benchmark Apache and Nginx performance

Pmwiki was used to benchmark Apache and Nginx performance

When I recently benchmarked Apache and Nginx performance, there were several trends that I noticed. Rather than post the hits/second, CPU load, and page render times, I’ve decided instead to post the trends that I noticed while performing this benchmark.  Next week, I will post the exact benchmark scores of each server application, as well as my conclusion.  Here were my observations on the way Nginx and Apache responded to a high amount of concurrent users:

  • Apache and Nginx begin to show noticeable performance loss at around 25-35 concurrent users, averaging around 420 transactions per minute. These are, by far, not normal traffic loads. Bear these numbers in mind when comparing Apache and Nginx performance- under the conditions of the test that I performed, traffic was non-typical.
  • Once failed transactions begin to occur (Apache’s failure point was at around 55-60 concurrent users, Nginx at 180 concurrent users), Apache’s failure rate at higher concurrent users was exponentially higher, whereas Nginx’s failure rate was linearly higher. When Apache was tested with 200 concurrent users, the failure rate was 24%. When Apache was then tested with 220 users, the failure rate grew to 48%. When Nginx was tested with 200 users, the failure rate was 5%. Nginx’s failure rate finally reached 20% at 280 concurrent users.
  • At a high number of concurrent users (200), transactions per second were at 7 for Nginx, and 4 for Apache. At 280 concurrent users, Apache’s transactions per second dropped to a mere 1.66 transactions per second. What this means is that Apache took longer to successfully complete a transaction, at an exponentially larger rate. Nginx’s transactions per second was still at 7.18 transactions per second, at 280 concurrent users.

Next week, I will post the exact benchmark numbers, test conditions, and my conclusion on the test.

Top 10 Technology Tips for Web Entrepreneurs- Tips 6-10

August 16th, 2010

This post is a continuation of my previous blog post, Top 10 Technology Tips for Web Entrepreneurs.  In this last section, I will cover mainly tips useful for web entrepreneurs doing project management work.

6.)If you outsource technology staff, hire competent workers, and retain them.

The most expensive part (in terms of both time and money) of outsourcing a part of your technology infrastructure is the process of finding and selecting a talented professional. Once you have selected the professional that will be working on your technology infrastructure (be it your server, or your website), make sure that you retain that individual. If you select a new coder every time your web application needs a bug fix or feature added, for instance, the underlying code will have become so cobbled together that it will take increasing amounts of time for a coder to understand how it works. Even the most talented coders all write code a little differently. This differences (without a dedicated code cleanup project) tend to add up over time, and can result in unexpected bugs. If budget is an issue, you can save quite a bit of money by working with the coder directly (outside of a freelance bidding website), although you should only do this for coders that you trust.

7.)Never pre-pay, or release funds on a project early, unless you absolutely trust the freelancer.

I’m surprised how much I’ve seen this situation come up- a client will pre-pay for some coding work, and never hear from the coder again. Another situation that I’ve seen happen frequently is that a coder will get 50% of a website or software application completed, get paid 50% of the project budget, and then will never complete the software application! This is simple- before you pay a freelancer, make sure that the project is 100% completed to your specifications. Don’t forget that documentation, either!

8.)Don’t rush a deadline, or deploy a software application too early.

We’ve all been tempted to rush a deadline on a project. Let’s face it- sometimes projects get delayed for reasons outside of our control. Sometimes, we’d like to go ahead and deploy a software application or website early, and “fix the bugs later”. This causes two main issues- first that your clients and users will see these bugs (and then might go to your competition in disgust, or at the very least have a negative experience with that application), and secondly that “later” may not ever come. As you have other projects down the road, you may forget to fix the bugs or issues that were present in the first place! Simply put, it’s best for your image and brand name that you wait until applications or features are 100% ready before deployment. Your clients will thank you!

9.)Don’t spend 90% of a software application’s budget on the user interface, concentrate on core software features at first.

It’s surprising how many times I see beautiful user interfaces that don’t actually accomplish anything. Your users and clients won’t care if your graphics came from 1990, as long as the interface is intuitive, and serves a purpose. Concentrate on function over form, for the initial application. After you have developed core features (that are genuinely useful to your clients and users), then you can work on the user interface.

10.)Never underestimate the potential for SEO to increase your business dramatically.

Most of us are aware of the amount of business and traffic that search engines can bring your website. What not necessarily everyone is aware of, is how much business you can gain from a well organized SEO campaign. I would estimate that with my Linux consultant business, that I gain approximately 2-3 clients per month from SEO. This may not seem like much to some people, but bear in mind that most (if not all) of my clients are “repeat customers”. This is from my limited SEO work, which I would estimate at 1 hour/month invested. Not too bad, if you ask me.  The best part about SEO is that it isn’t industry specific (with regard to results), and practically every industry can greatly benefit from a well targeted SEO campaign.

I hope that these technology tips may have helped someone prevent a costly mistake, and that these blog posts have been a valuable resource for any web entrepreneurs looking for some additional tips and guidance.  If you have any further questions about outsourcing,  server maintenance, or choosing a web host, feel free to contact me.

Top 10 Technology Tips for Web Entrepreneurs

August 9th, 2010

As a freelance Linux consultant, I’ve worked with many clients who have both succeeded and struggled with their online business ideas. In doing so, I’ve noticed several trends that clients have when they struggle with their business ideas (related to technology, anyways). I’ve written this series of blog posts so that others may learn from these mistakes, and avoid making them.  Here are 5 of the top 10 technology tips for web entrepreneurs:

1.)Choose your domain carefully, and make sure that it’s easy for others to remember, and type.

Your domain is the most crucial part of your online marketplace- make sure that others can easily remember and type it into their browser’s address bar. Picking a domain with excessive repeating characters (like waatches.com) is a recipe for disaster. Instead, if your first choice for a domain is already chosen, think of a creative way to get around the problem that others will remember (like discountwatches.com).

2.)Once you have your domain, change your business email address to match that domain.

Too many times, I’ve also seen business cards that had a Gmail, Hotmail, or Yahoo email address. This is not only unprofessional, but it makes clients doubt your dedication to your business. After all, email setup is cheap. If you must have a Gmail account, at least forward all mail from your domain to your Gmail account, and then setup Gmail to send mail using your domain.

3.)Take your time, and select a good web hosting provider.

As I mentioned in a related article, Choosing a Web Host, it is absolutely critical that your website is hosted on a reliable provider. Servers can be upgraded over time, but changing hosting providers is a long and expensive process (in downtime, lost sales due to poor network performance, and the costs of switching everything over to a new server). Sometimes the difference between a reputable and stable provider, and a poor one is a tiny difference in money. Make a wise decision the first time, and choose a good web hosting provider.

4.)Always have excellent documentation on your server, and the software that runs on it.

One of the most expensive and frustrating challenges can be if the coder designing software for your website doesn’t leave any documentation. This is not only limited to coders, but can also include Linux consultants, as well. Simply put, ask for documentation before you pay your coder or administrator. Don’t be rude about it, but instead just politely ask for documentation for the project that was completed. A professional freelancer will understand completely, and have no problem leaving you documentation. The documentation doesn’t have to be too detailed (for instance, a step-by-step explanation of all commands entered on your server would be excessive unless the freelancer was compensated extra), but your documentation should convey how the software solution operates, and where any configuration files are located.

5.)Invest in your technology infrastructure.

One of the biggest issues that I’ve seen is clients who treat their technology infrastructure as a one time expense. Instead, you should think of your technology infrastructure as a business investment. Common repeated expenses include server maintenance, as well as software updates and upgrades. Neglecting the maintenance of either your server, or the web applications running on it, is a certain recipe for disaster. For instance, a typical server maintenance program from a talented Linux consultant can cost as little as $40 per month, but it can help prevent much more costly issues (as well as prevent costly downtime).

Next week I will post the remaining tips that I have for web entrepreneurs, the conclusion of this two part series…..

Cool Web-based Software- DocMGR

June 15th, 2010

DocMGR is a free, powerful document management system for Linux servers





One of the biggest challenges when working with teams or organizations is information collaboration.  Although tools like email and file servers exist, they do not solve the problem of easily allowing outside contractors or remote employees access to important files, and file servers also do not have the ability to save multiple revisions of a file.  These common problems are easily solved with the use of a document management system.  A good document management system will also have a way to send files to external clients or consultants- people that might not have access to your network.

DocMGR is a popular document management software solution for Linux servers.  DocMGR has been in development since 2005, and the most recent versions include advanced features such as PDF exporting, and a built-in document editor.  DocMGR requires a few software packages to be installed on your Linux server (in addition to PostgreSQL and Apache), such as OpenOffice and ImageMagick.  Typically, DocMGR takes about an hour to an hour and a half to install.

Once installed, log into the DocMGR interface using the default username admin.  When logged in, you are welcomed by a customizable home page.

DocMGR main view upon login

DocMGR's home screen is completely customizable.

From here, you can view your files in thumbnail format (handy for images), or in a list format, which allows you to easily export documents as a PDF.

DocMGR offers list as well as thumbnail views

DocMGR offers list as well as thumbnail views.

One of the nice features of any document management system is the ability to have both private and shared files.  DocMGR’s ACL is easy to edit for folders, giving you the flexibility in controlling exactly who is able to view and change your files.  Subscriptions can be setup for shared folders, so that you are notified when the contents of a folder are changed.

DocMGR Shared Documents

DocMGR shared documents, and thumbnail view.

DocMGR’s built-in editor is quite sophisticated, and supports spell checking as well as the insertion of multimedia files.  The editor requires OpenOffice to be installed on your server, and it also allows you to easily edit any document stored within DocMGR (without requiring programs such as Word™ on your computer).

DocMGR document editor

DocMGR contains a built-in document editor.

DocMGR contains a built-in email client (for sending only), which allows you to send any file in DocMGR via email.  DocMGR also contains an address book feature, which allows easier organization of frequently used contacts.

Sending files via email with DocMGR

DocMGR can email files within DocMGR directly.

I am particularly impressed with the ability to not only email files, but also to send download links to for large files.  This way, you can send someone a very large attachment (that might otherwise fill up their mailbox), which they can download via a special link (that expires within 24 or 48 hours).  I particularly liked the idea that if I send someone a link to download a file, that link is automatically removed after a specified time.  Any time that I send someone a link manually, I usually forget to delete the original file from my server the next day.  This feature helps to keep your server tidy, and secure.

DocMGR direct download link

DocMGR offers the ability to send links to large files directly.

DocMGR’s way of selecting users is my only real complaint.  If you look at the screenshot below, users are selected using the search field at the top right corner.  Normally, you would expect a drop down list, or another way to select individual users.  The search field is not very user friendly, here.  Despite this one fault, the rest of the user management is easy and fast once you get used to selecting users via the search field.

DocMGR user management

DocMGR user management is quick, once you get used to the interface.

While we’re on the topic of searching, this is perhaps one of DocMGR’s biggest strengths- the search times for documents are extremely fast.  In addition, DocMGR can even search within the contents of files for what you want- in case you forget the filename.

If you have the need for a document management system, I would highly encourage you to take a look at DocMGR.  With it’s many features and fast performance, it’s a wonder that this document management system is available for free.  If you don’t want to tackle the installation of DocMGR yourself (and it’s many dependencies), feel free to contact me, and I will install it on your server for you.

The Importance of a Linux Server Security Audit

May 4th, 2010

A security audit is probably one of the least requested services that I perform, and for a good reason.  Truth be told, most of my clients don’t think about security when it comes to their Linux server.  After all, Linux is an extremely stable and secure Operating System.  Assuming that some sort of basic Linux server maintenance is being performed, the server should be safe from most types of root compromises.  However, server maintenance usually won’t protect your server from the more popular web application attacks.

A good security audit will test your server for:

  • XSS vulnerabilities
  • Operating System vulnerabilities
  • Weak user names and passwords
  • SQL Injection vulnerabilities
  • Server application vulnerabilities
  • Insecure configurations
  • Information disclosure vulnerabilities

Using advanced scanning tools, you can test for all of these potential vulnerabilities on your server.  Tools such as nmap allow for advanced port scanning, and the tests the ability of an attacker to detect possible sensitive information about your server.  Tools such as Nikto scan a server for web application vulnerabilities, and reveal information disclosure vulnerabilities.

If you hire someone to run a security audit on your server, ask questions beforehand, such as what scanning suites will be used,  and ask for references.  Any professional should have quite a few references, and should be able to identify the scanners that will be used against your Linux server.  In addition, ask them if after hours scanning is available, so that your business is not adversely affected by these scans.

If you have any further questions about security audits for your Linux servers, please feel free to contact me.

Gone Fishin'

March 29th, 2010

Instead of making a post about Linux solutions (or Linux itself) today, I thought I’d share the fruits of my Friday with you.  Friday morning and afternoon was rather uneventful, so I decided to get out of the office early for an emergency fishing trip.  I call this trip an emergency trip, since the rivers have recently thawed, and I renewed my Montana fishing license on the day before.  Since “all work and no play makes Chris a dull boy”, I was determined to make the most of an otherwise dull day.

Luckily, the Yellowstone and Stillwater rivers are a quick 20 minute drive from my office, so I was able to spend about four hours fishing.  With the wind chill at 38 degrees, I was comfortable with a cap and hoodie.  The wind was blowing at gusts of 20 MPH, which made casting at times difficult.  While we’re on the topic of casting, I’d like to mention that I was using minnow lures, and an assortment of flies tied to a floater.  Not exactly the fanciest way to catch trout, but it works!

Two Montana Rainbow Trout

What you’re looking at are two of the four fish that I caught on Friday.  I never thought to take a picture of the third (a 14 inch trout), and the fouth trout for the day was let go (less than 12 inches).  The trout on the left is about 15 inches, while the trout on the right is 17 inches.  The average size for Rainbow Trout in Montana is right about 13-16 inches, for what that is worth.

Since I never went to culinary school, and my wife is the chef in the house, I will probably cook these beauties using possibly the easiest trout recipe ever:

Easy Trout Recipe

Ingredients:

  • trout (duh), cleaned and left whole
  • lemons
  • black pepper
  • cayenne pepper
  • olive oil

Directions:

  1. Coat the trout in oil lightly, while sprinkling black cayenne pepper on the outside of the fish.
  2. Cut lemons into wedges, and insert one wedge into the trout’s cavity.
  3. Wrap the trout in aluminum foil, and cook at 375 degrees in the oven for 20 minutes total, flipping the trout in the oven after the first 10 minutes.
  4. Once cooked, remove the trout from the foil, and gently scrape the fish off of the bones.  Enjoy!

Hopefully next my blog post will be about something Linux related- but until then, good luck fishing!

Cool Web-based Software- Nagios

March 22nd, 2010

Is your Linux server down?




This one question has the power to keep us all up at night. Linux servers host your websites, handle your email, and manage your network. Your Linux server is the heart and soul of an online presence, since your databases and web applications all run on top of your server. Put simply, if your server is down, so is your business. Downtime means lost sales, and lost customers (present and future revenue). If you can’t afford downtime, you need a good server monitoring program. One such software solution is called Nagios, and it’s quite powerful.

Although Nagios isn’t the easiest web-based software solution to install (most of the server configuration is done by editing configuration files), it is extremely easy to use, once configured. Nagios presents you with a web-based status screen, which allows you to quickly view the status of all of the servers that you are monitoring. Nagios isn’t for just Linux servers either, Windows servers can be added to monitor as well. Once you are logged into Nagios, you can view the detailed status for all of the monitored servers by clicking on the “Service Detail” link.

Nagios host status details

From here, you can view the detailed information about when Nagios last checked the status of a service running on your server, and view the results of that last check. Pretty boring stuff so far, since nothing is broken. Let’s break the POP3 service on our server, and see how Nagios reacts. Within one minute, Nagios has flagged the POP3 service as being in a “critical” state. Nagios requires four failed connection attempts (by default), before an alert is issued. This is important, since sometimes a request is dropped by a router in between Nagios and the destination server. The Internet is a crazy place, and sometimes traffic isn’t delivered to it’s destination in time. Therefore, Nagios will wait for four consecutive failures, before it issues an alert.

Nagios host details- critical alert

Critical Alert

Once Nagios has failed to connect to the server four consecutive times, the server is then placed into an alert status. From here, depending on your Nagios configuration, an email can be sent, a text message sent, or even a sound played through speakers connected to your Linux server.

Nagios email alert

Nagios email alert

Once we have successfully fixed the issue with the POP3 daemon on our Linux server, Nagios will remove the critical warning on the server, and place the server into an “OK” state. Once the monitored server’s status is changed to “OK”, emails and SMS text messages are once again sent, to inform everyone that the monitored server is fixed.

Nagios SMS Alert

Nagios SMS Text Message Alert

In addition, the host status on Nagios is now displayed as “OK” on the service status page.

Nagios Host Status OK

Nagios Host Status OK

In addition to monitoring, Nagios also supports the ability to create logs and uptime graphs which display host uptime and service stability in an easy to read format. Host state breakdown reports allow you to easily view and export server and service uptime reports.

Nagios Host State Breakdown Report

Nagios Host State Breakdown Report

In addition, you can also schedule downtime with Nagios, so that alerts aren’t issued when a service or server is taken down for routine maintenance. The downtime window is completely custom, and Nagios gives a summary of all planned downtime, by clicking on “Downtime”, on the navigational bar.

Nagios Scheduled Downtime

Nagios Scheduled Downtime

In the end, Nagios has the potential to save you both time and money. WIth Nagios, you won’t have to worry about whether or not your server is working- Nagios will let you know as soon as your server is unavailable.