Mauserrifle.nl Tech and Life

    Home     Archive     Projects     Contact

Checklist Linux Web Server Issues

Is your Linux web server having (performance) issues? Follow my simple checklist to find the cause so you can start fixing it.

Check disk space

Check the disk space using the command: df -h

High disk usage? Follow these instructions.

Check disk inodes

Check the disk inodes using the command: df -i

High inode usage happens when there are many small files.

High inodes? Read the article No Space Left on Device Error on Linux When There is Space Available.

Check load with top

Check the load with top or htop. Maybe mysql consumes 100% cpu?

Check mysql for slow queries

Log in to mysql using the cli:

$ mysql -u user -p

Run the query below to get an overview of long running queries (1+ seconds).

SHOW PROCESSLIST;

Get the full queries using:

SHOW FULL PROCESSLIST;

Analyze slow queries. First check how many rows the table has using the COUNT example below.

SELECT COUNT (*) FROM table;

If the table has a huge amount of rows, it might be a cache/temp table. Google whether that table can be emptied, for example using TRUNCATE. Make sure to backup your database before running anything!

In the past I had a Wordpress database with a 20 million rows table. Below an example I used to empty two temporary tables.

TRUNCATE TABLE wp_icl_string_pages;
TRUNCATE TABLE  wp_icl_string_urls;

Make sure to fix the issue, otherwise it will be recurring. If it’s a plugin:

  • Update to the latest version
  • Find support if needed
  • Consider a temporary cronjob to truncate tables

Check similar websites on the web server.

Optimize the query using indexes.

Check disk i/o with iotop

Check the disk i/o using iotop.

There could be a process indexing your disk (updatedb / mlocate). It might be worth to disable this when not using the command locate often.

If a single file takes a long time to read, make sure the folder does not contain many files. You could optimize your web application to split the uploads into multiple directories per 1000.

Check request times from apache2

Install mod_status which enables you to analyze running requests on the page /server-status. Using this you will find the heaviest pages (SS unit in the overview).

Make sure your web applications are not constantly serving images dynamically (through PHP or similar). Resized images should always be static.

If a static file takes a long time to read, make sure the folder does not contain many files. You could optimize your web application to split the uploads into multiple directories per 1000.

If your web application just serves very slow, optimize your code with cache. Consider a reverse proxy for high traffic websites.


Hope this helps!

Thanks for reading!

If you liked this post, you can share it with your followers!