A week ago I started working on a small project, with the goal of showing CPU, memory usage and swap usage in a web interface. I chose perl for the job, utilizing rrdtool, cronjobs and mod_perl for cgi.
I works quiet simply by using a daemon that collects data every 15 seconds and stores the data in the rrd's. Moreover I have made a simple cronjob to check if the gather (/collector) is currently running, if not we start it again. CPU usage is currently gathered through /proc/loadavg, which isn't essentially CPU load, but tells something about how many processes is waiting or running on the cpu currently. Memory is gathered through /proc/meminfo, and is rather accurate down to 1k.
Swap seemed bothersome, so I used /usr/bin/free and parse the input, the problem with parsing swap from /proc/swaps is if there are more than 1 mounted swaps.
I've decided to generate graphs only when one visits the site, this should put an all around lower load on the system, however it may result in longer page load times in the browser, as long as it isn't noticeable it's the prefered way.
The images generated from the cgi is temporarily stored in /tmp and is loaded through the cgi script and written, so there's actually no direct access to the data.
The code can be found on a git repository at the usual location, moreover there's a live version of the project running on the server to show the actual stats of the server.
Comments and bugs are welcome.