I've recently been evaluating different website benchmarking tools, so I though I would take a moment to highlight two of them I have used recently.
ab is the Apache Benchmark tool, and it comes bundled with Apache. It's a pretty simple cli tool, used to test the throughput of a website. It has a bunch of different options you can pass to it, but the most important are -c (number of concurrent connections) and -n (number of requests). It's man page is pretty well written, so I'll let you explore the other options on your own.
So, here's a sample of testing alextheward.com using ab:
ab -n 200 -c 20 http://alextheward.com/
(Make a note here, that you need to specify the protocol, and the page, otherwise ab will complain)
And the results of that benchmark:
bash-3.2$ ab -n 200 -c 20 http://alextheward.com/ This is ApacheBench, Version 2.3 <$Revision: 655654 $> Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/ Licensed to The Apache Software Foundation, http://www.apache.org/ Benchmarking alextheward.com (be patient) Completed 100 requests Completed 200 requests Finished 200 requests Server Software: Apache/2.2.22 Server Hostname: alextheward.com Server Port: 80 Document Path: / Document Length: 19858 bytes Concurrency Level: 20 Time taken for tests: 26.306 seconds Complete requests: 200 Failed requests: 0 Write errors: 0 Total transferred: 4076928 bytes HTML transferred: 3986634 bytes Requests per second: 7.60 [#/sec] (mean) Time per request: 2630.642 [ms] (mean) Time per request: 131.532 [ms] (mean, across all concurrent requests) Transfer rate: 151.35 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 69 2121 1728.4 1554 9394 Processing: 0 296 372.5 148 2138 Waiting: 0 63 166.3 0 897 Total: 331 2417 1776.0 1846 9684 Percentage of the requests served within a certain time (ms) 50% 1846 66% 2623 75% 3189 80% 3586 90% 4958 95% 6051 98% 8059 99% 8518 100% 9684 (longest request)
Looks like I could be doing a better job serving up web requests! Lowering the concurrency certainly helped the test, getting the request time down to < 1000 ms for 90% of requests, so I need to see what's going on with Apache when I'm serving up concurrent requests.
There's another gotcha with AB. It cannot handle ssl requests coming from a server with a self-signed cert. There does not appear to be any way to tell it to ignore ssl errors either.
Jmeter is actually really cool, but it does have a bit of a learning curve. I've attached a couple of images which show a basic configuration.
First thing you have to add is a thread group. This is the place where you tell it how many threads to run on, and how many requests each thread is going to request. After that, you need to add HTTP Request defaults, so that you can specify the default server and the default request uri. Next, you add a HTTP Request sampler, and give it the uri you want to test. You can add as many of these as you want. Finally, you need something to read the results. I've added 2, one which shows the sample results, and another which shows the average request times over an interval.
After you hit the run button, you will see results in the resulting screen!
There's actually a pretty good intro over here: http://jmeter.apache.org/usermanual/build-web-test-plan.html
It will give you a pretty good intro to how to do web benchmarking with it. There are a bunch of other features which are outside of the scope of this post, but it's a pretty good tool for doing all kinds of performance testing.