Monday, November 10, 2014

Performance testing environment

One of the very essential things for implementing performance testing for a web application is to have a suitable performance testing environment. Ideally, it should be a replica of the production environment. However, as the production environments tend to normally be quite expensive, reproducing them for a non functional testing activity such as performance testing does not make for a good business case. Hence, most small and medium scale IT businesses tend to undergo their performance testing on the same environment that they undertake their other testing activities on.

One thing to note about performance metrics for a website is that we cannot scale results. Test environments have hardware far less capable of handling as many users as what the production systems can handle, would certainly have different performance results. The results achieved for the tests on the performance environments are far difficult to scale to match the hardware capabilities of the production environment. For a test environment which is a replica of the production environment the results could be straight away used to predict the performance of release code. But as in most cases where the test environment is not the same as the production environment how do we predict the performance wise health of a release? This was a question I had put across to one of the senior managers from another branch of the project that I worked for. He first said that he was lucky to have an environment that was similar to the production environment, so he didn't have to bother about such a situation. But to answer my question, he said if he were in my situation he would try and extrapolate results by plotting a graph based on numerous tests with incremental hardware capabilities.

Eg.
For a production web server with following hardware specifications:
  • 16GB RAM
  • 8 cores
  • 160 GB Disk space
We would have to start a performance test with web server in the test environment having specifications 2 GB RAM, 1 core, 20 GB disk space (assuming the network bandwidth is similar) note down results, and then increase hardware specifications to 4GB, 2 cores and 40GB of hard drive space and note down results again. This way we can plot a graph of parameters against response times and extrapolate results to predict results in the production environment.

Sunday, January 26, 2014

Google Page Speed Module

Hi Guys,

Although the Google page speed module has been out for long time now, but was just recently discovered by me. Anyways, better be late than never. So here it is for those who haven't noticed it as yet.. the Google Page Speed Module, has some nice tools for development, profiling and testing.

https://developers.google.com/speed/

Check it out, hope that is of use.

Sunday, January 19, 2014

Intro

Hello Guys,

These are my first words on my Performance Testing blog. My name is Vipul Kane and I am a Performance Testing engineer. I mostly deal with open source tools such as JMeter and Zabbix, except for some paid server monitoring tools such as New Relic.

I recently read a post which had ' Wirth’s law' which said that 'Software is getting slower more rapidly than hardware becomes faster.' The law may very well be true, as we have better programming tools, libraries, faster broadband networks, heavier browsers, lots of client side processing and lots of data to add to it.


I have been into computer software for around 5 years now and have been doing performance testing for close to 3 years. I thought of writing a blog to share some of my experiences and also articles that I keep reading elsewhere from other Performance guys who have already conquered our day to day performance testing problems. Lets keep sharing stuff so that we don't let Mr. Wirth prove his law to be true.

Cheers,
-Vipul