Accedian is now part of Cisco  |

Avatar photo
By Tony Fortunato

Baselining 101: Copying a file to test performance

Why having a clear goal is detrimental when copying a file to test performance

I’m sure everyone has at one point or another copied a file to test ‘performance’.

I’ve seen many conclusions from these tests that are misleading or downright incorrect, so I thought a video and article would be helpful to review the “why and how” points of this seemingly simple test.

Why is copying a file to test performance potentially misleading?

This the where things go horribly wrong. If you don’t have a clear reason or goal for copying a file, you will misinterpret the results.

For example, if you wanted to measure the performance of the network, you need to be certain that your hard drive, computers, and application users can perform faster than the network being tested. Don’t forget that you are only as fast as your slowest component. There are many benchmarking tools out there to measure your hard drive, RAM and component performance. Tools such as HWiNFO can provide this information.

When possible, I connect the 2 test devices with a cross over cable and configure 2 static ip addresses and record the results as my ‘control’ test. I am now certain that the network components, latency, and distance aren’t factors and I know the best possible numbers moving forward.

I’ve seen analysts copy a file trying to reproduce application performance not knowing if the application behavior is similar to a file transfer. Big mistake. Just because you can copy a file fast doesn’t mean your application will work well. Same point for VOIP or video applications. You need to understand your application flow, is it read or write-intensive, does it use many smaller packets, is it UDP or TCP are all examples of factors to consider.

How can copying a file to test performance potentially misleading?

Every application will have a different level of performance. A good example is Apache, they will probably have different performance results compared to a freeware web server on the same hardware. I don’t want to name names, but I’ve done testing and found some free webservers that can barely handle 100 Mbps.  I also see analysts using Virtual Machines that can negatively affect performance as well.

After a few tests, you will figure out what hardware and software work best for different scenarios.

If network performance is the primary goal, you might want to use a tool that does not use your hard drive or file system, like iperf3.

Automation; if you can automate the test using a batch file or macro, you can reproduce the test at various times and different networks easily and consistently. A good example is to test download vs upload performance to your Microsoft file server from your wired and wireless connection. Are they different, and if so, by how much?

How many tests should you run?  I always start with 5 tests, remove the highest and lowest values, and average the remaining three.  Ideally, you would want to test during your network ‘prime time’ (i.e. when people are logging in, end of the quarter, during server backups, etc.) as well as off prime time hours (lunch, weekends). This is where having an automated solution (even a batch file) that you can schedule, is a huge benefit.

Conclusion

Copying a file or any performance test is only effective if you have a clear goal and consistent methodology. Since there are many tools and methodologies, I would concentrate on making sure mine is documented properly. 

If you find this type of Active’ testing helpful, you will eventually build a ‘better’ system with your specific requirements or look for a product that will take your testing to the next level. Features such as reporting, historical comparisons, different types of tests, a web front end and alerts are examples most people look for.