This is a follow-up to a blog article that I wrote after a recent engagement at a client site where I was involved in planning a migration of performance tests from LoadRunner to Performance Center. The original article had some tips about the use of test data. This article expands on that.
During my time on site, I spent some time reviewing my client's scripts. During this review, I found three LoadRunner scripts which simulated the upload of documents to a web server. The tester had copied one script three times so that there were separate versions to upload 2MB, 5MB and 10MB files.
These scripts could easily be consolidated into a single script, using variables to determine the proportion of 2, 5 and 10MB files that are uploaded. This would reduce the script maintenance overhead.
As well as this improvement, I recommended that the client should parameterise the names (but not the contents) of the uploaded files. When I visited the client, I found that the test data for these scripts was a large number of Word Documents.
There were 1200 identical 2MB files, 1200 x 5MB files and 1200 10MB files. The files were all copies of each other, just with different file names. This was a total of 33GB! By using my method the disk space used for test data was reduced to 17MB.
e.g. in the script, rather than….
…would save over 30GB of disk space.
Whilst this waste of disk space wasn’t a problem for my client at the time of my visit, the migration to Performance Center increases the importance of managing disk space carefully. Uploading large amounts of test data in scripts should always be avoided if possible to prevent problems with script/scenario initialisation (compilation) at the start of each test.
In the next article in this series, I'll go on to discuss the importance of parameterisation (even when you think you can "get away without doing it").