This is the third article in a series describing script improvements that I recommended to a client during a recent engagement where I was advising on a LoadRunner to Performance Center migration. As well as advising on their use of test data and the size of test data files, I had a few more general comments to make regarding some of their scripts.
When I visited the client, I noticed a few hard-coded values in scripts. These should always be parameterised if possible (although to be fair this advice is valid for any performance test tool, not just LoadRunner or Performance Center for that matter).
Example 1- Repeatedly downloading the same document
The following LoadRunner code
be parameterised as shown below.
If, for example you are testing multiple users downloading a single document like an annual report or price list, it can be tempting to leave values hard-coded. Even in this case, I'd be tempted to parameterise the document. It doesn't take long and parameterisation means that you can quickly change the document when you need to, potentially future proofing your script or allowing it to be redeveloped to test another download or multiple, random downloads.
Example 2 - Hard-coded Session/User ID
I've edited the code below to avoid a potential data breach, but in the script that I checked, I found code similar to that shown below.
The User and Session IDs were left hard-coded. Whilst this may not appear to cause a problem, and the script may work, the tester is unlikely to know exactly what effect this would have on an application. Even if the application doesn't use these IDs to cache content or add a layer of security now, there is no guarantee that a code change won't be made in a future code release causing the script to fail. Best practice would be to future-proof your scripts by correlating these values and checking with the development team about their potential impact downstream.