Telerik blogs

I thought I’d follow up that last post on Using Data Driven Testing Wisely with something specific around the size of the dataset for a data driven test (DDT).

What’s a good size for a DDT? As with everything in software engineering/testing, the answer is “42.” That, or “It depends.”

In all seriousness, the right size of a dataset for a carefully thought out scenario does indeed depend. My payroll algorithm in the last post was a simple test set. You may be working something much more complex relating to finance, rocket science, or environmental controls.

Every situation’s different, but I can tell you that you need to re-examine how you’ve built your dataset and test script if you’re in the hundreds or thousands of rows of data.

I’m specifically not saying you’ll never need huge numbers of iterations of data. What I am saying is you need to re-evaluate your dataset if it’s that big. If you’ve done your due diligence around planning your data and you’re still that large, then fine—at least you’ve carefully thought things out and applied some of the steps I talked about in the previous post.

I’d also love to hear back from this blog’s readers:

What sizes of datasets (number of rows, number of parameters/columns) are you generally dealing with? What’s your typical set size? What’s the largest dataset you’ve pushed through a DDT, and why?

About the author

Jim Holmes

Jim Holmes

has around 25 years IT experience. He is co-author of "Windows Developer Power Tools" and Chief Cat Herder of the CodeMash Conference. He's a blogger and evangelist for Telerik’s Test Studio, an awesome set of tools to help teams deliver better software. Find him as @aJimHolmes on Twitter.


Comments

Comments are disabled in preview mode.