Monday, February 21, 2005

Test-driven development

At last - a Python-related posting!

I've been thinking about trying test-driven development for a while, but have never really had a good project. But I've found something which looks worth a go. FWIW, it's a skill planner for ToME. There's one available already, but it uses Curses and so isn't available on Windows. And anyway, it's in Perl, so it clearly needs rewriting :-)

Anyway, off we go. The obvious place to start is with a "Skill" class, which isn't difficult to set up. Or rather, to set up a test for.

class SkillTests(unittest.TestCase):
def test_create(self):
combat = Skill(0.8)

From here on, it's pretty smooth going. The hard bit is to add functionality which is just enough to make the tests pass. Also, something the articles I've read don't make clear is that test-driven development doesn't avoid the need for good design sense. (It's not that they hide the fact, it's more that it's an "obvious" assumption that I missed). You still have to think about how you want to use your classes, it's just that you document that use in test cases rather than in specifications (or more likely on bits of paper which you then lose...)

The first major test comes when I want to add some non-trivial functionality. By this point, my Skill class has a "multiplier", and "points" and "value" attributes. The points can be set, and the value is a derived attribute, basically points * multiplier. There is a restriction that the value cannot exceed 50, which I implement at the moment as a constraint on how the points attribute is set. But more on this later.

Now, however, I want to add the concept of dependent skills, so that points spent on one skill can affect the value of another skill. My class won't handle this at all. So I'm going to have to do some major refactoring, which means that I need some confidence that I won't break anything. Have I got this confidence? Well.... not really. I think that before I start, I'd like to add some more tests to make sure my basic skill class behaves exactly as I want. Whatever that is - there are clearly still some design decisions to make.

That's an interesting insight in itself. I identified the need to tighten up the basic spec before moving on because I didn't feel the necessary confidence that my tests covered everything. The articles never mentioned that one, either :-)

After some thinking, however, I come to two conclusions:

  1. I can't think of any more tests I want to add

  2. I've a sneaking suspicion I'm writing the whole class backwards, and the interface I'll ultimately want is not the one I'm currently designing


The first one makes me feel better about my current set of tests, so that's OK. The second one could be an issue, but I'll park it for now, and trust that I'll be able to fix it when I really need to. It does make me wonder, though. I'll be refactoring my tests at that stage, which, while not wrong as such, feels risky. What if I delete a test which is no longer correct, but replace it with a weaker one which my implementation passes "by accident"? No, I should wait and see. Use the YAGNI principle, and carry on regardless...

As I progress adding tests, I discover a very interesting thing. I know I want to write a test for the case where a skill is at its maximum, and points are added to a subskill (one which adds bonus points to its "parent" when it is increased). This could cause a skill to exceed its maximum, so I need a test here. But I don't know what behaviour I want, so I don't know how to complete the test! That's excellent - the test-driven approach has teased out a funamental design issue I'd have missed otherwise.

I'm enjoying this. But I'll stop blogging now as I actually need to think about my design before I can proceed.

(Maybe I should wait to post until I've finished. These flow-of-consciousness posts help me, but I don't know if they make enough sense to be worth preserving for posterity :-))

0 Comments:

Post a Comment

<< Home