Share via


enum->class Refactoring the OO way

This one was pretty much TheoY’s.  I coded it up, which probably means I corrupted his idea.  So, give him credit for the good parts and I’ll take balme for the bad parts. 

 

Pretty much the same approach was taken in C++ by Johny in his comment.

 

      abstract class E

      {

            private E() { }

            public abstract void OperationF();

            public class _a : E { public override void OperationF() { } }

            public static readonly _a a = new _a();

            public class _b : E { public override void OperationF() { } }

            public static readonly _b b = new _b();

            public class _c : E { public override void OperationF() { } }

            public static readonly _c c = new _c();

      }

      class Program

      {

            void F(E e)

            {

                  E e2 = E.a;

                  e.OperationF();

                  if (e == E.a)

                  {

                        return;

                  }

            }

      }

 

So, by making them into singletons, we lose the ability to switch() on the value.  But that’s actually a good thing – we’re doing one of the “ReplaceTypeCode” Refactorings from Fowler.

 

Are there any other interesting solutions, or is this pretty much it?

Comments

  • Anonymous
    May 31, 2004
    This latest one is starting to look like the (pre 1.5) Java enum pattern. The OO way certainly works well enough but it can get bulky in a hurry.

    Consider an enum for the arithmetic ops +,-,,/. Sometime I just need to differentiate additive ops {+,-} from multiplicative ops {,/}. That's 7 classes for this enum then: {+,-,,/}, {+,-}, {,/}, {+}, {-}, {*}, {/}! The C version probably just does (x & 0x2) and calls it a day :)

  • Anonymous
    May 31, 2004
    Link: http://home.online.no/~teyde/blog/AReactiontotheEnum-ClassR.html

    I have thought about this refactoring and I wonder why somebody would want to do it. It was a funny exercise, albeit I think it has no real world value. The code is complex, hard to understand, and if you have more enums you want to treat this way, you can't reuse the code because you can't inherit implicit operators. Then I thought about a wellknown pattern which is used to extend a class when you either don't have access to the source code, or can't alter the source code because that would break its clients. That pattern is the Visitor (or I think it is a Visitor).

    My Visitor is simple. It's a made up validator:

    public class EnumValidator
    {
    public bool IsValid;

    public EnumValidator(E e)
    {
    E[] validValues = {E.a, E.b, E.c, E.a | E.b | E.c};
    IsValid = Array.IndexOf(validValues, e) != -1;
    }
    }

  • Anonymous
    May 31, 2004
    "Why?" is a pretty important question, for sure. I have a couple possible answers, but as an IDE writer I have to assume that people will want my tools for things I can't anticipate, at least not specifically.

    When we analyze refactorings as automatable code transformations, we work with an absurdly trival case, like the one given here. It lets us focus on what the transformation might look like, but it sometimes it's not clear why the refactoring has value until you see it in a real world case.

    There are a few refactorings in Fowler's catalog that seem to be in the same category:

    http://www.refactoring.com/catalog/replaceConditionalWithPolymorphism.html
    http://www.refactoring.com/catalog/replaceTypeCodeWithStateStrategy.html
    http://www.refactoring.com/catalog/replaceTypeCodeWithSubclasses.html

    Bascially you move from the caller selecting a behavior to using polymorphism. This should be more readable. It also means that the logic of selecting behavior is in only one place (typically a factory). If you tell me I need to add a new case (E.d) I know I can do it without missing any cases - the compiler helps me catch it.

    However, I think that if the enum really represents something that acts like a set of sequential integers, this refactoring has much less value. This approach is really for the situation where the enum is about selecting a behavior.



  • Anonymous
    June 02, 2004
    I agree with Jay -- this wins out most in behavioral selection, and also groups that you might categorize as inheritance trees.

    Nicholas' post with the seven classes actually has additional merit (in fact, I've actually represented an arithmetic operator as a reference singleton before), consider the following:

    Let's make the classes a little more easier to name in this example:

    abstract class ArithOp // {+,-,,/}
    abstract class AddOp : ArithOp // {+,-}
    abstract class MultOp : ArithOp // {
    ,/}
    class Add : AddOp // {+}
    class Sub : AddOp // {-}
    class Mult : MultOp // {*}
    class Div : MultOp // {/}

    Okay, these are sucktastic names, and bodies are omitted for the sake of brevity. Hopefully you can still see the inheritance tree though.

    By having extra types for each one, you can actually refer to these groupings in ways that will allow your program to take advantage of typesafety to prevent bugs from occurring. For example, say you want to do an infix expression evaluator, and want to follow the basic rules of operations, that multiply's and divide's have higher precedence than add's and subtract's.

    You in fact, define different types to refer to these two precedence levels: MultiplicativeExpression, and AdditiveExpression, where MultiplicativeExpression CANNOT contain subexpressions that are of type AdditiveExpression without the AST type Grouping (represents parenthesis) wrapping around it. This allows your AST to have a type system that represents the rules of your grammar.

    You then define AdditiveExpression to have an Operator property, that returns something of AddOp type. Likewise, MultiplicativeExpression has an Operator property that returns MultOp. By separating out these types, you don't run into the possibility that you'll accidentally store a MultOp as the operator for the AdditiveExpression, or vice versa. Using flags or classic-enums won't work that way.

    In short, leveraging inheritance allows you to have arbitrary intermediate groupings of your enums.

    One might argue that this introduces a lot more code, which itself increases bug risk -- but my response to that is that this new code is REALLY simple, and it's more along the type that if you don't get it right, it won't compile. You'll see in future posts that I'm a fan of more code, if it reduces logic complexity. The whole drive for "lesser code", I think, is rooted in the drive for simplicity. And even though being "extremely OO" creates a ton of types, it reduces the logic for most of the methods, which helps reduce the number of bugs that can only be found at runtime.

    I hope this makes sense, if this is all gibberish to you, my blog is linked on this comment, and there's a contact link from that blog page.

  • Anonymous
    June 08, 2009
    PingBack from http://quickdietsite.info/story.php?id=6794

  • Anonymous
    June 15, 2009
    PingBack from http://debtsolutionsnow.info/story.php?id=2260

  • Anonymous
    June 16, 2009
    PingBack from http://fixmycrediteasily.info/story.php?id=14375