Blogging newcomer Marc Funaro made a provocative first post over the weekend with his entry How OO Almost Destroyed My Business. It has gotten a lot of comments, some supporting him, and some taking issue with his conclusions. I started to comment but decided it would be better to generate a secondary discussion rather than add onto the already long thread.

Marc says he picked up ColdFusion as a non-programmer, and had good success with it until fairly recently. With the movement toward object-oriented development that is happening in the ColdFusion world, he ran into trouble. He read some books, some blogs, and took a class on Java development. And he ended up overwhelming himself with unnecessary complexity in terms of frameworks, design patterns, and OO architecture. He sums up the result of doing this pretty nicely:

"The bottom line is, when you NEED to use some OO concept, YOU'LL KNOW. *That's* the time to start writing OO-style code, and only then... not everywhere else."

Which is pretty good advice. It's something that any knowledgeable proponent of OO will tell you. I'm sorry that it took him a good amount of time and frustration to reach that conclusion, but I'm glad he finally did.

Where he goes wrong, though, starts right in the title of his entry. OO can't ruin anything, but people making bad decisions absolutely can. And what Marc did was make some bad decisions, because he was new to OO, confused, and, as he says, "downloading one framework after another, piling them all into an application". This is like reading a book on construction, and then going out and trying to build the Taj Mahal when all that was needed was a garage.

Bad decisions don't mean a person is stupid or foolish. Smart people make bad decisions all the time (I'm not conceited enough to call myself a smart person, but I definitely have made some bad decisions). Usually, it's simply a lack of knowledge or experience, or a failure to understand the implications of the choices you're making. But even that is OK, because when someone makes a bad decision, it can still have a positive outcome if it results in learning something. OO does not equate to using a framework, and it does not require the application of every design pattern under the sun. It's simply a way to organize code, manage complexity, and accommodate change. Sometimes, that is best served by using a framework like ColdSpring or Spring. Sometimes, design patterns can offer solutions to encapsulate variations in a system and cope with change. One of the key things anyone using OO must understand is that there are pros and cons to every decision, and multiple solutions to a given problem. The only way to learn how to assess these trade offs is through experience.

The reality is that a lot of ColdFusion applications don't require a massive OO system to power them. Many of the small- or medium-sized applications don't need an n-tier architecture loaded with abstractions and design patterns. But that doesn't mean that some of the good ideas of OO, like encapsulation, can't be used with big benefits. One doesn't need to turn every query into an array of objects. Just creating well-defined interfaces to expose behavior to the rest of an application will get you a long way. Once something is encapsulated, it's much easier to change it later if you need to. It might be just a few CFCs to wrap up the bulk of the logic and hide the implementation. That might be all that will ever be needed. But if (and, more likely, when) things get more complex and it comes time to start adopting a broader set of OO principles, you'll be in a much better position to do so.

However, there is another reality that can't be denied: in the debate between procedural and OO development, OO has won. It won many years ago. ColdFusion is one of the few languages left that supports procedural development to a large degree. If you want to keep being a software developer, or ever want to move to a language like ActionScript, C#, Java, Groovy, or Ruby, you're going to have to know OO. That's just how it is. And as Marc points out, even within the CF world, OO is taking over, and the number of jobs available to people without OO experience are going to keep getting smaller and smaller. Some folks may not like this and may attempt to rebel against the trend, but you can't stop the tide. OO is not going away, in fact, it's only going to get ever more ubiquitous. So it's probably in your best interest to learn about it. One doesn't have to use it on every project, nor does one have to use it to create a complex, over-engineered mess. But experience is the best teacher, both in terms of learning OO and increasing your demand in the marketplace.

So, with respect to Marc, don't do what he did. Don't try to swallow the entire OO buffet in one bite. If you try, you'll fail. You'll get frustrated. And in that red haze, you'll probably miss the simple benefits of OO. Instead, learn what you can and take time to digest the information. Experiment with it, but don't get carried away. Apply what makes sense to you where you can, in small bits. Remember that the goal is to learn, but it is also to help you do what works for you and build applications that satisfy customers.

I suppose the bottom line is: Don't be afraid of OO. Be afraid of anyone who says that OO is the only way to build an application, and be just as afraid of anyone who blasts OO because they got carried away with it and got burned.

Comments Comments (23) | | Digg It! Digg It! | Linking Blogs Linking Blogs | 8899 Views

Comments (Comment Moderation is enabled. Your comment will not appear until approved.)

  • # Posted By Steve Bryant | 5/26/09 1:36 PM


    Your point seems to be to use the right tool for the job, but then you say "in the debate between procedural and OO development, OO has won". This seems to indicate not that there are right tools for the job and that OO has established itself as a good one, but that OO is BETTER than procedural.

    To me, this makes this post just the kind of post to which Marc was referring.

    I do agree with most of your post. I just think the quoted phrase is unfortunate. Perhaps it would have been better to point out that OO is a well established option and one about which it would be wise to learn something (even if you choose not to use it).

  • # Posted By Brian Kotek | 5/26/09 2:25 PM

    Hi, Steve. I do think OO is better than procedural coding in many situations. Generally, the larger the system, or the more adaptive the system needs to be, the more I think OO excels over procedural code.

    It's not that procedural code can't be used on large projects or can't be adaptive. The main difference is that with procedural code, it's all up to the developer to make code organized and adaptive. With OO, there are elements built in that help organize code (packages, classes, methods, etc.) and help it adapt to change (encapsulation, polymorphism, etc.)

    Now, there's no guarantee that using OO will result in organized code that is adaptive to change. But often, OO code that is not organized and is not adaptive to change is not OO at all, but rather the result of big chunks of procedural code being stuffed into classes. That's not always true, but it's true a lot. All of this is my personal opinion of course, but if we're talking about the subjective determination of which one is "better", I'll freely admit that for the projects I work on, I think OO is better.

    "Using the right tool for the job" is actually somewhat vague. I do think that most projects can benefit from some level of OOP. But that's just me and what I'm comfortable with. One can certainly choose to build many of these without OOP, so in those cases, the right tool for the job is whichever approach gets the job done. It's when the projects get larger, have more developers, and require more flexibility that I think OO overshadows procedural development.

    So I suppose I'd break down my opinion:

    Very small projects, quick prototypes, etc.: procedural is probably better
    Small to medium projects: either one will work
    Larger projects : OO is probably better

    However, what personal opinion I (or anyone) may hold on the subject is irrelevant to the larger situation. As far as the software development industry is concerned, OO *has* won. All of the major languages are object-oriented. Any projects or jobs that involve those languages will require knowledge of good OO development practices. Which means that choosing not to learn OOP seems like it will eventually limit that developer to a shrinking market. It may take a while longer but eventually it's going to get pretty hard to keep writing software without knowing OO, even in the CF world.

    Returning to the 3 opinions I gave above. If you agree with me, it would seem to lead to the conclusion: If OO, used judiciously, can work nearly or as well as procedural development on small to medium projects, and is better on large projects, then it would make sense to try and learn OO. Even if you do it slowly and apply it gradually, it's still a pretty good investment.

    Does that help clarify my intention? Or does it make it worse?

  • # Posted By Brian Kotek | 5/26/09 2:43 PM

    Maybe to clarify just a bit more: The real question often isn't whether or not to use OO on a project, because at a core level, OO can be well applied to just about any project. The question is how complex the OO design needs to be. On a smaller project, it may be just fine to encapsulate your business logic in a set of service objects. One might not need a lot of complex patterns, and there may not be a factory, a DAO, or an abstract class in sight. The larger the system, the more complex the object model tends to be to support it. The challenge is to have a design that isn't over designed or under designed, but that works well and is easy to expand on when the time comes.

  • # Posted By Matt Woodward | 5/26/09 2:45 PM

    I have a ton I could say but frankly I'm sick of the debate, so I'll just say this: OO has won. If you don't believe that despite the fact that the rest of the world is doing things this way, you need to get over yourself. All you're doing by ignoring it is committing career suicide and displaying a "the world is flat!" kind of arrogance that is downright laughable.

    That being said, there is no "OO" in any "oh yes, by uttering those two letters I know precisely what you mean" sort of way. You don't "do OO" or "not do OO." And yes, you have to (oh the horror) know stuff and learn stuff and maybe even read a book or two to begin to intelligently use OO principles in your development. You don't download a framework or 8 and blindly pound away at the keyboard expecting knowledge to magically appear. You learn things, start small, gradually expand your knowledge and techniques, and understand what you're doing over time. Only by doing this will you learn to apply OO principles intelligently in your development and in a way that fits you. Otherwise you'll get frustrated and start pointing a finger at OO as the problem when in all honestly it's most likely a lack of patience and willingness to open your mind and learn that's the real culprit.

    And of COURSE you don't attempt to go from 0 - 100 mph and try applying all this stuff to a real project with a real deadline if you don't know anything about it. That's a foolhardy approach that is doomed from the start.

    Yeah, I know, no one wants to hear that it takes time, experience, and even failure to learn how to do this stuff right, and frankly (I'm gonna get flamed, but I guess I don't care) I think the people who say "OO is the problem" use this as an excuse not to learn what the rest of the world has been doing for a very long time. If you choose to believe the world is flat then fine, you're entitled to that belief, but that in no way makes it valid or correct.

  • # Posted By Charles | 5/26/09 2:45 PM

    @Steve -- Your site is pretty nice, I like the lens flare.

  • # Posted By Steve Bryant | 5/26/09 3:01 PM

    So, for everyone commenting on Marc's blog asking where they can find examples of people saying that you have to use OO, I should point them here?


    Thanks, my friend Denise at Medulla Studio made that for me several years ago.

  • # Posted By Brian Kotek | 5/26/09 3:07 PM

    Steve, since you didn't preface your replay to anyone in particular, if you're referring to me, no. I'm not saying you have to use OO. But I am saying it's probably pretty wise to learn about it and using it to some degree isn't a bad idea in most cases.

  • # Posted By Matt Woodward | 5/26/09 3:49 PM

    @Steve--the problem I have is with comments like yours, looking to point people to a blog where people are saying they have to use OO. It just seems that we're trying to vastly oversimplify what on the one hand is a dead issue (OO won), and on the other hand is MUCH more nuanced than "so and so said I have to do OO, and I respect them, so I guess I'll do OO!"

    I'd love it if we could have *intelligent* debates about this, but talking about "doing OO" or "not doing OO" and bifurcating the CFML community into artificial camps doesn't do anyone any good.

    Yes, if you're a developer, you have to have decent OO skills to survive in the job market. That's just a fact that people will have to face if they haven't already. And how do you get decent OO chops? By using it of course. It's not as if you can read a book and say "now I 'know OO' but I'll keep doing things the way I've always been doing them ... at least I'll be ready with the buzzwords on a job interview!"

    Obviously I'm taking things to a ridiculous extreme, but it seems to me that's the level at which this conversation currently stands, and it's just a little silly. If we want to talk about OO principles, the benefits of these principles (because hey, there has to be *something* to all this for the vast majority of the programming world to be using it), how they do or don't apply to CFML, when/where we might want to use them, how you can take certain aspects too far, etc. then that's a meaningful, helpful conversation to have. But splitting everyone into OO/non-OO camps and throwing rocks at each other doesn't help either side.

  • # Posted By Steve Bryant | 5/26/09 4:08 PM


    On one hand, I really agree with your point. On the other hand, you are saying "OO won", which (to me) contributes to that exact bifurcation.

    When you say "OO principles, the benefits of these principles", on the other hand, I think you are onto something much more helpful and less divisive.

    I'm a little confused, frankly, at why you say that I am being divisive. I have yet to argue that OO is better or worse than anything else. I'm trying to understand the positions being stated here and suggest that some nuance in the discussion might be helpful.


    So, you are saying it is a good idea to learn OO, but not necessary to use it on every project?

  • # Posted By Brian Kotek | 5/26/09 4:17 PM

    @Steve - Yes. It's certainly not *necessary* to use OO on anything if you choose not to. But I would say OO *can* be used in a beneficial way on most projects. Again, the real question isn't whether OO is "bad" or "good" on a certain project, since I think it will almost always be "good" if used appropriately. The real question is how far do you take it, and how complex does the design need to be.

  • # Posted By Craig Kaminsky | 5/26/09 4:24 PM

    @Matt: Since you mentioned you might get flamed over your comment, I wanted to offer support for your remarks. I think it's incredibly short-sighted not to learn OO principles in today's web/software world. It's hard, time-consuming, and, at times, frustrating but totally worth it!

    I forced a junior programmer, with whom I worked with for 3 years, to start learning OO techniques 18 months ago (learning it, not throwing it into every project straightaway). He recently sought and obtained a new job that brought him higher pay and better work (i.e., 'cooler' projects) because he knew some OO, unlike most of the other CF applicants. I'm not saying such an experience will be the norm but in a highly competitive market, knowing where the industry is trending (and knowing how to 'do' such things) is important. OO isn't so much where we're trending; rather, it's where we're at and where the immediate future is headed in our industry.

    While OO isn't the tool for all situations, you'd better have some understanding and ability to use it if you want to keep working and advancing in this industry.

    Maybe 'OO won' isn't the best way to articulate it but the fact that every web language out there (CF & PHP included, it's not just the C#'s, Java's, and Ruby's of the world) has evolved further and further along OO lines should tell us all something about how central OO is to our field.

    @Brian: thank you for the thoughtful response to Marc's original post.

    P.S. Does this general OO debate remind anyone else of the CSS-based v. Table-based design arguments from several years back?

  • # Posted By Matt Woodward | 5/26/09 4:25 PM

    @Steve--saying OO won in the programming world at large is a fact. That's my point. How we choose to implement it in CFML is the only thing that's debatable here. I'm making the point that OO won (and won a very long time ago) because saying otherwise is just ignoring reality and is not helpful to any further discussions on the topic. Putting two points of view on equal footing when they're simply not isn't healthy.

    To put it another way, if we can't all at least admit that becoming familiar with OO is important--to our careers if nothing else, though I'd argue it's certainly helpful to a lot more than our resumes--then we're not going to get anywhere with the conversation.

  • # Posted By Peter J. Farrell | 5/26/09 7:39 PM

    I was going to comment, but a blog post was born instead:

  • # Posted By Mike Rankin | 5/27/09 11:35 AM

    @Matt Woodward - Saying OO won in the programming world at large is a fact is not true. In fact, it's a damn lie. If you are looking to find the real winner in programming languages, there really is no contest. It's .... wait for it.... COBOL. A decidedly not OO language. And yet, despite it's inability to describe an interface, it somehow manages to process roughly 70% of the entire worlds business data. 15% of NEW application development in 2008 was laid down in COBOL. (stat's from Gartner circa 2007)

    If that's not enough, you've got IBM shipping REXX in everything they've made since the '80s. There is even a version that will run on WinCE.

    I'm personally involved in building a language that isn't OO or procedural, but that's a different story.

    When it comes time to throw some real horsepower at a computing problem, OO languages, while certainly contenders in many cases, are far from a foregone conclusion.

    The point is, there are plenty of non-OO languages floating around out there doing serious, important work. To claim that one particular style of language has "won" is pure hubris and just pisses people off.

  • # Posted By Roland Collins | 5/27/09 11:52 AM

    @Matt - "saying that OO a fact".

    By who's measure? That's an awfully short-sighted statement. It may hold true in certain areas, but certainly not for the "programming world at large".

    I guess we should tell all those Linux, Unix, and NT kernel developers to get with the times (in fact, we should just get rid of C entirely). Those embedded systems guys - wankers. All those functional programmers - hacks. The SQL gurus - well they're not really programmers, so let's dismiss them entirely.

    The point being that you can't possibly make that statement credibly. Sure - OO has a lot to offer for solving certain types of problems, but it has by no means "won" anything. It's a single tool in a very large toolbox. Sure - it's important to learn and to understand, but it's not the be-all and end-all of programming paradigms, nor is it the correct solution to every problem.

  • # Posted By Brian Kotek | 5/27/09 12:18 PM

    @Mike: Actually, COBOL has been OO since 2002. And even if one believes the 15% number (which I don't, I'd love to know how Gartner knew in 2007 how much COBOL was written in 2008; it also totally contradicts the TIOBE index which lists it at 0.474%), that still means 85% of development was done in something else, and that something else is almost certainly Java, C++, C#, VB.NET, PHP, or Ruby. Which are all OO languages.

    @Mike and Roland: I guess I can't understand how the idea that OO is far and away the dominant programming paradigm in software development can be up for debate. This is not hubris, and it is not short-sighted. It's just true.

    Go look at the TIOBE index ( Even if their numbers are off some (and they probably are), that doesn't matter. The top 10 languages account for roughly 97% of the usage. With the exception of C, every one of them is object-oriented.

    If some folks don't want to go the OOP route, that's fine. Everyone is free to make whatever choices they wish. But to try and justify that decision by stating that there's still some doubt about what approach is dominant in the software development world is just denying reality.

  • # Posted By Roland Collins | 5/27/09 12:46 PM


    First of all, any argument invoking TIOBE isn't actually offering any evidence. The index is a complete and utter joke and has been a laughable source for years. Delphi is #11? Really?

    But even *if* we assume it's a valid source, this is where your logic starts to fall apart - "with the exception of C, every one of them is object-oriented". That sentence is factually incorrect. Those languages *support* object-oriented paradigms, but that doesn't mean that they *enforce* them. In fact, most of those languages are intentionally designed to be multi-paradigm. And if you really think that the people who use those languages always write OO code, then you really haven't spent enough time *using* those langauges to comment.

    Just because you use "objects" doesn't mean that you're writing OO code. Nor does using a language that supports OO programming force you to write OO code. There's a world of difference.

    No one is saying that OOP isn't useful. Learn it, by all means - it will make you a better programmer. But it is by no means the "best" paradigm, nor has it "won" some imaginary paradigm competition, and if you think it has, then you're the one who really doesn't get it. It's a tool, pure and simple. You use it when you need it, to solve problems that are suited to being solved by OOP principles. Maybe every problem *you* solve is best suited to OOP, but making a blanket statement that most of the programming problems in the world are is decidely untrue.

  • # Posted By Matt Woodward | 5/27/09 1:27 PM

    @Mike--this is why I should stay out of these ridiculous discussions. People always pull out the same old specious arguments. Of course there's plenty of stuff out there that isn't OO, although COBOL is no longer one of them. Doesn't mean that the vast majority of stuff out there today *isn't* OO and that it's valid to ignore OO. That's the only point I'm trying to make. This isn't about statistics, it isn't about finding some niche where OO never took off, it's about being knowledgeable in the paradigms that are used by the vast majority of the programming world and being employable in today's market.

    Now let's get real. Just make a list of all the languages that are OO and those that aren't, and tell me which ones are most in demand in the job market today. That's my point. It's not about lines of code, maintaining legacy code from the 80s (probably steady work but certainly not something that's in high deamnd), etc.--if that's all you care about then be my guest and go learn REXX. But I don't think most of us are looking at things that way.

    @Roland--we're talking about web/application development here, so don't take things out of context just to make a point that I wouldn't argue anyway. If you're writing Linux drivers then of course you're going to be using straight C. That doesn't make my assertion less valid for the context of this discussion. And by me saying OO has won, I'm not saying it's the right tool for every job. Did I say that? Reading back ... nope, I sure didn't. So don't put words in my mouth.

    Since those of you jumping down my throat apparently still missed my point, I'll state it again: if you ignore OO it's at your own peril. I never said it's the right tool for every job. I never said if you don't "do OO" (again, not that that statement has any meaning, but since that seems to be the fallback everyone uses ...) that you're a bad programmer. But I can't stand by and listen to people act like ignoring OO and not becoming well-versed in it is a valid choice in today's programming world for web/business application development. Unless you're in one of the non-OO niches that the OO-naysayers love to drag out every time this discussion comes up, ignoring OO and refusing to learn it is quite simply a very bad move for your career.

  • # Posted By Brian Kotek | 5/27/09 1:35 PM

    @Roland: So now we're down to the assertion that even though all of the dominant languages support OO development, most of people using those languages don't actually use those language features?

    To me, that's like saying that lots of people have a bed, but there's no way to prove that most of them actually sleep in it.

    I guess we'll just have to agree to disagree. It would be interesting to run an experiment where two resumes were put online, one listing "OO Development" as a skill, and the other listing "Procedural Development" as a skill, and see what happens.

  • # Posted By Steve Bryant | 5/27/09 1:52 PM

    Did someone suggest ignoring OO? I guess I missed that.

  • # Posted By Brian Kotek | 5/27/09 2:17 PM

    No, Steve. Everyone seems to agree that understanding it is a good thing. There just seems to be some disagreement on whether one of the reasons why is that it's the dominant paradigm in software, and not learning it will be very limiting from a career standpoint.

  • # Posted By Peter J. Farrell | 5/27/09 2:41 PM

    I don't care which paradigm has "won". I care about my future employment. My future earning potential for myself, my wife and if I have kids some day. The biggest thing for me is that I learned about new things whether it is OOP, TCP/IP, encryption, caching strategies or javascript. Any and all of those new things have changed how I program... for the better (even though sometimes I wanted to hit my head against the wall as ask... why do I bother?).

    We're in a free country to make the decisions as we feel fit. Personally, I feel that that learning object oriented programming has been a benefit to my career. I feel that learning other languages such as Java, Python or Ruby is good for my career. And while I don't program Ruby or Python very well, it's an on going process.

    I think that these are the most common arguments against OO that people use. I'll comment after each of them.

    * It's the wrong tool for the job

    Yes, it could be. How would you know if you don't use it? Take two people faced with the same problem. It may end up being the right tool for one and not the right tool for the other. It's impossible to answer this question because it's subjective and not objective in nature. Also, as humans we always want to right and not wrong. I know I'm wrong half the time anyways, so it's like water off a ducks back.

    * It's slow (i.e. procedural runs faster)

    Yes, it could be. Programming is somewhat a mix of art and science. Just because I give you paint and canvas does not mean you'll be able to paint the Mona Lisa. Take the same two people faced with the same problem again. It depends what you are architecting for and what the requirements of the application are.

    * I don't have time / I can't bill for it because it takes me longer

    This is more of a valid argument to me. However, I want to point out that learning about OOP means you have to spend time actually coding not just reading a book or taking a class. I like to think this as practical "art" or you may just think of it as experience. With practice, you get faster and you refine your techniques and hopefully get better.

    The last argument is one of concern for me because the world of software development is a fast paced world. This translates to always learning about something new because by the time you learn it -- there are 10 more times you need to learn. I know most people say they don't have time to learn lots of new things because they burned out or just plainly not interested. I attribute this to the always on lifestyle that the digital world as brought us.

    My advice is to switch off the TV, blogs, email, IM and twitter -- viola -- you have time to learn. I attribute half of what I've learned in the past 4 years to working on Mach-II and learning from Kurt, Matt and others. They learn new stuff, I get the nice tasty morsels and the same goes in the opposite direction. It's almost a learning collective. Also, learning doesn't occur in a vacuum -- interact with people that have my experience in a given area or someone to bounce ideas off of.

    Honestly, this argument of OO versus procedural is just silly. Either you subscribe to expanding your mind through different ideas or just keep doing what you've always done. Nothing we've discussed in this thread indicates that a silver bullet exists and I think that searching for it is what drags people down. So let me be clear. There is no silver bullet. Stop looking for one!

    Personally, I view the arguments against OO as "excuses". Please do *not* take offense to that but railing against it makes people feel better if they are having a hard time learning OOP or if they don't want to take the time to learn OOP. Don't make excuses to justify a reason to not learn it in the first place.

    My reasons above on the common threads against OOP should **not** be perceived as flames or baits. No insults have been targeted to anybody in particular. Ultimately, I want to always learn about stuff and understand how things work. It's an important aspect of my life so I'm sticking to it and I'm glad it followed me into adulthood.

    Feel free to flame me or bait me. I'll just keep trucking along and keep my head down anyways.

  • # Posted By Hal Helms | 6/1/09 10:44 AM

    My concerns about OO in CF aren't because...

    1. I don't know it
    2. I'm unwilling to learn more
    3. I prefer procedural code
    4. I'm trying to excuse my shortcomings

    Rather, my concern is that the ColdFusion world has, in large part, embraced OO while Adobe continues to produce faulty implementations. Worse, OO has become something of a religion in ColdFusion, despite the problems with it as it's implemented.

    I wouldn't be writing this about Java or Ruby or Python. But ColdFusion developers are being urged to adopt something that, while wonderful in many languages, is clearly deficient in ColdFusion. That, coupled with OO as a religion, is a recipe for frustration.

    You can see my own take on Marc's rant at