Maintainable, but for whom?
Jdn is making an excellent point in this post:
Okay, so, TDD-like design, ORM solution, using MVP. Oh, and talk to the users, preferably before you being coding.
One problem (well, it's really more than one). I know for a fact that I am going to be handing this application off to other people. I will not be maintaining it. I know the people who I will be handing it off to, so I know their skill sets, I know generally how they like to code.
None of them have ever used ORM.
None of them do unit testing. One knows what they are and for whatever reason hates them. The others just don't know.
None of them have ever used MVP/MVC, and I doubt any but one has even heard of it.
All of them are intelligent, so could grasp all the concepts readily, and become proficient with them over time. If they are given time by their bosses, or do the work overtime, or whatever.
There is a 'standard' architecture in place that they have worked with for quite some time. I personally think it blows, and frankly, so do most of them, but it is familiar, and applications can be passed between developers as they use a common style.
There are several things that are going on in this situation. The two most important ones are that the currently used practice of bad code, is also (luckily) wildly recognized as such and the people who work there are open minded and intelligent.
Before I get to the main point, I want to relate something about my current project. If you wish to maintain it, you need to have a good understanding of OR/M, IoC and MVC. Without those, you can't really do much with the application. That said, good use of IoC means that it is mostly transparent, and abusing the language give you natural syntax like FindAll( Where.User.Name == "Ayende") for the (simple) OR/M, and MVC isn't hard to learn.
Back to Jdn's post, let us consider his point for a moment. Building the application using TDD, IoC, OR/M, etc would create a maintainable application, but it wouldn't be maintainable by someone who doesn't know all that. Building an application application using proven bad practices will ensure that anyone can hack at it, but that it has much higher cost to maintain and extend.
I am okay with that. Because my view is that having the developers learn a better way to build software is much less costly than continuing to produce software that is hard to maintain. In simple terms, if you need to invest a week in your developers, you will get your investment several times over when they produce better code, easier to maintain and extend and with fewer bugs.
Doing it the old way seams defeatist to me (although, in Jdn's case, he seems to be leaving his current employee, which is something that I am ignoring in this analysis). It is the old "we have always done it this way" approach. Sure, you can use a mule to plow a field, it works. But a tractor would do much better job, even though it require knowing how to drive first.
Comments
Another brilliant post - I agree above
The point you leave out:
"although, in Jdn's case, he seems to be leaving his current employee, which is something that I am ignoring in this analysis"
is the key point.
To understand and maintain MVP/IOC/ORM/whatever requires experience. Or teaching.
The vast majority of code doesn't require understanding any of MVP/IoC/ORM/whatever. More over, the vast majority of that code is happily supporting businesses that are worth billions of dollars.
Given that, what is the ethical obligation of someone to support that business, especially when there is a common architecture?
My answer is: Building software that fits the architecture of the business.
In a perfect world, you can build a better architecture, teach people along the way, etc.
But I live in the real world.
I am on a 2 million dollar large software product. I would call that the real world. A year ago, 6 out of the 8 developers had never heard of TDD, DDD, O/RM, Continous Design, or even programmed in C#. They were used to banging out stuff in a previous version using VB6. We decided to do right from the get go. We did teach people along the way, first doing 1/2 sessions every day, where we made all the developers draw pieces of the software on the whiteboard and taught them code smells and the like. We made it a decision that we all agreed to do everything in pairs. The two of us who were experienced made sure we paired with everyone else to teach TDD, and everything else. A year later, all 6 of them can do it, we have 2,000+ real world unit tests.
All I ever hear you saying jdn is you can't, you can't and how all of us who have worked our butt off to get here are elitist and not in the "real world." You blame everyone else. But the fact of the matter is none of us are smart, we just keep working hard on it and make it happen even when it was real tough with management. You are smart too. You are reading this stuff. You can learn it too as all of us did.
No, I am not leaving it out.
I am excluding your case specifically, because it is a unique one and still there are ways to teach developers good practices quickly.
When the common architecture is known to be bad? To change that, would be my answer.
That is a sad approach, frankly. Because it presumes from the start that you cannot really do much about what is going on around you.
In the time that it takes to finish the common project, there is a lot that can be done in the development environment, especially when working with intelligent people who already know that the current approach sucks.
About four years ago, I worked for a company in which I was one of the developers who maintained an application largely written by a couple of much more savvy consultants. Although far from perfect, the application was written with good practices in mind. At the time, I was a VB6 guy who had started developing in C# not long before. My team knew nothing of TDD, MVP, ORM, etc. I thought MVP stood for "Most Valuable Player". After the consultants moved on, I was in charge of the first set of enhancements and bug fixes for this application. Even though I helped with the original development, I was faced with an application I didn't fully grasp. I cussed it at first, but soon came to realize I had a good application to learn from. It was my first experience with TDD. From there, I got hooked on TDD. From the lead consultant, I learned what design patterns were, what MVC was, the basic concepts of agile development practices and of course, TDD. It was a lot to digest at first, but I was fortunate enough to have a real-world application to learn from, and a newfound awareness of what I should learn. Those consultants could have just as easily have written the typical procedural style applications our development team was used to seeing, but they chose to do it right. That was a turning point in my life as a developer. Half the battle (probably more) is just making people aware of what TDD is or that things like ReSharper, NHibernate, Rhino Mocks, etc. exist. I was fortunate that somebody decided to leave behind a nugget of gold rather than the usual lump of coal.
I struggle with this kind of question every week. The last two years I have been working on maintaining and extending a large legacy system, after a mayor refactoring I usually show what I have done to my colleagues and try to explain how the code is now more maintainable, testable and reduced in size and sometimes I am met with comments like "No one but
you will understand that".
Sometimes I find it hard to explain the benefits of NHibernate and Castle Windsor.
Sam: you forget that the project you're on isn't a project which is already 2 years in production and you have to add new features to it worth 2 million. That's a total different game, but it's the game of software maintenance.
You all miss the point. The point isn't about focussing on technicalities: 'what's O/R mapping, what's an IoC etc.'... that's reading some manuals and you're done. Knowning these doesn't make the software suddenly maintainable, THAT'S the key mistake you all make.
The key to maintainable software is knowing why a piece of code is written the way it is written, i.e.: why algorithm A is chosen and why alternatives B and C are ignored, and why the feature is kept simple and thus the design kept simple and why the design of a feature didn't take into account some more flexibility for extension.
A software maintainer needs answers to these questions so s/he can make the proper decisions where (!) to make changes. This has nothing, absolutely NOTHING to do with the usage of e.g. o/r mapping or IoC.
For the people who still don't get it: NO, I'm not talking about the question WHY IoC was used, that's not something a software maintainer is interested in.
Saying that teaching the developers that TDD etc. is better is showing the lack of understanding what it takes to maintain a large piece of software which is already in production and THUS a lot of dependencies are already in place.
Oh, maintainability isn't about fixing some bugs. It's about moving a codebase from version 1.0 to v2.0 in the right timeframe, with the resources provided, and with the knowledge that the original team is likely not working on it anymore and has moved on to other projects and perhaps other employers.
It's sad to see that there's so little understanding about this very important subject. In the next couple of years, software maintenance will become the key point of software engineering simply because there won't be enough software engineers on the planet to maintain all the different applications.
Frans,
Thanks for your well expressed comments and you are absolutely right then it isn't about "technologies" but what you do INSIDE the code that will either make it less or more maintainable.
But IoC and things like that are PRINCIPLES, not technologies that lead to code that is able to be extended or continously evolved without ripping up major pieces. Which O/RM one chooses may be a technoplgy choice but O/RM in itself can and should be discussed as one of the things one can do to write cleaner code.
I also made this comment to your comment on my blog. I am sorry if we gave you the impression that TDD was everything. It helps on the OUTSIDE but its is just ONE out of 12 practices that we follow in Extreme Programming. They have to be done together to have any benefit. We also develop the INSIDE of the code, the algorithms and such in a Continous Design manner constantly refactoring the code looking for simplifications and making it cleaner. When we talk about this and say the code is "extensible" its not so things can be plugged in, its so the maintainers can come in and make changes 2 years later without ripping everything apart. If you don't introduce code rot into a system and you continually work aggainst, it isn't there in the future. 18 months into our project, 2 new people came in from the outside with hardly any skil sets and although its been hard for them, they are able to look at BOTH the unit tests and the well factored code as well as pair to uunderstand the INSIDE as well.
I hope this helps, I think this is an excellent discussion. If there is something I am not explaining well or you think I am not not considering, I am happy to talk about it. Thanks.
For me, one interesting issue is whether using a framework or an approach (O/RM, IoC, DI, AOP) can move more code away from having to be maintained.
For example, the maintainance team wouldn't have to maintain NHibernate. Also, it would be safe to assume that the code using NHibernate wouldn't be longer than the equivalent code not using NHibernate, or it would be a no-brainer to go without NHibernate.
If the remaining code (the code outside NHibernate, that is, your application) would be smaller compared to the code that wouldn't use NHibernate, the question appears to become: Is it (the NHibernate using code) so much more complex that this outweighs the value of it being shorter?
That would appear to be the question. But is it really a good question? What if we answer "Yes, it is so much more complex that we shouldn't use it, even though it is shorter"? Is that a good answer?
I don't think it is. In my view, development has always been about learning how to use new tools, frameworks, abstraction levels, languages etc etc that will help us express more using less code. Otherwise we could all just code in assembly language. For every step of the way, I think there was a time when the new step ahead was still so new that it appeared to be more complex than could be motivated by the reduced LOC. Had this argument always been heeded, we would indeed still be programming in assembly.
If the frameworks offer more powerful expressions in the code, the move towards using them will be almost inexorable. To ask developers to stay away from them because the maintainers may not yet have picked up on them is thus an argument that may occationally be the realistic observation in the real world, but that should more generally be regarded with some suspicion, imho. To accept the argument is to succumb to the specifics of the situation. Thus I would see Jdn's argument to be more short-sighted, but in his particular situation perhaps quite correct, whereas Ayendes is more farsighted but perhaps (unfortunately so) unrealistic in Jdn' situation.
I agree with everyone, but mainly Frans.
Did I mention there was a common windowing system, where this application will sit with (not sure of the real count) at least a dozen other applications? And that I have to make it play nice with that?
If I had the time, the expertise and the clout, I would take the 'far-sighted' approach. But:
I don't have the time. Literally.
I may or may not have the expertise.
I definitely don't have the clout. Let me speak one word: committee.
Sam, do you have any idea how lucky you are to be able to have done things the way you were able to? I understand that you in large part created your own luck, but you still have to be in the right situation in the right circumstances to make the call.
You can't simply re-write an architecture used by dozens of people because you want to. If I knew I was going to be there for a longer period of time, I might go ahead and make the argument. If I was a full-time employee, I'd most likely just go ahead and do what I wanted to, and let them fire me. I'm a joy to work with that way.
But, when given a prototype of something to work on for a bit before passing it back.....I'm still going to ORM it if I can, because nHibernate is approved, but anything much else is really out of scope.
I should have left the 'But I live in the real world' comment out previously, but this is what I was getting at.
I appreciate your situation jdn and have been in it many times. It is extremely fustrating and I don't want to at all minimize it. Its very real and when I have been in it painful.
All you can do is go forward an do the things you are doing. I, for one, are impressed by your deep desire to learn (something I like to think we all share) and all that you have been bringing to these conversations. It is so valuable for all of us to think outside our litle worlds and understand each others.
Doing TDD/Unit Tests in a legacy environment is very difficult. It requires far more skill, and not just of the technical variety. It is not always the best return on value to create unit tests for legacy code. Many people have found FitNesse and integration tests in general to be a better return on value for legacy code. For all new work TDD is still the way to go, and yes it is more difficult than a green field project.
What would I like you to focus on in the previous paragraph? Not the solution to the problem. The implied wisdom in the approach to solve the problem. "Many people have found" XXX helpful. That there is a community out there that can help you. You are not the first person to experience what you are going through. As much as you may think you want to be you are lucky that your situation is not special. Other people have been though it and found solutions.
Where does this leave you?
First you will have to accept that there is an answer. Second that you might not like the answer, it will require you to leave you zone of comfort. Third you will need to experiment with the answers you find to make them work for you. Forth and most important you will need patiences with your self and others. This is a path not a destination.
Now I want to be clear that I and many others in the community will support you in your endeavors to grow in this area. You can email me directly if you want. I have helped many people and been helped by many people. We want you to find you way and be successful.
Lot of good points brought up in this discussion. Unfortunately too often many software managers fall into the trap of thinking that developers are "plug and play" in a project and assume they can be added/removed as needed.
When you have this attitude toward development talent, adopting things like MVP, ORM, TDD, becomes very difficult as it has become exceedingly difficult to find developers that are familiar with these approaches, let alone those who have experience with them. Adding new developers and expecting them to "plug right in" is more difficult when using an "off the grid" approach like TDD or ORM.
Unless the management makes it a prerogative to keep existing developers happy and provide sufficient ramp up time for new developers, it will always be difficult to adopt new approaches that have not yet achieved widespread adoption.
I have been in varied environments:
Working on enterprise app that talk to old mainframe apps at Dell
Web 2.0 stuff in start-up land
ISV products including inheriting the nasty code
greenfield products where I did everything "my way"
coaching software teams.
It's definitely easier when starting from scratch, but I've also had to inherit other code and make incremental changes. I've done maintainence, but I still insisted that the code HAD to get better. I found it materially beneficial to read Michael Feather's book, "Working Effectively with Legacy Code"
http://www.amazon.com/Working-Effectively-Legacy-Robert-Martin/dp/0131177052/ref=pd_bbs_sr_1/105-9499862-9761233?ie=UTF8&s=books&qid=1182110426&sr=8-1
Karthik,
I would assert that any place that treats their employee in such a fashion is not a place that I would like to work for or with.
When I was in the army, the ultimate place for plug & play mentality, there was a significant emphasis on making soldiers happy, and a true understanding of what it was to have a good soldier serving with you. Those are rare, and people fight over them.
To suggest that you can replace one person with another, even given they have the same training is ludicrous.
I'm glad to see the wide and varied discussion.
So let me go ahead and make a comment that will probably appall (at least) Ayende.
"Those are rare, and people fight over them."
In a perverse way, I can see, from the perspective of a business, why having good/great developers, who bring in advanced programming techniques, can be a business risk.
Especially in America (though I would bet this is global more and more), you have to view all employees as being replaceable, because the good/great ones will always have better opportunities (even if they are not actively looking), and turnover for whatever reason is the norm not the exception.
Suppose you are a business with an established software 'inventory', and suppose it isn't the greatest in the world. But it gets the job done, more or less.
Suppose an Ayende-level developer comes in and wants to change things. We already know he is a risk because he says things like:
"not a place that I would like to work for or with."
He's brilliant, but he's definitely a risk to split.
On the other hand, there are people who know how to work with, just to use an example, Infragistics controls. Or typed datasets. Or drag and drop. All things that make me cry and weep.
Mediocre code that can be maintained by a wider pool of developers is in a certain respect more valuable to a business than having great code that can only be maintained by a significantly smaller subset of developers.
I can see why the previous sentence may appall people. But I still, for the life of me, can't see any argument that tells me it is false.
What I'm trying to do with my own business is to do things my way, things that (I hope) bring considerable business value, and give me competitive advantage.
At the same time, I am going to be engaging clients that have existing systems. I can take the Ayende/Bellware attitude (since Scott has said the same thing in a different manner) that 'if they don't do things my way, I won't do business with them.'
And believe me, I have to fight with the people I partner with that there are certain things that are sort of baseline items that I insist on.
But at the same time, I'm offering services for clients. I can't disrupt their business because I don't think their code is pretty enough.
What I can do better, going forward, is learn to make the incremental changes that gets them on their way to prettier code. My attitude is not "well, I can't do anything so I won't even try."
But at the end of the day, I have to do what is best for the client. If that means typed datasets (picking on them, but include anything you personally cringe over), then I can partial class and override to make them better, but typed datasets it will be.
jdn,
Well put! I think with software, as with anything else, there is a reality out there that there are more businesses out there that choose to imitate rather than innovate. Imitators will choose the common and widely chosen path simply because there is less unknown risk involved. Innovators will stay at the cutting edge and take risks that the imitators won't. The reality is that innovators will always be outnumbered by imitators.
The same applies in software. Ayende and others may look at an imitator software company and decide to say "not a place that I would like to work for or with" and that is perfectly acceptable. If you understand the limitations of such a view and can still succeed then more power to you!
"Because it presumes from the start that you cannot really do much about what is going on around you."
This has become such a common-place attitude in companies these days, it can almost be assumed to be the personality of the project walking in the door. People need to change their mindsets about what can be accomplished and learned in relatively short periods of time within the lifecycle of a project in order to improve that project. Certainly folks like Oren, Frans, CodeBetter and many, many others are so vocal about how to address these issues through principles and practices (typically with an XP focus) and of course technologies (MonoRail framework, Resharper and TD.Net add-ins, NSpec) that slowly but surely the voices are being heard to alleviate these attitudes.
"In the time that it takes to finish the common project, there is a lot that can be done in the development environment, especially when working with intelligent people who already know that the current approach sucks."
If you want to be better at your job, you surround yourself with people who are better at it than you are. Having smart, driven and passionate people around you that are focused on change, and know how to implement those changes gradually and effectively are key to improving the processes. I recently went to a client that had no CI, no TDD practitioners, no pair programming, but they have willing and smart developers and want to make the changes because they have realized over the years, through the experiences of others, that certain things have to be put in place and start to happen if they want to be more successful. Companies that are blind and deaf to those changes or just plain unwilling to change are the companies that I avoid.
"Building the application using TDD, IoC, OR/M, etc would create a maintainable application, but it wouldn't be maintainable by someone who doesn't know all that. Building an application application using proven bad practices will ensure that anyone can hack at it, but that it has much higher cost to maintain and extend."
And so is, I would guess, the blue print of the majority of software life cycles these days. Change is on the way however. I've been in jdn's situation, and I avoid it these days because I can right now. That might not always be the case. Its a shitty situation from my perspective and I admire anyone who is still in it for the fight that they must endure. Discussions like these always help in keeping these things in the forefront of the community efforts to alleviate and save our fellow developers.
@jdn
I can identify with your environment, and I have been in the same boat. As a consultant (now), I'm working with just a normal team.
Jdn, I detect two big assumptions that might be the base for this point. I'd like to address those specifically.
****Assumption #1: Mediocre code can be maintained easily by average developers.
I have found that mediocre code is costly to maintain, but it can suffice for a period of time. That period of time ends when the system has to change significantly or integrate in a different way than originally planned. If the system never has to change or only change marginally, then mediocre code may suffice. The business must understand they are paying for this constraint, however. Often, the system must change in a significant way, and this is where mediocre code breaks down. The end result is often a system rewrite, and we have somehow convinced business executives that is has to be this way.
What is the price of maintaining mediocre code and the subsequent rewrite 2 years later? I'd venture to guess it's more than the business executive was expecting.
****Assumption #2: Great code can only be maintained by the top XX% of the industry.
I have found this to be absolutely false. Even for a developer who has never written object-oriented code, this code is easy to read because it contains objects that represent the real world. All the developers understand the real world.
Great code isn't complicated. On the contrary, great code is much simpler than mediocre code. Mediocre code that finds its way into the codebase is ruthlessly refactored into great code. Great code is so lean that it is maintainable by anyone, especially average developers. With great code, there is no concept duplication. . . every class has a single responsibility. . .concerns are separated into different classes. When behavior needs to change, there is just one place to go, not several.
Great code is so easy to follow, anyone can maintain it.
If it's hard to maintain and hard to follow, ____it isn't great code____.
@Jeffrey Palermo
to your assumption #1:
This is a common comment, and I agree with it.
Having said that, I don't know how to say this often enough.
And I mean this sincerely.
There is a huge amount of code that is developed that never needs to be re-written. Not in any significant way.
Once you write it to do the job, it doesn't change.
Or it changes in a minor way. To the extent it is drag and drop, it is supportable by just about anyone who has used VS.
Assumption #2:
Any beginner knows how to drag and drop, because that is what VS IDE supports. Not any beginner knows how to do MVP.
There is no argument otherwise. Do you understand why separation of concern is hard? It simply is.
It's highly teachable. It has to be, since I was taught it. But it isn't easy.
@jdn
"There is a huge amount of code that is developed that never needs to be re-written. Not in any significant way."
But like you said, it can change, and in poorly written code, even minor changes can cause gigantic problems. When you say "Not in any significant way", this is misleading. The business rules that need to be applied or changed may not be significant, but the coding behind it may be difficult, and poor design is often a major contributing factor to the difficulties. So just because it doesn't have major revision type changes, this doesn't mean the changes, due to poor design, are not major. Poorly written code is also non-extensible, at least easily or in the standard ways you would expect.
On assumption 2, Jeffrey is correct. Well written, self-documenting and test covered code is much, much easier than trying to figure out poorly modeled drag and drop applications. I've been in both situations from a maintenence POV and wouldn't work on a drag and drop built application just because its too hard to follow and understand what is going on.
"Any beginner knows how to drag and drop, because that is what VS IDE supports. Not any beginner knows how to do MVP."
Drag and drop has a bigger learning curve, for me, than learning common and simple architecture patterns such as MVP, and its always been that way. I still, to this day, cannot drag and drop very easily, and can code an application must faster without using it. Controls are difficult for me to use, which may be the reason I steer away from UI in the first place. This is just the way I've been brought up by my peers. So while your statement may have some truth, keep in mind that not all advanced developers can drag and drop, so I would venture to guess that not all beginners can either, so they might as well be led down the path of better code.
@Raymond
How can I say that I agree with everything you say, but still disagree?
Maybe it is just the things I've worked on or been contracted to work on.
But I can tell you, there are applications that I've created or upgraded that were never to be extended. They were/are designed to satisfy one or more specific business requirements where you needed to do.....X.
And 'X' was very specific. Just do it, and you are done.
And I would say, emphatically, it is precisely because you are advanced that you can't handle drag and drop as well as a less advanced developer.
This is one of those things where I bet you and anyone else that I could demonstrate in a demo. I swear I am not making this stuff up. What I have been calling 'mediocre code' is vastly easier to maintain.
It really is. I think Frans and Karthik know what I mean, and I'm not just saying that because they agree with me.
I'm not advocating mediocre code. Not at all. I think it is better for the community to advance better programming techniques, and to spread them in bits and pieces, if that is all you can do, or in huge chunks if it fits.
But, again, I swear I'm not making this stuff up. Sometimes, the only thing you can do is try to advance clients in pieces towards a better way, and to try to do more than that is actively harmful.
I can relate to many of these comments - I joined a small company in 2005 as a second developer. The original developer had basically wrote the app themselves, organically, with no documentation or comments.
Did I mention the business layer was written in VB6? With a classic ASP front end? (it was written in 2004 - not 1997)
This developer really didn't have a clue about design patterns, object oriented design, normalized database design, and leveraging functions and views with stored procedures. Business logic was in javascript, asp, VB6, and t-SQL - a total nightmare. I don't consider myself a slow learner, but trying to make sense of this code took a while, because I would literally have to step through it. And tables in the database with Fk-Pk relationships used different names for the related columns, so that was fun.
Did I also mention the developer was an arrogant jackass and really didn't help me out because he thought I was a threat?
I suppose he was right. I extolled the virtues of C# to the managers/owners. I didn't know everything, and I still don't (that is why I read blogs like this) but I had a solid Java background and CS degree. And I got my chance to build a new sub-app using objects & Nhibernate, struggling, making mistakes, then figuring out what I did wrong. Meanwhile, the original developer complained about having to maintain HIS OWN code and not having enough time to learn .NET (he was told he could take a couple hours a day to read - did I mention he was a jackass?)
To make a long story short, the original developer left because he felt I pushed him out - not really, I just refused to do things the old way,and proved I knew what I was doing. I started change gradually, and now I am managing 3 developers and the upcoming re-write of the legacy VB code into C#/Nhibby (no business logic in SQL!!!).
I do have another big hurdle left, because I am the only one that really understands Nhibernate/IoC, so what I am attempting to do is laying down the foundation and test examples that others can build upon without knowing gory details about Nhibernate/IoC.
Clean, well documented code with test cases is easy to read, understand and maintain, and that is my goal. And I want to thank Ayende and everyone else who contributes here for helping make me a better developer with their insights.
I've never seen this, even in a project that once 'delivered' never has changing business requirements, because you have to change the code during the development cycle.
If you have to fix a bug, then you have to change the code; if the user provides feedback during acceptance testing that they want something to behave differently, then you have to change the code; as the code grows you gain new insights into the overall architecture, you want to be able to change the code to accommodate them. I've never seen anyone write a class, and never have to change a single line of it after they finish typing.
From the point you hit save on version 0.1 of a file, it is effectively in maintenance and subject to change, and many of these techniques will help you to adapt and grow as you need to change.
The benefits don't just come at the end of the project, they come from early on and just keep on giving.
Conversely every project that dumbed down its code, because the 'developers were not smart enough to get it', because companies aspired to mediocrity has ultimately failed, because the architecture rapidly turned into big ball of mud. I have seen too many projects which would 'never need to change', have new requirements (usually long after the guys who wrote the dirty version have upped and gone - I am convinced some people have never stayed with software they write long enough to see the pain) suddenly require extension; and that extension is expensive and painful, just because of the choices that were made. T
echnical debt kills not just projects, but whole development teams, because their inability to continue to deliver to changing requirements means that the users lose faith in them. I have seen this time and again, and that's why I have embraced Agile, because right now it seems to have the best toolkit to stop this happening.
I agree with Jeff here: anyone can write complex code, writing simple code requires discipline and effort.
I can agree with many of the sentiments here. Prior to Microsoft, I've worked in organizations such as JDN mentioned. What I've seen time and time again is that the only benefit of "mediocre" coding is faster time to market. However that code is often extremely buggy and an order of magnitude more difficult to maintain in the long run. At my last company they "cranked code" in VB6 on a code base that was 20 years old (it had been converted). The code became increasingly more unmaintainable such that the only option left was a complete rewrite.
In this kind of situation, I don't blame the developers. I blame the management. Management is the one who allows the situation to continue. Contrary to popular opiion, many developers (even the non alpha-geeks) actually do want to learn something new. They simply need management to step in and give them the green light.
My $.02
Glenn,
Agreed, but "the only benefit of "mediocre" coding is faster time to market".
I am willing to challenge that as well.
On mediocre code has faster time to market when mediocre coding techniques are the rule, based on the standards committee.
This is common in consulting - especially in clients that have a history of mediocre code, short term decisions, undervaluation of technology, and numerous other anti-patterns.
The cost to the developer who wants to start doing TDD, but in so doing, takes on a ton of career risk (be it demanding a different consulting gig, trying to convince a client to change their approach, etc.) - needs to be very convinced that all these practices are not only correct, but easy, or at least feasible, to sell.
Easy to sell may happen at agile conferences, but at certain clients - the kind that are, say, just now upgrading from VB5 apps written 12 years ago - not so much.
The path of least resistance sometimes becomes the reason why mediocre code rules the day. I hate it when it happens, but in many companies, where consultants have to meet the demands of account execs who over promise and under deliver in order to nail a commission (standard disclaimer: not a problem at my current company).
Mediocre code is faster to get into testing maybe, but probably takes longer to clear testing. You could always ship faster by eliminating those pesky testers.
Time to market
Anecdotal evidence: I was VP of Engineering at a start-up company where time-to-market is crucial. Some questioned whether disciplined practices could work at a start-up or if it would slow down delivery with all its technical analysis, clean OO design, ruthless refactoring, full test coverage, no bug mentality, etc. Just for the record, the team was 3 people, and I mandated a "no bug" policy. I'm off my rocker, you might think, and they did too. . . at the beginning.
I trained the team on disciplined coding. I did not allow any shortcuts. We wrote unit tests for each bit of code as it was written. We had separation of layers in the application. We actually did use NHibernate with developers who hadn't even heard of it before.
End result: We had a few bugs, but not enough to warrant a bug database. The bugs that did pop up only lived for a couple of hours before being squashed. Eventually, bugs became a real rarity because the quality of the application was so high.
90,000 lines of code, complete application, 8 months. Full test coverage. Roughly 2000 unit test and many full system tests with FIT. No bugs outstanding. 3 people.
To this day, the code is evolving as changes are being made. Most of the codebase is in maintenance mode, and it's a breeze to modify.
Time-to-market: lightening fast. Insisting on top-quality work sped up the team.
@Ayende
So what I meant to say was 'perceived' faster time to market, i.e. it's a fallacy. Meaning that you can may be able to get something into the hands of the customer more quickly. However, what you get them is riddled with bugs, impossible to maintain, and extremely delicate and furthermore becomes dependent on the original developer who wrote it (all the other devs say 'I am not going near that thing'). Ultimately the amount of time you spend trying to clean up the code will be several times what it cost to write in the first place. If it was just built right in the first place....And by right I don't mean using the waterfall method either.
I still go back to that crappy little program in the shipping department that I talked about in my original post.
That crappy little program was surely mediocre code as I mean it. It has to be, it was written in Access. Or Visual Basic. Or both.
Whatever, it was written in 1994 or something like that. And I, as a non-programmer, upgraded it to Access 95. Or something, it was a long time ago.
If the company were still around, that crappy little program would still be around. Maybe upgraded to .NET because....well because software that's been around forever seems to need to be upgraded at some point. I'm not sure why this is.
I don't remember if it was riddled with bugs, but it did what the business needed. The shipping department lived off of it.
There is a lot of software out there like that. From what I've seen, so take that as you will.
I think there is a crucial lesson in there. Some where.
"That crappy little program was surely mediocre code as I mean it. It has to be, it was written in Access. Or Visual Basic. Or both."
I disagree with that statement. I have worked in environments where developers wrote fabulous code in VB6 / Access even with the constratints the language imposed. True you did not have TDD and true OOP, but that does not prevent you from writing good refactored and reusable code. I've written plenty of bad code in my time, but i can't make exuses for it....
Sorry, I should have put in an emoticon or something, since I was just being a smart-ass with that line.
@jdn
If you are looking for a scenario where mediocre code suited the business need, you can find one, but I don't think that is the rule.
The existence of a situation where a business was happy with mediocre code does not constitute a rule that all businesses are served well with mediocre code.
My experience has taught me that more business needs are served with great code than with mediocre code.
Do you always have to be disciplined in development to make a tool the business will love? No. Can you always be undisciplined and satisfy the business all the time? No.
I contend that businesses function despite mediocre systems, not because of them.
My base position is that great code always serves the business better than mediocre code.
By the way, if the app is an access database, it can still be great code.
@Jeff
"The existence of a situation where a business was happy with mediocre code does not constitute a rule that all businesses are served well with mediocre code."
Agreed.
"more business needs are served with great code than with mediocre code."
This is either a tautology or a statement that is either true or false.
Assuming it isn't a tautology, I don't know how one would go about proving it one way or the other. It certainly sounds true.
Having said that, I've done a number of conversion jobs of software that had been around for quite some time, where the old code certainly didn't qualify as anything but mediocre. And provided significant business value.
And yet it was being converted because of many reasons (though one reason that it almost always used is "The code is old"....so what? ).
I have no way of proving that this is true, but I think this is more the norm than the exception. I think it has to be because compared to many disciplines, software programming is in its infancy.
I just looked it up, so things like 'Separation of Concerns' were talked about in the 70s, but when were Design Patterns first codified by GOF, 1994?
Not that those are (necessarily) necessary or sufficient conditions for what produces great code
So, almost by definition, most code has to be, in some respects, mediocre.
And I haven't the slightest idea if any of that is correct. It's somewhat an academic point anyways.
I can completely relate with jdn but my current situation is worse case scenario for any developer. For jdn who is leaving his company, he is conflicted over which approach to go with regarding the maintenance of the software after he leaves.
However, working at a fairly well known Fortune 500 company that while technology itself is not its core business it is heavily reliant on technology to drive its business (what company isn't?) However, because it is not core to their business it is viewed and treated more like something that is a financial burden rather than something to invest and develop especially the people involved in software development in any capacity. All developers regardless of skill sets are seen as interchangeable and plug-n-play. It is a frightening and frustrating culture for any developer (essentially we are all seen as morts)
Also, typically, it is expected that the developers who create an application will not be the same who will maintain it. In fact, the ones who will maintain it are faceless, outsourced, and offshored elsewhere where truthfully in my experience is more ingrained in the typically hackable software. In fact, I was told from a "software architect" way high up the chain (the "non-coding" architect variant) that it would not be cost effective to use the more "advanced" techniques like MVP because it would take too long for the maintaining developers to get up to speed and thus costing the company money. I attempted to argue the value of going the other route but I feel like a lone person fighting an entire culture and army of non-innovators.
Unfortunately, wider mainstream adoption is an important influencing factor and it is the typical "until Microsoft starts doing it then we won't do it". A lot of businesses only look at the short-term, measurable, and tangible costs of software and rarely the long term (hence why they later hire consultants to fix the problems when the software becomes unmaintainable and unchangeable- a vicious cycle)
So whadda ya do? (Unlike Ayende I'm not at that "uber-developer" level that would provide me a wider range of opportunities).
From the trenches,
I faced similar issues in the past, the client wanted a complex app to be build "simply" so "anyone" could maintain it.
There is only so much that I will agree to, so the project use WebForms and not MonoRail, but it is built to be maintainable, so it uses all those advanced techniques.
I would start by ignoring the non coding architect, frankly.
That depends on the way you work (the team, policies in place), but most often, you can get a lot done this way.
For a long term approach, I would try to get anyone in my team to work in this fashion "let us try it for a week, see how it is done".
If this is really one of those places where change is not going to happen it, I would look elsewhere.
I want to take pride in what I do.
Thank you Ayende for the advice. I appreciate everything you have done for the developer community as a whole (especially Rhino Mocks!)
Comment preview