ReIoC and Average Programmers
This is a reply to Eli's IoC and Average Programmers.
Just to clarify:
- Coding to the Mort's level will get you bad code, period. I deal with big applications, which means that I have zero use of demo-ware features. Trying to force an application to use them "because the programmer can use the designer to do all the work" is a stupid. It creates more work, it add more complexity and it results in hard to maintain code.
- Not investing in developers is stupid, period. Right now We! are hiring. I get to interview a lot of people, and the bar to get hired is not with knowledge. I can teach knowledge, and I can mentor beginners. It is fully expected that you would have a learning curve. Trying to avoid that by mandating stupid code means that you will not be able to keep good people, and those that you keep will not like what they are doing...
That said, there is such a thing as Too Much Magic. But it isn't at the Mort's level.
Update: this looks relevant - Technical Debt
More posts in "Re" series:
- (11 Apr 2022) Clean Architecture with RavenDB
- (14 Mar 2022) Database Security in a Hostile World
- (02 Mar 2022) RavenDB–a really boring database
Comments
My pingback doesn't seem to be working, so http://www.shawnhinsey.com/b/2007/03/06/ayendes-ioc-and-average-programmers/
I totaly agree with your two points.
I guess that when you are interviewing many developers your avarage falls down steeply. Remember that good developers are probrably not looking for a job.
A bad developer will keep on looking for a job untill a company hires him (or he changes his career), a good developer will find a job after a few interviews. So it might seem that there are many bad developers out there.
I consider myself an Avarage programmer, and well, that was the average I was talking about.
Eli,
You either not average, or you work with HAL.
Comment preview