Useless Java
This article (The 'Anti-Java' Professor and the Jobless Programmers) has really annoyed me. Not because of its content, but because the professor being interview is so narrowly focused that he is completely out of it.
Let us take this quote:
“Furthermore, Java is mainly used in Web applications that are mostly fairly trivial,” Dewar says, with his characteristic candor. “If all we do is train students to be able to do simple Web programming in Java, they won't get jobs, since those are the jobs that can be easily outsourced. What we need are software engineers who understand how to build complex systems.”
You know what, I might agree with this, trivial web programming is something that doesn't require highly skilled worker. But the context in which he says this makes all the difference in the world. Here is the next statement:
“By the way Java has almost no presence in such systems. At least as of a few months ago, there was not a single line of safety-critical Java flying in commercial or military aircraft. I recently talked to a customer who had a medium-sized application in Java, which was to be adapted to be part of a safety-critical avionics system. They immediately decided that this meant it would have to be recoded in a suitable language, probably either C or Ada.”
So, if it is not aircraft code, it is "simple Web programming"? Excuse me?!
Leaving aside the part about Java's license not being suitable for life critical systems, it annoys me that this is the only criteria that he think is worth mentioning. His bank is running Java, as well as the stock exchange, it is likely that the payroll system in his university as well. Relegating Java to "simple Web programming" is a huge mistake. Java sweet spot is building complex systems, where a lot of the design decisions that make it hard to do the simple stuff pay off tremendously.
I can think of quite a few mission critical (people life on the balance) that are written in C# (for the purpose of this post, equivalent to Java). That is not to say that aircraft systems shouldn't be written in Ada or C, I never written one, so I won't comment, but to say that this is the only field of any complexity is showing a remarkable amount of ignorance and bigotry.
There was another part that caught my eye, this professor interviewing's technique:
Dewar says that if he were interviewing applicants for a development job, he would quickly eliminate the under-trained by asking the following questions:
1.) You begin to suspect that a problem you are having is due to the compiler generating incorrect code. How would you track this down? How would you prepare a bug report for the compiler vendor? How would you work around the problem?
2.) You begin to suspect that a problem you are having is due to a hardware problem, where the processor is not conforming to its specification. How would you track this down? How would you prepare a bug report for the chip manufacturer, and how would you work around the problem?
“I am afraid I would be met by blank stares from most recent CS graduates, many of whom have never seen assembly or machine language!” he says
I can tell quite a lot about what he is doing, just by the questions he ask, but let me try answering them.
1) Well, I would start by creating an isolated test case, seeing if I can still see the odd behavior. I would remove all code that is even half way smart, trying to reduce the amount of work that the compiler has to do. I would get the language spec and pore over it, trying to make sure that I am not relying on unspecified behavior or misunderstanding the spec. Hopefully, I have a decompiler as well, since that would be an independent corroboration that there is or isn't an issue. Then I would take the output and verify disassemble it, comparing it to what is supposed to happen. I will take that info and submit to the compiler vendor. Working around this issue... by now I have some understanding on what triggers it, so I would change the code accordingly, simplifying it, removing any smart tricks, etc. I'll also leave a big comment about why this is done.
2) Try it on a different machine with the same processor, try it on another machine with a different processor. That rules problems with a specific machine and problems with my code, respectively. Reduce the problem to the smallest thing that can repro it, go over the spec, verify understanding. Eventually I should have a small program that demonstrate the issue. This can be incredibly hard to do if you run into something like memory barriers do not work in some scenario on SMP machines, for example. Work around it... depending on context...
Comments
The way I understood the article is:
If language (i.e. Java) has extensive framework library it is ment to be used by idiots.
Real programmers use IL or assembly code to program mission critical applications.
I'm personal aware of other people mistaking Java with Javascript. There is many people that really does not feel the difference in what is done server side by a web application, and what is done onto the browser. May be "the professor" is one of them.
There are tons of non-simple mission critical applications written in Java. Google AdWords is a good example. Much more than a simple web application, and probably responsible for billions of dollars in revenue.
I was annoyed by that too. I partially agreed with him that it seems some colleges are dumbing down the curriculum but that's about where my agreement ended.
His argument would seem more compelling if he didn't make profit from promoting Ada -- http://www.adacore.com/home/company/exec_team/. My opinion is that it's important to know the concepts he mentions (and not just knowing how to use libraries) but outside of school/learning use whatever language is best suited for the task at hand.
For these kind of professors "mission critical" almost always means hard real-time systems involving risks to human lives. Money is more abstract and secondary thing for them. I used to have AI professor at university who always divided software universe into these kind of systems and "standard business data processing software" he had utterly disrespected for lacking any kind of scientific background. Because that's what these kind of professors are - scientists and not engineers. Engineers they hire always work on the border of doing computer engineering and computer science. They don't find any pleasure of daily software development as it is too much craftsmanship and more of the art than working based on proved mathematical models.
haha, totally agree with you.
Biggest problem with guy like him, and my university by the way, is that they are totally disconnected from the business side of our profession. My university was teaching software enginering in a manner to apply it to the scientific side of the profession. But when you start seeking for a job maybe 2% to 5% of the offers are in scientific fields (in my area is even lower then that). The rest is all about application for management where higher level language provide productivty which is an important factor to consider.
Anyways for me a strong understanding of C/C++ and low level concept is crucial, but I really see the route from High Level language to low level language as a better way to learn. For now they tend to make you learn the other way around.
Real programmers use only 2 keys: 1 and 0.
I guess it would be peace of cake for him to build fully functional trivial web application (internet banking?) in assembler. Yeah, throw in a browser as well, just for fun. He would probably do it before deadline and under budget.
I recently wrote a post regarding something similiar written by Jeff Atwood few days ago:
http://il.dotmad.net/archive/2008/07/29/there-are-some-things-a-good-developer-is-not-required-to-know.aspx
Wow, I really don't understand why so many are so upset by this. Go read the original paper, I'll wait.
http://www.stsc.hill.af.mil/CrossTalk/2008/01/0801DewarSchonberg.html
The point is not that you can't use Java (or C#) to build critical systems, but that you won't learn how if you only learn Java (or C#). You have to learn how the computer works first. This has almost nothing to do with computer language choice!
You are missing the point. The point is that calling Java a language suitable for "simple web programming" is a gross ignorance on his part.
That you could bring up the energy to read the entire article! I read the first paragraph and some of the idiot's remarks and had enough.
Mike: I even disagree with that point. At universities, a language is used as a TOOL to teach the material, like OS design, system engineering etc. if a student only stares at the code written and not at what is taught, that student won't get it, regardless if the material is taught using another language.
Btw, I more and more hear that .NET loses momentum to Java in the enterprise world, (i.o.w.: java is coming back).
But Java is suitable for simple web programming. (Its also suitable for over-engineering business applications.) It can be used for mission-critical applications, if only used by skilled developers. This isn't about a particular computer language, its about what isn't being taught in college.
Mike,
It is not all that is suitable for, which is what he think it is
He does not think that Java is only suitable for simple web programming. Read the article and his paper again.
Page 1: The reason: students’ reliance on Java’s libraries of pre-written code means they aren’t developing the deep programming skills necessary to make them invaluable.
Page 3: Realize that “copying” code has value “It’s interesting when you think that the message that we give to students is: ‘You must do this all on your own, you mustn’t borrow anything from anyone else.’ And then we put them in a real industry situation and the message suddenly turns to, ‘Reuse code as much as you can.’ Real life programmers get good at using chunks of other people’s code."
My personal path was to understand hardware and machine code first, then high-level language. Worked well for me, but I'll never be able to truly evaluate trying a different path.
I'm just here to let you know that the programming.reddit.com thread for the linked "Anti-Java" article is at 400+ comments now:
http://www.reddit.com/r/programming/comments/6tzd1/the_antijava_professor_and_the_jobless_programmers/
if you're reading the comments here, you may find the reddit comments similarly enlightening.
I read the same article yesterday. Here's my take on it and it's remarkably similar to yours:
http://agilology.blogspot.com/2008/07/teaching-java-in-school-is-just-as.html
Some people think they are heroes by having some professional experience with C and/or assemply. They "don´t like" JAVA(or C#), but sometimes they can´t describe what the JIT compiler does -- or understand general managed enviroment behavior. It´s ignorance. JAVA is not guity for unawareness of computer architecture.
Advice on how to correctly answer both of those interview questions comes straight from the book Pragmatic Programmer: "'Select' isn't broken" is an allusion to the fact that "it is rare to find a bug in the OS or the compiler, or even a third-party product or library. The bug is most likely in the application."
Joshka,
I have found several compiler bugs, and I wasn't particularily trying
Hi have work on Military Aerospace applications using Ada, Fortran (yes, it's stilla round :-/ ) and C, as well as VB6 and .NET for prototyping interfaces and I know that some of the other projects used Java interfaces for interaction with these systems (console, not aircraft) and now am in the public sector coding purely in C# and to be honest, I don't think that one is better than the other and agree wholeheartedly that this professor is nothing more than a pretentious pr!ck that is very sure of his own importance.
What he says is in-part true, however he's throwing it completly out of context.
Ada is used in (primarily) military applications because it is a Mil-Spec language! It was design for just that. You would have no hope of writting a business application using Ada!
Java and .NET were not built for military applications, so why even try to adapt them to it if there are tools designed for just that?
I'm sure he'd be much less assured of the "simplicity" of Java or .NET if his bank balance was nulled after a C programmer forgot to clean up some stray memory pointers!
I've run in to many hard-core C programmers who belittled anyone who did not speak ASM and I have had the pleasure of putting them in their place after showing them we could develop something as reliable (if not more so) in less time, using less resources.
A good story about someone who thought just like this can be found on the daily WTF http://thedailywtf.com/Articles/That-Wouldve-Been-an-Option-Too.aspx
Just my two cents.
Mil Spec sometimes doesn't mean much at all. Lot of those craps are actually seriously old technology, some of it still buggy. Some people just assume that it's conform to some Mil-spec it must be good. In reality a lot of time they are getting the lowest bidder to do the job and they are just as horrible as any other system out there. It's all depend on who is working on it. A spec itself is pretty worthless...
Regarding the interview question it's all basic troubleshooting technique. Reduce the problem to a smaller test case, process of elimination.
He's seriously missing the boat here. Consistently mixing up complex applications and reliable applications.
And he has probably never understood the difference between accidental and intentional complexity. Reducing the first (by using a managed language like java) can enable you to increase the last. Every program is complex when you use C... :-)
The "professor" is clearly still living in the '70's
Let's not forget that by US law, life-saving, critical applications CANNOT be developed in any language that has undeterministic memory management (a.k.a GC). Won't look it up but that's a pretty well-known fact that you won't be seeing respirators and MRIs written in managed languages. ;)
Oren and all the other enraged fellas, relax.
The article is a classic piece of sensational journalism that needs to grab your attention in the first two or three paragraphs lest you start yawning with boredom. The professor elucidated his position on Java with respect to teaching computer programming at universities. He may have a sane and sound point.
-- Frans Bouma
Hmm, Frans, I take your words at face value, i.e. you hear, it may not be your position. Next time, you hear such blasphemy pronounced, refer your interlocutor to the TIOBE Index. Have him/her compare the grahps of Java and C# tracking the languages for the last six years.
As the nefarious professor intimates, Java developers seem to be doomed. :-)
Java
http://www.tiobe.com/index.php/paperinfo/tpci/Java.html
C#
http://www.tiobe.com/index.php/paperinfo/tpci/C_.html
Gabor,
AFAIK, such apps are forbidden from allocating memory after startup.
the professor is an old fool clinging to old ways.
just ignore him like you ignore any other old fool ranting on about the 40's to a cashier
Having seen "mission critical" applications, I can assure the nutty professor that many are indeed written in Java and C#
I suspect he is, like most professors, coming from an extremely isolated and limited viewpoint where he has little real contact with the industry as a whole, and only sees a very small area under his interest.
The assumption that "mission critical" for example means military type applications and code is foolish ... medical systems are mission critical (patients records being lost or wrongly updated), social care systems are mission critical (vulnerable people not being monitored or taken into care), traffic systems are mission critical (badly coordinated traffic flow systems leads to accidents and potential deaths).
I am more concerned that he thinks people will find compilers with bugs, and will find hardware critical code. I suspect that is largely due to using languages that are much lower level and more experiemental ... if a lot of these were switched to a higher level mature language like Java or C# ... perhaps his scenarios would matter less.
... and, I once interviewed a guy that wrote some of the code to provide a military air force with targetting and mission data for their aircraft ... the system used a big database to generate XML files, that were then copied to portable hard drives that were then plugged into the aircraft directly. The onboard system that dropped the bomb may well have been in Ada or C++ ... the dodgy code that told it where to drop the bombs most certainly wasn't.
This is funny because I work in the travel industry and we have been replacing applications and systems that were written in C, C++, COBOL, etc. with mostly Java applications.
I found this interesting article: http://www.cotsjournalonline.com/home/article.php?id=100261
my google session came through things like safety critical java, where the GC is simplified and runs on a normal priority thread with a 'deadine monotonic assigned priority', whatever that means.
and I also found the certification approved by the FAA for avionics software: http://en.wikipedia.org/wiki/DO-178B
I did not exactly find the requirements on the runtime but 'does not allocate after start-up' makes definitely sense too.
Interesting topic what we touched on indeed.
Gabor
Java Technology is even used in the Mission to Mars. critical part of Mars Rover is written in Java. refer the link below where James Gosling discusses about the role of Java in the Mission to Mars -
http://www.sun.com/aboutsun/media/features/mars.html
Christ, it's really annoying to read through these comments - you typify exactly the person the Professor is complaining about. Not once did he say that Java is irrelevant - simply that it's only 5% of what any reasonably competent computer professional should know. I'm sure your Java skill's are l33t and you're amazing bit if that's all you know you're dead in the water pretty shortly. This whole thread comes across as shrill bitching that someone didn't tell you you're a genius.
Clayton,
You are not a regular reader, it appears.
The last time I touched Java was in 1996.
My main objection was to: "Java is mainly used in Web applications that are mostly fairly trivial"
You might consider actually reading the post, rather then responding out of habit
Comment preview