It's mislabelled. the author says "professors" plural, and yet generalizes them all based on his experience with a handful.
Anyway, Patio, for some reason you've made me want to refute every single one, heh
1. Java is a good first teaching language
This is curious. The whole entry is supposed to be about Computer Science professors on the whole, and yet, when you look at the first sentence explaining this:
I don’t know how many computer science programs start teaching programming using Java
You don't? So how in the heck is this a misconception that applies to Computer Science professors as a whole, rather then the "misconception" of a single professor?
First- Java is no better and no worse then any other programming language for teaching. They are all programming languages. He continues:
but there are more than a few, and that’s too many. When you’re going over variables, loops and conditionals, the object-oriented overhead of a language like java is unnecessary and confusing. Inquisitive students can’t just memorize things (i.e. public static void main (String args[])) without demanding to know what it means and why it’s there. "
I don't even know how to respond using words small enough he would understand, so I'll be blunt- this is bull-shakespeare. The fact is, when you actually start programming, you're going to have to deal with code that is both unrelated to what your doing at the moment as well as possibly messy, so developing the ability to analyze what is relevant to your task NOW rather then later is useful. Secondly, basing the selection of a teaching language on the syntactic requirements of the language is stupid. He's basically saying "they should use BASIC instead". lastly, this "misconception" is misconcieved. Good computer science courses don't select a single language at all. They usually explain concepts using scheme, and then apply what they learned to a more powerful/standard language- such as C++,C#, or java, since teaching programming using a single language is stupid.
This looks one of those zealots that has this big hard-on for Python and thinks it will solve all the worlds problems as soon as everybody tries to learn it. They also have a big problem with anything that is compiled, which I imagine brings him to his second point:
2. Machine language is “basic”
In this particular section, we see that the author does not possess a basic understanding of metaphor.
Comp Sci people seem to be terribly confused about what ‘basic’ means. When one learns to drive a car, starting the car, making a right turn, a left turn, parking, etc. is basic. Building a parallel gas-electric hybrid engine is not basic.
He is right here, however he fails to mention that machine language, and the assembly that is used to generate it, describes actions exactly that way; as a series of logical steps. It takes more steps and work on the part of the programmer to create the equivalent of higher level constructs, but it is by no means difficult. It is the fact that it's so basic that makes it harder to use.
Driving a car is more basic than building one because the latter requires significantly more expert knowledge than the former. In the same way, using a simple scripting language requires less depth of understanding that writing in machine language; therefore, computer science education should start with higher level languages and proceed to lower level ones, not vice versa.
His mention that "using a simple scripting language" implies that he believes script languages to be superior to Assembly; however he fails to mention the benefits of using Assembly over a scripting language, such as the ability to hand-tune performance improvements, and even to make script code faster in some instances when you understand how the interpreter operates on it. Additionally, understanding assembly language increases once ability to solve computer problems as a whole- debugging issues with computers is a lot easier when you understand registers, return addresses, and stack frames, all three of which are completely hidden from you in a "sophisticated" scripting language.
The authors love of scripting languages means that he likes to open notepad, and simply write code without thinking about design. This probably introduces his next point:
3. You should write code on paper before you write it on a computer
You should. He rambles on, however...
Writing code by hand is stupid. It is entirely inconsistent with the interactive and iterative design process that comes naturally to hackers and painters alike.
HAHAHAHA
No, of course not. writing code out by hand doesn't possibly make it easier to design a application. Of course, it helps if your application does s omething useful and/or is large enough. His statements to the contrary merely prove he s pends his time with page-long script files doing basic useless tasks.
Professional software developers make extensive use of API documentation, reference guides, forum discussions, etc. to make troubleshoot problems and make their code more efficient and effective.
This is wrong. Amateur software developers do that. professional software developers do this thing called "debugging". You don't need any reference material to see that your trying to access past the end of the string. Additionally, writing out code on paper can help concrete the idea in your head- and possibly reveal problems. of course you can do that as you type it out, but as you re-hash the problem in your mind your constantly thinking of possible flaws in your solution, rather then coming up with a solution, typing it out, and debugging for 2 hours only to discover that you were doing something horribly wrong.
Writing code by hand tests your ability to write trivially simple software without making errors. Real programmers must be capable of making complex software and detecting their errors with a variety of automated tools.
Again- no. Real programmers must be capable of
not making errors in the first place and writing the code out by hand first facilitates this. And when there is an error, there will be fewer of them, and having the algorithm in question solidified in ones brain by writing out the code (the action of writing by hand actually stimulates the memory more then typing it out does, believe it or not, since, rather then a mindless jumble of nonsense that goes straight from the brain to the fingertips, as typing does, one needs to invoke their cerebellum and muscle memory going back to early grade school for printing out letters and symbols. it is this "primitive" memory recall that helps "open up" long term memory. It makes one think about each line of code, each operation, separately, because you are writing it out separately; it helps detect errors in logic. (syntax errors can be caught later, if they exist).
Teaching or testing coding using pencil and paper is inconsistent with both the natural mode of human action and the practical realities of software development.
Just because program code is executed on the computer doesn't make the computer the best way to learn how to write it. Kind of like how implying he has any sort of practical experience with software development does not make this author know what he's talking about.
4. Lectures are an effective method of teaching programming
Programming is like algebra. You can’t learn how to write code by watching someone write code on a blackboard or listening to elaborate explanations from professors. You can’t learn math from watching someone do math. You learn to do things by doing them.
Lectures in programming classes are no more or less effective a teaching tool then in other classes. For example, English courses have lectures about various grammatical scenarios- these are applied later when you write essays. Math classes explain various rules regarding mathematical operations and functions in lectures. these are applied later as well. Just as how Computer Science can have concepts explained via lectures and have that material used later when actually writing some code down. The main point here is the lecture is pointless if you cannot review it later, by either taking notes or even making an audio recording. It's the all-important memory concept known as "repetition" which may be why I consider this particular author here a complete douchebag, since he insists on repeatedly proving it.
5. Algorithm design is learned by reading existing algorithms
Um. they do.
Designing algorithms is about finding innovative solutions to difficult problems.
Yes. very good. here, have a gold sticker.
Algorithm design courses are about studying existing solutions to rather simple problems.
And Math Proofs are about studying previous math proofs. What's your point?
Learning how a particular problem can be solves provides approximately zero insight into how to solve problems you’ve never encountered before.
Ah, I see. this makes perfect sense. See, an algorithm design course is supposed to find Computer science problems that have no solution and set the students to trying to solve it! Makes perfect sense! OH WAIT! NO IT DOES NOT! lastly, it DOES provide insight, since many of the general algorithmic designs can be dedicated to a vast number of problems- the "divide and conquer" strategy is used for the QuickSort Algorithm, the Binary Search Algorithm, as well as in the creation of optimized palettes via the Octree method, and far more. one can deduce therefore that there is a chance that problems not yet encountered have solutions that can involve a divide and conquer strategy. Additionally, one can learn that every recursive definition has an iterative equivalent.
6. You can just ‘pick up’ prolog in a week for a course
You should, if you have any experience with programming at all.
There’s this crazy belief among Comp Sci. faculty that all languages are basically the same,
They are, you dumbass.
so after learning the principles behind languages you can use whatever.
pretty much.
This is *censored*.
Is not.
This is like claiming that since someone studied Spannish grammar in grade school, they can speak Spanish fluently, in any of Spanish, Mexican or Columbian accents.
No it isn't. It's more like writing a 12 point essay about misconceptions and riddling them with your own ineptitude and then claiming they actually have more truth in them then Bill Clinton has celibacy.
The leap between structured and object-oriented programming is huge,
Only if you have short legs. And it isn't. an Object is just a structure in memory. C is not object oriented but you can work with objects in the form of structures.
and it pales in comparison to the leap between object-oriented languages and declarative languages.
much as when you read your test scores you paled.
At least this is all I can gather, from your repeated spewings.
7. Exams measure understanding of programming
No, and I doubt the teachers claim this. but they are a required part of the course. so Get over it.
8. GUI’s are not an important aspect of learning to code
They aren't. I'm not even going to bother to quote and refute the actual body of this point, because it's as blunt as the author himself.
9. Programming Requires Calculus
Requires, no. helps, yes. This once again proves that this particular programmer (the author) hasn't actually done as much software development as his claims would seem to suggest.
10. Linux will rapidly overtake Windows among consumers
This I have to agree with. but at the same time it has absolutely nothing to do with software development or computer science courses, so I'm apt to consider that perhaps the author has ran out of relevant points.
11. LaTeX will overtake WYSIWYG text editors because LaTeX gives you more control
I love how the author refutes this claim, but earlier on pretty much implied that script languages will overtake compiled languages because they are more "basic" Which at the time also implies that he hasn't written a interpreter before. They are not basic.
12. You can buy gates at RadioShack
And with this, the author once again proves that he is really just complaining about the teacher he had, and not about CS teachers in general.