I taught C to second year undergraduates back in the 1990s.  By this time, 
they had done some COBOL (those were the days), Pascal and assembler.  Two 
things they had great difficulty with were pointers and multithreading.  
Multithreading is not such an issue because you can avoid it while they are 
learning the basics, but you couldn't avoid pointers for long.  Go is, if 
anything, even worse, because pretty much as soon as you introduce 
functions you have to get involved in pointers, because without pointers 
you can't write a function that sets data.  Other languages such as Java 
manage to hide pointers sufficiently well that you don't have to get to 
grips with them until you have the rest of the language under your belt.  
Actually, I'm guessing that there are lots of successful Java programmers 
out there who have never really understood those things that Java calls 
references.

To be honest, I'm not entirely sure why pointers are such a big issue for 
students who had done a bit of assembler, but trust me, for some they are.  
I think one reason may be that they are one of a set of related concepts, 
and you have to understand them all before any one of them makes sense.

Multithreading is a different matter. I just found that many people have 
great difficulty visualising several threads working at the same time.

Apart from those specific issues, people new to programming have great 
difficulty with the idea of (a) variables and (b) expressions.  These very 
fundamental and crucial ideas are actually quite hard to grasp.  To a 
newcomer, it all just looks very arbitrary.  I have a variable called 
"total", but it could just as  easily be called "wibble".  Why is it called 
"total" then?  What are the rules?  This variable contains a string.  
What's a string?  Oh, it's a bit of text.  Why would I want to store a bit 
of text?  Because the instructor understands these concepts very well, they 
often don't appreciate how difficult some people find them at first, and 
they take them at too fast a pace.

Changing tack completely, I've also encountered a different problem.  Most 
people are only interested in computers when they do useful things - useful 
to them.  So, the instructor shows them a program that takes two numbers, 
adds them together and displays the result.  They can do that themselves 
with a calculator.  Take a list ten numbers that are out of order and sort 
them.  Why not just write the list in the correct order in the first 
place?  What's the point of all this?  Many people are just not willing to 
go through many hours of artificial examples on the assumption that they 
might one day learn something useful.  You have to convince them before you 
begin that they are going to learn something useful (to them) and keep them 
on board throughout.

If you think about it, *why not just write the list in the correct order in 
the first place?* is actually a fundamentally important question, involving 
stuff that we are good at, and computers are bad at.  We can just look at a 
small list of numbers and write it down in sorted order.  Why can't the 
computer do that?   The answer to you is probably so obvious that it 
shouldn't need explaining, something on the lines of "it's not the same 
problem when you have a million numbers", but then why would I be 
interested in sorting a million numbers into order?

Sorry if this is a bit vague.  I've been thinking about this stuff ever 
since I gave up teaching many years ago.  I know what some of the problems 
are, but I'm really not certain of the answers.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to