My favorite introductory C/C++/Basic/Pascal books are those by Donald Alcock—a master of progressive revelation and visual facilitation. If you don’t know these books, you might follow the link and have a preview.
http://www.cambridge.org/us/academic/subjects/computer-science/programming-languages-and-applied-logic/illustrating-c-2nd-edition The other key thing for new students —beyond creating a visual model of concepts—is mutable examples. The Go Playground makes this very easy. It allows the “false” introduction of simple statements that work generally, then later, example input that causes problems, and then based on that motivation, code refinement to solve those problems (list size = 0, negative surd in a cubic equation, whatever…). If you know TeX and Donald Knuth’s TeX Book, then this is like the “Dangerous Bend” signs that he uses. Very valuable didactic approach. From: <golang-nuts@googlegroups.com> on behalf of Egon <egonel...@gmail.com> Date: Wednesday, July 20, 2016 at 3:31 AM To: golang-nuts <golang-nuts@googlegroups.com> Subject: [go-nuts] Re: Go is for everyone On Wednesday, 20 July 2016 12:40:01 UTC+3, Simon Ritchie wrote: I taught C to second year undergraduates back in the 1990s. By this time, they had done some COBOL (those were the days), Pascal and assembler. Two things they had great difficulty with were pointers and multithreading. Multithreading is not such an issue because you can avoid it while they are learning the basics, but you couldn't avoid pointers for long. Go is, if anything, even worse, because pretty much as soon as you introduce functions you have to get involved in pointers, because without pointers you can't write a function that sets data. Other languages such as Java manage to hide pointers sufficiently well that you don't have to get to grips with them until you have the rest of the language under your belt. Actually, I'm guessing that there are lots of successful Java programmers out there who have never really understood those things that Java calls references. To be honest, I'm not entirely sure why pointers are such a big issue for students who had done a bit of assembler, but trust me, for some they are. I think one reason may be that they are one of a set of related concepts, and you have to understand them all before any one of them makes sense. I've explained pointers in terms of arrays... var memory [1<<10]byte // defining a pointer p := 10 // assigning to a pointer memory[p] = 123 // *p := 123 // dereference fmt.Println(memory[p]) // *p // indexing array starting at p fmt.Println(memory[p+8]) // double dereference fmt.Println(memory[memory[p]]) // **p Then show how you can store bigger values, structs or strings... etc. Multithreading is a different matter. I just found that many people have great difficulty visualising several threads working at the same time. For multithreading "The Little Book of Semaphores" by A. Downey (http://greenteapress.com/wp/semaphores/), it has many exercises about "how things can go wrong", which is one of the fundamentals of concurrency. Apart from those specific issues, people new to programming have great difficulty with the idea of (a) variables and (b) expressions. These very fundamental and crucial ideas are actually quite hard to grasp. To a newcomer, it all just looks very arbitrary. I have a variable called "total", but it could just as easily be called "wibble". Why is it called "total" then? What are the rules? This variable contains a string. What's a string? Oh, it's a bit of text. Why would I want to store a bit of text? Because the instructor understands these concepts very well, they often don't appreciate how difficult some people find them at first, and they take them at too fast a pace. Changing tack completely, I've also encountered a different problem. Most people are only interested in computers when they do useful things - useful to them. So, the instructor shows them a program that takes two numbers, adds them together and displays the result. They can do that themselves with a calculator. Take a list ten numbers that are out of order and sort them. Why not just write the list in the correct order in the first place? What's the point of all this? Many people are just not willing to go through many hours of artificial examples on the assumption that they might one day learn something useful. You have to convince them before you begin that they are going to learn something useful (to them) and keep them on board throughout. If you think about it, why not just write the list in the correct order in the first place? is actually a fundamentally important question, involving stuff that we are good at, and computers are bad at. We can just look at a small list of numbers and write it down in sorted order. Why can't the computer do that? The answer to you is probably so obvious that it shouldn't need explaining, something on the lines of "it's not the same problem when you have a million numbers", but then why would I be interested in sorting a million numbers into order? Sorry if this is a bit vague. I've been thinking about this stuff ever since I gave up teaching many years ago. I know what some of the problems are, but I'm really not certain of the answers. -- You received this message because you are subscribed to the Google Groups "golang-nuts" group. To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout. -- You received this message because you are subscribed to the Google Groups "golang-nuts" group. To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.