Test predicts ability to learn programming

Anything not relating to the X-Universe games (general tech talk, other games...) belongs here. Please read the rules before posting.

Moderator: Moderators for English X Forum

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Wed, 10. Jan 18, 19:34

Why would you want, say, your hugely expansive architect or geologist to stop doing the thing your paying them to be be a specialist in and start writing scripts?

That would be a very expensive waste of time, time that you would be paying for.

They might enjoy it, but you're not paying them to do that.

Any decent craftsman would accumulate tools over time, but if they need a new tool, and that tool is software based, you hire a coder.

Or buy something to do the job.

Anything else is just wasting time and money.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

pjknibbs
Posts: 41359
Joined: Wed, 6. Nov 02, 20:31
x4

Post by pjknibbs » Wed, 10. Jan 18, 22:01

Morkonan wrote: And, if he wasted his time writing html, no matter how good it would be, it'd be wasted time. (He still misses it, though. He's often said he'd probably be happier writing code instead of being not-writing-code in upper management.)
I had a manager once who was hired as the IT manager of the business, but I think within three months of arriving he just left all that stuff to me while he got involved with a major coding project that was going on. Nobody ever called him out on it either, I have no idea why; it seemed like a total waste of resources having someone on that sort of pay scale doing regular programming work.

I also agree with mrbadger's point. You pay people to do what they're good at or what they trained to do. Yes, they might be capable of writing a script or maybe extracting data with a simple SQL statement or two, but it should no more be *required* that they do that than it should be *required* that someone is capable of replacing a spark plug if they own a car.

kohlrak
Posts: 136
Joined: Thu, 28. Dec 17, 11:47

Post by kohlrak » Wed, 10. Jan 18, 22:20

korio wrote:IMO everyone in "modern" days should have a "minimun" knowledge of programing.


Despite i work on "IT" im a programmer (atleast that's what i studied) and its always the same, someone asks me to do some "easy" program i tell them how much time it will take and then the "but why so much time, its an easy thing" starts.

Making a little program or script to do something you need is easy, making anything for users to use....

Also, we have a little joke when i was learning programing, "if you learn VB, its like to brainwash yourself and forget everything you have learned so far of programing" and up to some point its true hahah
That's something else i've noticed, too. You got programs that people think are easy, but are hard, but then you got programs that are easy, but people think are hard.
mrbadger wrote:Ah Grep.

Last year I had a student who seemed to think grep was some sort of wizardry. It took them five weeks to understand and complete this task:

Create empty file cli/fstab.txt without using a text editor

Concatenate /etc/fstab to cli/fstab.txt

Display this file in the terminal

Search for the word ’reference’ in fstab.txt

Redirect the output you see to the file cli/out.txt and print this to the terminal.

Not hard right? Certainly not hard for a final year computer scientist, but the student in question thought I was asking them to do something far beyond undergraduate skill level...

All the commands to use, and the way to use them, was listed in the worksheet, which was designed to introduce students to lots of CLI tools.

They needed it explained to them so many times I started to think they were coming to the class stoned.
When you see that, from what i've seen, is people who are afraid to sit down with it, especially with "information overload." I'm not a professional educator, yet i've seen it. You get that deer in the headlights look from them when you say a bunch of things and ask them to apply it. Sometimes they just need a walkthrough case for that extra mental security. It doesn't help at all that people are afraid of breaking their computers. It's not always that irrational fear, but that fear certainly does make things much, much worse.
Morkonan wrote:
kohlrak wrote:...What i want to know is why a particular professional can't generate a simple script on their own without paying a programmer to do it for them. Why are we still hiring people to make static web pages?
It's not their job.
But it's an absolute waste. It's one thing if you have programmers working under you, but most companies don't have programmers working for them.
A friend of mine is a coder, by degree and background. But, as he's moved up the ranks into upper-management, he's not a coder any more. He loves writing code, but it's just not what he's paid to do. So, he's gotten a bit rusty, here and there. Not sloppy, just a little rusty in certain things he'd rather not be rusty in. So, he'll write some code in his free time, just to sharpen up his skills. He may even weigh in, from time to time, on something being written and show the boys in the cubes how to do it.

Sometimes, he'll tell me about seeing crappy code or something just not done the way he'd have done it. He made his chops, been there, done that, been in the trenches and had the bad habits beaten out of him, so he knows what the entry and middle level mistakes are.

"Does it work? Good. I'm not going to sift through this rat's nest unless the problems are egregious, I have more important crap to do." - My friend.

He could. If he needs to know something about a language or just learn a new one, he'll handle it in short order, then be able to tell the new guy what he's doing wrong... IF he bothered at that level. But, it's just not his job anymore. And, if he wasted his time writing html, no matter how good it would be, it'd be wasted time. (He still misses it, though. He's often said he'd probably be happier writing code instead of being not-writing-code in upper management.)
You get the moral of your story? If you have a programmer working under you, you don't need to mess with code. The problem with your story is, this isn't even close to the majority of companies.
My perspective here is just as a user with a desired goal to reach in using a tool. If I don't "need" to create my tool, I don't need to know how to do it. I need to understand how my tool works and a few basic things about it, but that's as far as I need to go in order to get the job done.

Most professions are like that.

But, in "learning" a profession, one often starts at the basics, even if those basics aren't ever used anymore. So, I certainly understand the benefits a coder would gain if they learned what their tools were actually based upon. Learning assembly, for instance. I can see the benefit, there. But, will they need a working knowledge of that in twenty years? Probably not. Is it likely they'll ever have to actually apply that knowledge to writing code in twenty years? Probably not.
If knowledge of assembly is not influencing your coding algorithms, you're probably writing inefficient code. The compilers today have gotten better over the past 10 years, but they're still far from perfect. You should know what the computer's effective at.

Otherwise, yes, you're right. As much as I love assembly, i'm still usually writing in C++, my first programming language, unless there's something particularly useful in the assembly of the target machine (you'd be surprised how many specialized instructions never get used) or C++'s hand-holding is particularly painful for the task (type checking, anyone?).
Here's a question: How long do programmers actually stay in the position of "programmer" before moving into a specialized area or even into a different field, possibly not even related?

Sure, there are fifty year-old programmers cranking out stuff, but what's the percentage of beginning software engineers/programmers/coders that actually stay in that job description for their entire working life? From those I have known, it seems as if that's just a low-level port-of-entry onto a ladder of career opportunities in software/IT.
From the people i talk to, usually tech support is the point of entry, and only very unique people end up programming. I've heard stories like your friend's as well. But that's the thing: programming really is entry level stuff, and this is reflected by the markeet. You can outsource most jobs to, say, India, for next to nothing. Why pay specialists with degrees when it really is basic? So, then, if programmers are frequently promoted, that means they're not as special and unique as we like to think we are. This also means, there should be no problem for other people to learn this entry-level position and erradicate the need altogether outside of companies that specialize in certain types of more complicated software.
mrbadger wrote:Why would you want, say, your hugely expansive architect or geologist to stop doing the thing your paying them to be be a specialist in and start writing scripts?

That would be a very expensive waste of time, time that you would be paying for.

They might enjoy it, but you're not paying them to do that.

Any decent craftsman would accumulate tools over time, but if they need a new tool, and that tool is software based, you hire a coder.

Or buy something to do the job.

Anything else is just wasting time and money.
It's one thing if you're asking an architect to write CAD software or a geologist to write the drivers for some common piece of equipment. But, common, why are they hiring programmers to write a script that takes in simple data and spits it out into an easier to read format? Maybe it's raw data and you want to parse it into HTML or XML and have it put in certain results in red or yellow to inform the reader of particular dangers. There's no reason why you should have to hire a coder to do something that mundane, and it shouldn't take an architect or geologist a whole day's worth of work to write it, either. Hiring a programmer for such a mundane task is easily wasting time and money.
pjknibbs wrote:
Morkonan wrote: And, if he wasted his time writing html, no matter how good it would be, it'd be wasted time. (He still misses it, though. He's often said he'd probably be happier writing code instead of being not-writing-code in upper management.)
I had a manager once who was hired as the IT manager of the business, but I think within three months of arriving he just left all that stuff to me while he got involved with a major coding project that was going on. Nobody ever called him out on it either, I have no idea why; it seemed like a total waste of resources having someone on that sort of pay scale doing regular programming work.

I also agree with mrbadger's point. You pay people to do what they're good at or what they trained to do. Yes, they might be capable of writing a script or maybe extracting data with a simple SQL statement or two, but it should no more be *required* that they do that than it should be *required* that someone is capable of replacing a spark plug if they own a car.
You do expect people to check their oil, fill their gas tank, fill their tires, and fill their sprayers, right? It's also pretty useful to know how to replace an air filter or a battery, or should that not be required, either? Ever see someone out in the middle of nowhere with jumper cables and no functioning idea how to use them when faced with a dead battery and a volunteer?

EDIT: It's time to bring this back, since the topic is present.

Why basic knowledge of tools is important.

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Thu, 11. Jan 18, 18:34

In Computer Science/Computing degrees, we always start with the computer equivalent of 'this is a door, this is a door handle, and this is how you hold the steering wheel', type of stuff.

Real basic, covering all the simple tools, like the terminal, Git, tar, grep and the like.

Plus thanks to my generating the exercise and hammering it at the relevant module leader till he finally agreed to use it, this finally includes simple assembly language comprehension. So it's no longer true for us that the potential for students to learn all the basics of their trade aren't present.

But we still face the problem that students view a completed module as something that they can now forget. Modularised degree's simply do not work when the assessment is bound to each module and not to the course year itself.

How many times do I hear 'but I haven't done that since the first year, I don't remember it'.

There is something wrong with any approach that creates that mindset.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

User avatar
Morkonan
Posts: 10113
Joined: Sun, 25. Sep 11, 04:33
x3tc

Post by Morkonan » Thu, 11. Jan 18, 19:16

mrbadger wrote:...How many times do I hear 'but I haven't done that since the first year, I don't remember it'.

There is something wrong with any approach that creates that mindset.
^--- This.

There's a sort of evolution, isn't there, in "teaching a profession."

How many students have heard, either from their instructors or graduates, "You'll never use this, but you'll be taught it, anyway?"

Back in the day, when the most advanced, affordable, calculator a student could have didn't have a log function button on it, we used paper log tables... We'd look them up, by hand.

I learned all about "Freudian Psychotherapy" even though the only people practicing it today are whackjobs simultaneously staring through crystals to see people's auras while standing in an energy vortex.

Is there a "Hard Science" component to "Computer Science?"

The hard sciences pride themselves on accumulating knowledge and adjusting accordingly, restructuring teaching as needed to compensate. Some subjects that are much like "science" have fundamental rules that are, indeed, rules. Mathematics comes to mind, here. Even so, some things can be changed due to new discoveries, but in these days, those are pretty advanced components of the subject, itself. Soft sciences, as implied, aren't as rigorous or are still developing. In my experience, they deal a great deal more with the history of discovery, the hints and glimpses of basic principles, but without the actual principles, themselves, beyond some basic ones. "People generally don't like pain" - A basic assumption by psychologists that is generally true, most of the time, with a normal person, depending upon how you define "normal", etc, etc...

What changes instruction in Computer Science? What is the active operator? What is considered "new knowledge?"

You are all more learned in this subject than I. I'm simply some "static" in the fundamental discussion. That's a given. :)

But, beyond "this is an instruction set to carry out a mechanical operation," what is fundamental to Computer Science? Mathematical rules would seem to be a norm to this unlearned observer.

It's kind of crazy in a dynamic, interesting, hair-brained way, to be honest. The imagery of people bringing in cardboard boxes of all sorts of stuff that is applicable to "learn to computer" and then dumping it on the floor so they can sift through it in order to construct a method of teaching "computer" is rambling around in my brain right now. Is it a "science" because of maths? Quantum physics? Materials sciences, chemistry, logic and philosophical principles?

So, when the subject of "people should learn assembly" is brought up, this neophyte has to ask "When will that change, how will that change, and will it change so much that the knowledge will be a dead language?"

Assembly isn't even the bottom of this rabbit hole. There are lower levels to be reached. Do "students" or even "programmers" need to go even further before their education can be considered complete? And, when a new chip/processor comes out... then what? That extremely low-level, very specific, knowledge, but not the basic principles it probably makes use of, is now useless.

A undergrad student or practitioner of this mysterious art might be very well served if they also studied Philosophy and Economics as well as their normal Mathematics instruction.

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Thu, 11. Jan 18, 21:44

I'm not quite sure what's going on with pure Computer Science.

I can only talk definitely for my university, but the subject doesn't really seem to have realised yet it's not preparing people for academia.

That was true when I was an an undergrad in the early 2000's, and it seems to still be true. They won't let go of this 'we are training academics' mindset, even if they say out loud that they have.

It worked for me, having gone through the pure CS route. I was totally incapable of functioning in industry (I know, I tried), but I'm fine in university. But that was where I always wanted to be anyway.

I'm not in our CS degree team, I teach on it, but I'm in the Computing degree team, which is very much focused on 'this is what you need to function in the workplace'. I prefer that.

I like CS, I wouldn't give up my CS module, but I don't want to be on their team. I'm often called on to advise, but I won't join them officially, because I'd have to give up my place on the Computing degree team.

Maybe it's not as technical, but I've always thought that an interested intelligent person with a degree as a starting point can teach themselves more or less anything, or do a Masters later.

Besides, not as technical doesn't mean easy. Not even slightly. Not with me designing the assessments it doesn't. Not if they want to get a decent grade.

And when I do make it easier to get a passing grade, that comes at a heavy price.

I never worked in the software industry proper, but I was a registered nurse for many years prior to university. I understand the requirements of a professional career.

The skillset may be different, but a lot is the same. there is a need to stay up to date, to know as much as you can, to function independently.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

User avatar
Observe
Posts: 5079
Joined: Fri, 30. Dec 05, 17:47
xr

Post by Observe » Thu, 11. Jan 18, 22:29

mrbadger wrote:I was totally incapable of functioning in industry (I know, I tried), but I'm fine in university. But that was where I always wanted to be anyway.

I'm not in our CS degree team, I teach on it, but I'm in the Computing degree team, which is very much focused on 'this is what you need to function in the workplace'. I prefer that.
Aren't those two statements a bit at odds with each other?

First you say you don't want to be in industry (and that you prefer academia) and then you say that your teaching focus is on 'this is what you need to function in the workplace'. Shouldn't we be drawing more people from industry, who are able to function in the workplace, teaching those who will end up there?

kohlrak
Posts: 136
Joined: Thu, 28. Dec 17, 11:47

Post by kohlrak » Thu, 11. Jan 18, 23:21

mrbadger wrote:In Computer Science/Computing degrees, we always start with the computer equivalent of 'this is a door, this is a door handle, and this is how you hold the steering wheel', type of stuff.

Real basic, covering all the simple tools, like the terminal, Git, tar, grep and the like.

Plus thanks to my generating the exercise and hammering it at the relevant module leader till he finally agreed to use it, this finally includes simple assembly language comprehension. So it's no longer true for us that the potential for students to learn all the basics of their trade aren't present.

But we still face the problem that students view a completed module as something that they can now forget. Modularised degree's simply do not work when the assessment is bound to each module and not to the course year itself.

How many times do I hear 'but I haven't done that since the first year, I don't remember it'.

There is something wrong with any approach that creates that mindset.
Absolutely. But, at least at that point, it's on the student. I hate it when the teachers hide information from their students, or even misrepresent information. That drives me insane. All that does is complicate matters, in hopes of making matters less complicated. So, instead of solving the problem, you're pushing it further down the road, when it'll be hard to correct. But by not making it someone else's problem, you are responsible. My personal favorite, though, is when someone teaches that assembly and "hex" are the same. Even better, when someone who knows assembly focuses on teaching motorola just because it's a royal pain in the rear.

What you might be able to do to mitigate it (i assume you're already doing this, anyway) is to bring it up as a fundemental that will be used. Maybe bring up fraction multiplication or division or something and say "see? Don't forget these skills like you forgot those, or you'll be in a world of hurt on the job."
Morkonan wrote:
mrbadger wrote:...How many times do I hear 'but I haven't done that since the first year, I don't remember it'.

There is something wrong with any approach that creates that mindset.
^--- This.

There's a sort of evolution, isn't there, in "teaching a profession."

How many students have heard, either from their instructors or graduates, "You'll never use this, but you'll be taught it, anyway?"

Back in the day, when the most advanced, affordable, calculator a student could have didn't have a log function button on it, we used paper log tables... We'd look them up, by hand.

I learned all about "Freudian Psychotherapy" even though the only people practicing it today are whackjobs simultaneously staring through crystals to see people's auras while standing in an energy vortex.

Is there a "Hard Science" component to "Computer Science?"

The hard sciences pride themselves on accumulating knowledge and adjusting accordingly, restructuring teaching as needed to compensate. Some subjects that are much like "science" have fundamental rules that are, indeed, rules. Mathematics comes to mind, here. Even so, some things can be changed due to new discoveries, but in these days, those are pretty advanced components of the subject, itself. Soft sciences, as implied, aren't as rigorous or are still developing. In my experience, they deal a great deal more with the history of discovery, the hints and glimpses of basic principles, but without the actual principles, themselves, beyond some basic ones. "People generally don't like pain" - A basic assumption by psychologists that is generally true, most of the time, with a normal person, depending upon how you define "normal", etc, etc...
Computer science isn't all that fluid, either. Everyone just comes up with their own new ideas that they think are new just because it has a new name, but really isn't. Almost all languages either look like assembly, basic, or C, but it's the languages that are new? Even the new features of C++ comming out in the past few years aren't really all that special. Lambdas? It's just a function within a function, which people were curious about for a long time on why we couldn't do that when you can in certain other languages.
What changes instruction in Computer Science? What is the active operator? What is considered "new knowledge?"
A person with an idea, a promise, and the belief that something they're doing is actually different.
You are all more learned in this subject than I. I'm simply some "static" in the fundamental discussion. That's a given. :)

But, beyond "this is an instruction set to carry out a mechanical operation," what is fundamental to Computer Science? Mathematical rules would seem to be a norm to this unlearned observer.
Rerepresentation of math is more accurate, to the degree that it can have functional properties. This allows numbers to be given meanings that ultimately creation the mechanical operations of which you speak. It really is on, off, and how much on.
It's kind of crazy in a dynamic, interesting, hair-brained way, to be honest. The imagery of people bringing in cardboard boxes of all sorts of stuff that is applicable to "learn to computer" and then dumping it on the floor so they can sift through it in order to construct a method of teaching "computer" is rambling around in my brain right now. Is it a "science" because of maths? Quantum physics? Materials sciences, chemistry, logic and philosophical principles?
Math is more of an art than a science, even though we use it in science. I see computers as no different, aside from the fact that we use science to make computers. And if you understand the connection between the three, you start to view modern scientific theories (especially quantum physics, computer modeled events, etc) in a very, very new light, and probably not a positive one when you also realize how wildly inaccurate it is. Add psychology to the mix and you'll be a skeptic of human "knowledge."
So, when the subject of "people should learn assembly" is brought up, this neophyte has to ask "When will that change, how will that change, and will it change so much that the knowledge will be a dead language?"
Assembly is the lowest level that you can access. Assembly varies from processor family to processor family. It's reliably present, even if it's sometimes hard to get to (the tools nanny you too much and try to keep you from it). Everything ultimately gets translated to assembly before it becomes the executable code. It is a common language of sorts before the final step. As such, the more you know of a processor's assembly, the more you know about what you should and shouldn't do on it.
Assembly isn't even the bottom of this rabbit hole. There are lower levels to be reached. Do "students" or even "programmers" need to go even further before their education can be considered complete? And, when a new chip/processor comes out... then what? That extremely low-level, very specific, knowledge, but not the basic principles it probably makes use of, is now useless.
It's not inconceivable to teach from the atomic level. I've tried to do this a couple times, but your average Joe gets too bored. Which, i understand: arduino's are cool, but most people don't want to use them outside of picking up a "shield."
A undergrad student or practitioner of this mysterious art might be very well served if they also studied Philosophy and Economics as well as their normal Mathematics instruction.
Philosophy, yes, but i've heard the suggestion already to learn programming before philosophy (Stefan Molyneux). They're close enough that they are useful to one another, just not in the fields themselves. Economics, yeah, i would agree on that, too, but that's another one of those "who should come first?" I would argue that economics is actually more useful than philosophy in this matter, since philosophy isn't overly useful in making programs outside of the logic bit.
Observe wrote:
mrbadger wrote:I was totally incapable of functioning in industry (I know, I tried), but I'm fine in university. But that was where I always wanted to be anyway.

I'm not in our CS degree team, I teach on it, but I'm in the Computing degree team, which is very much focused on 'this is what you need to function in the workplace'. I prefer that.
Aren't those two statements a bit at odds with each other?

First you say you don't want to be in industry (and that you prefer academia) and then you say that your teaching focus is on 'this is what you need to function in the workplace'. Shouldn't we be drawing more people from industry, who are able to function in the workplace, teaching those who will end up there?
I think he means he hated actually programming professionally (given his academic nature, it seems logical), so he likes to focus teaching power users at best, rather than programmers.

User avatar
Observe
Posts: 5079
Joined: Fri, 30. Dec 05, 17:47
xr

Post by Observe » Thu, 11. Jan 18, 23:51

kohlrak wrote:I think he means he hated actually programming professionally (given his academic nature, it seems logical), so he likes to focus teaching power users at best, rather than programmers.
Yes, I understand that. I have the utmost respect for mrbadger and meant no offense whatsoever. I was merely trying to make the point, that perhaps part of the problem with new graduates, is that they have been instructed by those who do well in academia, but who may not be the best to train for the workplace.

kohlrak
Posts: 136
Joined: Thu, 28. Dec 17, 11:47

Post by kohlrak » Thu, 11. Jan 18, 23:59

Observe wrote:
kohlrak wrote:I think he means he hated actually programming professionally (given his academic nature, it seems logical), so he likes to focus teaching power users at best, rather than programmers.
Yes, I understand that. I have the utmost respect for mrbadger and meant no offense whatsoever. I was merely trying to make the point, that perhaps part of the problem with new graduates, is that they have been instructed by those who do well in academia, but who may not be the best to train for the workplace.
Depends which problem you're referring to. The complaint of most potential employers is that coders they hire can't do even the most simple of academic tests. On the other hand, when employers are not complaining, professional programming is like professional painting: there's a practical side to an art which you get paid for, but the power of art comes from the desire to do it, and money isn't enough. I'm betting his problem ultimately was he wanted to "do things right," but the industry wasn't too kosher with that, which deserves a topic all on it's own.

EDIT: Or the programming job he was expecting to do regular programs or gui programs, and ended up doing web pages.

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Fri, 12. Jan 18, 09:36

Observe wrote:
kohlrak wrote:I think he means he hated actually programming professionally (given his academic nature, it seems logical), so he likes to focus teaching power users at best, rather than programmers.
Yes, I understand that. I have the utmost respect for mrbadger and meant no offense whatsoever. I was merely trying to make the point, that perhaps part of the problem with new graduates, is that they have been instructed by those who do well in academia, but who may not be the best to train for the workplace.
The sort of jobs I was offered were always ones that would pay a lot but take up all of my time and be no fun as far as I could tell.

Microsoft wanted me of their Software Quality Assurance team, based in Ireland. That would have meant becoming an 'expert' on all of their major code-bases, and being part of the team that made sure everyone was keeping to standards.

Lots of money, but no free time.

Or there was Google, where I would have worked on Google Earth. However to do that I needed to move to Switzerland, without my son. Again money was good but not enough of a reason.

Or Toshiba. I was never quite sure what Toshiba were offering except the pay was huge and it was a research post with, as far as I could tell even less free time.

Moving to the US to work for Microsoft was offered as well, but that would also have meant leaving my son, so I never even considered that.

On the other hand I have always wanted to be an Academic. Even being a nurse was just a step towards that. An odd step I admit, but a step still.

The pay is a fraction of what I could earn, but I love to teach. I always have.

I have always been of the opinion that it's more important that you enjoy your job then do it just for the money.

This is also a thing I try to explain to my students. Personal happiness and work satisfaction is vastly more important than any wage slip.

In my case that's why I was so eager to return to work after my brain injury, in spite of how difficult that was.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

User avatar
Morkonan
Posts: 10113
Joined: Sun, 25. Sep 11, 04:33
x3tc

Post by Morkonan » Fri, 12. Jan 18, 22:37

kohlrak wrote:... I would argue that economics is actually more useful than philosophy in this matter, since philosophy isn't overly useful in making programs outside of the logic bit...
All of your reply is noteworthy, I just want to avoid my penchant for bloviating into the available voids. :)

"To instruct a machine on how to accomplish a task." - Is that it? That's "programming" in a nutshell, right? During that, one may also instruct the machine to accomplish related tasks and how to accomplish them. Along the way, there will also be several "do this" commands, with the mechanisms for doing that already present in the machine's design.

Then, stuff changes, and new operators are added to the tools (languages) used to "do programming." Occasionally, something radical causes a... "paradigm shift" and new low-level opportunities are made available which flow up the chain to "programming." That's a one-way influence, isn't it? Or, does innovation in programming language, presumably in more efficient operators, influence hardware as well? I can see hardware being optimized, of course, for language, but how "innovative" is that? Enough for a shared dynamic, each influencing the other?

Babbage moved automatons and windmills from simple, fairly single-purpose, tasks to a multitask capability. A machine that, within physical constraints, would do what you told it to do. That's more than the remarkable checkers-playing clockworks or variable speed transmisisons did.

Philosophy, from what little I know of it, will often use a rigorous set of principles in order to explore a concept or present an argument. The "rigor" presented in such things is what is important. If one strays, all is lost, since it naturally devolves into babble. Unless one creates a suitable foundation, already proven in its stability, the house collapses. Well, unless one invents a new, equally firm, foundation from which to make an argument or explore a concept.

Create a thing from a set of rules.

Tinkertoys. Legos. Building blocks. I don't mean just "objects" but constraints. You can't tell a machine to do something it isn't designed to do. You can force it to accomplish a task it wasn't designed for, but you can't force the "how" in terms it accomplishes that task if there is nothing in the machine that can support that "how." I can't tell a television to put a balloon on a string. But, I can certainly instruct a television on how to display a balloon on a string. I have hardware constraints that must be obeyed.

At the lowest level, it's all electrons jumping around. One day, it may be photons. Until then, innovation at the basic level revolves around more betterer ways to get the electrons to "do." Sometimes, there are new ways to get the electrons to "do." There could even be ways to organize that "do" that are innovative and new. Parrallel vs serial, etc.

It's all about rigorous rulesets at varying levels. Sometimes, people write new rulesets and tell the machine how to interpret those. The new rulesets make it easier, more betterer, to do certain things using that set of rules.

At the most basic level, considering this, perhaps teaching new computer programmers how to achieve things within a very narrowly defined set of rules is what should be done?

Teach them how to win at chess. Teach them how to win at Go. Then, give them a variety of "games" and tell them to "win." Then... tell them to make their own game, with their own rules, and make it winnable.

Electrical engineers are modern-day wizards. They must learn practical physics. They have to know "how", but they also have to know the "why" that lies behind the "how." If they don't, they end up causing lots of problems, like buildings burning down and equipment exploding... It's important.

How much does a programmer need to know about the "why?" At what point does the pursuit of that knowledge cause the student to begin deviating deeper into the engineering aspects rather than the "playing the game by the rules and winning?"

A "programmer" should, because of their ease with using rulesets to accomplish tasks, be able to pick up a new language and use it proficiently to accomplish tasks. IMO, they should be exposed to as many toolsets as possible. They should be instructed how, in general, these toolsets work and, to a certain extent, "why" they work. For instance, they should have a series of instruction concerning how languages, the tools, are constructed and how they work to move electrons around. (Assembly introduction and concepts relating to even lower-level functions as well as how interpretations of languages are created.)

But, I don't think that, for a programmer, they must "know" Assembly. What's important is their ability to acquire a new tool (rule-set/language) when it is needed or available and to use that tool to accomplish a desired task with an acceptable level of proficiency. Their practical, workable, knowledge may not need to extend beyond the level of the compiler. They know what it does and, to some extent, how and why it does it. But, they don't need an operational level knowledge below that in order to serve their purpose.

An engineer, though, would need that level of knowledge. They build the stuffs that make the "dos." :)
mrbadger wrote:I'm not quite sure what's going on with pure Computer Science.

I can only talk definitely for my university, but the subject doesn't really seem to have realised yet it's not preparing people for academia.

That was true when I was an an undergrad in the early 2000's, and it seems to still be true. They won't let go of this 'we are training academics' mindset, even if they say out loud that they have.
...
Experimental vs practical. Engineers vs architects.

Every time someone comes up with a good, new, tool, the user environment erupts. An MRI revolutionizes medicine, a new plow revolutionizes farming.

Are the students tool builders or tool users? What is it that is being produced versus what it is that one desires to produce?

Students want a degree and knowledge that they can use to pursue their desired goal. That's true in every institution of learning. If the student wants to create new tools, they need an engineering focus. If they want to learn how to use tools and how to learn to use new ones, they need a practical focus. Both? Well, then they can get a dual degree in both. :)

Computer "Science" to me indicates something more tool-creation based than tool-use based inasmuch as the desired goal of such things are different. It helps if a race-car driver knows many of the details of how their automobile works, but that doesn't help them achieve the goal of winning automobile races. They have specialized mechanics to ensure they have the best tool possible for that.

User avatar
mrbadger
Posts: 14226
Joined: Fri, 28. Oct 05, 17:27
x3tc

Post by mrbadger » Fri, 12. Jan 18, 23:21

My argument on that has been that we have post graduate courses for specialists.

Undergraduate courses are for generalists, and should be aimed at things people need for the workplace and their careers.
If an injury has to be done to a man it should be so severe that his vengeance need not be feared. ... Niccolò Machiavelli

User avatar
Morkonan
Posts: 10113
Joined: Sun, 25. Sep 11, 04:33
x3tc

Post by Morkonan » Fri, 12. Jan 18, 23:52

mrbadger wrote:My argument on that has been that we have post graduate courses for specialists.

Undergraduate courses are for generalists, and should be aimed at things people need for the workplace and their careers.
A good solution, probably. Nobody wants a half-trained tool maker. :)

PS - Geometry!

Geometry is a fascinating subject. I love proofs! Proofs, everywhere! This is this, and this is this... so this must be this, always. Oh, but only if it's just two dimensions... :) If it's three, you'll have to sign up for the advanced courses. :D

kohlrak
Posts: 136
Joined: Thu, 28. Dec 17, 11:47

Post by kohlrak » Sat, 13. Jan 18, 05:56

mrbadger wrote:
Observe wrote:
kohlrak wrote:I think he means he hated actually programming professionally (given his academic nature, it seems logical), so he likes to focus teaching power users at best, rather than programmers.
Yes, I understand that. I have the utmost respect for mrbadger and meant no offense whatsoever. I was merely trying to make the point, that perhaps part of the problem with new graduates, is that they have been instructed by those who do well in academia, but who may not be the best to train for the workplace.
The sort of jobs I was offered were always ones that would pay a lot but take up all of my time and be no fun as far as I could tell.

Microsoft wanted me of their Software Quality Assurance team, based in Ireland. That would have meant becoming an 'expert' on all of their major code-bases, and being part of the team that made sure everyone was keeping to standards.

Lots of money, but no free time.

Or there was Google, where I would have worked on Google Earth. However to do that I needed to move to Switzerland, without my son. Again money was good but not enough of a reason.

Or Toshiba. I was never quite sure what Toshiba were offering except the pay was huge and it was a research post with, as far as I could tell even less free time.

Moving to the US to work for Microsoft was offered as well, but that would also have meant leaving my son, so I never even considered that.

On the other hand I have always wanted to be an Academic. Even being a nurse was just a step towards that. An odd step I admit, but a step still.

The pay is a fraction of what I could earn, but I love to teach. I always have.

I have always been of the opinion that it's more important that you enjoy your job then do it just for the money.

This is also a thing I try to explain to my students. Personal happiness and work satisfaction is vastly more important than any wage slip.

In my case that's why I was so eager to return to work after my brain injury, in spite of how difficult that was.
See, there inlies another issue. The pay matters, but that doesn't mean you should be blind to other factors if the pay is high enough. You want to have enough to be comfortable, then ask yourself if you'll need more in the future. If not, why take the extra money? It's also a much higher risk. You going to take all that money into the after life? Want to give it to charity, fine, but you gotta ask yourself certain questions at the end of the day. The polar opposite of this is where sometimes you have to be like me and take jobs you're not satisfied with, simply because the jobs that would please you just don't pay well enough for you to even pay the bills, let alone let you live comfortably.
Morkonan wrote:
kohlrak wrote:... I would argue that economics is actually more useful than philosophy in this matter, since philosophy isn't overly useful in making programs outside of the logic bit...
All of your reply is noteworthy, I just want to avoid my penchant for bloviating into the available voids. :)

"To instruct a machine on how to accomplish a task." - Is that it? That's "programming" in a nutshell, right? During that, one may also instruct the machine to accomplish related tasks and how to accomplish them. Along the way, there will also be several "do this" commands, with the mechanisms for doing that already present in the machine's design.
Very simplified assessment. For someone who doesn't do it, that's well enough. However, in the thick of things, that's way too simple, and I feel too many people have that mentality when approaching it when they're part of it. It's like being a competitive weight lifter with the attitude "i pick things up and set them back down." No, you gotta challenge yourself, and even thicker into it you find that sometimes you want to go lighter than you need, sometimes you want to go heavier, and there's a massive strategy behind it.
Then, stuff changes, and new operators are added to the tools (languages) used to "do programming." Occasionally, something radical causes a... "paradigm shift" and new low-level opportunities are made available which flow up the chain to "programming." That's a one-way influence, isn't it? Or, does innovation in programming language, presumably in more efficient operators, influence hardware as well? I can see hardware being optimized, of course, for language, but how "innovative" is that? Enough for a shared dynamic, each influencing the other?
Let me explain the issue you're describing with some programming concepts to give you a realistic perspective, rather than a purely theoretical one. See, we have this concept called "abstraction." We do this with everything.

You can "abstract down" (which is how we normally learn about the objects in our life), where you take something, say, a basket ball, and say "that's a ball." Then you find out that it bounces, that it has a texture to allow friction to help you spin the ball (either when shooting or on your finger), that it's made of rubber, that it has those grooves on it which can affect it's flight path, and so forth. You have a similar thing with baseballs, except they have different properties. Then you have beach balls, volley balls, tennis balls, marbles, and so forth. All balls can roll, so if you need to roll something, you can grab any one of those balls. That's an abstraction, you can use any ball to fulfill the task of rolling. If you need something to relieve muscle knots, you need a ball or roller that can roll, but is hard enough to put pressure on a small point. A tennis ball may or may not work for this (usually does), but you can definitely rely on a baseball or basket ball to do this. These are abstractions: you're taking something complex, but simplifying it. For most objects, we abstract down and slowly discover how it's different from other objects that we give the same name to. The less complex name is the abstraction. We even do this with people. "Man" or "woman" often comes first, followed by "white, black, indian, asian, etc." Rather than having to store the entirety of the person in your brain, you can recognize properties of them that separate them from other humans, and these properties are often rather simple, and thus aren't as much of a problem to remember, which is why we can remember if someone's skinny or fat, but we can't remember what their face looks like.

You can also "abstract up," like with legos or building blocks, where, if you wanted to build a Pirate Bayamon, you would build a gun, out of those blocks, then you build 3 more like it. You build wings, and you build a skull. You build a body. Each of those then get mixed together to make the bigger bayamon. We make qualities, then combine the qualities. This is less common, but it is precisely what we do with programming, except it doesn't need nearly as much planning as building blocks, because, the first step of buildng the gun for the bayamon, you could potentially use the gun before you put it on the wings.

Now, the problem that you're describing comes in that, back to the ball example: while the task of rolling some balls are better suited to the task. Heavier balls roll further. Basket balls take up alot of space. This represents the concept of "optimization" in programming. What's easier for the programmer is getting it to work, but it comes at a cost of being a bloated mess. But, if it's optimized, it cost the programmer and/or company alot, and thus as a customer it will cost you more, but at least when it's running it runs much, much better. So, in order to try to get the best of both worlds, sometimes the basket ball gets redesigned, and you rely on a regulation basketball, but end up with a "little kiddy basketball" that's half the size and it doesn't quite work anymore, because you're relying on the mass to also make the shots go in, and that smaller ball just doesn't have enough mass for inertia to pull it as far, and you start missing shots. This is called deprecation. Or, let's change it up and refer to buildng blocks again: imagine they started replacing all the yellow blocks with 4 bumps in them, and if you want yellow blocks, you need to use one with 6 bumps, or if you want to use a block with 4 bumps, you have to use a blue one instead. As far as theory is concerned, you should be able to rely on your abstractions, but in practice this isn't the case, as some libraries become "deprecated" or "unsupported" (not the same, but neither are they mutually exclusive terms). Maybe what you wanted all along was a yellow block with 4 bumps and you've been doing it the other way for some time now, and you can finally do what you wanted to do in the first place. Those are the times that are awesome, but they don't happen all that often.

So, the problem is, hardware changes (building blocks), drivers change (the communication method between the button that fires the gun and the gun itself), the APIs change (the button to fire the gun), and the programs change (there the button is placed in the cockpit). And, yes, it's even more complicated than that. Some changes allow the benefits without changing any of the levels "above" it, and those changes are awesome. Unfortunately, in reality, it usually requires some kind of changes, or an extra "abstraction layer" (aka "wrapper") to make things even work like they did before, and sometimes without any of the benefits, so the extra wrapper complication is more code, more work, and more room for bugs, so often times what we think are improvements actually aren't. Sometimes, though, that's necessary, because the gun design doesn't exist solely to be fired from your bayamon, but other bayamons, and ships that are not bayamons.

Programming languages are just one layer. The good news is, the higher layers don't directly affect the lower layers.
Babbage moved automatons and windmills from simple, fairly single-purpose, tasks to a multitask capability. A machine that, within physical constraints, would do what you told it to do. That's more than the remarkable checkers-playing clockworks or variable speed transmisisons did.

Philosophy, from what little I know of it, will often use a rigorous set of principles in order to explore a concept or present an argument. The "rigor" presented in such things is what is important. If one strays, all is lost, since it naturally devolves into babble. Unless one creates a suitable foundation, already proven in its stability, the house collapses. Well, unless one invents a new, equally firm, foundation from which to make an argument or explore a concept.

Create a thing from a set of rules.

Tinkertoys. Legos. Building blocks. I don't mean just "objects" but constraints. You can't tell a machine to do something it isn't designed to do. You can force it to accomplish a task it wasn't designed for, but you can't force the "how" in terms it accomplishes that task if there is nothing in the machine that can support that "how." I can't tell a television to put a balloon on a string. But, I can certainly instruct a television on how to display a balloon on a string. I have hardware constraints that must be obeyed.

At the lowest level, it's all electrons jumping around. One day, it may be photons. Until then, innovation at the basic level revolves around more betterer ways to get the electrons to "do." Sometimes, there are new ways to get the electrons to "do." There could even be ways to organize that "do" that are innovative and new. Parrallel vs serial, etc.

It's all about rigorous rulesets at varying levels. Sometimes, people write new rulesets and tell the machine how to interpret those. The new rulesets make it easier, more betterer, to do certain things using that set of rules.

At the most basic level, considering this, perhaps teaching new computer programmers how to achieve things within a very narrowly defined set of rules is what should be done?
Assembly and BASIC are popular for this reason: they're lower level tools with simpler things, upon which the newer higher level languages are based. This means everything that exists can be translated into them, and thus familiarity with the simpler tools that are "harder to do things with," you can explain mov, add, sub, jmp, cmp, jX and how labels work, you should be able to pretty much explain any higher level programming language, given they're building upon those fundementals. People are afraid of this, though, because sometimes these fundemental concepts can be "hard to teach." Instead, alot of teachers avoid these fundemental concepts, and avoid certain features of the "easier tools," because they rely on the fundemental concepts that they don't wish to explain. This is a huge problem with foreign language learning (especially Japanese and english). This "if they need it, they'll figure it out on their own" mentality might be true, but you are wastefully making them learn what they inevitably need to learn a harder way. Maybe they don't need to learn assembly, but they will need to learn pointers, and pointers is way easier to teach in assembly. I understand, you need pointers to do assembly, but always will. You're just putting off pointers by avoiding the topic, and they'll learn them one way or another, even if it's the hard way, especially without a complete grasp of the topic (thus leading to incorrect assumptions that then become bugs) because you're lazy.
Teach them how to win at chess. Teach them how to win at Go. Then, give them a variety of "games" and tell them to "win." Then... tell them to make their own game, with their own rules, and make it winnable.

Electrical engineers are modern-day wizards. They must learn practical physics. They have to know "how", but they also have to know the "why" that lies behind the "how." If they don't, they end up causing lots of problems, like buildings burning down and equipment exploding... It's important.
This happens with programs, too. Most of the time if you have the right abstracted parts, you can get things right or "good enough." My girlfriend's father is an electrician, and you should hear some of his horror stories. You'd be surprised how much this incompetence exists in electrical engineering as well. You'll find, with lots of research, that the incompetence levels you see in your own field are in pretty much every field. For better or worse, the results are, by pure luck i imagine, rarely the worst case scenario. The curse of being a professional in something dangerous is knowing that your peers are incompetent, even after normalizing for Dunning-Kreuger effect.
How much does a programmer need to know about the "why?" At what point does the pursuit of that knowledge cause the student to begin deviating deeper into the engineering aspects rather than the "playing the game by the rules and winning?"

A "programmer" should, because of their ease with using rulesets to accomplish tasks, be able to pick up a new language and use it proficiently to accomplish tasks. IMO, they should be exposed to as many toolsets as possible. They should be instructed how, in general, these toolsets work and, to a certain extent, "why" they work. For instance, they should have a series of instruction concerning how languages, the tools, are constructed and how they work to move electrons around. (Assembly introduction and concepts relating to even lower-level functions as well as how interpretations of languages are created.)

But, I don't think that, for a programmer, they must "know" Assembly. What's important is their ability to acquire a new tool (rule-set/language) when it is needed or available and to use that tool to accomplish a desired task with an acceptable level of proficiency. Their practical, workable, knowledge may not need to extend beyond the level of the compiler. They know what it does and, to some extent, how and why it does it. But, they don't need an operational level knowledge below that in order to serve their purpose.
But, what if the level of proficiency, in order to be acceptable, must be operational level? To that degree, the latest people asking for my help get the following explanations from me. I start with electrons moving in a wire, then move to diodes (and how they work), then move to transistors, then work on a basic logic gate. From that logic gate, we can abstract a simple adder (simple addition), a lamp display, and switches to display information in binary notation. The inputs are the switches, the output is the lamps. Once you understand an adder, you know that other circuits exist, and you are free to look them up, if you wish, but you don't have to. As long as you know the adder, you understand enough to move up a level. We'll assume that we can add another set of switches that decide the operations. Then we can use another set of switches to say which array of switches becomes the inputs. Next you get a solid state input, which can be the result of an output, so that you can then use the output of a previous function to input the next function. At this point, you have a very obnoxious to use calculator. But, from there, we can remove all the switches, and fill the solid state memory based on switches and buttons. From there, we can easily build a handheld calculator, or start talking about RAM. Once you have RAM, you can start talking about ROM. Once you have ROM and RAM, we can run a program. Once we can run a program, we can start talking about adding other hardware. Once we have all that other harware (speakers, screen, buttons), these things can have processors, RAM, and ROM of their own (sound card, video card, keyboard [and yes, keyboards do have their own microcontrollers]). Then you have a need for a driver as an abstraction layer for programmers to have a common interface for different machines and hardware. Then as everyone realizes everyone has their own ideas, techniqes, etc, we find that software is forever expanding and adding more abstraction layers (OS provides common APIs, which use common drivers, which use common hardware, etc). If you want to know any more, you have the basics necessary to follow up with your own research. Whereas, without those basics, you can't ask "how does an operating system work?" because the answer might not be at the level you need it to be to actually understand (either too simplified to answer your actual question that has you looking into the topic, or too complex and you can't wrap your head around it).
An engineer, though, would need that level of knowledge. They build the stuffs that make the "dos." :)
Everyone does. Trust me. Imagine someone with website building knowledge only being tasked to write drivers for a new military laser for shooting down incomming missiles, simply because he's certified as "the computer guy." Obviously, the militaries around the world are more competent than that, but are they competent enough? This scares me. Would they settle for someone who has experience writing CD-ROM drivers?
mrbadger wrote:I'm not quite sure what's going on with pure Computer Science.

I can only talk definitely for my university, but the subject doesn't really seem to have realised yet it's not preparing people for academia.

That was true when I was an an undergrad in the early 2000's, and it seems to still be true. They won't let go of this 'we are training academics' mindset, even if they say out loud that they have.
...
Experimental vs practical. Engineers vs architects.

Every time someone comes up with a good, new, tool, the user environment erupts. An MRI revolutionizes medicine, a new plow revolutionizes farming.

Are the students tool builders or tool users? What is it that is being produced versus what it is that one desires to produce?
This is the kicker with computers, and why it's especially difficult. We see it a little with arduino, but, in reality, we both make and use the tools. This topic is often hard to me in terms of game dev. I really want to make a game where it's moddable at the core. Thus, i'm a tool maker, not a tool using game maker.

I have had ideas for awesome sandbox games before minecraft ever came out. I had this really cool idea of elder scrolls kind of thing in space, but users could build scripts within a specific API and just script new races in, new ships, new campaigns, etc, like everyone's doing right now. But, even cooler, was the idea that the AI would actually try to live in this universe despite the changes, and actually try to adapt to the mods. Those plans have since evolved quite a bit, and i've replaced them all for this idea of a virtual machine where programs can call other programs as abstractions, but then could override functions in higher level programs (much like classes in java and C++), but it could do all these things without having to recompile the entire code base. Doing so, I would then make a game with a game world where there are only 3 languages spoken in the whole world, and none of them actually exist in the real world. The languages woul be based on what would make sense for existance in the game world, and thus the AI could legitimately communicate with each other (instead of sharing common knowledge with global variables), and, as a player, your goal is to integrate yourself into this world and learn the language of the starting village (where you woul have "parents" who would teach you the language like real parents do), and to get out of the village and experience the real game, you would have to show proficiency in casting magic, talking to NPCs, etc (but this would be tiered, rather than a single test, so one simple test would be to have you learn how to ask for food and water, and slowly you could ask for more materials), and the AI wouldn't just listen to your commands, but you'd have to earn your rank, and you could suffer losses in this rank by being demanding and never contributing to the projects. The best part? The project is not as ambitious as it sounds. The hardest part would be building the compiler/assembler for the virtual machine. However, a bigger problem is that I know it would need a really patient starting community whom can get into the gameplay to take off. Therefore, i know the project would fail. Granted, none of the ideas are actually new, and have existed in several games before (nethack, adventure bar story [3ds], barony, etc have had artificial languages that are randomly generated on each new playthrough, for example; while fantasy life has shown there's an interest in various different roles in a world; cavestory and other retro games showed that you don't need to have a complex 3d world to make people happy).
Students want a degree and knowledge that they can use to pursue their desired goal. That's true in every institution of learning. If the student wants to create new tools, they need an engineering focus. If they want to learn how to use tools and how to learn to use new ones, they need a practical focus. Both? Well, then they can get a dual degree in both. :)
In reality, it really is both. Especially those familiar with CLI will quickly find themselves making tools, even if they didn't get into it expecting that. Programming really is all about making tools. Some tool making is more like tool making than other tool making.
Computer "Science" to me indicates something more tool-creation based than tool-use based inasmuch as the desired goal of such things are different. It helps if a race-car driver knows many of the details of how their automobile works, but that doesn't help them achieve the goal of winning automobile races. They have specialized mechanics to ensure they have the best tool possible for that.
You'd be surprised how well a race-car driver knows about their automobile. I personally know a few small-fry professionals. Mechanics seem to be more-or-less sidekicks or hired hands rather than a completely different part of the same team. Just think of how well you know the various ships in X. Now, if it was realistic, every ship of the same race and class would have little quirks about them and be slightly different. Ever notice this about the cars you've driven? Even if there's cars of the exact same year and model, maybe the stearing wheel has a larger "deadzone." Maybe one accelerates faster than the other, even though you feel those differences shouldn't exist. As such, that race-car driver is very familiar with his cars, and many are often under the hood with their mechanics: they want to make sure things are tweeked to their own personal preferences. This is less so seen with, say, an air force, but you still see it. You wouldn't believe how much a pilot learns about his airplane. Falcon 4 BMS has given me an ever so small taste of that reality. The basic rule is: the more competitive the market, the more you have to know to give yourself an edge.
Morkonan wrote:
mrbadger wrote:My argument on that has been that we have post graduate courses for specialists.

Undergraduate courses are for generalists, and should be aimed at things people need for the workplace and their careers.
A good solution, probably. Nobody wants a half-trained tool maker. :)

PS - Geometry!

Geometry is a fascinating subject. I love proofs! Proofs, everywhere! This is this, and this is this... so this must be this, always. Oh, but only if it's just two dimensions... :) If it's three, you'll have to sign up for the advanced courses. :D
Honestly, i'd much prefer a grading system that wasn't based on "pass or fail" but "this is your level of competence." In other words, if you can't handle making functions, you're on the web-design team, because you passed the class with a 10%.

pjknibbs
Posts: 41359
Joined: Wed, 6. Nov 02, 20:31
x4

Post by pjknibbs » Sat, 13. Jan 18, 07:10

kohlrak wrote: Assembly and BASIC are popular for this reason: they're lower level tools with simpler things, upon which the newer higher level languages are based.
BASIC is in no way a low-level language. The reason it's popular is because of the home computer boom of the 80s, where almost all of them had a BASIC interpreter built in--many of today's older programmers can trace back their start in programming to one of those computers, at least in Europe.

User avatar
Observe
Posts: 5079
Joined: Fri, 30. Dec 05, 17:47
xr

Post by Observe » Sat, 13. Jan 18, 07:31

pjknibbs wrote:
kohlrak wrote: Assembly and BASIC are popular for this reason: they're lower level tools with simpler things, upon which the newer higher level languages are based.
BASIC is in no way a low-level language. The reason it's popular is because of the home computer boom of the 80s, where almost all of them had a BASIC interpreter built in--many of today's older programmers can trace back their start in programming to one of those computers, at least in Europe.
This ^^

You basically have: machine language -> assembly language -> Basic and all the other high-level languages. Basic, in its turn, was influenced by Fortran, which is still used to this day; primarily in certain scientific applications.

BTW, my first programming was using machine language for the Signetics 2650 microprocessor in 1975, to control a variable frequency, synthesized A.C. power supply for the U.S. Air Force. :)

User avatar
Morkonan
Posts: 10113
Joined: Sun, 25. Sep 11, 04:33
x3tc

Post by Morkonan » Sat, 13. Jan 18, 09:15

kohlrak wrote:Computer science isn't all that fluid, either. Everyone just comes up with their own new ideas that they think are new just because it has a new name, but really isn't....
Then, distill it - What is it? Here's a "definition" - "the study of the principles and use of computers." (courtesy of Google)

As a rule, anytime you use the same word to define a thing, it's not a "definition."What's a "truck?" "Well, I'm glad you asked that question! It's a truck!" /sigh

I get it, it's the "science of computers." OK. But, courtesy of Wikipedia (All Praise to The Almighty Wiki) -

"... An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[1]..

Now, that is much more betterer. It's a heck of a lot more meatier than " it's about computers."

IF that is the most succinct definition, that's where one should start when teaching or learning it, right? Sure, there's lots of other stuff, but all of that builds on that concept. If it doesn't, then it's "something else." That's true of every subject, each having their branches of specialization, some becoming fully independent subjects themselves as a result.
What changes instruction in Computer Science? What is the active operator? What is considered "new knowledge?"
A person with an idea, a promise, and the belief that something they're doing is actually different.
Too general. :) A promise of innovation isn't innovation. A claim that a new ball will roll further isn't innovation unless it does. It won't create a paradigm shift if it doesn't do... what? What must it effect or affect in order to drive true change? There's a nugget deep in Computer Science that will respond to that thing. What's that nugget and what does it pay attention to?

No, it's not easy, since the notion itself screams out that it's not easily predictable. But, like porn, any student of it should know it when they see it. :)
Rerepresentation of math is more accurate, to the degree that it can have functional properties. ...
THIS IS GOLD. I'm purposefully ignore the part where you went deeper into explaining this sentence. No more explanation is needed nor should the focus be clouded with needless words. Unfortunately, I'm talking now, supplying them all for you.

Is math an "abstraction?" Point to ponder, perhaps. But, what you're really doing is manipulating an existing system, mathematics, to control another system, physics. Mathematics, through all the funky controls implemented, manipulates physical controls to yield a desired result. But, mathematics itself, if it isn't truly the "language of the Universe" isn't the system you're manipulating.

You're re-representing the controls in mathematical terms. Why? Because, mathematics is an abstraction, representing the controls in place over a physical system. And, why math? Precision and rigor, no matter how it's defined. It's also much better than using Portuguese... Besides, things get difficult to really understand at their most basic level without maths. Maybe it is the language of the Universe, just not very well-liked by its inhabitants. I suppose some people like Portuguese, so let them be the "mathematicians."
Math is more of an art than a science, even though we use it in science. I see computers as no different, aside from the fact that we use science to make computers. And if you understand the connection between the three, you start to view modern scientific theories (especially quantum physics, computer modeled events, etc) in a very, very new light, and probably not a positive one when you also realize how wildly inaccurate it is. Add psychology to the mix and you'll be a skeptic of human "knowledge."
I'm a skeptic of what people call knowledge. I think "knowledge" is a misleading term. I also think that most people don't consider our reality very well or very deeply, but that's another subject.

"There was a crooked man, and he walked a crooked mile,
He found a crooked sixpence against a crooked stile;
He bought a crooked cat which caught a crooked mouse,
And they all lived together in a little crooked house.
" - Mother Goose
... As such, the more you know of a processor's assembly, the more you know about what you should and shouldn't do on it.
"Its assembly."

If it changes from time to time, something changes it and it's not fundamental. (Different architecture may require different instruction sets.) There are controls further below, operators that respond to the commands and "make do" stuff. Assembly, in this interpretation, is simply the most basic instruction set for those other things. And, it's the one that is open for manipulation to those uninitiated in the mysteries, since they'd set the house on fire if they were allowed to go any deeper...

If you type out assembly code, you're not moving electrons. You're telling something else to move them. Something in a particular chip or on the board responds, shunting things around according to your command. The existence of 101101101 is not reflected in little ones and zeroes lining up, since ones and zeros don't do anything themselves. There's a fuzzy electron bouncing or reaching, depending on how you look at it, being influenced in its movement, however you look at that, to avoid burning the whole thing down while still making something happen in the "aggregate."
It's not inconceivable to teach from the atomic level. I've tried to do this a couple times, but your average Joe gets too bored. Which, i understand: arduino's are cool, but most people don't want to use them outside of picking up a "shield."
A blacksmith must have an understanding of metallurgy. That's the boring part to people that aren't blacksmiths. :) In a way, a blacksmith is a practical chemist. A "computer scientist" may need a deeper, more functional, understanding of physics of the quanta, but a programmer needs to have some portion of that understanding as well. A cert carrying Cisco Engineer, some Novel guy, someone humping footballs writing netcode? Not so much. It might help, though.


On the subject of teaching and prediction of performance:

So, what works in other disciplines? That's the subject, right? "How do we make this betterer?"

History is a good subject to start with. It's all about "starts" anyway. Most teaching seems to include a lot of it in early classes. "The history of our subject, part one" is usually <insert class subject here> 101 level. Every textbook is full of "the history of the evolution of the making of the design of the study of the idea that is the result that we now call _____." Surely, that's important in some fundamental way besides an instructor bragging on how far we've come since the days when we actually had to chase our food.

I've had friends, mostly women, who decided to go into the profession of teaching. I can only assume, since I paid little attention, that such subjects start off with discussing Plato and a bunch of Greeks arguing with each other as students watched.

But, I also assume that they don't actually... do this. They don't move chairs around and then get students to stand up and start "teaching" in Teaching 101. If they don't do that, how can they be reasonably expected to understand how teaching has evolved and how it has improved? Or... has it?

If Computer Science is looking for a better way to teach and, perhaps, be able to predict a student's success given a set of teaching conditions, it may need to look to other professions, even ones some would consider to be unrelated. How do these professions achieve their goals with success? Copy that. Paste it. Save.
This is the kicker with computers, and why it's especially difficult. We see it a little with arduino, but, in reality, we both make and use the tools. This topic is often hard to me in terms of game dev. I really want to make a game where it's moddable at the core. Thus, i'm a tool maker, not a tool using game maker.
I had a similar idea. I wanted a game that I couldn't predict and one that readily accepted complexities and responded to them. And, not just through some RNG, either. I wanted a game that built its own rules and changed them in response to "x" much like a person would, but still within certain limits. (Basically, "AI.")

A long time ago, in a basement far far away, I started coming up with a grand strategy modern warfare game. It was all just numbers with some primitive graphics, but I had loads and loads of data on units, strengths, weaknesses, terrain, etc. But, I sucked at programming, so it never got to the point where it would "do." I made a character generator and a sort of adventure maker for D&D, instead. That was easy. :) Thus ended my dream.
You'd be surprised how well a race-car driver knows about their automobile...
Well, kinda. I had a race-car driver that would constantly solicit me for sponsorship. We bought some tires for him and he plastered our name on the side of his car. Tiny letters. You get bigger letters if you pay more. :)

Race-car driver "professionals" are Professionals. They have an in-depth knowledge of their field including how to change blinker fluid and all the other stuffs they should know about their tools. They have a specialized knowledge and capability inside their field, though.

A point spawned from another post, can't remember who's or where: As a college student, I once got offered a job working on a classified military project, simply because I could "computer." WTF? My "do computer" was pretty much limited to "make this fookin' thing run game and make this other thing get pr0n from teh bbs." And, one of the leads on the project, who wasn't computer literate, was scrambling to get staffed. I told him I sucked. He didn't care. "We'll get you taught, just show up and you're golden." The project got buku funding and is all over the place, today, revealed in the open, mostly, and still managing to get bunches of dollars thrown at it. Eventually, it'll all hit "sci-fi-reality" in a few years. I keep kicking myself whenever I think about it. But, no regrets, really, as I have no desire to make more of an idiot of myself in public on that scale. I'll just stick it out here in OT and accomplish much of the same thing, with less volume. ;)

User avatar
JSDD
Posts: 1378
Joined: Fri, 21. Mar 14, 20:51
x3tc

Post by JSDD » Sat, 13. Jan 18, 09:49

"ability to learn" is an inherent capability of all living creatures, isnt it ?

"programming" is nothing more than a sequence of instructions, what varies is the "language" in which these instructions are written ... i'm sure that every monkey can learn "program" a machine to give them a banana if you teach them the language ... there's no magic behind it
To err is human. To really foul things up you need a computer.
Irren ist menschlich. Aber wenn man richtig Fehler machen will, braucht man einen Computer.


Mission Director Beispiele

kohlrak
Posts: 136
Joined: Thu, 28. Dec 17, 11:47

Post by kohlrak » Sat, 13. Jan 18, 12:08

pjknibbs wrote:
kohlrak wrote: Assembly and BASIC are popular for this reason: they're lower level tools with simpler things, upon which the newer higher level languages are based.
BASIC is in no way a low-level language. The reason it's popular is because of the home computer boom of the 80s, where almost all of them had a BASIC interpreter built in--many of today's older programmers can trace back their start in programming to one of those computers, at least in Europe.
Some processors execute basic without compilation. Technically, this makes BASIC a dialect of assembly. Not my argument, per se, but I've heard it before and i think it's a legitimate stance. It is certainly low level, even if interpreted.
Observe wrote:
pjknibbs wrote:
kohlrak wrote: Assembly and BASIC are popular for this reason: they're lower level tools with simpler things, upon which the newer higher level languages are based.
BASIC is in no way a low-level language. The reason it's popular is because of the home computer boom of the 80s, where almost all of them had a BASIC interpreter built in--many of today's older programmers can trace back their start in programming to one of those computers, at least in Europe.
This ^^

You basically have: machine language -> assembly language -> Basic and all the other high-level languages. Basic, in its turn, was influenced by Fortran, which is still used to this day; primarily in certain scientific applications.

BTW, my first programming was using machine language for the Signetics 2650 microprocessor in 1975, to control a variable frequency, synthesized A.C. power supply for the U.S. Air Force. :)
If you do know machine language, then you should also know that assembly is only a level of transcription, rather than translation. Either that, or you skipped learning assembly, at which point i feel really sorry for your programming experience.
Morkonan wrote:
kohlrak wrote:Computer science isn't all that fluid, either. Everyone just comes up with their own new ideas that they think are new just because it has a new name, but really isn't....
Then, distill it - What is it? Here's a "definition" - "the study of the principles and use of computers." (courtesy of Google)

As a rule, anytime you use the same word to define a thing, it's not a "definition."What's a "truck?" "Well, I'm glad you asked that question! It's a truck!" /sigh

I get it, it's the "science of computers." OK. But, courtesy of Wikipedia (All Praise to The Almighty Wiki) -

"... An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.[1]..

Now, that is much more betterer. It's a heck of a lot more meatier than " it's about computers."

IF that is the most succinct definition, that's where one should start when teaching or learning it, right? Sure, there's lots of other stuff, but all of that builds on that concept. If it doesn't, then it's "something else." That's true of every subject, each having their branches of specialization, some becoming fully independent subjects themselves as a result.


Well, to that degree, you would expect, then, the main purpose of getting into the field as a whole without specialization would then be getting yourself a functional level of knowledge of all the major layers, to the point you can continue into your own research. But, by that definition, comp sci becomes a 101, an entry point into a more specialized field.

But, the truth is, the definitions you find, vs the definitions in the wild, are different. When you sign up for a "comp sci" course, like I did in highschool, it basically translates to "learn the flavor of the day programming language." In my case, the flavor of the day was Java (yep, i have Computer Science credits, but never went to college, so they're worthless to me). And, by that definition, the old Comp. Sci. was C++, which i taught myself years before. And the difference between Java and C++ from a language point of view? Very, very little. In fact, the stuff that the book i was reading failed to teach me about C++ i ended up learning through the java course. I had a potential employer ask me to write him some code to show my worth, and i passed the interview doing C# code when I never coded in C# or took a course on it. How? I googled the API and looked at the compiler errors when it wouldn't compile. He was real anxious to hire me, but then i turned him down for a very special set of reasons (i met someone who knew the guy, and he was over ambitious, and he was likely to go under [to this day, i don't know what happened to his company, if it moved or if he gave up]). For perspective, C# is probably the Comp. Sci. flavor of the day right now, begging to be replaced with something new, maybe C++17.
What changes instruction in Computer Science? What is the active operator? What is considered "new knowledge?"
A person with an idea, a promise, and the belief that something they're doing is actually different.
Too general. :) A promise of innovation isn't innovation. A claim that a new ball will roll further isn't innovation unless it does. It won't create a paradigm shift if it doesn't do... what? What must it effect or affect in order to drive true change? There's a nugget deep in Computer Science that will respond to that thing. What's that nugget and what does it pay attention to?
Well, my answer was a little towards the sarcasm side. The reality of innovation right now, as it's focused on programming languages, is basically everyone patting each other on the back and calling new features a welcomed addition. Pretty much everyone's converting their language to the same thing so they get brownie points for adding features that people enjoy from some other programming language. From what I can tell, the end goal is "how do we make it easier for people to learn programming, and do so making fewer mistakes?" Which never accomplishes it's goal, because the task of nannying the program makes certain tasks difficult, which makes it harder for students to get into programming, because they can't do what they wanted to do that got them learning in the first place. Or the language ends up being slow on real machines when doing real programs. Or, perhaps, something else that makes the whole "innovation" less practical. Ultimately, the "newer, easier, and safer" programming languages end up being like giving someone a dull knife: they just try that much harder to do what they want to do and end up cutting themselves (giving up or doing something they weren't supposed to do).

And if you really want some comedy, go look up the "goto debate." It's one of those debates that it usually takes a beginner or someone who's not studied in the field to tell the nanny side that they're absolutely full of it. The jist of the arguments boil down to the first code somehow making more sense than the second code.

Code: Select all

while(true){
	command1
	command2
	command3
	command4
	command5
}

Code: Select all

the_beginning_of_the_end:
	command1
	command2
	command3
	command4
	command5
goto the_beginning_of_the_end
If the goal is to do something over and over and over again, which one seems more "readable?" Yep, welcome to the politics of Computer Science. If you think it's any more than preference, note that the debate revolves around the programming languages that have both options available to the programmer.
No, it's not easy, since the notion itself screams out that it's not easily predictable. But, like porn, any student of it should know it when they see it. :)
How about the emperor's new clothes? People are seeing it when it's not even there. I'm still trying to see how lambdas in C++ are innovative when java et al had them for a long time. Don't get me wrong, i appreciate the complication of the tool (I honestly do appreciate having the option that i probably won't use, simply because it's an option), but don't say it somehow makes thing easier or is somehow innovative.
Rerepresentation of math is more accurate, to the degree that it can have functional properties. ...
THIS IS GOLD. I'm purposefully ignore the part where you went deeper into explaining this sentence. No more explanation is needed nor should the focus be clouded with needless words. Unfortunately, I'm talking now, supplying them all for you.

Is math an "abstraction?" Point to ponder, perhaps. But, what you're really doing is manipulating an existing system, mathematics, to control another system, physics. Mathematics, through all the funky controls implemented, manipulates physical controls to yield a desired result. But, mathematics itself, if it isn't truly the "language of the Universe" isn't the system you're manipulating.
I'm glad i finally heard someone agree with me on the idea that mathematics is not the language of the universe (odds are, you haven't even heard of me making this argument, but it's a staple of mine where science and politics intermix, and it equally destroys both common atheist and common religious doctrines). The thing is, however, that if mathematics is what it is, we're essentially making a new branch, where this new branch is mixed with physics and common notation and interpretation to manipulate a small amount of matter in this universe. If you think about it, it's incredibly inefficient, but still way more efficient than anything we've had before. This idea that wire 1 having electricity running through it represents 1, wire 2 having electricity flowing through it represents 2, wire three's electricity representing 4, and so on, is merely a "standard" that we conform to. We don't even have a standard for what values are addition, subtraction, etc. Programming languages above assembly are basically meant to hide this basic fact.
You're re-representing the controls in mathematical terms. Why? Because, mathematics is an abstraction, representing the controls in place over a physical system. And, why math? Precision and rigor, no matter how it's defined. It's also much better than using Portuguese... Besides, things get difficult to really understand at their most basic level without maths. Maybe it is the language of the Universe, just not very well-liked by its inhabitants. I suppose some people like Portuguese, so let them be the "mathematicians."
The point is that the abstractions allow you to return to math, which is a common language that smart humans tend to speak more of than any other language. The messages don't need anything more than numbers and letters to represent numbers and sequences of numbers that you can't be bothered to remember. At the end of the day, it's still math, only we've manipulated the rules to sandbox it (or, rather, because without sandboxing we couldn't make a common interface with it).
Math is more of an art than a science, even though we use it in science. I see computers as no different, aside from the fact that we use science to make computers. And if you understand the connection between the three, you start to view modern scientific theories (especially quantum physics, computer modeled events, etc) in a very, very new light, and probably not a positive one when you also realize how wildly inaccurate it is. Add psychology to the mix and you'll be a skeptic of human "knowledge."
I'm a skeptic of what people call knowledge. I think "knowledge" is a misleading term. I also think that most people don't consider our reality very well or very deeply, but that's another subject.

"There was a crooked man, and he walked a crooked mile,
He found a crooked sixpence against a crooked stile;
He bought a crooked cat which caught a crooked mouse,
And they all lived together in a little crooked house.
" - Mother Goose
We seem to disagree on pretty much everything except this. So, I have a feeling that our disagreements revolve around a fundemental difference on how to handle this.
... As such, the more you know of a processor's assembly, the more you know about what you should and shouldn't do on it.
"Its assembly."

If it changes from time to time, something changes it and it's not fundamental. (Different architecture may require different instruction sets.) There are controls further below, operators that respond to the commands and "make do" stuff. Assembly, in this interpretation, is simply the most basic instruction set for those other things. And, it's the one that is open for manipulation to those uninitiated in the mysteries, since they'd set the house on fire if they were allowed to go any deeper...
The instruction set differences tend to vary wildly. While we do have some common elements like add, sub, etc, some of the even simpler operations like call, ret, mov may or may not exist in that form or in a completely different syntax. To the degree of there being any commonality in assembly, it boils down to 1 command per line. GNU Assembler Syntax for x86 even violates the standard "mneumonic destination, source" syntax. That said, there are fundementals to it, but everyone's got their own ideas. For every assembler, though, the basic syntax of the instructions can easily be learned in about 30 seconds. The instruction set learning curve depends on the CPU family, and the macro and conditional assembly depends on the assembler, but often follows a syntax that's relatively similar to another assembler's syntax, with some minor changes. Like the difference between C, C++, C#, Java, etc. You set them side by side and you might have trouble telling them all apart, but you usually can't just drop one in to replace the other, since they're different on "fundemental levels."
If you type out assembly code, you're not moving electrons. You're telling something else to move them. Something in a particular chip or on the board responds, shunting things around according to your command. The existence of 101101101 is not reflected in little ones and zeroes lining up, since ones and zeros don't do anything themselves. There's a fuzzy electron bouncing or reaching, depending on how you look at it, being influenced in its movement, however you look at that, to avoid burning the whole thing down while still making something happen in the "aggregate."
More than one, but pretty accurate, otherwise, and unnecessarily abstracted. It really is as simple as "We have a thing called a transister where wires A, B, and C exist. If electrons flow from C (ground) to A (voltage source), then electrons can also flow from C to B, else it don't happen. Accepting that electricity flows through the path of least resistance, you basically string a bunch of these ABC combinations together (2, actually) and you can get a simple "and" or "or" gate, which then you string a few of those together to get some advanced logic regarding the standard representation of one number to mix with another number according to the same standard, and get an output from the overall circuit in the form of another number to the same standard. Now, if the line representing 1 controls whether or not the lights have power, and if the line representing 2 controls whether or not the locks have power, and the line representing 4 represents whether or not the security system has power, you punch in 6 to go to bed, and 7 if you're going to the store. It's scarier than it is complicated, and when you face it, it really does end up being simple.
It's not inconceivable to teach from the atomic level. I've tried to do this a couple times, but your average Joe gets too bored. Which, i understand: arduino's are cool, but most people don't want to use them outside of picking up a "shield."
A blacksmith must have an understanding of metallurgy. That's the boring part to people that aren't blacksmiths. :) In a way, a blacksmith is a practical chemist. A "computer scientist" may need a deeper, more functional, understanding of physics of the quanta, but a programmer needs to have some portion of that understanding as well. A cert carrying Cisco Engineer, some Novel guy, someone humping footballs writing netcode? Not so much. It might help, though.
Certainly does. I guess your argument makes sense if you apply the pareto distribution, but employers are kind of sick of getting the 80% of applicants who understand 20% of the big picture. Sure, they can do scripts and such, which is the main task, but sometimes the task calls for a little above the bare minimum, and it also follows the pareto distribution, thus it's not every day, but it's far from rare, either, and it's causing problems.
On the subject of teaching and prediction of performance:

So, what works in other disciplines? That's the subject, right? "How do we make this betterer?"

History is a good subject to start with. It's all about "starts" anyway. Most teaching seems to include a lot of it in early classes. "The history of our subject, part one" is usually <insert class subject here> 101 level. Every textbook is full of "the history of the evolution of the making of the design of the study of the idea that is the result that we now call _____." Surely, that's important in some fundamental way besides an instructor bragging on how far we've come since the days when we actually had to chase our food.

I've had friends, mostly women, who decided to go into the profession of teaching. I can only assume, since I paid little attention, that such subjects start off with discussing Plato and a bunch of Greeks arguing with each other as students watched.

But, I also assume that they don't actually... do this. They don't move chairs around and then get students to stand up and start "teaching" in Teaching 101. If they don't do that, how can they be reasonably expected to understand how teaching has evolved and how it has improved? Or... has it?

If Computer Science is looking for a better way to teach and, perhaps, be able to predict a student's success given a set of teaching conditions, it may need to look to other professions, even ones some would consider to be unrelated. How do these professions achieve their goals with success? Copy that. Paste it. Save.
Well, there's places in programming where that works, and places where that doesn't work. Success in another profession usually means knowing the information your rival professional doesn't know. This isn't the attitude we have with technology. Instead, we've tried to measure performance by "lines of code," without comparing it to the complexity of the problem. With math, this is akin to saying the guy who can solve 100 addition and subtraction problems in 1 hour with blocks is better than the guy who can calculate the square root of 10 numbers to 20 decimal placements in his head in 1 hour. Now, you would imagine that they would abandon such a system when they know even if they did compensate for the difficulty of the task, they still couldn't compensate for other factors, such as bad management assigning the wrong tasks to the wrong people, and then not even letting people whose code depends on each other even communicate, that they would give up the idea. Nope, Microsoft committed to that idea, anyway. We have to hire a professional to evaluate the craftsmanship and stability of a house, and we've been living in houses for how long now? Now try evaluating someone's code.

But when it comes to the history point, that's where my idea of teaching from the electron comes in. Teach from that, skip the vacume tubes we don't need anymore, and land on the language of the day, and you have your history. Teach the stages to at least a level that people can see it functioning (get out a multimeter, grab a few transistors, grab a few quad-ops and LEDs, grab some switches, grab a basic addre, grab an arduino, and show the stuff working exactly how you describe it to). What's cool about this is that we don't have to rob a museum to get our ancient artifacts of the field, and the prices are reasonable. Heck, for less than the price of one student to have a text book, a class of 20 can have their own artifacts.

I think what it boils down to is, what this one comp sci guy said (i wish i could find the video), that people aren't tinkering anymore (and my arguments in this post boil down to the fact that we're straight up discouraging it), so people are having trouble understanding the lower levels that make up what they teach. My teacher in highschool clearly didn't even know what assembly looked like (i had it on my screen after "finishing the book"), when he was straight up teaching the entire class that it looked like hex-decimal. He wasn't too pleased when the programming language on my screen didn't look like hex-decimal and i called it assembly. Granted, i live in a rural area and it wasn't college, but the united states is mostly rural. This is what the college professors are picking up. And i've indirectly found out through some students that it's this bad in colleges, too. It's ugly.
This is the kicker with computers, and why it's especially difficult. We see it a little with arduino, but, in reality, we both make and use the tools. This topic is often hard to me in terms of game dev. I really want to make a game where it's moddable at the core. Thus, i'm a tool maker, not a tool using game maker.
I had a similar idea. I wanted a game that I couldn't predict and one that readily accepted complexities and responded to them. And, not just through some RNG, either. I wanted a game that built its own rules and changed them in response to "x" much like a person would, but still within certain limits. (Basically, "AI.")

A long time ago, in a basement far far away, I started coming up with a grand strategy modern warfare game. It was all just numbers with some primitive graphics, but I had loads and loads of data on units, strengths, weaknesses, terrain, etc. But, I sucked at programming, so it never got to the point where it would "do." I made a character generator and a sort of adventure maker for D&D, instead. That was easy. :) Thus ended my dream.
Which is a shame. Yet, we see these things are indeed happening, so they're ultimately possible, you just need to get back to the "do."
You'd be surprised how well a race-car driver knows about their automobile...
Well, kinda. I had a race-car driver that would constantly solicit me for sponsorship. We bought some tires for him and he plastered our name on the side of his car. Tiny letters. You get bigger letters if you pay more. :)

Race-car driver "professionals" are Professionals. They have an in-depth knowledge of their field including how to change blinker fluid and all the other stuffs they should know about their tools. They have a specialized knowledge and capability inside their field, though.
That's the thing, though, programmers should be that specialized, and often times they're far from it. Someone who drives truck should be specialized like that, too. Someone who goes to and from work should know how to fill the tank, check the oil, and other basics. If your profession uses a computer (which, in modern times usually does), you should be able to do at least basic scripting or something to build some degree of self-sufficience. I'm not asking for them to implement PGP encryption on data on the fly or something. I'm just asking they can "git commit." Yet, in this post, that seems to be too specialized for even degree seekers.
A point spawned from another post, can't remember who's or where: As a college student, I once got offered a job working on a classified military project, simply because I could "computer." WTF? My "do computer" was pretty much limited to "make this fookin' thing run game and make this other thing get pr0n from teh bbs." And, one of the leads on the project, who wasn't computer literate, was scrambling to get staffed. I told him I sucked. He didn't care. "We'll get you taught, just show up and you're golden." The project got buku funding and is all over the place, today, revealed in the open, mostly, and still managing to get bunches of dollars thrown at it. Eventually, it'll all hit "sci-fi-reality" in a few years. I keep kicking myself whenever I think about it. But, no regrets, really, as I have no desire to make more of an idiot of myself in public on that scale. I'll just stick it out here in OT and accomplish much of the same thing, with less volume. ;)
That stuff scares me, too. The general idea is, don't go near anything that has been touched by a project with that level of stability, otherwise you might not remember it.
JSDD wrote:"ability to learn" is an inherent capability of all living creatures, isnt it ?

"programming" is nothing more than a sequence of instructions, what varies is the "language" in which these instructions are written ... i'm sure that every monkey can learn "program" a machine to give them a banana if you teach them the language ... there's no magic behind it
You'd think so, right? We do call some people "code monkeys" for a reason, but they seem to be hard to find, according to businesses. Personally, i just think businesses are looking in the wrong place, and people desiring to learn for real are also putting their trust in the wrong place. I'm sure there's some good education out there, somewhere, through the traditional methods of college, but it's not working, and it's not specific to programming. I think saturation is a big contributor to the problem. And thanks to all the talk about top paying jobs, i imagine people in law and med school have similar proble--oh wait, i just remembered that my girlfriend's menstrual cycle (affected by the endocrine system) had no connection to her thyroid (an organ in the endocrine system). She went to nursing school a month or two later, then apologized for making fun of me for calling BS. Within a week of starting classes, she was learning about the endocrine system and how finicky it is. It only took me 5 minutes and a guess to google and cross reference reputable articles, and general knowledge that the endocrine system is a bit touchy (thanks to all the meds people take, you can see imbalances in people all the time where one med leads to another med).

I tell people all the time that i'm not all that smart or knowledgeable on things, so me having a better track record than a professional medical practitioner in this area isn't me praising myself. It's me insulting the doctors. I could go on, but that was the most recent story.

Post Reply

Return to “Off Topic English”