" there seems an emphasis ... on using DEF PROC, familiar to BBC Micro owners, to define subroutines"
Back in the day of the board computers of the late 1970s - your Scrumpi, your Nascom 1, your UK-101 et al - it was customary to build a case for it out of wood. If you were a better equipped "constructor" - what we used to call "makers" in those days - you’d build a box out of metal. Folk like Tangerine offered optional cases …
That sparks a very faint glimmer of recognition in my mind - recognition of a term I had absolutely no idea the meaning of back then and am none the wiser now; though as a grown up would hazard a guess is an abbreviation for "define procedure" or "process". I'm probably wrong. Perhaps it is "cool public relations orange county". The problem was IT teaching back then fell to the maths teacher and basically (no pun intended) consisted of typing out code in a book. Or maybe the problem was just me. Either way, I rarely understood a line of it ; and being a slow and bad typer, the computer rarely understood me. To me it seemed unintuitive abbreviations of unintuitive words, and if the computer could understand weird obtuse stuff, why didnt they just make it understand words that were easy to understand? It really did seem that they had been deliberately designed to be as incomprehensible as possible. I think most of the class was the same, but a couple of kids just "got it" and raced a couple of books ahead of everyone else. Needless to say, I dont have anything to do with code as a grown-up. They are probably .com millionaires. I kind of like the idea of Raspberry Pis and similar as I love gadgets and Making, but unfortunately the memory of programming misery the first time around is still enough to keep me put off dipping a toe in the water at this stage. I'd like to think that the education system has had a lot more time to get to grips with teaching programming effectively; but to be honest I still think it is far far more important & useful to teach kids how to use web browsers & Office than how to code. There will always be a couple of adept kids honing their skillz in their bedrooms & after-school classes who will become the programmers of the future; but for the majority, knowing how to use software is more important to their career prospects than knowing how to make software. Probably the wrong place to air a view like that - but I come here mostly for EMP weapons, molten salt reactors and Playmobile reconstructions...
No. No. No.
The key question is: How much BASIC can you cram into *ONE* line of code?
What's the character count maximum per line of BASIC?
About 240 characters in just one line of BASIC code should be enough for any project, such as replicating SAP in its entirety.
At one time Motorola used to make a 6800 variant with a built-in terminal interface (LILbug).
It would be nice (I think) to have an equally low level kit today, as even the Pi and the Arduino have a considerable learning curve at the machine code level. Maybe I'm just extremely old fashioned, but I feel that with these new toys there are just too many levels of abstraction between monitor and LED. The essential connection between machine code and I/O is lost.
To use the usual analogy, the Pi is like a small but very customisable car, but where is the learning equivalent of the bicycle?
It is important that we still teach machine coding at some level. I have included coding very simple programs on a simulated microprocessor in our course "Introduction to Computing Science". It helps people understand what goes on "under the hood" when coding in C (in the course "Imperative Programming" running in parallel). These simulators can run on the Pi or Arduino controllers, I suppose. In particular, simulators can show what is going on graphically, and that helps understanding as well.
Procedural languages, Functional languages, OOP, etc., are mostly just different kinds of organisational scaffolding. No mainstream CPU actually gives a toss about any of that stuff; it all ends up as machine code in the end. Often on an Intel or ARM CPU, neither of which care one whit about whether you like to organise the original code in your source files as subroutines or as objects. Same meat, different gravy.
None of that stuff matters.
What matters is understanding how computers "think", because programming is just a synonym for "translation" and is actually pretty easy to learn at that level. I was far more productive coding Z80 or MC680x0 code in assembly language than I ever was writing in C++. I used to be proud of writing bug-free code, and it really *was* bug-free. But those days are long gone. The hardware has become orders of magnitude more powerful and capable, but the tools we use to program it all have barely changed since the flint axes of the 1970s.
It's 2013 and we're *still* writing code using artificial languages that require us to walk on eggshells due to their cruel and unusual punctuation. My *phone* can render a Mandelbrot set in real-time and run full-on 3D First-Person Shooters at HD resolutions, and yet we insist on forcing *humans* to do stupid grunt-work like adding a semicolon at the end of a line to save the compiler from a picosecond of calculation? How the hell is this even acceptable? How is this not front-page news in The Register and all its rivals? THIS is the scandal of our time.
No wonder today's software comes wrapped in legalese instead of warranties.
Compilers can cope with syntax errors - they tell you there's a problem and to go away and fix it!
The problem with automatic syntax correction raises the possibility that the correction is ambiguous and an automatic corrector has a different idea of which correction to make than the engineer.
Better for the compiler to say 'I think this is the problem' and let you review and fix the error rather than blindly fixing it so you have no idea where the fixes are.
Don't talk to me about indentation levels in Python. Especially when someone uses an editor with spaces to modify code originally indented with tabs.
A stupid, stupid, stupid design decision.
Like making the assignment operator in C '=' and equivalence '=='. AND making assignments within conditional statements valid syntax!
Like making fall-through the default in switch statements.
Utterly brain-dead decisions that have cost millions of man-hours of debugging time.
Make the C assignment operator ':=' like Pascal. Allow fall-through only if you add 'continue' at the end of the case block. And just fucking introduce braces to Python. It's FAR easier, and visually more obvious.
> stupid grunt-work like adding a semicolon at the end of a line
That is (possibly) a task for the IDE, definitely *not* the compiler.
I really haven't got the faintest idea what programming a computer is supposed to be looking like by now, but it sure as hell isn't even in the same galaxy with what we are still using - yes, that includes whatever happens to be the cutting edge these days.Think of it for a moment: we all possess incredibly powerful machines now capable of incredibly useful stuff - but 99.99% of us can only use them for things someone else considered worthwhile doing.
Yes, there's no way to escape having to define rigorous logic, but someone able to clearly articulate the logic he wants implemented should never be required to master obscure and arcane tidbits of syntax, APIs, frameworks, and a thousand other arbitrary language and compiler idiosyncrasies. Right now, a compiler can't even point me to the ACTUAL syntax error I make - it just dumbly panics on the NEXT structural block that doesn't sync up. You need to be an anointed High Priest of Coding just to figure out what the actual error you committed is while the compiler whines on about something apparently entirely unrelated. This is incredibly wrong.
We need a usable, robust way to handle arbitrary levels of complexity - and OOP in anything like its current form isn't it. We need true code reusability, not the joke we have today that breaks without even touching it on a tool version change (let alone on a move to ANY different environment). We need tools that can proof logic and help pinpoint errors efficiently when they happen, tools that understand that complexity, not merely point to the instruction it happened on. We need standardized ways to implement unambiguous interfaces that can contain and encapsulate that complexity and mechanisms that deal with the unforeseen more effectively than the occasional tentative try-catch.
We'll know we got there when a randomly crashing flash plugin or a bloody Facebook button on an unresponsive internet link won't effectively freeze my entire Firefox window motionless, all 101 tabs of it, for as long as it wants; when I'll be able to effortlessly tell which recalcitrant piece of garbage causes my PC to apparently twiddle its thumbs at 98% idle while it's taking five minutes to boot; when instead of a blue screen or a stop error I get a tasteful notification saying "your webcam driver has repeatedly malfunctioned and has been unloaded"; but most of all, when all that just stops happening altogether.
...the funny thing is, there is an opportunity of staggering proportions here, and whoever finds a way to cater to it would overnight become bigger than Microsoft, Google and Apple all together. A pity that nobody seems to know how.
I want one, even though I stopped fiddling with electronics many years ago. So wish I had one of these when the first computers were coming out. I can easily imagine this sitting gathering dust in my home, just waiting for me to start that great home project (sigh). Still nostalga isn't what it used to be.
I used to have the Philips EE Electronics Engineer kit when I was a youngster (liberated from my brother) and it lead me into a career in electronics, computing and now software. So who says these kits won't encourage today's youth. Something like this is so much more practical in schools than just a bare board Pi and a bunch of connectors. Most teachers don't have the time or the resources to assemble a workable experimental kit and develop teaching material to support it. This at least goes half way there.
"Philips EE Electronics Engineer kit"
The one where the transistors (a BC148 and a BF194 if I remember correctly) were mounted on little square PCBs and you built everything on little springs? I had one of those for Christmas when I was 11 and ended up with a career in electronics (which now seems to have degenerated to purely software) too!
The one I had (possibly an older version) had an AF116 and two AC126 transistors and these were raw unmounted components. Everything was built on a set of springs and clips pushed through a board which was overlaid with a circuit diagram with holes to place the springs at component junctions. Wonderful stuff, transistor radio, electric organ, flashing lights, etc. The disadvantage of the components not being set up as little modules was that the wires could break off after being used several times, especially before I learnt not to bend them too close to the component. For me, at least, this disadvantage was more than made up for by the fact that off-the-shelf replacements were cheap and easy to get, and you could also get bits not in the original kit and ad-lib. It was the perfect excuse to haunt establishments like Henry's in Edgware Road or Proops down at Tottenham Court Road. None of the emporia in Harry Potter's Diagon Alley could hold a candle to these for MAGIC! And now you can get all this and a computer too...
Your tales of visiting Henry's and Proops brought a little shiver of excitement to me as I remember my childhood days in Luton going to the electronics shop (I think it was in Leagrave Road) to buy components (all discretes I think in those days - if you wanted TTL you had to get it mail order) - ahhh ... the magic of a shop that smells of electronics.
Of course, it has to be an independent shop where the owner knows exactly what's what, Maplin's never has that magic. And Maplin never has that authentic smell - probably because everything that should smell of electronics is shrink wrapped!
On the Philips kit, I'd forgotten about the circuit diagram templates that told you where to poke the springs through.
The BC148 and BF194 (rings a bell) silicone based kits had their trannies on little PCBs (some with heat sink?) because their geranibum based predecessors (AC126, AC128, AF 116?) used to have factory-fresh transistors whose flying leads didn't always last all that long under the stress of repeated insertion and removal.
More at http://www.hansotten.com/index.php?page=ee8-a20-e20
The Interwebs still hasn't found me a copy of the Meccano Magazine articles on their modular digital computer which if I remember rightly was based on OC71-class stuff. Maybe it's out there somewhere but a quick look at e.g. http://meccano.magazines.free.fr/index.htm hasn't found it for me yet.
Yes I know it's not really silicone or geranibum.
If it were just a keyboard and breakout port, then it'd be so so.
The addition of a breadboarding kit in the top, inspired.
The inclusion of a pre-loaded SD card with a complete dev environment, genius!
To me this completes the Pi 'package'.
Shame it's a tad expensive all considered.
"Or as a great way to put the Pi safely into schools."
I went to the audiologist for my Tinnitus and they gave me a noise generator to put by my bed. (white noise helped me sleep). But oh no! "we cannot give you the power supply 'cos of health and safety".
It runs on 6 volts FFS!
So I doubt if any Pis will ever been seen in a school.
Edit: Just checked and it uses 6V at 200mA
I don't know what aspect of H+S your audiologist was referring to, here are a couple of facts:
H+S understands all about safe voltages, the SELV (safety extra low voltage) specification allows voltages up to 70V absolute max to be put onto touchable connectors, this is known to be safe.
Supplying mains power adapters to members of the public requires that they are EC marked, which in turn requires they are tested against a proof voltage of several kV, they conform to EMC requirements, they don't overheat and (i think) they are fused or in some way protected against overdissipation.
PAT testing is used in addition to this if the parts are to be used at a given premises - a school or factory or office - and checks that each of the relevant type-approved items is not faulty.
I suspect it is this requirement that stops them offering you your power supply. If I were them I would ensure it uses a standard micro-USB then it can be your responsibility to source and use the adapter.
I'm pretty sure that as long as it's CE marked it can be sold in the UK to anyone; PAT testing only comes after the equipment is in use, usually as part of an annual inspection of electrical equipment.
Maybe your audiologist didn't get, or couldn't hear, the appropriate advice?
> H+S understands all about safe voltages, the SELV (safety extra low voltage) specification allows voltages up to 70V absolute max to be put onto touchable connectors, this is known to be safe.
I could be wrong but I thought SELV was max 50V, but my memory is not what it used to be.
But a tad out of the RPi price range, I feel. And I'm not exactly a RPi advocate despite owning one of the first models (I'm not an advocate BECAUSE I own one of the first models)... just too much geek-cred and not enough educational focus.
This brings it closer but I still can't see kids wanting to hack on it much. The breadboard is inspired, it has to be said. But £70 plus all the internal faffing about and USB power (which is already a problem)... it's sad. Maybe if it came down in price a little I'd treat myself to one, but it's fallen at the usual hurdles for RPi's in education - no real documentation, no focus, no support, just a geeky device and "there you go".
Think of 30 in a class x £180 = £5400 - Sure, you could do it cheaper if you have someone on staff "who knows", but you can get a netbook trolley for that, or a ton of tablets, or even an entire ICT suite if you're careful (I know - I've done it for that price). Let's not mention that you might need display devices too, if you want to do anything useful, and having the kids reaching down the back of the ICT Suite machines to disconnect VGA/HDMI cables to use the PC monitors probably isn't the best idea. And - again - the lack of educational support materials is going to hit hard unless you have someone "who knows". And schools with someone "who knows" will be doing all this stuff cheaper and easier already.
The BASIC is a good idea but I thought that we weren't supposed to teach children that anymore? Every time I mention it to other programmers, I get universally derided for doing so. Isn't it supposed to be the antithesis of good programming? (Note, I believe that all to be rubbish, personally).
It's a REALLY cool gadget. For me. For schools, etc.? Not so much. Same as the RPi.
Really I don't see this as the solution for the Computer Science / Computer Studies / ITC (or whatever it's called these days) room. More something for an electronics project lab where a few students may be working together on a project, or where not every project needs to be microprocessor based, so equipping enough for a whole class to use individually may not be necessary.
Biting the hand that feeds IT © 1998–2019