back to article IoT puts assembly language back on the charts

Let's do the time warp again: according to an outfit that tracks programming languages, the Internet of Things is re-igniting demand for assembly language skills. Software consultancy TIOBE's Programming Community Index has turned up the re-emergence of assembly programming in its monthly index (the definition of the index is …

  1. Destroy All Monsters Silver badge
    Gimp

    You can "learn" assembly?

    No you can't.

    "Assembly" doesn't even exist as any "assembly" is completely chip (and might I say, board dependent) and using it is anyway similar to killing cockroaches by throwing lego bricks at them. While being gimp'd up.

    But the assembler (the program) and the editor to write the code can fit into 15 KByte. That's a plus.

    Anyone who expresses the desire to "learn" it is on a hiding to nowhere. You can just "do" it, and best print out the PDFs manuals first, because there will be much cursing and flipping through pages.

    I also fail to see who would want to use this approach in 2016 even for the IoT unless stuff is being shat out so quickly that there isn't even enough time to wire up a proper compiler backend to generate the binary specific to said "Thing". Go figure.

    1. J.G.Harston Silver badge

      Re: You can "learn" assembly?

      Something that's so small that it doesn't need (or have space for) a runtime library? Or for actually *writing* the runtime library and CPU startup code that you need to actually get to the high-level language code.

    2. Mark 85

      Re: You can "learn" assembly?

      If they're using the Apollo lander code as an example.. it's tight code. Nothing fancy like startup code or libraries. Just pure "do this".

      Writing for the assembler isn't the hard part. It's writing for the chip and board. Methinks if it's embedded, they're not giving the coder a lot of space.

      Hmmm.... maybe this is one reason security is crap on these devices. The "oh we only have enough space for the make-it-work-stuff... something else will have to provide the security."

    3. Old Used Programmer

      Re: You can "learn" assembly?

      More like...instruction set based. I've programmed in SPS, Autodcoder, Compass, and ALC. All of them assembly languages and none of them ran on microprocessors.

      1. BillG
        Holmes

        Re: You can "learn" assembly?

        I program in assembly because I can. It's an exercise that makes me a better C programmer because I can get a feeling for what the compiler is doing.

        In the end, it's not the language, it's the programmer behind the language.

    4. Jeffrey Nonken

      Re: You can "learn" assembly?

      ... Yes, and no.

      Yes, assembly is tied directly to the architecture, so varies accordingly. Learning one is NOT learning all.

      No, because there is a paradigm, a way of thinking, that is different from higher level languages. Once you learn how to program in machine language, moving to a new machine is mainly just learning the architecture, syntax and op codes. I know, that sounds like all of it, but it's not really... There's that difference in thinking from other languages.

      I was lucky enough to have access to a very simple machine when I was young, an IBM 1620, where I was able to teach myself its machine code. In spite of the fact that it was a register-less system, applying the understanding to new CPUs has never been a problem for me. Generally takes me about a week to get comfortable, a month to get proficient.

      Then again, I'm one of those weirdos who LIKES programming in assembly. So maybe my experience doesn't count.

      Anyway, I agree it's not a language per se, but I feel that knowing a machine language gets you closer than only knowing only high-level languages. (Or 'C'. Which is basically an abstraction layer over machine language. IMHO. Still protects you from the underlying architecture.)

      Sorry you don't like it; you are in the majority. Your contempt is noted.

      1. Anonymous Coward
        Anonymous Coward

        Re: You can "learn" assembly?

        In the 1960/70s you needed to program in assembler for about three very different hardware architectures - before you were mentally prepared to meet even more variants.

        Using an assembler was sometimes a luxury in an emergency in the field.. Then you had to write the machine code in hex/octal on a sheet of paper - calculating all the data/jump addresses yourself. The translation to a binary was created by punching holes in paper tape - sometimes using just a simple dibber tool. You even had occasions where you had to enter the program on key switches on the mainframe's engineers' panel.

      2. VeganVegan

        Re: You can "learn" assembly?

        I totally agree with you. My 1st job was to do assembly programming. It taught me how to go about learning new languages, low-level and high.

        Assembly language programming itself is not particularly difficult. It's just damn demanding, in a tedious sort of way: you have to take care of bloody everything yourself. Most of the time, you don't get to call on some module wrtten by some else, because it simply doesn't exist.

        Maybe with the IoT demand, there will be the equivalent of modules that will become available. That would be nice, if you can trust it, that is.

    5. Kevin McMurtrie Silver badge

      Re: You can "learn" assembly?

      There are, in fact, assembly language courses at schools. They cover techniques for managing the call stack, passing parameters, multi-threading and interrupts, building and parsing data structures, breaking down mathematical formulas into bitwise operations, macros, optimizing instruction pipelines, virtual addressing, various means of interacting with hardware, and playing nice with an operating system. The details vary with each system but the basics remain the same.

    6. Anonymous Coward
      Anonymous Coward

      missing the forest for the trees

      Assembler interesting yes but it doesn't change the fact the IoT can go fsck itself, as well as the 1%ers who want to data mine all the peons while the only ones who will be able to afford appliances with privacy are said 1%ers. Its coming have no worries. They will sell it to the mouth breathers by using terms like smart appliance x and other whiz bang marketing terms. Look no further than Android and Windows 10 for example to see said trojan horse free "goodies". Already your car's mandatory black box can be used against you by the megacorps. Not like the Millennials understand privacy so worse case they have to play the long game a bit.

  2. Anonymous Coward
    Anonymous Coward

    I have 30 years experience of C and assembly, so that should cover me coming and going, so why won't anybody pay me to use those skills? Instead everybody demands 30 years experience of being paid to use Java.

    1. DainB Bronze badge

      The most probable explanation is that you're applying for wrong jobs.

  3. Anonymous Coward
    Anonymous Coward

    Assembly is OK, but...

    ... can we use DevOps to be productive and proactive and six-sigma-hyperconverge (or whatever the current BS terms are) on it?

    1. DropBear
      Trollface

      Re: Assembly is OK, but...

      Sure, but only if you've got at least five years of hands-on experience with Agile DevOps. As a service. In a Docker container. Oh, you don't...? Shame...

      1. bombastic bob Silver badge

        Re: Assembly is OK, but...

        "but only if you've got at least five years of hands-on experience with Agile DevOps. As a service. In a Docker container. "

        'What Color is your Parachute?' indeed... gotta find a way to get your resume to the HIRING MANAGER, an individual that understands enough about software and engineering to recognize talent, even when the resume doesn't have the "buzz words" that H.R. weenies live to screen by, or a 4 year degree (even in something irrelevant). "Wow, that guy has a basket-weaving degree! I bet he makes a GREAT programmer!"

        smaller companies who don't have H.R. departments are your best bet, anyway, especially for contractors.

  4. Denarius
    Flame

    Cobol, jokes ?

    Not half as much as that disease of PHB dreams and affluenza, the abomination and love of all hardware vendors, java. In fact, all IOT should be coded in Java. At least there would no security issues.

    OTGH, assembler teaches programmers to be precise, plan ahead, design. You know, all those skills oldtimers did as a matter of routine before the invention of RAD tools. And resource considerations got consigned to dustbin of history by OO code ideals.

    Horns of dilemma here. Finally modern coders get to do it right and we lose privacy or crap code continues with the only winners being hardware floggers

    I am still outraged that my newest PC has 80 times the RAM, double the cores, 100 times the network speed, 100 times the disk of the payroll server for an entire department only 20 years ago and is slower than the XT I used 30 years ago, the 386/486 20 years ago and the 7 year old current linux box. Just got the latest Win10 patches. Bigger than some older hard disks not so long ago.

    1. Pascal Monett Silver badge

      Re: In fact, all IOT should be coded in Java. At least there would no security issues.

      Right, because it's the language that decides if there can be security issues, not coders writing sloppy code.

      I can't agree with that. In my opinion, coders the world over have demonstrated a disturbing knack of being able to create security issues in any language.

      1. This post has been deleted by its author

        1. Denarius
          Facepalm

          Re: In fact, all IOT should be coded in Java. At least there would no security issues.

          @Pascal, perhaps the implied sarcasm tag was not clear enough. My apologies. I was thinking of small RFI tags IOT. Vendors of bigger stuff have already demonstrated that (a) they have no interest in _any_ security, (b) probably have no clue as to implement it and (c) Poor quality coders seems attracted to java. Something about the language instigates poor quality practices, even if not implicit in the language itself.

          Must be losing my sarcasm ninja.

    2. nematoad
      Thumb Up

      Re: Cobol, jokes ?

      OTGH,

      +1 for the Niven and Pournelle quote.

  5. Enno

    It's IOOPS not IOT in my opinion anyway...

    From what I've seen so far it's all about corporates getting their grubby little fingers on data about me. So sell me, say a thermostat and make it phone home constantly. You know, for me.

    It's not IOT it's IOOPS, the Internet Of Other People's Stuff.

  6. Anonymous Coward
    Anonymous Coward

    Assembly Language?

    Haven't touched it since 1985.

    I can't see its relevance to run-anywhere processor/platform independence.

    Is this another marketig press release dressed-up as news?

    1. heyrick Silver badge
      Happy

      Re: Assembly Language?

      I write ARM code for RISC OS. More "because I can" than actual necessity. There's no such thing as platform independence in to IoT world. All the ARM SoCs are subtly different, and in the larger scale there are dozens of microcontrollers, each one different. You could have a common platform by running some sort of java, but that's a lot of power and processing going to waste - not luxuries one always gets.

    2. Pascal Monett Silver badge

      Re: run-anywhere processor/platform independence

      That is a concept dedicated to full-blown computers with powerful CPUs at 3GHz or more, gobs of RAM and a fair amount of storage space. And even there, it's not guaranteed.

      IoT is a world of microprocessors that run in the Khz, next-to-no RAM and zero storage space. There is no possibility of platform independance here, unless you're ready to buy a toothbrush that is connected to your PC.

      So, you're right, there is no relevance to run-anywhere in IoT. IoT is just-run-there-and-be-happy-you-can.

      1. Anonymous Coward
        Anonymous Coward

        Re: run-anywhere processor/platform independence

        I would say micro-controllers rather than micro-processors. But security will always be a problem because most programmers JUST DO NOT CARE. Their goal is to crank out code -- whether assembly or HLL -- that sort of works: No design, no secuirty assessment, no reviews. Even the simple route of writing it out in an HLL (or pseudo-code) and then reducing to assembly is rarely followed.

        (Incidentally, some of the best assembly code I have ever seen was written for 370s. One of the most insane projects was trying to write scientific code for an IoT chip startup, who supplied us with a prototype and an assembler that (1) core-dumped when it reached the first unknown opcode and (2) changed with every release sent to us with previously legit opcodes were now gone.)

    3. itzman
      Holmes

      Re: Assembly Language?

      Haven't touched it since 1985.

      Have you programmed any PIC's etc etc?

      I can't see its relevance to run-anywhere processor/platform independence

      IOT wont be made of run-anywhere processor independent code.

      It might have a generic C library on top, but to talk to the (custom) hardware like as not you will need assembler, and to write the lowest level c library functions.

      1. Destroy All Monsters Silver badge
        Headmaster

        Re: Assembly Language?

        It might have a generic C library on top, but to talk to the (custom) hardware like as not you will need assembler, and to write the lowest level c library functions.

        This is 2016, not 1975 when "compilers" and "parser" where magic shit that had to live on tapes and function in 32KiB RAM.

        Your statement just means you don't use the right DSL or can't be arsed to write one.

        Any anyone who calls out for "tight code" in IoT nowadays is a few beers short of a sixpack. Unless the memory is expensive because it's radiation-hardened...

    4. bombastic bob Silver badge

      Re: Assembly Language?

      "I can't see its relevance to run-anywhere processor/platform independence."

      yeah, I guess you'll just have to continue coding C-pound for ".Not". Which was STILL under 5% on the TIOBE index, from what I can tell.

      On a related note, really good C code is a close second to tiny assembler code for microcontrollers. but sometimes you have to at LEAST do inline assembly, especially to take advantage of some special instructions (like in a bootloader, flashing NVRAM - did this for a project that ended up on github, XMega port for Arduino IDE).

  7. frank ly

    re. the Twitter link

    I like the bagless Dyson soul harvester. It's the ultimate IoT device.

  8. Dan 55 Silver badge

    Why Assembler?

    Classic C will do the job, Shirley? Unless you get a better class of coder with assembler who's more careful about buffer overflows and so on, but I doubt it.

  9. hattivat

    Stop, just stop

    Stop giving that joke of a ranking known as TIOBE publicity, it is nothing more than a glorified Google search rank graph. If you do that, perhaps it will finally suffocate and die.

    As anyone who has actually looked for a job in the last 10 years can tell you, Java is not 3 times more in demand than C#, C is not 3 times more popular than Python, and C++ is definitely not two times more in demand than Javascript. TIOBE's high ranking of C is due to their inability to effectively separate searches for C-as-a-programming-language from searches for "C-suite", "C-class driving license", "Arthur C. Clarke", "Boeing C-40 Clipper", and so on. If you want actual data, look no further than here: http://www.itjobswatch.co.uk/

    These guys index actual job ads, not google searches. As can be seen from their graph, use of Assembly in jobs where you actually get paid is in long-term decline, and currently the demand for it is on par with the likes of Erlang (ie. almost non-existent) http://www.itjobswatch.co.uk/charts/permanent-demand-trend.aspx?s=assembly+language&l=uk

    1. bombastic bob Silver badge

      Re: Stop, just stop

      "Java is not 3 times more in demand than C#"

      *sniff* *sniff* - smells like a Microsoft Shill

      I mentioned C-pound already, and that ".Not" _thing_ that goes with it. It deserves it's "under 5%" ranking on TIOBE.

      whereas, the ENTIRE point was that IoT and other micro-controller-based projects *DEMAND* the kinds of low-level coding that assembly language lets you do. And if you're mixing inline assembler with your C code, it's the same *kinds* of coding as pure assembler [except you now get to deal with some of the quirky syntax things that 'inline' forces you to deal with].

      explained HERE for AVR processors:

      http://www.nongnu.org/avr-libc/user-manual/inline_asm.html

      it takes the *right* kind of coder to deal with this kind of thing, yeah

      1. hattivat

        Re: Stop, just stop

        What C# "deserves", and what I or you wish its popularity was doesn't change what its popularity actually is in the real world. I dare you to find an online job board in the UK that has three times more positions advertised for Java than for ".not", or even better, three times more positions for C than for Javascript. I don't believe that you will find a non-obscure one that has more such positions at all, let alone three times as many. The manglement in most companies is shit and they demand their workers use shitty technologies, deal with it if you want a job.

        The entire point is that start-downs will code their Internet of Shit gizmos in javascript and ruby, because that's "cool", whereas asking a 50-yo engineer's opinion on what actually makes sense is "lame", and there is little we can do about it.

        1. VeganVegan
          Happy

          Re: Stop, just stop

          I like that term, manglement.

    2. David Roberts
      WTF?

      Re: Stop, just stop - search terms?

      Did I just read that the search ranking for Assembler has gone up because they didn't filter out "self" and Ikea just opened a new store?

      1. hattivat

        Re: Stop, just stop - search terms?

        Unlikely to be that banal, but it could certainly be due to something like the CS course requirements being revised in say India to include a mandatory course in assembler 101. That would generate a lot of artificial popularity, just like Scheme and Logo were "popular" 15 years ago (I love Scheme BTW, no intent to disparage it here).

        In any case the point is that the TIOBE "ranking" is entirely disconnected from reality, both in the professional world (as can be seen on any job board, e.g. Monster, Indeed) and the enthusiast side of things (as seen on Github and in game modding communities) and should not be taken seriously by any IT journalist worth their salt.

  10. Only me!
    Happy

    Assembler

    Soooo, complicated I was having fun with it over 30 years a go, creating moving graphics to go in time with the music for a night club....on a Spectrum!!!! They ran it till the Spectrum died a few years later, possible due to smoke inhalation!

    *The need was because most songs did not have a video to go with it!"

  11. Paul Kinsler

    Apollo assembler code on github

    I can only hope there's some 80 year old guy out there who has a quick nostalgic browse, spots something that needs improvement, and submits a patch.

  12. Anonymous Coward
    Anonymous Coward

    Java on IoT devices is quire feasible

    Java Card is around and is being used.

  13. Down not across
    Thumb Up

    Apollo 11 AGC

    Thanks for the link.

    It will be interesting to look through some of that code and should keep me entertained for a while.

    1. Destroy All Monsters Silver badge

      Re: Apollo 11 AGC

      I recommend reading

      Digital Apollo: Human and Machine in Spaceflight for the full-fat context!

  14. Mike 16

    Run anywhere?

    Maybe I am just cursed, but in my (deliberately small, for reasons that should become obvious) experience, Java tends to be "run anywhere with _exactly_ the same version of JRE that the author used, on the same OS version, on the same hardware"

    As for "job hunts", I used to include 1401 Autocoder on my CV, to save my time and that of the companies whose HR droids would reject me for having gray hair (and little of it) after an otherwise fruitful set of interviews.

    1. Destroy All Monsters Silver badge
      Headmaster

      Re: Run anywhere?

      Java tends to be "run anywhere with _exactly_ the same version of JRE that the author used, on the same OS version, on the same hardware"

      Well, that's obviously untrue otherwise the Apache project (or anyone else, for that matter) would never offer precompiled packages for download at all.

      I don't know where you guys have been...

      1. Mike 16

        Re: Run anywhere?

        Just in case that was an honest question, rather than rhetorical snark, my (very) bad experiences with Java have been with one of my banks (No, I don't mean javascript or JSP. Their "Secure Document Delivery" _required_ I run their application), two cases of "control software" for instrumentation, one on Mac, one on Windows, and a couple of IDEs that I eventually had to abandon and punt back to Open Source command-line tools. Like I said, my experience trying to run Java apps is limited, by choice. Perhaps the Apache folks are a lot better at writing portable code than the average bear. That would not be hard from what I have seen in a number of languages, written presumably by average bears.

        My point is that Java does not (despite its adherent's claims) somehow magically confer "run anywhere".

    2. Anonymous Coward
      Anonymous Coward

      Re: Run anywhere?

      I was considered for a pdp-11 programming job a couple of years ago, but was living in the wrong country at the time.

      1. Destroy All Monsters Silver badge

        Re: Run anywhere?

        You mean programming a PDP-11 emulator, shirely?

        (Might even be running on a RasPi ... or the emulator might be written in JavaScript for optimal browser experience ... )

  15. JLV

    Hmmmm....

    Why does letting loose a bunch of programmers doing willy-nilly Asm on IoT make me think of ...

    https://www.youtube.com/watch?v=Rrm8usaH0sM

    i.e. what could ever go wrong here, when we can't even manage security with higher level languages, let alone C, most of the time.

    1. Anonymous Coward
      Terminator

      Re: Hmmmm....

      Yeah, it's often worthwhile to be sure about the algorithmic basics in a Thing On The Internet and leave the rest to the compiler.

  16. bjr

    Are there chips with no development support?

    I find it hard to believe that assembly code is ever necessary anymore. I started programming in the 1970s when memory sizes were just a few K (PDP 8s had 4K max, PDP 11s and Novas had between 8K and 32K, a monster PDP 11 system had 256K of mapped memory). When you have only 4K of memory then every bit counts and there is a reason to use assembler. You can do a lot of work in very little memory but at the expense of supportablity, i.e. not only is it hard to read your own code, let alone someone else's, but if you are really aggressive about writing tiny code you would end up with something as fragile as crystal. One common trick was to hold a state bit in the CARRY flag for several instructions. You could write code that would leave the CARRY unchanged so that you could do a branch later. Of course if you were to ever try and modify code like that you would break it.

    C took over in the early 80s because it was almost as efficient as assembly code but it was portable and it was far less fragile. The slogan at the time was "C, all the power of assembly languages with the ease of use of assembly language". After 40 years of Moore's law I can't imagine that there is any device out there that has so little memory that you can't do a better job with C than you can with assembly code.

    1. Mike 16

      Re: Are there chips with no development support?

      One nitpick first: The PDP-8 could only directly address 4K words (or 256? words, for a more stringent definition of "directly") but they (most models? I don't know of an exception) could use banking to extend that to something like 32K max. Later models could go even larger, maybe a Meg (I remember an ad featuring an elephant).

      As for "too little memory", I recently did a small job where the customer had already committed to a PIC with something like 96 bytes of RAM and 1.5K of ROM. I suppose someone, somewhere might be daft enough to program such a chip in C, but it would be wrong in so many ways. Among others, I needed to use coroutines to meet the spec. Also, in my experience, the compilers for such machines are not exactly "best in breed" ("Standards? More like Guidelines, really"), and finally, I have rarely met a person who is just fine with C but draws the line at assembly. If they have made a choice based on a real evaluation of the needs and capabilities, maybe. But most of what I see, (especially) including these comments, is "gut revulsion", not "real evaluation".

    2. Andrew Commons

      Re: Are there chips with no development support?

      An old friend of mine refers to C as the "gentleman's assembler".

      There are games you can play in assembler such as manually overlaying use once initialisation code and read/write storage such as I/O buffers that I'm not sure you can play with complied languages. If you are really memory constrained you grab at every straw :-)

  17. ShrNfr

    BR 14

    END

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like