back to article Q. What's today's top language? A. Python... no, wait, Java... no, C

Among developers, Python is the most popular programming language, followed by C, Java, C++, and JavaScript; among employers, Java is the most sought after, followed by C, Python, C++, and JavaScript. Or so says the 2017 IEEE Spectrum ranking, published this week. IEEE Spectrum, a publication of the The Institute of …

COMMENTS

This topic is closed for new posts.
  1. John Smith 19 Gold badge
    Trollface

    I suspect there are quite a few Java devs out there

    The question is how many of them are good?

    1. wolfetone Silver badge

      Re: I suspect there are quite a few Java devs out there

      That's the thing. It's all well and good saying that you have 10 years experience in Java. But that doesn't mean you have 10 years of being good at it. Hell, it doesn't even mean you're good at it now.

      1. HmmmYes

        Re: I suspect there are quite a few Java devs out there

        I have ~20 odd years of Java experience.

        I try and implement something in it. Its shit, its slow, I have to install 20mb of JVM and crap.

        I pull it and re-write the stuff in C + python.

        This is just for internal tools. A quick read of the Java license scared me off doing any billable product in Java.

        I struggle to remember what the problem that Java is the solution.

        1. JDX Gold badge

          Re: I suspect there are quite a few Java devs out there

          So after 20 years you still suck, basically? Java isn't slow and the JVM allows you to write code in many languages not just Java.

          It's not my favourite, but it's eminently capable.

          1. Roo
            Windows

            Re: I suspect there are quite a few Java devs out there

            "So after 20 years you still suck, basically?"

            He prefers other tools - I happen to share his viewpoint. I write Java where folks require it, but most new stuff is in Python by popular choice.

            I undertook a fairly small greenfield project that demanded minimal runtime footprint & cost, JVMs were far more costly at runtime in time & space than the equivalent C++ code - and I did actually code up the core loops & profile them in Python/C++ & Java and measure the cost as honestly as possible. Also with the C++ solution we easier to validate from the security point of view - because we didn't have to delve into a bunch of opaque third party binaries - like the JVM for example. There was an absolute minimum of third party code - and the finished beast ran in a privilege-separation format - again much more natural to code in C/C++ than a JVM hosted language + runtime.

            "Java isn't slow and the JVM allows you to write code in many languages not just Java."

            It's a lot better than it was - but it's not the quickest either - the memory footprint alone means that the electrons are putting in more miles across the various databuses & memory arrays. When push comes to shove Physics will always be on the side of compact run-times.

            Firing up a JVM is *slow* in comparison to a comparable bit of C/C++, regardless of the merits of whatever you've compiled into bytecode.

            Having defended C/C++ a bit I have to point out that I write the majority of code in Python followed by Java - they have their strengths too, although I'd say Java is actually pretty much lumbering on by convention, toolchain & framework inertia. New stuff does tend to be Python for better or worse.

          2. werdsmith Silver badge

            Re: I suspect there are quite a few Java devs out there

            As my career started in resource constrained times, I was formed as a C/Assembler then C++ guy.

            When Java appear in the 90s I made the effort to learn it but soon became disillusioned with it as a fat, lumbering sloth of a thing that would use many times the memory and cycles to do the same thing as I could with my old comfortable friend languages.

            I still don't see any reason to use so much resource just because it's available.

            Python, great and I got into it helping my son through school with it, it does some exceptional things.

            But I still like to be as close to the metal as possible. If something doesn't need real programming then Javascript will do it.

            Python will take over the world though, it's the new BASIC and the enthusiasts/makers love it.

        2. xXSwolGunzXx

          Re: I suspect there are quite a few Java devs out there

          Java has 2 key features: GC and not too much power. You can turn loose a bunch of average devs and get them to churn out a pile of Java code that meets the requirements. If they get into a tight spot, the staff bright spark can fix things without too much trouble because Java doesn't provide any sharp things that destroy maintainability when used badly--macros, decorators, metaclasses, etc.

      2. John Smith 19 Gold badge
        Unhappy

        "It's all well and good saying that you have 10 years experience in Java. "

        Indeed.

        My favorite job interview was a second one (after HR had confirmed I didn't drool on the carpet or attack people at random).

        "Here's your login details, the machine, the language manuals and your programs function description. We'll be back in two hours. If it runs you get the job."

        Only time I'd ever seen such an interview technique. Yes, it took some effort to set up, but the employer soon gets to separate the workers from the BS merchants.

        1. CrazyOldCatMan Silver badge

          Re: "It's all well and good saying that you have 10 years experience in Java. "

          after HR had confirmed I didn't drool on the carpet or attack people at random

          So - not fast-tracked to senior management then?

        2. bombastic bob Silver badge
          Devil

          Re: "It's all well and good saying that you have 10 years experience in Java. "

          "Here's your login details, the machine, the language manuals and your programs function description. We'll be back in two hours. If it runs you get the job."

          I had a similar 'interview' with the last on-site gig I did. It was a phone interview followed by an e-mail, with a request to take a particular data format and do something with it. "Any language" and it was timed.

          I did it in about an hour or two, with a nice robust C++ application. But one guy did it in 5 minutes (using Perl). If I'd known BSD/Linux as well as I do now, I'd have done it in about that much time using 'awk'.

          (after that it was 'meet everyone' so they could figure out if they could get along with me, get a tour of the place, and so on - small startup company)

      3. CrazyOldCatMan Silver badge

        Re: I suspect there are quite a few Java devs out there

        you have 10 years experience

        10 years experience or one year repeated 10 times? Quite an important difference..

      4. bombastic bob Silver badge
        Devil

        Re: I suspect there are quite a few Java devs out there

        Java programmers should learn C first, THEN C++, and THEN Java.

        That would help build some proper coding discipline, so they don't start out every function/process using "ginormous collection object", and instead use some "non-insider" readable code that looks a bit more like C or C++.

        THAT, and the discipline of explicitly cleaning up your objects when they're no longer needed...

    2. Tom 7

      Re: I suspect there are quite a few Java devs out there

      If its their first and only language then most probably (but not certainly) not many. Languages are very closely tied to methodologies and seem to tie people into a certain way of thinking about how to solve the problem. People solve problems with the tools they are familiar with - that bloke solved Fermi's last theorem in 100 odd pages where Fermi couldn't quite fit his answer in the margin of his book, Different languages tend to take different approaches to things some will go from A to D via B and C and others via F G and H. My memories of Java were it went from A to B via A1, A2, A3 and A4 for a lot of the time but that may have been just my A and B were not in Java at the time.

      If you try and learn other languages you will move into other problem spaces the languages were designed to solve and realise that problem space is not the same as language space and no language fits all but more importantly learn there are many ways to skin a cat and when it boils down to it when you have a job the best way to do it is the way the local cat skinners do it for the local cats and even after 50 years in the job you will still be learning shit your computer language has trouble with because it was designed by some anal retentive genius who had a problem with brackets or camel case or something else really irrelevant to a CPU or GPU, or in the case of Java by a committee of them.

      All computer languages are shit but some people can at least drive them off road for a bit before they need a sky hook to get them out of trouble,

      1. oliversalmon
        FAIL

        Re: I suspect there are quite a few Java devs out there

        Fermi's last theorem?

        If you're trying to prove how much smarter you are than Java devs, at least realise that it was Fermat not Fermi.

        1. Tom 7

          Re: I suspect there are quite a few Java devs out there --- Fermat

          That's what you get from bloody code completion on a quantum computer.

        2. CrazyOldCatMan Silver badge

          Re: I suspect there are quite a few Java devs out there

          Fermat not Fermi.

          Fermat is what I have at home - just inside the cat door. For them to wipe their feet[1] rather than carefully preserving the cold-and-dampness so that they can walk all over me in the night[2]..

          [1] Sadly, this little exercise in cat-training hasn't really worked. Probably because I struggled to think of how to reward them for doing it. And, in general, cats are mercenary little blighters.

          [2] In oh, so many ways. After all, how many people would get up at 3am to let out the youngest cat[3] who really, really can't be bothered to use the cat door to go out. She's happy to use it to come in though. Probably because of the lack of suitable servants waiting outside to let her in.

          [3] She of the multiple paranoia-syndrome. Even her paranoia is paranoid about the other paranoias she has..

        3. Tronald Dump

          Re: I suspect there are quite a few Java devs out there

          Therein lies the paradox.

        4. Anonymous Coward
          Anonymous Coward

          Re: I suspect there are quite a few Java devs out there - Fermi's last theorem?

          "Whacking subcritical lumps of plutonium together is bad for you" easily fits in a margin. Describing how to make the plutonium probably takes more than 100 pages.

      2. Steve Channell
        Pint

        Proof of Fermat's last theorem used infinite sets

        Whilst you could not computationally prove the theorem using infinite sets, its worth mentioning that {C#, C++, Python} together with every functional language (except Scala) supports infinite sets, Java does not.. which is good.. Java (lacking tail call optimisation) would fall over randomly with a stack overflow (differently on every machine - develop anywhere, debug everywhere)

        1. xXSwolGunzXx

          Re: Proof of Fermat's last theorem used infinite sets

          Nah java.util.Set supports that just fine. I guess implementing Iterable could be fun depending on the set, but that's all.

    3. Steve Channell
      Pint

      Re: I suspect there are quite a few Java devs out there

      Given that the PYPL measure is for google searches (heavily skewed by JavaDoc), it is not unreasonable to conclude that a large number of Java developers need help getting through the day.

      The Redmonk score looks more interesting, though I'm note sure c# "async execution of a Linq closure" compared to Java's "which Date class is best" or Python's "why is 5 + 1 sometimes 6 or 51" is a fair comparison

      1. Richard Plinston

        Re: I suspect there are quite a few Java devs out there

        > Python's "why is 5 + 1 sometimes 6 or 51"

        You are confused, that is a 'feature' of Javascript not Python.

        1. Flocke Kroes Silver badge

          Re: I suspect there are quite a few Java devs out there

          $ python3 -c 'print(5+1 == 6 or 51)'

          True

          $ echo | awk '{print "5"+"1"}'

          6

    4. CrazyOldCatMan Silver badge

      Re: I suspect there are quite a few Java devs out there

      The question is how many of them are good?

      I know at least one. Admittedly, he is related to me so I might possibly just be biased..

    5. Phil O'Sophical Silver badge

      Re: I suspect there are quite a few Java devs out there

      The question is how many of them are good?

      In my experience many of them might be using Java, but they write C programs.

      1. Someone Else Silver badge
        Coat

        @Phil O'Sophical -- Re: I suspect there are quite a few Java devs out there

        Remember, you can write FORTRAN in any language...

        1. Tim99 Silver badge
          Coat

          Re: @Phil O'Sophical -- I suspect there are quite a few Java devs out there

          @Someone Else

          But to be really good/bad at it, it should be the first one that you used - Like me, unfortunately.

      2. Dazed and Confused

        Re: I suspect there are quite a few Java devs out there

        > In my experience many of them might be using Java, but they write C programs.

        As the saying goes "A good Fortran programmer can write Fortran in any language"

        1. Bitbeisser
          Meh

          Re: I suspect there are quite a few Java devs out there

          > As the saying goes "A good Fortran programmer can write Fortran in any language"

          Then there must be a lot of APL programmers working on those C/C++ Open Source projects I looked at... :?

      3. Matt Bryant Silver badge

        Re: Phil O'Sophical Re: I suspect there are quite a few Java devs out there

        ".... but they write C programs." Yeah, guilty! Nowadays I write code once in a blue moon, but it's usually C code morphed into whatever wrapper is required.

        One strange practice I commonly run into is customers that have picked the coding language for a problem long before they have determined even what the problem is. I try to at least steer the project to a thorough pseudo-coding before going to language selection. Agilistas really hate that, they seem to think it insults their skills.

        1. jake Silver badge

          Re: Phil O'Sophical I suspect there are quite a few Java devs out there

          "Agilistas really hate that, they seem to think it insults their skills."

          Agilistas with skills? Must be a mutation of the breed that hasn't made it here yet.

    6. Anonymous Coward
      Thumb Up

      Re: I suspect there are quite a few Java devs out there

      The question is how many of them are good?

      Hard to tell. Java developers never finish anything properly - there isn't enough time.

  2. jake Silver badge

    In over 40 years of programming ...

    ... I have used a lot of criteria when choosing a language to utilize for a given project, or a new language to learn for my own edification. In all that time, I have never, not once, picked a language due to it's popularity.

    I have never really thought about it before, but I suspect that choosing anything purely due to it's popularity is a mug's game. Or rather, popularity brings out the lowest common denominator, making popularity quite the opposite of elegant. See music for a rather egregious example ...

    1. Anonymous Coward
      Anonymous Coward

      Re: In over 40 years of programming ...

      Surely these languages are popular for a reason? I certainly wouldn't call it a mugs game learning them.

      1. Rameses Niblick the Third Kerplunk Kerplunk Whoops Where's My Thribble?

        Re: In over 40 years of programming ...

        And to be fair, neither did Jake. He said choosing a language for any given project based upon its popularity is a mugs game.

        Learn whichever ones take your fancy though.

        1. John Smith 19 Gold badge
          Unhappy

          "Learn whichever ones take your fancy though."

          Or whatever ones available on the hardware you're paid to develop for.

          Although probably a good idea to avoid "M," the language formerly known as mumps.

          That's reckoned to be like the disease.

          Very nasty if contracted in later life.

          1. Vic

            Re: "Learn whichever ones take your fancy though."

            Although probably a good idea to avoid "M," the language formerly known as mumps.

            A friend of mine made a very good living doing M[1] just after leaving University.

            We'd taken the piss all the way through Uni because she was reading Philosophy. Then she got a job in my field on more cash than I was getting...

            She's an undertaker now...

            Vic.

            [1] Yes, it was still Mumps at the time.

            1. John Smith 19 Gold badge
              Unhappy

              "A friend of mine made a very good living doing M[1] just after leaving University."

              "She's an undertaker now..."

              That's sort of my point.

              IIRC it was Forth based and allows abbreviations of commands. IOW it's for those who find C a bit too verbose.

              I think it can legitimately be said that after you've used it you won't want to use another programming language.

              Because you won't want to do programming ever again.

              1. John Smith 19 Gold badge
                Unhappy

                Re: "A friend of mine made a very good living doing M[1] just after leaving University."

                Ooops.

                MUMPS was not Forth based, as it preceded Forth by about 5 years (1966 Vs 1971).

                Although its terseness and design of breaking code into 2KB blocks is very Forth like.

                OTOH variables <==> files <==> b-trees mean anything can be made persistent across all instances (IE a file) just by putting "^" in front of the name is not very Forth like.

                And then there is the command abbreviation, combined with number of spaces between some of them being significant. That could make for a complete mindf**k when reading through old code, to the point of writing a tool to expand such abbreviations to make the whole thing more readable.

      2. Christian Berger

        Re: In over 40 years of programming ...

        "Surely these languages are popular for a reason?"

        Yes, but that reason, more often than not, is hype surrounding them. You have professors who grew up in the pseudo OOP-Hype of the 1980s and 1990s and think C++ and Java are the ultimate languages as they are so OOP.

        1. sawatts

          Re: In over 40 years of programming ...

          OOP-Hype? Thats just smalltalk.

          1. Ben1892

            Re: In over 40 years of programming ...

            one more squeak out of you and there'll be trouble

          2. big_D Silver badge
            Pint

            Re: In over 40 years of programming ...

            @sawatts ISWYDT! Have a Friday beer.

          3. kventin

            Re: In over 40 years of programming ...

            which reminds me: is it still true that you can identify Igor-level uber-minions by their lisp and comic reliefs by their oopsy-daisies?

            (and yes, may the forth be with me, for making such bad puns)

          4. John H Woods Silver badge

            Re: In over 40 years of programming ...

            (Languages sort: [:x :y | x excellence > y excellence]) first name = 'Smalltalk'

      3. regadpellagru

        Re: In over 40 years of programming ...

        "Surely these languages are popular for a reason?"

        Eat shit. Billions of flies can't be wrong ...

    2. Sinical

      Re: In over 40 years of programming ...

      What he said.

      Whenever I learn a language it's because I've found a new (or even old) language that can do something I need my software to do better than the languages I already know. They are all tools, you just pick the best one for the job. Yes you can use a screwdriver to hammer a nail, but using a hammer is just easier.

      1. CrazyOldCatMan Silver badge

        Re: In over 40 years of programming ...

        They are all tools, you just pick the best one for the job

        Many (subjective) eons ago, I was a mainframe assembler programmer, writing (bad)[1] code for a system running TPF.

        One of my siblings, having obtained various degrees and doctorates, was musing why we still bothered using such archaic languages when they had so many better ones in University..

        I managed to restrain myself from beating him to death with a POPS manual[2] and suggested that the 40+ years-worth of code we were maintaining couldn't be replaced in a hurry, especially with languages where very few programmers existed and where there was no long-term commercial experience.

        [1] One of the many, many reasons why I stopped[3] being a programmer and went into support.

        [2] We chucked ours[4] away some time ago. Then, a few weeks later, discovered how much they were worth online. Doh!

        [3] Some might claim I never really started. YMMV.

        [4] Senior Controller was also a programmer, in the same company (we were married before we went there). She stuck at it considerably longer, being considerably better at it than I was. I was more of a 'hack it together and then fix it in testing' sort. She is one of those tedious^W meticulous types that actually preferred to design things first.

    3. Jim 59

      Re: In over 40 years of programming ...

      @Jake same here, but if you are choosing a language in a commercial situation, a ready supply of people who know it will aid success of the project and reduce its future support costs. Hence, choosing on popularity makes sense.

      1. Blitheringeejit
        Holmes

        @Jim 59

        "if you are choosing a language in a commercial situation, a ready supply of people who know it will aid success of the project and reduce its future support costs. Hence, choosing on popularity makes sense."

        Isn't that exactly why so many organisations continue to use Windows, in spite of all the grief they incur by doing so?

        And does it really, honestly, cost less in the long run?

    4. Pen-y-gors

      Re: In over 40 years of programming ...

      Yeah, but serious geeks in late-middle-age still code everything in Fortran IV and then write a post-processor to translate it into COBOL, PL/1, Algol, Lisp or whatever the flavour of the month is.

  3. Admiral Grace Hopper

    COBOL FTW

    But I may be biased.

    1. big_D Silver badge

      Re: COBOL FTW

      I loved COBOL, at least the later COBOLs that weren't so restrictive over line lengths.

      A lovely, verbose language and you can make it very modular.

      I've used dozens of languages over the years, COBOL has a soft spot, as does Z80 and 6502 Assembler. 68K Assembler was also nice, but x86 Assembler was a nightmare in comparison, assembler equivalent of VHS, compared to 68K laser disc...

      1. Anonymous Coward
        Anonymous Coward

        Re: COBOL FTW

        " ... x86 Assembler was a nightmare ..."

        reminds me of something someone wrote in the mid-80s:

        "There are two kinds of assembly programmers: those who hate the Intel chip, and liars."

        1. Dagg Silver badge

          Re: COBOL FTW

          >> " ... x86 Assembler was a nightmare ..."

          Even 6502 wasn't that good. I started with PDP PAL-11 then MACRO-11. Now THAT was an assembler. 8 general purpose registers (R6 was also the stack pointer and R7 the program counter) with 8 address modes.

          Lovely....

      2. CrazyOldCatMan Silver badge

        Re: COBOL FTW

        Z80 and 6502 Assembler

        Indeed. My first (real) computers were a Nascom 1 followed by a BBC Micro.

        I did write a bit of X86 assembler during my (short) programmer phase even though I was (nominally) a mainframe programmer. But it was more fun to write an assembler utility that went round the (token-ring) LAN looking for OS/2 print servers and then enumerating all the stuff people had statched on the file shares on the server.

        It was *fairly* network intensive, which is why I only ran it on the evening. Found some fairy 'interesting' stuff as well as quite an amount of warez.

        This was sometime in the early 90's. When I was young and foolish.

        1. jake Silver badge

          Re: COBOL FTW

          If you were young and foolish in the early 1990s, you are hardly "old" today ... That was barely a quarter century ago!

    2. Matt Bryant Silver badge

      Re: COBOL FTW

      During the Y2K bonanza/hysteria, I was amazed at the amount of COBOL code that was expensively edited to get round the Y2K issue, rather than rewritten in a more modern language. Seeing as the majority of those COBOL coders from 2000 are dead from old age, any young whippersnapper with COBOL in their skill set will probably make serious dough when the next Y2K-like issue arrives and all that code needs to be edited again.

      1. jake Silver badge

        Re: COBOL FTW

        Excuse me, Matt. The reports of my death are greatly exaggerated.

        Still making money quietly coding COBOL, and still recommending it as a language for kids to learn if they want a guaranteed income into the foreseeable future. I know lots of Java, Python, C# etc. coders who are out of work, but the COBOL folks are all gainfully employed. Can say the same for Fortran.

        COBOL is dead! Long live COBOL!

  4. Lusty

    Meaningless jabberings

    Overall popularity is meaningless as it depends on use-case. Chinese is a very popular language but it's of little use if you plan to live and work in France.

    Programming language popularity is heavily skewed by web and mobile app development. If you want to work in financial services or machine learning though you'd want to research what those industries need rather than look at the overall top 10 languages.

  5. Anonymous Coward
    Anonymous Coward

    Learn all of them, but NOT Java

    Learning Java will automatically turn you into an accessor function writing shitter, unable even to define a constant integer without creating 20 classes.

    You can spot a Java programmer even when they write in any other language.

    1. Anonymous Coward
      Anonymous Coward

      Re: Learn all of them, but NOT Java

      You can spot a Java programmer even when they write in any other language.

      That's soooo true. I used to have to keep reminding my ex-Sun boss that you can compare strings directly in C# instead of having to use .equals().

    2. GrumpenKraut
      FAIL

      Re: Learn all of them, but NOT Java

      > You can spot a Java programmer even when they write in any other language.

      Change Java to anything, still true. One of my "favorite" examples is a guy who does assembler style optimization in Mathematica.

      Regarding Python, people seem to assume that just because they use Python, their code is good, even when it is actively terrible.

      Regarding Java, I once reviewed a paper where the author had put something like "Java is the highest form of programming" and gave one of the most eye-watering pieces of shit code I have ever seen. He seemed to come from a (bad) C-background and managed to make about every mistake you can make in both languages in just one (printed) page. Strong rejection.

      For everyones entertainment I'll mention Fizz buzz.

      1. AMBxx Silver badge

        compare strings directly in C# instead of having to use .equals().

        That's one of the things I really don't like about C#. It never seems to know whether a string is an object or not.

        1. bombastic bob Silver badge
          Devil

          Re: compare strings directly in C# instead of having to use .equals().

          "strcmp(pointer1, pointer2)" ... what's so hard about that?

          object-oriented is *HIGHLY* overrated (and often inefficient - not always, just often, especially when done for the sake of doing it)

      2. Paddy
        Gimp

        Re: Learn all of them, but NOT Java

        "Regarding Python, people seem to assume that just because they use Python, their code is good, even when it is actively terrible."

        Not true. The community seeks "pythonic" code showing Python good practice. "The Zen of Python" asks new users to think more deeply about what constitutes good code, (import this). PEP-8 is a style *guide* for readability.

        You can write bad code in any language, but blog it for comment and the Python community usually give helpful and constructive criticism. :-)

      3. Nolveys
        Headmaster

        Re: Learn all of them, but NOT Java

        Regarding Python, people seem to assume that just because they use Python, their code is good, even when it is actively terrible.

        import time

        def get_tomorrows_date():

        ⠀⠀⠀⠀time.sleep( 24*60*60 )

        ⠀⠀⠀⠀return time.strftime('%Y-%m-%d')

    3. big_D Silver badge

      Re: Learn all of them, but NOT Java

      I remember one of my first projects, I had to maintain an ancient corporate accounting data collection system, written in MS BASIC running on CP/M80 and MS-DOS.

      It wouldn't have been so bad, but it had been written by FORTRAN programmers and maintained by COBOL programmers (I kid ye not). Unfortunately, neither group had ever read the BASIC manual further than IF and GO TO. Ever loop in the program was performed by "IF A > 0 THEN GO TO 100". They had never heard of FOR...NEXT or WHILE...WEND, let alone a REPEAT.

      There were also dozens of computer GO TOs in the code! There were reams of code that were commented out and we were running into size restrictions, so I tried deleting old, commented out code, only the thing just fell over, because it was computing a jump into the middle of a commented out section of code!

      I managed to tidy up the code somewhat and optimize it. I got the data collection, preparation and transmission down from over 4 hours to under 20 minutes!

      1. HmmmYes

        Re: Learn all of them, but NOT Java

        I can top that.

        Having to port a networked unix server to DOS using a TSR and a 3rd party TCP/IP stack. I almost quick the profession.

        1. JDX Gold badge

          Re: Learn all of them, but NOT Java

          If you're writing accessors yourself, you really should learn how to use your IDE properly - knowing how to use the tools is a big part of being effective in any language.

          1. Infernoz Bronze badge

            Re: Learn all of them, but NOT Java

            Or for most Java accessors, use project Lombok for much shorter code; it also covers lots of other common boilerplate code, including constructors, and common logging declarations.

            Explicit accessors are sometimes compulsory, for validation and security-copying (to prevent mutable object exploits), and trace logging.

            A lot of Python frankly looks like write-only code, because it never required type declarations method/function declaration, and I also suspect a lot of security/performance issues given how many easy, but dangerous assumptions it makes! I also view the Python API docs web pages as quite primitive and fugly compared to other language API docs like JavaDocs.

      2. John Smith 19 Gold badge
        Unhappy

        "the thing just fell over,..it was computing a jump into the middle of a commented out section"

        Ahhh. They just don't write code like that any more.

        Thank f**k.

        What an abortion.

        1. CrazyOldCatMan Silver badge

          Re: "the thing just fell over,..it was computing a jump into the middle of a commented out section"

          Ahhh. They just don't write code like that any more.

          One of the big crash-landings[1] we had when I was a programmer was writing self-modifying code. Since we were writing for an environment where a single code segment couldn't exceed 4K (and that had been raised from the original 1K), some of the previous generations had done some fairly aggressive things to keep their code small..

          Like having self-modifying code. Which is fine[2] when, in the old days, you only had a single thread to worry about and nothing would grab the CPU while your code was running, but by the time I got there, we had to code stuff so that it was re-entrant and could be used by multiple CPUs at once.

          Which, of course, negated the advantage of self-modifying code since you could never guarentee how many CPUs were running your (single instance) code.

          [1] Crash-landing was the term we cam e up for "if you do this it's an instant P45". Stuff like telling the CEO that he was an idiot..

          [2] For a particularly difficult to maintain version of "fine". And trying to debug a core dump where the bit of code you are looking at doesn't match the source code isn't fun.

          1. John Smith 19 Gold badge
            Unhappy

            "crash-landings[1] we had when I was a programmer was writing self-modifying code."

            I was taught about this in High School. Mostly that it could be done, but it was a Very Bad Idea to do it.

            It took me years to find any actual cases of it being used.

            They were

            a) The Apollo Guidance Computer b) The Bell Labs "Blit" bit mapped terminal.

            Both of which had (for different reasons) severe resource constraints.

            So I'm curious, what was your hardware environment?

            1. bombastic bob Silver badge
              Devil

              Re: "crash-landings[1] we had when I was a programmer was writing self-modifying code."

              PDP-11 in effect needed you to be able to write to where the code was running in order to efficiently pass parameters.

              JSR PC, MYFUNC

              ARG1 .WORD

              ARG2 .WORD

              etc.

              and MYFUNC would use the old 'PC' value as a frame pointer (from the stack), and do a kind of 'PC cleanup' on the program counter so that you returned to the correct address. Or you could call with 'JSR Rx, MYFUNC' and put the old PC into 'Rx' and use it as a frame pointer. I forget the details, but that's kinda how it worked.

              And so, you needed to write the arguments to ARG1 and ARG2 (etc) before doing the function call. they might even be general use memory variables if you're really clever with the design. You could even implement the subroutine call by referencing the actual address you call (the last word in the instruction, I think) as 'ARG1 - 2' and poke that before doing the call (making it a dynamic function call of some sort).

              Anyway, this was common in the PDP-11 world. I think DEC was kinda proud you COULD do this. But single-thread only, no recursion...

            2. Dagg Silver badge

              Re: "crash-landings[1] we had when I was a programmer was writing self-modifying code."

              From what I remember the original platform that COBOL was developed for had no stack so the perform start through end was implemented by overwriting the end label with a jump instruction to return to just after the original perform.

              The other major gotcha as a sort of self modifying code. This was the jump into an overlay when the wrong overlay was loaded.

      3. disgruntled yank

        Re: Learn all of them, but NOT Java

        Sorry, how can you jump into commented-out code and what happens when you do? Or was it "commented out" in the manner of

        if (False) { // old stuff I'm scared to cut but doubt we need

        doThis();

        doThat();

        ...

        }

        1. Fonant

          Re: Learn all of them, but NOT Java

          I understood it to mean that the code calculated a variable line number to GOTO, which broke the program if you removed a bunch of comment lines and the lines were renumbered: the calculation would no longer give the right line to GOTO.

          BICBVR

          1. Richard Plinston

            Re: Learn all of them, but NOT Java

            > I understood it to mean that the code calculated a variable line number to GOTO ...

            I suspect that it was much simpler. There were GOTOs to numbered lines that were comments. This would then drop down to the next executable line. When the commented lines were deleted there was then no target for the GOTO.

          2. fritsd

            Re: Learn all of them, but NOT Java

            If you remove a bunch of comment lines, why would your computer renumber the line numbers afterwards? That just doesn't make sense. I'm quite sure the Commodore 64 BASIC didn't do any such stupid shit.

        2. alisonken1

          Re: The way Basic worked in the old days

          In Basic - there is no labels.

          10 IF x = 5 GOTO 50

          20 REM THIS IS A COMMENT AT LINE 20

          30 REM THIS IS A COMMENT AT LINE 30

          40 REM THIS IS A COMMENT AT LINE 40

          50 PRINT "X = 5"

          60 REM THIS IS A COMMENT AT LINE 60

          70 REM THIS IS A COMMENT AT LINE 70

          Now, if you delete comments at line 20 and 30:

          10 IF x = 5 GOTO 50

          20 REM THIS IS A COMMENT AT LINE 40

          30 PRINT "X = 5"

          40 REM THIS IS A COMMENT AT LINE 60

          50 REM THIS IS A COMMENT AT LINE 70

          (edited for missing rem statements)

          1. Richard Plinston

            Re: The way Basic worked in the old days

            > In Basic - there is no labels.

            BASIC is not _a_ language, it is a large group of approximately similar, or not so similar, languages. Some do allow labels, even named subroutines.

            > Now, if you delete comments at line 20 and 30:

            No, no, no, not for any variation of 'BASIC' that I am aware of. For the BASICs that only use line numbers there is _NO_ automatic line renumbering, that would be a complete fail. Deleting lines 20 and 30 would leave lines 40 and beyond with their original line numbers. The whole point of numbering by 10s is so that lines can be inserted, such as 51, 52, etc.

            The problem described would arise if the original line 10 had GOTO 30 (which would work correctly) and then lines 20 and 30 were deleted because they were 'merely comments'.

          2. bombastic bob Silver badge
            Devil

            Re: The way Basic worked in the old days

            OK line numbers don't auto-re-order on any BASIC I've ever seen...

        3. big_D Silver badge

          Re: Learn all of them, but NOT Java

          @disgruntled yank

          Something like:

          10 print "Enter option number: "

          20 input a

          30 go to a*1000

          ...

          1000 rem b=56*c

          ...

          2000 print "sub menu"

          ...

          3000 rem input c$

          Only it wasn't in multiples of 1000. You see the commented out code (rem statement) and think it is no longer needed, so you can delete it to save space and make room for new code... Only to find out later that the code falls over when run.

      4. John Smith 19 Gold badge
        Unhappy

        "I got the data collection, preparation and transmission down..over 4 hours to under 20 minutes!"

        Which suggests just how much cruft this code had accumulated, and how badly the task had been implemented.

    4. John Smith 19 Gold badge
      Unhappy

      "You can spot a Java programmer even when they write in any other language."

      Or as they used to say "You can write FORTRAN in any language."

      I'll leave others to decide if not being able to write Java in other languages is a good or bad thing.

      1. Anonymous Coward
        Anonymous Coward

        Re: "You can spot a Java programmer even when they write in any other language."

        >> Or as they used to say "You can write FORTRAN in any language."

        I'm one of the few people I know who can make JavaScript look like Perl.

        This, it seems, is not a marketable skill.

        1. Richard Plinston

          Re: "You can spot a Java programmer even when they write in any other language."

          > I'm one of the few people I know who can make JavaScript look like Perl.

          In my experience, most people can make PERL look like chicken scratchings.

          1. dajames

            Re: "You can spot a Java programmer even when they write in any other language."

            In my experience, most people can make PERL look like chicken scratchings.

            I find it more remarkable that some people can make PERL not look like chicken scratchings ... and, indeed, can write useful, constructive, and efficient programs in that unlovely language.

            Why they don't apply their undeniable talents to something, instead, else remains a mystery, though.

        2. EmilPer.

          can make JavaScript look like Perl

          Easy, use jquery.

    5. YARR

      Re: Learn all of them, but NOT Java

      I use Java EE and other languages, having learnt with sequential languages, but I don't regard these criticisms as valid reasons for disliking the language. OO programming in Java isn't so different from other OO languages. If someone you know is overly applying OO concepts / design patterns then that's just their convoluted programming style rather than a fault with the language. When applied effectively, those concepts benefit large applications that are maintained over a long lifespan. Hence why OO features were added to languages like C and PHP.

    6. bombastic bob Silver badge
      Thumb Up

      Re: Learn all of them, but NOT Java

      "Learning Java will automatically turn you into an accessor function writing shitter, unable even to define a constant integer without creating 20 classes."

      you, sir, get my upvote!

  6. 45RPM Silver badge

    I love Python. It’s a great language - the new ‘Basic’. It’s great for teaching kids how to program, and it’s great for doing real work in as well but…

    …for me my one true love is C. It’s powerful (and, yes, dangerous if abused). It doesn’t hide anything or do anything automagically. Memory is yours to play with as you will. Even my C++ looks like C (which I realise makes it bad C++ - except, sometimes, to other C programmers).

    I quite like Objective C and Swift. I’ve been paid to develop in Pascal (which was my favourite teaching-kids-to-code language until I discovered Python) and APL (which was a vile experience). But, in my experience, if you can do C then you can pick up most modern programming languages quite easily. If you can do C well then even Assembly comes fairly naturally.

    1. Paul Crawford Silver badge

      Same here, Python is really handy for many tasks that otherwise would mean something like MATLAB or worse.

      "If you can do C well then even Assembly comes fairly naturally" may be true, but even truer is that C is really a universal assembler - there are very VERY few cases when assembly is justified, and even in those cases the fact that it can be in-lined in many C compiler's extensions is good.

    2. HmmmYes

      I like python, on top of C.

      In fact, I regard any language written in C as a C language. Apart from Java, thats shit. And Perl.

      1. 45RPM Silver badge

        @HmmmYes

        I like Perl for short bits of text processing. I use it like a more readable version of sed when I need to share code with a non-programmer. Anything more than that and Perl falls down badly - I had to maintain an application written in tens of thousands of lines of (badly written) Perl code. The original developer had left out the "use strict" pragma because in his words "it didn't run when he put that in". I fixed that, and improved overall reliability somewhat - but it still wasn't as good, or as fast, as it could have been if it had been written in a language which was up to the task in the first place.

        As for Java, that's a sad tale. So much potential - and ruined by Oracle. You have to admit* though that Microsoft really ran with it and has, latterly at least, come up with a real gem in C#.

        *you don't have to admit of course. You could spew coffee over your keyboard and disagree vehemently. There are some strange idioms in English.

    3. bazza Silver badge

      C is Getting Rusty

      I to am an ardent C programmer. It's been a superbly useful tool over the decades. I have done some pretty big C systems very successfully with C.

      However, I am intrigued by Rust. If they standardise that, there's a very good chance that I'll convert. It's usable as a system's language, it doesn't need a runtime, but it has some nice high level languages ideas, and does Communicating Sequential Processes too. There's lots to like!

    4. CrazyOldCatMan Silver badge

      I’ve been paid to develop in Pascal

      My Polytechnic code assignment was to write a stock-control system in Pascal. In a dialect that had no random access file handling..

      So I gave that up as a bad job and just wrote a reasonable demo instead. I got marked down a bit for not sticking to the brief and them marked up for my creative approach :-)

    5. kdd

      APL was my first language. I still have a soft spot for it, having written my own interpreter years ago that I still use as a desk calculator from time to time. But it has two serious flaws-- one, it's truly write-only, as even your own code becomes incomprehensible in record time given it's tendency to inspire complex one-liners. And two, it's optimized for the 2741 selectric printing terminal with an APL typeball and keyboard, which no one has anymore. The only language I know of however, that uses real mathematical multiply and divide symbols for the math operations rather than repurposing asterisk and slash. A lovely language, if you ask me, but I'd never advise anyone learn it.

      1. 45RPM Silver badge

        @kdd

        I know others who like it too. Maybe if it had been my first I’d feel the same way, but I was a C programmer, and I got tasked with working on an APL system because of my aptitude for quickly picking up new languages. I might be good at learning new languages - doesn’t necessarily mean that I enjoy using them!

        APL isn’t the only language, incidentally, that can use real mathematical divide symbol for the maths operations. AppleScript (and IIRC HyperTalk) can too - but only because it’s very flexible as to the syntax (which can, in fairness, be A Bad Thing, if only because no two developers will write code in the same way)

        For example, in AppleScript, for this sum, these are synonymous:

        display dialog 10 ÷ 2

        display dialog 10 / 2

        display dialog 10 div 2 (div is integer only)

    6. Matt Bryant Silver badge

      Re: 45RPM

      "....Pascal...." ah, yes, that was a fun starter language, but most learning establishments seem to have just treated it as an intro to Modula 2 and/or C, and never as a viable language in its own right.

      I like intro'ing kids to code with HTML, especially as they all use websites every day, and you show them structuring and files calls etc. with quick and easy results. From there it's easy to get them into a backend in C or whatever language you like.

  7. Rosie Davies

    On Another Note

    I'm not going to get involved in a 'my language is better than your language' discussion; I've got the battle scars from too many of those already.

    What did strike me as a bit strange was characterising a professional institute with a royal charter as "a technical advocacy organization". I wonder if that doesn't sell them a little short. Mad Bob the technology yogi is a technical advocacy organisation, albeit not for any technology that exists, and I'm not sure that's really comparable.

    Rosie

    1. Graham Cobb Silver badge

      Re: On Another Note

      The IEEE has a royal charter? That must annoy the IEE.

      1. Anonymous Coward
        Anonymous Coward

        Re: On Another Note

        Or even the IET as it's now called... My charter certificate is old enough to say IEE.

        I once had a very confusing conversation with someone who hadn't heard of the Institute of Electrical Engineers, but had heard of the Institute of Explosives Engineers. Now like every decent electronics engineer I have blown up various things in my time, but only tantalum capacitors or power supplies. This was an aspect of the conversation that took a while to resolve, via puzzled enquiries about really being allowed to do such things in one's bedroom at University...

  8. anonymous boring coward Silver badge

    I learned Swift, but it made my head explode.

    All Swift learning spilled out.

  9. anonymous boring coward Silver badge

    BTW, for me the top language has to be Swedish, closely followed by English, and then C. Quite like Tcl and Awk too. Awk because it's awkward, which I like. Never liked C++.

    1. Vic

      Awk because it's awkward, which I like

      I dislike awk intensely. But for some tasks, there is no sane substitute...

      Vic.

      1. anonymous boring coward Silver badge

        "I dislike awk intensely. But for some tasks, there is no sane substitute..."

        One can use sed as well for quite a lot. Even hairier than using Awk.

        Done quite a lot in "sh" as well. (Think this was before Bash, so was more limited. Bourne Shell, as opposed to Bourne Again Shell, if my memory serves me?)

        1. Anonymous Coward
          Anonymous Coward

          "One can use sed as well for quite a lot. Even hairier than using Awk."

          Matters got somewhat hairy when I ended up using sed to generate more sed.... I don't think I've ever come up with something so utterly unreadable to me. Worked though :-/

          1. anonymous boring coward Silver badge

            Re: "One can use sed as well for quite a lot. Even hairier than using Awk."

            That's the kind of programming that gives hair on the chest! (Or is it a neckbeard?)

      2. Roo
        Windows

        Intuitively awk feels like a mash-up of grep, c-shell and a stream fed spreadsheet. I'd have been curious to find out what drove the authors to produce that immortal chimera.

        1. anonymous boring coward Silver badge

          Doing as much as possible with as small resources as possible in an interpreted simple language, I would suspect.

  10. Anonymous Coward
    Anonymous Coward

    anything but python

    simply because of the dumb fucking idea of spaces.

    1. Tom 7

      Re: anything but python

      You can tell it to rely on parentheses rather than indentation - comes in handy when doing things like keyboard/mouse handlers and other complicated things that just wont be broken down into smaller functions in a desperate attempt to fit in 80 chars. I can only thank god that no-one has written a python ASP thing like PHP - imagine following indentions down through that code!

      1. Richard Plinston

        Re: anything but python

        > I can only thank god that no-one has written a python ASP thing like PHP

        http://www.4guysfromrolla.com/webtech/082201-1.shtml

    2. Anonymous Coward
      Anonymous Coward

      Re: anything but python

      I wish I could upvote you a million times. Significant whitespace has to go down in history as one of the dumbest decisions ever (along with the GIL).

      The reason is simple. Your code becomes nightmarishly difficult to refactor. Refactoring is one of the most important tasks in large scale software development since it allows your design to develop with the changing use-cases. But when you have to be ridiculously careful about making sure that things end up at the correct indent level it becomes a nightmare problem. The number of times I've fixed someones bug where they accidentally changes the indentation of the last line of a loop or similar.

      Personally, I would have had a terminating token for the end of functions/loops, but made it a syntax error to have invalid indentation. That way after a refactor you can use your editor/IDE to fix the indentation.

      I have a few other Python gripes - one of the biggest being the absence of a perl-like use strict type construct. If you have ever had to debug a problem where a thread dies due to a syntax error in a little used code path, you will totally get my annoyance.

      Incidentally, and without wishing to start a flame war, my favourite language is C++, and I think Julia is the one to watch for.

      1. bazza Silver badge

        Re: anything but python

        Seconded, Python is horrid.

        Plus there's no such thing as Python. There's Python 2, and then there's Python 3. Which one to use? Solving this by going polyglotal is madness.

        1. Richard Plinston

          Re: anything but python

          > Plus there's no such thing as Python. There's Python 2, and then there's Python 3.

          What's your point? There is Java [1], Java 2, Java 3, ..., Java 8; C++ 3, C++ 11, C++ 14, C++ 17.

          1. Anonymous Coward
            Anonymous Coward

            Re: anything but python

            I've not used python 3 yet because I haven't had to, but it would appear to be very different from 2.

            I hate the whitespace thing too. Syntactically, c and Perl have always made the most sense to me.

            1. Richard Plinston

              Re: anything but python

              > c and Perl have always made the most sense to me.

              Python 3 does have differences from Python 2, but Perl has been through several rewrites, each of which were incompatible which previous version source code. If Perl makes "most sense" then you obviously never used Perl4 and haven't looked at Perl6 because these are quite different languages.

              https://docs.perl6.org/language/5to6-nutshell

              1. Anonymous Coward
                Anonymous Coward

                Re: anything but python

                I was talking about two different things...

          2. This is my handle

            Re: anything but python

            >> What's your point? There is Java [1], Java 2, Java 3, ..., Java 8; C++ 3, C++ 11, C++ 14, C++ 17.

            Yes, but by & large they have backward compatibility. At least java does. I haven't touched C++ in years but as has been amply pointed out in this thread (I'm paraphrasing a bit) "You can write K&R C in any language" (including C++ last time I checked) at least with a few gcc flags and ignoring the warnings; . This is *not* true of Python (or for that matter, my own favorite scripting langauge perl, though in fairness I'm guessing that the vast majority of all perl ever run in production anywhere was perl 5.x).

            My $0.02.

            1. Richard Plinston

              Re: anything but python

              > Yes, but by & large they have backward compatibility.

              Yes, but it is only "by and large". When moving from one version of C++ or Java to the next there will always be some issues which need resolving, except in trivial code.

              Python3 is a new version of the language designed to be a significant improvement. Python2 is still developed and supported and has 'futures' and other tools to ease the transition to the new language. This has been done by numerous languages: extreme examples are: Pascal to Modula2; VisualBasic - numerous times;

              Python3 vs. Python2 should be compared to Kotlin vs. Java. Kotlin is designed to make Java into a modern language and drop 22 years of baggage that it still carries. C++ has 36 years of baggage.

              > "You can write K&R C in any language" (including C++ last time I checked)

              Actually you can't. K&R C (edition 1) was replaced by ANSI C and few modern C/C++ compilers support the original K&R (though gcc may still do so). And that is hardly "any language".

              And I don't know that anyone said that; what they did say was "You can write FORTRAN programs in any language", which is quite a different thing.

      2. Richard Plinston

        Re: anything but python

        > But when you have to be ridiculously careful about making sure that things end up at the correct indent level it becomes a nightmare problem.

        I don't have problems with that, but then I have chosen tools, and configurations of those, that would seem to be more appropriate than the ones that you are using.

        > If you have ever had to debug a problem where a thread dies due to a syntax error in a little used code path, you will totally get my annoyance.

        Syntax errors are discovered during the load/compile phase so you are probably referring to something different. There is usually an exception trace produced unless you deliberately ignore exceptions.

    3. Fruit and Nutcase Silver badge
      Happy

      Re: anything but python

      @AC

      I went to upvote this post and got the following response

      "Oops. Alert the Moderatrix - the coders need whipping"

      ...my internet connection had dropped

  11. Christian Berger

    Note that there were "popular" shitty languages in the past

    Like PL/1 for example, a language trying to do "everything".

    Here's a review of it:

    https://plg.uwaterloo.ca/~holt/papers/fatal_disease.html

    1. bombastic bob Silver badge
      Unhappy

      Re: Note that there were "popular" shitty languages in the past

      and there are a few in the present - like "C-pound" which relies on ".Not". Both equally shitty.

      I think Python has its uses, but is ripe for ABuse and I see this in poorly written DJango code (and imported objects) INCLUDING the DJango implementation itself.

      And too many people say "Write that in Python" or "I can write that in Python" when it SHOULD be done as a C utility, at least for efficiency. [converting binary data in python is the *WORST* possible implementation I have *EVAR* seen, because Python is afraid of pointers and C-style structures, apparently, and YES, I'm currently tasked with maintaining code that actually *DOES* this, because python 'expert' did a rage-quit].

      1. Richard Plinston

        Re: Note that there were "popular" shitty languages in the past

        > "I can write that in Python" when it SHOULD be done as a C utility, at least for efficiency.

        Not all C programs are efficient. I wrote a text merge program in C. It was quite slow due to the str..() library, in particular strcat() having to scan along the strings to get the length. A rewrite in Python was 10 times faster.

        1. jake Silver badge

          Re: Note that there were "popular" shitty languages in the past

          Not all programmers know how to write efficient code in C.

          FTFY, no charge.

  12. John Smith 19 Gold badge
    Trollface

    OT. Way to get a C dev to fall in love with another language.

    Simple. Let them take a pointer to a procedure (or function) and put it in an array in that language.

    Which (IIRC) even Ada allows (called "reference" variables), but strongly discourages.

    Give them that, and at least 8 character variable names and they're yours.

    They'll even tolerate garbage collected memory

    1. GrumpenKraut
      Mushroom

      Re: OT. Way to get a C dev to fall in love with another language.

      > They'll even tolerate garbage collected memory

      NEVARRRR!!!

      1. John Smith 19 Gold badge
        Unhappy

        They'll even tolerate garbage collected memory. NEVARRRR!!!

        I did say tolerate.

        "Liking" is a bit too much to ask for, given the loss of absolute control.

        But I may have been a bit too optimistic there.

    2. Anonymous Coward
      Anonymous Coward

      Re: OT. Way to get a C dev to fall in love with another language.

      void *my_functions[100];

      or in C++

      std::array<std::any, 100> my_functions;

      if you want to be particularly smart, wrap it in a simple class that has a std::enable_if that only allows you to store things for which std::is_invocable is true...

      1. bombastic bob Silver badge
        Pirate

        Re: OT. Way to get a C dev to fall in love with another language.

        'std' class template-based implementations are HIGHLY overrated. They try to be too much, are sometimes collection (instead of array) based, have some cryptic built-in requirements for memory manager objects and other irritating things, and can be best re-implemented in only a few lines of code by someone who knows what he is doing (like me).

        But the C++ language doesn't require 'std' usage so it's all good. I think that the 'std' class templates were written by "Academic Arrogance" types that haven't coded in production EVAR in their entire lives, nor had to MAINTAIN someone else's crap-code. So they're clueless about the real world. And it's reflected in the design.

        pirate icon, just because I'm a rebel

    3. Flocke Kroes Silver badge

      Re: OT. Way to get a C dev to fall in love with another language.

      return_type (*array_name[])(parameter0_type, parameter1_type, ...) = {function0, function1};

      Works fine. Eve's C compiler was limited to 8 letter variable names because of the limited storage capacity of flint chips. Ancient Greek clockwork compilers allowed arbitrary length identifiers, but only the first 63 bytes were significant.

  13. Anonymous Coward
    Anonymous Coward

    Xcode / Swift

    I've decided to start learning Xcode / Swift, only as a hobby mind. If that leads to the ability to work from home and be on better wages than I'm on now, perhaps I need to ramp up my speed of learning it! Would welcome comments from anyone with knowledge about this language and career prospects.

  14. Anonymous Coward
    Anonymous Coward

    Broken link http://spectrum.ieee.org/ns/IEEE_TPL_2017/index/2017/1/1/1/1/1/25/1/25/1/50/1/25/1/25/1/50/1/25/1/25/1/100/1/100/1/25/1/40/

  15. Clive Galway

    Any language where whitespace dictates what is inside a conditional and what isn't (eg Python) needs to die a slow and painful death IMHO.

    1. vincent himpe

      You mean the creators of such languages need to die a slow and painful death.

      The language itself can't die fast enough...

    2. Richard Plinston

      > Any language where whitespace dictates what is inside a conditional and what isn't (eg Python) needs to die a slow and painful death IMHO.

      No one cares if you don't use Python (unless your managers do). If you are forced to use it, then get better tools and learn how to use it better.

      In what way is 'what is inside a conditional' not determined by the colon that terminates it. Perhaps you are thinking of some other language.

      1. Clive Galway

        I meant what is inside the code block, not what is part of the condition.

        numbers = [2, 4, 6, 8]

        product = 1

        for number in numbers:

        ....product = product * number

        product = product * number is only part of the loop because of it's indenting. This is what I was referring to.

        Trying to post some Python code somewhere that does not allow indenting (I cannot figure out how to do it on this site, even pre blocks strip out leading whitespace) and you simply cannot post valid code.

        What happens if somehow you end up with source that contains spaces and tabs? The code could then surely APPEAR to mean one thing, but in fact means something completely different.

        1. Richard Plinston

          > Trying to post some Python code somewhere that does not allow indenting ...

          That is not the fault of the language, but of the site. This site recognises &lt;code&gt; and &lt;pre&gt; tags but fails to implement them in a useful way.

          > What happens if somehow you end up with source that contains spaces and tabs?

          You fire the programmer and/or get better tools and/or use how to configure them.

          > The code could then surely APPEAR to mean one thing, but in fact means something completely different.

          Like in C:

          total=0;

          j=0;

          for (int i=0; i<10; i++)

          ....total+=i;

          ....j++;

          1. Clive Galway

            Nothing like C.

            With C, the rule is that if the conditional has no braces, then the first line after the condition is in the block and nothing else.

            Whitespace in this case is utterly irrelevant

            1. Richard Plinston

              >> The code could then surely APPEAR to mean one thing, but in fact means something completely different.

              > With C, the rule is that if the conditional has no braces, then the first line after the condition is in the block and nothing else.

              Exactly. That is why my example in C was illustrating that "the code could then surely APPEAR to mean one thing [according to the indent], but in fact means something completely different."

    3. Flocke Kroes Silver badge

      @Clive Galway

      Could be worse. Imaging what could go wrong if a language allowed:

      ⎵⎵⎵⎵⎵⎵⎵⎵if (is_elephant(animal))

      ⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵if (is_white(animal))

      ⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵white_elephant(animal);

      ⎵⎵⎵⎵⎵⎵⎵⎵else

      ⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵⎵not_elephant(animal);

    4. bombastic bob Silver badge
      Devil

      "Any language where whitespace dictates what is inside a conditional and what isn't (eg Python) needs to die a slow and painful death IMHO."

      I just think of it as 'Allman Style' without the curly braces

      https://en.wikipedia.org/wiki/Indent_style#Allman_style

      And if you put the ':' right after the control statement, it doesn't look a THING like K&R style (which I HATE)

      (and I also dislike hard-tabs, so multiple spaces are fine, and pluma does auto-indent, and highlights things in a readable manner)

  16. vincent himpe

    Write it out in mnemonics, translate ,by hand,from memory to opcodes (hex notation) and send it directly into unalterable masked read only memory. Should work first time right.

    Anything else is for wannabe's.

    ps: use lots of absolute jumps (GOTO) and globals with hardcoded addresses just to piss off the modular/reusable crowd.(because such programs run much faster since they don't have waste cpu cycles pushing and popping stuff onto/off the stack and moving data)

    1. TheElder

      quote: use lots of absolute jumps (GOTO) and globals with hardcoded addresses just to piss off the modular/reusable crowd.(because such programs run much faster since they don't have waste cpu cycles pushing and popping stuff onto/off the stack and moving data)

      Heh. Exactly what I am doing. Ruby has crappy garbage collection. Unwrapping some of the loops can also work well.

  17. TheElder

    Coding 54 years

    Started when 13 on a Bendix G-15. A machine with little glass tubes filled with nothing. Writing code that resembles BrainFuck. Grew up spitting distance from the Valley. Moved on to Fortran at UC Berkeley and also studied psychology since I wanted to get into AI. AI still doesn't exist.

    Moved on to BASIC but quickly dropped into machine code for the speed. Then figured out how to overclock everything. Designed and built bit mapped video cards before the concept existed. Rewrote the BASIC interpreter. Wrote a little program called DamBusters for the PET as well as some others. Published in Transactor. Worked on the computer side of Xerox and also met with Gates and others at the first Computer Fair in San Francisco. Most likely was watching Jobs helping himself to everything he could at PARC.

    Since then have worked with various languages but am mostly interested in anything to do with graphics. Python works well in that respect. Have helped to develop the AutoCAD 3D turbulence modeling. C is OK but don't need the speed for what I am now doing. Have figured out how to do brain mapping using Ruby along with some cool sonification using Sonic-Pi.

    The thing that gripes me the most is the FLAT everything. If it is a button it should look like a button. It is all about users. Thinking like a user isn't easy. I have been teaching users for many years. I developed the very first computer science course for Thompson River University back in the early 80's.

    1. Jaybus

      Re: Coding 54 years

      "AI still doesn't exist"

      Of course not. It's to be expected, as natural intelligence barely exists and was some 4 billion years in development.

  18. Anonymous Coward
    Anonymous Coward

    Fermi/Fermats last theorem ...

    I suspect that Fermat was wrong, and if we had a chance to see his much vaunted proof, we would have found it flawed.

  19. Marcus Fil
    Alien

    Fermi's last theorem

    I have discovered their truly marvelous location, which this margin is too narrow to describe.

  20. herman
    WTF?

    Easy to learn?

    Any programming language is easy to learn if you already know a bunch of other programming languages...

  21. Brian Allan 1

    Use the languages you know!

    No need to be elitist...

  22. Prndll

    meanwhile.....

    90% of the world is now writing apps.

    Java, python, C+ ...........oh my!

    There's an app for that!

  23. venkatesanm

    Python will rule the IT as a Language

    Python will be the language of choice in almost all the domain in IT including Cloud Computing, Artificial Intelligence, Machine Learning , Data Analytics, IOT, DevOps. In fact python is the most prepared Language in Software Test Automation, Mobile Test Automation, Application development, IT Infra.

  24. Mothan

    ava has seen heavy use building enterprise applications, as such, there are a multitude of mature frameworks (Grails, Spring etc..) and experienced developers to make use of. This means that Java can be used for a large variety of usecases, from small applications and single page sites, all the way through to large enterprise applications with lots and lots of data.

    PHP has seen use in a large variety of applications as well, it is well known that PHP powers a large amount of applications on the web. Facebook used it in its earlier days, wikipedia, wordpress, among many others. More recently, we’ve seen a lot of nice looking frameworks become available, Laravel for one.

    Personally, I’m a big fan of Java and Ruby for web development. They provide the syntactical consistency that I enjoy working with, along with language features and frameworks that make web development a breeze. While PHP isn’t a bad option for web development, whether its better or not is going to come down to the developers you have on hand and the type of application you intend to build.

    If you interesting, check out this where compare this two languages between each other I hope it helps

  25. the_north

    Java still the best and will be the best!

    A quick search on Dice.com shows that working in Java is in bulk. If for iOS there are about 2500 offers, for Java it is more than 17000. Of course, one cannot completely rely on these numbers. But the fact that the market for Java on Dice.com is potentially seven times larger than for the most fashionable iOS suggests that “old Java” feels pretty good.

    Java certainly has its own problems. Java haters will continue to sputter and knock on the keyboard, posting malicious comments on the Internet. A garbage collector can cause hiccups and shiver. Data typing is a chore and cannot reject really bad code. Annotations are too complex. New Java features aren't evolving as fast as they were in the past. Braces add some confusion. This list goes on and on.

    However, none of the competing technologies could not so widely and deeply land on the shores of the IT industry. Although some of the problems in Java are fairly easy to fix.

    Another question is where and how to learn Java?

    Java is the primary language for Advanced Placement Computer Science (Advanced Placement (AP) - curriculum and exams for high school students in the US). This means that often for Java students is the first programming language. Thus, Java is further with them "both in sorrow and in joy."

    Let's talk about Java learning methods. Who is learning or studying? Very interesting to listen to. Maybe I myself can give a couple of practical training tips.

    1. Richard Plinston

      Re: Java still the best and will be the best!

      > If for iOS there are about 2500 offers, for Java it is more than 17000.

      You are basing the 'popularity' of a language on the number of empty desks ?

      It may be that an iOS offer is filled quickly and thus the offer is taken down while the Java offer stays up for months and thus there are more of these at any one time.

      Just like any religious dogma support is searched for while counter arguments are ignored.

      1. the_north

        Re: Java still the best and will be the best!

        These are actual offers. I think it makes no sense to argue about the superiority of Java over the iOS platform.

        You did not think that the complexity of the implementation of Java requires more skills than iOS? You did not think that Java covers a wide range of tasks? iOS is one platform. Java is multiplatform. From here and more offers, and I don’t need to prove here that the offers on the iOS fly like hotcakes, and Java continues to stale on the table.

    2. the_north

      After initial excitement and motivation, you gradually hit a brick wall (figuratively speaking) from time to time, which can be quite demotivating.

      Once I started Multithreading, several times I caught myself thinking "Why am I doing this?!" But you get through it, gradually grinding it out.

      I personally think motivation is not really the key for any beginner. You need discipline, more than anything, and as you get used to gradually getting through the hard topics, you gain more experience and when it finally clicks you feel on top of the world...and recharge your motivation in the process.

      I strongly believe it is also very important to have the right tools and information.

      So, let's took about the books.

      My First one (not surprising) was "Head First Java".

      This book is for fans of the informal presentation of material. When you read this book you get the impression that you are not learning, but just talking with friends.

      The next one was "Java. Beginners guide" by H. Schildt.

      This is the best book for newbies. I could not read Head Fist for a long time; This is not my book, I have reached half and realized that I can no longer.

      In this book, everything is structured, all "on the shelves". The author managed to write a book "not dry" and "without water". This is the best book for newbies!

      Legendary "Thinking in Java". B. Eckel.

      I recommend this book after reading Head First or after reading Schildt’s book.

      If you are starting to learn Java, but you have experience in other programming languages, such as C ++, then you can safely take up this book.

      This is a book that can be read in a couple of nights and you will know the Java-core at a good level.

      I also have a personal shortlist of online resources, which I hope helps beginners:

      CodeGym.cc

      + : free, good design, a lot of practical tasks, game-like course geared for complete beginners, quick switch between light and dark themes.

      - : Java only website.

      Edabit.com

      + : free, interesting concept with a lot of “challenges” of various complexity, can add your own theory resources to each challenge.

      - : not for beginners, no theory apart from links from users to outside sources.

      Mooc.fi

      + : free, includes exercises/tasks, examples of code included in the theory, more advanced topics also included.

      - : reads a bit like a very long manual with no ‘back to top button’, too much white on the page so hard on the eyes after a while, not much theory.

      SoloLearn.com

      + : free, good design, step-by-step process, and test questions.

      - : very little theory, no proper tasks to cement the knowledge.

      Good like, guys! I hope it will be helpful.

This topic is closed for new posts.

Other stories you might like