back to article Python creator Guido van Rossum sys.exit()s as language overlord

Guido van Rossum – who created the Python programming language in 1989, was jokingly styled as its “benevolent dictator for life”, and ushered it to global ubiquity – has stepped down, and won’t appoint a successor. In a mailing list post on Thursday titled, “Transfer of Power,” he wrote: “Now that PEP 572 is done, I don't …

  1. Charlie Clark Silver badge

    Re: I like Python and C

    Some design decisions were, in hindsight, wrong. ‘print’ - statement rather than ‘print()’ - expression, for example . Means you can’t ‘[print(x) for c in x]’.

    I actually want my print statement back. Why on earth would I ever wrap a print call in a comprehension? Never ever felt even the urge to do this in the past. And if I ever did, two lines would be fine.

    Code that works on both 2 and 3 requires minimal changes if you can start with 3-style. Then it's really just unicode literals, ints instead of longs. It's a bit more work if you have extensions. All in all less work than changing a major component.

  2. jake Silver badge

    Re: I like Python and C

    "DOS’s 640k RAM limit"

    Assuming you mean the MS-DOS/PC-DOS twins, they didn't actually have a 640K limit. They could use all the contiguous memory that you could give them. Really. Look it up. The actual limit was in the hardware. IBM, in it's infinite wisdom, decided to put the system ROM at the top of memory. Had they put it at the bottom, DOS would have be able to use as much memory as the CPU could address.

    And it wasn't really 640K, it was more like 704K, if you knew what you were doing. Later, memory tricks allowed up to about 720K, later still around 950K. I find it absolutely amazing that the "640K DOS limit" piece of incorrect trivia is still being parroted as fact after all these years ...

    On the other hand, I personally remember Steve Jobs saying that "128K ought to be enough for anybody", at a meeting of the Homebrew Computer Club in late 1983, as he was demonstrating the original 128K Mac, just before the public unveiling. At the time, he had a point ... people were running flight simulators in 64K!

  3. JLV Silver badge

    Re: I like Python and C

    Well, I didn't know that. But I'm not sure I agree with your viewpoint DOS had nothing to do with it.

    It's not that I disagree with the hardware info you state, you know more than I do about it, apparently.

    However, I well remember mucking around with config.sys to get DOS to see enough RAM to run a flight sim on my first pc. So, the OS, which should have abstracted away the hardware specifics, wasn't quite up to that role. Ditto tons of memory manager utilities floating around to help out.

    Also, IIRC, things like QNX had no trouble running 4MB RAM workstations.

    Regardless of the actual causes, it's still one of the better known examples of (hardware or software) design issues sticking around like a wart.

    Last "640K is enough for everyone" is, perhaps wrongly, attributed to Gates, so hardly surprising the meme would survive as a "DOS sin".

    https://www.computerworld.com/article/2534312/operating-systems/the--640k--quote-won-t-go-away----but-did-gates-really-say-it-.html

  4. Adrian Harvey
    Boffin

    Re: I like Python and C

    “IBM, in it's infinite wisdom, decided to put the system ROM at the top of memory.”

    I’m not sure the blame doesn’t sit with Intel there- the 8086 processor bootstrap begins by executing code ar FFFF:0000 - right at the end of memory (for those too young to remember segmented addressing, that’s 16 bytes shy of the 1MB highest possible address on the 20 bit address bus.). So you would have to have some ROM there to handle the bootstrap process. And putting the system ROM somewhere else in the memory map would probably have required a second chip or some custom part.

    For all I know it may sit further back in computing history than that....

  5. W@ldo

    Re: I like Python and C

    IBM owning a big chunk of a then faltering Intel gave us the joys of segment:offset memory addressing. That set us back years in assembly coding quality software. There were Zilog and Motorola chips around at the time that were much better and could handle direct memory addressing. We all paid a price for 15+ years until Intel reached that point.

    If you don't know how segment:offset works, take some time and you'll see the futility we all faced back in the day....I moved on to C, never embraced C++ and Python is the only interpreted language I became fond of. Too bad about the situation as it has been a fun ride.

  6. Charlie Clark Silver badge

    Re: I like Python and C

    And it wasn't really 640K, it was more like 704K, if you knew what you were doing.

    Like running DOS inside OS/2…

  7. tfb Silver badge

    Re: I like Python and C

    The actual decision that was wrong about print was having statements. Expression languages have none of these problems. But I realise that that's far too radical an idea to ever succeed.

  8. jake Silver badge

    Re: I like Python and C

    JLV, if you configured your hardware such that the CGA memory space was replaced with RAM (CGA was at the bottom of the so-called "Upper Memory Area"), that 64K of RAM would be recognized by DOS automatically, giving you 704K of so-called "Lower Memory" without the use of third-party memory management tricks, or even any mucking about with config.sys or autoexec.bat.

    "So, the OS, which should have abstracted away the hardware specifics, wasn't quite up to that role."

    You are quite correct, DOS wasn't up that kind of thing. But then DOS wasn't really an operating system, it was just a glorified program loader.

  9. jake Silver badge

    Re: I like Python and C

    True enough, Adrian Harvey. Note where I said "contiguous"?

    "And putting the system ROM somewhere else in the memory map would probably have required a second chip or some custom part."

    As happened starting with the 80286.

  10. jake Silver badge

    Re: I like Python and C

    W@ldo, IMO, Intel's biggest problem back in the day was a lack of MMU.

    Frankly, I never had an issue with the segmented address space. Every CPU has it's quirks, some are more quirky than others. They all suck, but we use 'em anyway.

  11. jake Silver badge

    Re: I like Python and C

    Indeed, Charlie Clark :-)

  12. Charlie Clark Silver badge

    Re: I like Python and C

    Frankly, I never had an issue with the segmented address space.

    x86's memory addressing and "context switching" chained CPU performance to the 1970s for decades. Intel had admirable processes for a dreadful architecture. But, as with VHS over Beta (feel free to add your own examples), it's often not the best technology which succeeds initially. Eventually, however, the better technology is likely to be adopted.

  13. jake Silver badge

    Re: I like Python and C

    Rather clumsy wording on my part. What I meant was that I had no real issues with programming Intel's kludge of a flagship. It was there, it worked, we used it. Some were quite vocal in their displeasure, I just learned the faults & foibles & got on with it. Seemed easier than griping about a problem that was never going to go away.

  14. onefang

    Re: I like Python and C

    "Eventually, however, the better technology is likely to be adopted."

    You had me nodding my head in agreement up until this line. Marketing wins over good tech every time, and then the better tech disappears from view, coz the marketing worked so well. By the time the better tech is generally accepted as actually being better, both techs are obsolete, and the current tech winner is the new shiny with the better marketing.

    I think the reason is that the better tech knows they are better, and don't bother to market it so well, but the not so good tech knows it's not so good, so they pour money into good marketing to get their higher market share.

  15. Charlie Clark Silver badge

    Re: I like Python and C

    Marketing wins over good tech every time, and then the better tech disappears from view, coz the marketing worked so well.

    Tech that is "good enough" is likely to win through but I think we've seen also seen examples of good technology winning through in the end. For example, similar to the VHS versus Beta battle, the microchannel architecture of the PS/2 was most definitely better than the ISA of the PC, but it wasn't that much better, plus IBM didn't want to repeat the BIOS problem so they didn't want to licence it. A few years later when a replacment for ISA became essential we got VESA local bus and PCI. VESA was first to market and the devices were cheaper but PCI won out. Okay, Intel's backing did help. Intel was also behind the push of USB for everything but at some point jumped ship to back Apple's FireWire follow-up that is "USB in name only". But Intel came a cropper by betting on Sprint and WiMax: not sufficiently good enough to get networks to dump LTE for it. The same goes for trying to convince the developers of embedded devices that x86 can do the same work as an ARM with the same power.

  16. Potemkine! Silver badge

    Whoever is not a misanthrope at forty can never have loved mankind

    Facing the stupidity, violence, hatred from the many despicable assholes making a big part of the human kind is a harsh experiment. It's getting even harsher with the so-called 'social' media which are so great to propagate this stupidity, violence and hatred even more.

    I'm going back in the garage.

  17. Gordon 10 Silver badge

    Re: Whoever is not a misanthrope at forty can never have loved mankind

    Nice wheezer song have an upvote.

  18. Anonymous Coward
    Anonymous Coward

    IIRC the use of ":=" for (all) assignments used to be the sensible choice since using the mathematical equality symbol for this was obviously inappropriate. But then people complained that not only was it two keypresses instead of one, but it also involved finding the shift key - adding "short hand" to programming languages has a long history.

  19. Wensleydale Cheese Silver badge

    Re introducing ":="

    "But then people complained that not only was it two keypresses instead of one, but it also involved finding the shift key - adding "short hand" to programming languages has a long history."

    On a recent personal project which involves the manual entry of lots of time stamps, I decided to drop the colon mainly because it requires shift. I simply use HHMM format rather than HH:MM.

    It's my data and if I want to use it in other applications which expect a colon, it's a simple enough matter to let the computer do the work (via an extra workflow step, if necessary).

    There, saved myself a lot of keyboard work, and a useful by product is less typos which need correcting.

  20. Anonymous Coward
    Anonymous Coward

    Re: Re introducing ":="

    Isn't that the sort of thing which caused the year 2000 bug, saving all the keyboard work on not typing 19. Recently I came across some files which shouldn't of been destroyed in 1910 which they hadn't been, maybe because they didn't exist then.

    Your data so you can do want you want, I've just spent to much time dealing with other peoples and there typos, when there wasn't anything in place to highlight the typos when they happened.

    Anon of course

  21. FrankAlphaXII Silver badge
    FAIL

    Re: Re introducing ":="

    >>I've just spent to much time dealing with other peoples and there typos

    Instead of complaining about other people's typos, maybe you should worry more about your own, as your own sample of typing here isn't exactly a shining example of perfection.

  22. FeRDNYC

    Re: Re introducing ":="

    Two-digit years wasn't so much about typing as it was about data storage, back in a time when saving two bytes per date seemed like real economy. If it was just about typing, they could've allowed operators to type a two-digit year, but stored it as 19xx, in which case Y2K would've been a snooze. (The same way UNIX has always stored dates as integers — seconds since the epoch — for efficiency, and therefore UNIX/Linux were largely immune to the Y2K bug, at least in terms of system code.)

    Pascal did indeed use ':=' for assignment (and '=' for equality test), whereas C went with '=' and '=='. Which probably was about typing laziness regarding the shifted character.

    (Or possibly internationalization — did the ':' character appear on all early keyboard layouts? Does it appear on all of them now, for that matter? Obviously it's already required if you want to write Python code, so it's not an issue using it for ':='.)

  23. Anonymous Coward
    Anonymous Coward

    "Python can be a fraction of the number of lines as a program which does the same thing in C."

    Interesting... What are the Top-Things people like about Python specifically? Does it have easy syntax, rich libraries of functionality, no overbearing pointers, headers, macros, virtual-funcs, mem-management?

  24. Anonymous Coward
    Anonymous Coward

    Re: "Python can be a fraction of the number of lines as a program which does the same thing in C."

    Yes. And ease of use, and ubiquity across platforms, and the ability to do very rapid development, and built-in debugging/trace aids and...

  25. Flocke Kroes Silver badge

    Re: "Python can be a fraction of the number of lines as a program which does the same thing in C."

    Python has a buffer protocol that effectively wraps pointers and allows you to experience all the joys of debugging double free, use after free and memory leaks - if you want to.

    Importing python modules is a bit like including a header file.

    You can emulate macros with string templates and the codeop module - if you want to.

    All python methods are virtual functions.

    What I like about python is that programming styles from different languages are (to a varying extent) supported. I can pick the style best suited to each part of the problem and the interpreter will not add pointless road blocks to send me in the direction Bjarne Stroustrup knows is the only possible way to solve a different problem.

  26. Anonymous Coward
    Anonymous Coward

    'programming styles from different languages are (to a varying extent) supported'

    Cheers!

    How does that actually work in practice, can you say more?

  27. JLV Silver badge

    Re: 'programming styles from different languages are (to a varying extent) supported'

    Different paradigms, rather than languages:

    Functional - the list comprehensions and iterators, maps, reduces, lambdas, all sorts of goodies (which I know little about).

    Object Oriented - pretty much everything is an object, including classes themselves and functions/methods. There's a lot of depth in the data model that few people use. For examples, classes are themselves objects with their class being a metaclass. That's useful, for some use cases, or for some people's programming preferences - I have a bit of a blind spot for them, they're an unused tool in my case. You can generate classes on the fly as well - say a class for each database table you are reading from.

    (one thing to beware of : mutable objects as attributes, at a class level or as default arguments, bites everyone sooner or later. self._list = [] looks like cls._list = [], but in the first case appending stuff self._list.append(hit) affects your instance, in the second all that class's instances.)

    Procedural - if you want to do write something with a main calling all sorts of functions, there's really nothing forcing you to use classes or objects as your building blocks - sticking to functions is perfectly permitted. ditto avoiding list comprehensions.

    Multiplatform/scripting - rare is the case where you really have do worry about Windows vs Nix. os.path.join("foo","bar","zoom") will do the right thing on either, barring issues with Windows C:/D: drive names.

    Since functions are objects, you can say assign any attribute, say a template to a function. The reasons why you might to do this are not common, but it can be helpful at times. For example, I explicitly assign template file paths to webserver functions because it allows you to automatically introspect which urls use which templates.

    def f_view(**kwds):

    (indent) print (f_view.template % kwds)

    f_view.template = "my foo is: %(foo)s"

    f_view(foo=2, bar="1")

    All these tricks need to be sanity-checked against clarity - it is just as possible to write incomprehensible code in Python as in C!

  28. JLV Silver badge

    Re: 'programming styles from different languages are (to a varying extent) supported'

    things it doesn't do (I've probably missed some):

    - compile time checks - there are some basic syntax checks (dangling commas, bad indents, etc...), but nothing like a real compiler. It is an interpreted language, albeit a strongly typed one. Even the 3.x type annotations are more intended for 3rd party library parsing than real compile time type checking. That's a hard separation - you either want compile checks or you don't.

    - information hiding and encapsulation. There is no privacy as such to class and module attributes, though single underscore, _my_somewhat_private, by convention means non-public and double underscores, __my_almost_private, are obfuscated, but still accessible.

    - full-on threading. There's something called the Global Interpreter Lock in the main (C-based) version of the language that enforces code locks. It looks like full threading from the POV of the coder, but code blocks will take their turn in some cases. Different ways exist to mitigate, and it looks fine from the dev's POV, but it's still there.

    - speed. You can find cases of quick Python programs that compare fairly favorably to C alternatives, but that's just because the algos are not CPU-bound. Or they are, but the heavy lifting could be left to objects which are implemented in C. For example, the built-in hash maps are very clever and can often make a huge difference in speed, but they're C-based, not native Python. Ditto things like pandas or numpy, used in data science - libs are all in C, but dev need not care. Generally, Python knows full well that it can't do everything quickly and goes out of its way to facilitate interfacing to native compiled code.

    Pure Python CPU-bound code? Slow. Writing a driver in Python? Not a great idea.

    - it's not manual memory management. Which means you may experience the joys of garbage collection kicking in at inopportune moments.

  29. druck

    Re: 'programming styles from different languages are (to a varying extent) supported'

    Multi-threading is pretty useless in Python unless you are launching system commands, any real parallelism has to be done using multi-processing (multiple processes), and is far more difficult to get right. It's more akin to writing code for MPP systems in that all data has to be transferred between sub processes using slow mechanisms.

    On Linux fork allows the sub processes get any existing data already calculated in the main program (although it can't modify anything not explicitly shared), but on Windows each sub process starts as a clean sheet and all data it needs must be sent to it. Unless you are doing a lot of processing with a small amount of data, there's a big risk that multiprocessing ends up slower than just using a single process.

  30. petef

    Larry Wall for BDFL

    Let Parrot flourish

  31. CAPS LOCK

    I hope he goes on to start a robotics company...

    ... general purpose ones...

  32. Paddy

    Change is inevitable.

    Thanks for your role as BDFL for so long. You did so well in helping to bring Python to where it is,and in fostering such a great community.

    Now I have to sit and think of what more I might need to/could do to help grow Python and it's community.

  33. Someone Else Silver badge
    Coat

    Hmmmm...

    So, all you have to do is post a bunch of disagreements with der Führer to have a disruptive change in top level management? I guess it’s good that Guido isn’t leading Linux. (But I wish it was that easy to affect the "top management" in Washington....)

  34. JLV Silver badge

    Re: Hmmmm...

    yessssss, our first Godwin.

  35. jelabarre59 Silver badge
    Joke

    Fictional leader

    Well, I understand Miss Kobayashi is a crack Python programmer (according to this screenshot. You don't have to deal with a real person, and anyone who disagrees with her decisions will have to deal with the wrath of her maid/dragon.

  36. IGnatius T Foobar ✅

    This happened to Perl too...

    Actually, Guido wasn't the first to do this. Larry Wall stepped down some time ago, but no one was able to read his resignation letter.

    *rimshot*

  37. FeRDNYC

    Re: This happened to Perl too...

    ...I don't want to find that joke baked from low-hanging fruit funny ("Hey, this guy is finding humor in the common perception that Perl syntax is hard to read, do they have a Nobel Prize for Humor yet?") ... ... ... but dammit, it IS funny.

    There's More Than One Way To Earn An Upvote.

  38. Anonymous Coward
    Anonymous Coward

    Is this equivalent to the following, widely recognised as horrible in any C-like?

    if (a = f(b)) { ...

    rather than

    a = f(b);

    if (a) { ...

    Because if so, I'm absolutely on the side of the haters.

  39. FeRDNYC

    Re: Is this equivalent to the following, widely recognised as horrible in any C-like?

    Soooooorta, but you're also inside a for loop (which is what a list comprehension is, an expression evaluated for every member of a given list), so there's also a sense in which it's a shorthand for (in Perl):

    foreach my $x ( @input_data ) {

    my $y = f($x);

    push(@output_data, [$x, $y, $x/$y]) if ($y > 0);

    }

    Yes, the assignment is inside the if statement, but that's because you've already compressed the for loop into a one-line comprehension and there's really no way to locate it anywhere else. The point is to call f(x) only ONCE, assign it to y, and then output the tuple iff y > 0;

  40. FeRDNYC

    Re: Is this equivalent to the following, widely recognised as horrible in any C-like?

    And, in fact, PEP 572 isn't JUST about list comprehensions. It adds in-expression assignment generally, a very common language pattern which Python previously lacked. So, while these two expressions are equivalent:

    foreach my $x (@input_data) { # Perl

    for x in input_data: # Python

    other common in-loop assignments such as (again, Perl):

    while (my $x = $parser->get_token()) {

    do_stuff_to_x

    }

    had no direct syntactic Python equivalent. But with PEP 572, this can now be written in Python as:

    while x := parser.get_token():

    do_indented_stuff_to_x

    In-expression assignments like that are generally a common pattern. They can be abused, of course. But C is the king of "enough rope to hang yourself" so a feature being open to abuse isn't an argument against the feature. It's against abusing it.

    (And I would argue that the list comprehension form does not count as abuse. There's a reason they went with := instead of =, and it was to avoid exactly the type of horrible C code you mentioned.)

    Because while

    if (a = f(b)) {

    is absolutely horrible C due to its "hidden" assignment, if the syntax were

    if (a := f(b)) {

    that would be far less problematic, because it's clearly different from a == test for equality.

  41. mfabian

    It seems to me, you will be useful to read this https://webcase.studio/blog/advantages-and-disadvantages-python-programming-language/

    You will see the pros and cons of Python programming language over Java, C, C++, and understand why companies prefer Python.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2018