* Posts by Kristian Walsh

1817 publicly visible posts • joined 10 Apr 2007

Apple redecorates its iPhone prison to appease Europe

Kristian Walsh Silver badge

Re: Apple is likely to try to make Safari more competitive

Much as I hate the “works on Chrome” argument too, the truth is that “web standards are web standards” supports them in case. Safari iOS simply is not up to date with modern standards. It’s deliberately crippled to guide publishers away from simply offering their services through their own website, and towards writing a native iOS app.

Kristian Walsh Silver badge

Re: The Law Has No Teeth

The “actual requirements”? The naïvety here is shocking. Laws cannot, and do not, attempt to cover every edge case of behaviour. When something is believed to be in violation of a law, but not explicitly forbidden by it, we have legal test cases: one party asserts that the behaviour is unlawful, the other defends that it is lawful, they show their supporting evidence, and a judge decides… by considering the evidence, and also the intention of the lawmakers and the general behaviours that they sought to restrain or support (or, if you will, the “spirit” of the legislation).

The commercial courts are filled with such cases. Apple won’t get a free pass, and they shouldn’t: their decision to enforce additional, arbitrary licensing fees on popular apps is very much against the idea of fair competition. We had similar cases in the past on telephony, with incumbents setting “carriage fees” structured to prevent competition, and they were found to be anti-competitive too. The EU has decades of experience, and decades of wins, in breaking up monopolies and anti-competitive practices - I wouldn’t bet against them.

Missed expectations, zero guidance: Tesla's 'great year' was anything but

Kristian Walsh Silver badge

To be honest, there are vintage car models that have better service and parts support than Tesla.

Tesla ignored life cycle support so that it could spend its cash on sales growth. Now that it has a large customer base it really needs to invest heavily in that repairs and parts function, because customers who lose the use of their car for a month while waiting for a simple part to arrive will remember that when choosing the brand of their next car. But that investment isn't happening. They're still chasing growth, and not paying enough attention to the customers who have trusted Tesla.

Anecdotally, I hear that while EV owners stick with EVs, Tesla is starting to look like a "one and done" brand, with owners moving up to models from other manufacturers.

Atari 400 makes a comeback in miniature form

Kristian Walsh Silver badge

Re: What's with that stupid "The 400" name on the case?

Image-search says you’re right. It was A1000s that had the tick. Early A500’s had the chickenhead alone, and later ones had a different case pattern with a rectangular, rather than square, sticker that read “C= Commodore”.

Kristian Walsh Silver badge

Re: What's with that stupid "The 400" name on the case?

The logo they replaced was the Amiga “rainbow check-mark”, not the Commodore chicken-head.

Kristian Walsh Silver badge

Re: Side question: Why does the Atari 2600 have the cartridge slot built like at an angle like that?

The reason was simple: design and ergonomics. The 2600 was expected to be mounted either on top of the family TV, or on a shelf below it. A vertical cartridge slot made it harder to insert or remove cartridges - especially in the “shelf under the TV” situation. The NES has flat-loading cartridges for same reason - being able to access them easily when the unit is stacked under a TV.

But the socket is not “built at an angle”. It’s mounted directly on the PCB, and then that PCB is tilted at an angle of 30º inside the case! (iFixit did a retrospective teardown of a 1980 2600 here.. https://www.ifixit.com/Teardown/Atari+2600+Teardown/3541 )

Later Atari reissues of the 2600 had the cartridge opening pointing straight up, because the boards were mounted flat in the case.

Kristian Walsh Silver badge

Jay Miner designed the video hardware for both the Atari 400/800 and the Amiga. The Amiga chipset was originally developed with the aim of licensing the design back to Atari for a re-entry into the home gaming market. Atari (under Warners) had had a loose agreement with Amiga Technologies for first-dibs on the results of their work, but with the collapse of that Atari, Amiga found themselves in desperate need of a customer... and they found Commodore. After a lawsuit or two, the whole thing was settled out of court, with much bad blood.

The origin as a console chipset explains the odd clock-rate of the Amiga’s CPU: it was chosen so that the video clock could be exactly divided down to the NTSC colour signal time-period, thus ensuring stable colour reproduction when attached to a domestic TV. And even the famous HAM mode of Amiga was actually a quick RGB repurposing of a what was originally a component-video mode allowing the luminance signal to be adjusted halfway through a TV a colour-pixel (in composite video, the colour signal has exactly half the bandwidth of the luminance signal).

You’re absolutely right that spiritually Atari and Commodore exchanged places in 1984; Commodore, despite the C64, was a company that had traditionally targeted the business market, and they initially pushed Amiga as a workstation (the A1000). Atari, on the other hand, was entirely home-focused until the ST, especially in its hi-res monochrome monitor configuration, drew them into professional sales (especially in Europe). Meanwhile, once developers unearthed the gaming power of the Amiga’s chipset, the centre of Commodore’s business was pulled firmly into the home. Yes, Amigas were used for video production work in the 1980s, but there were more STs being used for “work” than Amigas.

Kristian Walsh Silver badge

Re: US-centricity and the significantly different UK and European markets of the 80s and early 90s

Yes, Europe was very different, largely due to timing. Nintendo launched NES to the USA first in mid-to-late 1985, but European sales were a phased roll-out from late 1986 to 1987. That’s a long time in the technology of the mid-1980s.

In terms of console competition, Sega’s Master System launched a year later than Nintendo in the USA, but pretty much at the same time as NES in Europe. But the real competition wasn’t other consoles. Europe also had a stronger 8-bit gaming market than the USA, and the popular 8-bit home computers had their development costs paid by 1987, which meant they could compete strongly with the consoles on purchase price. Consoles also suffered in two other ways: first, a computer was a far easier sell to parents than a system designed entirely for games; and second, cost of software. Even if you didn’t copy your friends’ game tapes, computers had much cheaper games. By the latter half of the 1980s, a top-tier computer release was £10-12, but there were many budget games titles available to buy at around £2, while console cartridges ran to £25-35.

That trend continued into the 16-bit era, with Amiga and Atari ST sales in Europe dwarfing those in the USA, again because European consumers wanted a single device that their kids could use for games and other, more educationally-worthy, purposes too: parents would rather spend more on an Amiga than buy their kids the much cheaper SNES or MegaDrive. Of my teenage friends who played video games back in the late 1980s, I don’t think any one of them had a console.

How Sinclair's QL computer outshined Apple's Macintosh against all odds

Kristian Walsh Silver badge

Re: outSHONE

The reason there were no 68060 Macs is that Apple, Motorola and IBM had formed the PowerPC Alliance in 1991 to promote that new family of microprocessors, and Apple’s engineering staff was busy developing PowerPC chipsets (yes, Apple used to do their own support ICs back then). The 68060 would have been well underway by this point, and I suspect Motorola let it run to completion as a hedge against PPC being a total flop (the 601 was not a Motorola chip, but IBM’s design).

68060 was a faster CPU than the P5 Pentium that had come out six months before it, and it was at least as fast, MHz-for-MHz as the first-generation PPC, the 601. A rumour at the time said Mac hardware upgrade specialist DayStar had planned a 66MHz 68060 upgrade card for older Macs with Apple’s “Processor Direct Slot”, but Apple refused to licence the necessary ROM changes because they were afraid that the combination was fast enough to steal sales from the new PowerMac line.

I would 100% believe that story - the PPC601 machines ran an OS that was about 75% emulated code. Given that the 060 could match the 601 on native code, it would have absolutely smoked the PowerMac, and it would also have run QuarkXPress properly – the only application on Apple’s “must-support” list that broke the 680x0-to-PPC code-translator!

Kristian Walsh Silver badge

Re: outSHONE

The “-ed” also functions to distinguish between different meanings of what looks like the same verb.

“Let us hang the Constable in the drawing room” is either a incitement to murder of a serving police officer or a suggestion for decorating a wealthy home. When you review the day’s work using “Hung in the drawing room” versus “Hanged in the drawing room” you are removing all doubt.

The 'nothing-happened' Y2K bug – how the IT industry worked overtime to save world's computers

Kristian Walsh Silver badge

And if you guessed “(R, Texas)” then congratulations, but don’t get too big-headed about it - it was pretty obvious.

Kristian Walsh Silver badge

Do bomb-disposal teams get this bullshit too?

I mean, who’s to know if the “bomb” was ever “live” in the first place. And isn’t it funny how these the “bomb” ‘“disposal”’ “‘“experts”’” are always so quick to arrive whenever a ‘“‘“bomb”’”’ is found... but nothing ever explodes, does it?

Whole thing is a scam. Next time I dig up a WW2 munition, I’m going to give it a good shake and a rap with a hammer. That’ll show them...

NASA, Lockheed Martin reveal subtly supersonic X-59 plane

Kristian Walsh Silver badge

And fuel cost per passenger would be...?

What killed Concorde more than the sonic boom (which cynics may be forgiven for thinking was an objection over-egged by the US at the behest of Boeing, whose supersonic plane was such a catastrophe), it was fuel costs and democratisation of air travel. Concorde could seat about 100 passengers, already on the low side when designed, but its faster speed couldn't beat 300+ passenger 747s for throughput, and airlines make money per passenger, not on travel time.

The other problem was efficiency: Concorde burned three times the fuel of an equivalent subsonic jet. That was okay with pre-Oil Crisis pricing, but once fuel became a significant share of costs, Concorde was never going to thrive.

Unless those shortcomings are fixed by this plane, it will be nothing more than a tech demo.

It's uncertain where personal technology is heading, but judging from CES, it smells

Kristian Walsh Silver badge

Re: Very big indeed

I suspect they hid it inside an atom... SI prefixes are lowercase for magnitudes below 1.0 (with exceptions for h,k,da which pre-date the SI)

Silicon Valley weirdo's quest to dodge death – yours for $333 a month

Kristian Walsh Silver badge

In fairness, living in Silicon Valley drastically reduces one’s chances of being flattened by a bus...

X's 2024 plans include peer-to-peer payments in app push

Kristian Walsh Silver badge

Re: But why?

There’s one fundamental difference that makes Twitter a bad candidate for this. All the “everything apps” you cited started as Instant Messenger systems, and in an IM app, you choose the accounts you see, and they tend to be heavily weighted toward people you have a real-world relationship with: friends, family, co-workers, team members, customers, suppliers, whatever.

Twitter/X, by contrast, has much more of a “random strangers” model for communications: even if you ignore the shit-cannon that is “For You”, most people’s Twitter roll contains people that they have no other interactions with except that they follow them on Twitter.

That lack of real-life contact puts Twitter on a far lower level of trust than an IM app. Even if I still maintained an account there, I wouldn’t have any need to send money to anyone I interacted with on Twitter - either I know the person already and can do it via a more secure channel, or they’re person I’d have no reason to ever give money to.

One use-case does come to mind, though, especially in a year evenly divisible by four: political donations. Luckily, Elon Errolovich has completely eradicated all bots and automated accounts, so it should now be perfectly fine to allow, say, the Trump campaign to collect campaign finance via X, safe in the knowledge that the money can only be small amounts given by individual donors, rather than the kind of very large donations from rich backers that would otherwise attract (entirely warranted) scrutiny...

CLIs are simply wizard at character building. Let’s not keep them to ourselves

Kristian Walsh Silver badge

Re: -h or --help

No, your example isn’t right. If apt had followed the Unix pattern, the syntax would be something like “apt --purge <packagename>” and “apt --install <packagename>”: one tool, one task, but you seelct the action using options, not magic-word “sub-commands”. (Compare with how “tar” both archives and unarchives files).

“apt” isn’t the big villain here, though. Look at “ip” and how many things it does, and how different the syntax gets depending on where you are.

One of the unfortunate side-effects of the trend for sub-commands is that they tend to be chosen from a misguided desire to make the whole command invocation read like an English sentence. That’s a fool’s errand (you quickly get to a function that cannot be cutely expressed like this) that also makes the system less usable for people whose first language is not English. Unix did not have this problem to the same extent: the commands were odd abbreviations, and knowing that “ls” was short for “list” wasn’t really much of a help in learning the system (I, a native English speaker, actually only learned that particular command origin a year or so after I’d already been using Unix)

Kristian Walsh Silver badge

Re: GUIs were and are intended to demystify the computer

Yep. It silently changed in 1997 with MacOS 8.0. If you clicked and held a menu title or a popup-menu control, you got the old behaviour (release to select), but if you just clicked, you got the new, industry-standard behaviour instead (click again to select).

The original feature itself had poor usability, but what existed after MacOS 8.0 is actually an example of good UI, in that it allows you to discover different ways of accomplishing the same thing. When it changed, people who were used to the old way of using menus on Macs didn’t even realise it had changed, while people who were unfamiliar with Macs didn’t even know that there was another way of using menus compared to on other windowing systems.

(Bonus shout-out to Atari GEM with it’s drop-down menus, that automatically appeared on mouse-over... I can’t decide if they’re better or worse than the Mac pull-down variety)

Kristian Walsh Silver badge

Re: NeXTSTEP

It’s well documented that Steve Jobs hated text-driven interfaces (or at least he did once he’d visited Xerox Palo Alto). The Macintosh project, as originally led by Jeff Raskin was originally a keyboard-controlled product; Lisa* was the mouse-and-graphics computer that would lead Apple into the 1980s. But when Lisa bombed in the marketplace, Steve Jobs effectively took over the Macintosh development project and turned it into a cheaper Lisa (largely by following all of the advice he had ignored during the Lisa’s development).

Jeff Raskin’s original vision for Macintosh became the Canon Cat - a very different kind of computer, entirely driven by the keyboard (but with a couple of very powerful additional function keys): it had a longer learning-curve than a mouse-driven UI, but it’s the sort of thing that you could become extremely proficient at.

At NeXT, the command-line came for free due to the OS being built on top of BSD/Mach, and in any case, the target market for NeXT wasn’t the same as for Macintosh. The Mac was a general-purpose home/office computer, but the NeXT computer was to be a high-performance workstation for scientific research: in that market, shipping without support for Unix made no sense.

__

* it takes a special kind of personality to deny being the father of a baby, but then use that same child’s name for the most important project in your company, but that was Steve Jobs...

Kristian Walsh Silver badge

Re: -h or --help

I usually use “-?”, as I’m afraid that someone will decide that -h is a synonym for “--hangup”, “--hard-delete”, or something equally destructive, but yes thad does annoy me.

The problem with tools like npm (or apt, git, or ip) is that they’re just dispatchers to other commands: the first argument is the actual command that you’re asking for. That’s against the basic philosophy of Unix: “one command, one function”, but it has become the norm for modern tools, and we’re stuck with it now.

Kristian Walsh Silver badge

Re: Both have their places

Have you seen the man-pages for modern Linux tools?

Actually, even the Unix ones aren‘t great these days: once tools gained more options it became fairly clear that the “Examples” section should have been the second one after the basic description, with the most common use-cases listed there. In too many pages, the “Examples” is an afterthought, but it’s the one place that should shows you how to use the the tool to do something useful (best set of examples on a man-page I’ve seen recently is for the hex parser/display tool xxd).

Kristian Walsh Silver badge

Re: GUIs were and are intended to demystify the computer

The problem was that inexperienced users would have to keep looking at their hands to ensure they pressed the correct button, thus breaking their gaze. It has nothing to do with the consistency of functions or colour-coding - and colour-coding isn’t useful if it makes you break your gaze to see which colour button you’re pressing. The Smalltalk environment’s use of two different menus on two different buttons isn’t that good a UI: your buttons are: “poke this”, “a menu” and ... “a menu again, but a different one”. Fine when you know there’s a distinction between contextual actions and system actions, but new users need to learn this.

Apple solved the “small targets for menus” by putting the menu-bar at top of the screen - the boundaries of the screen are, in effect, infinitely-sized when you use a mouse. Apple’s other very good idea was to make the commands that were buried in menus directly invokable by simple keyboard shortcuts, and then standardising those common key-commands across all applications: other systems allowed you to navigate the menus by keyboard (= more keystrokes to get what you wanted), but never tried to enforce any kind of standard for keyboard commands.

However, as noted in a post below this, Apple failed by not ditching the “double-click” action, which is the most difficult gesture to learn with a mouse - the default double-click time was just too short for new users, but because of how the shell was written, making the interval too long would have made other parts of Finder unresponsive to clicks. (The other hard-to-learn mouse action on Macs, later copied by Windows, was renaming a file: click on the file name, hold for just long enough, and hope that it causes the label to turn into an editable field)

The comparison between Smalltalk and desktop PCs isn’t really valid: Smalltalk was by its own definition a tool for experienced computer users - as much as it pretends not to be, it is a programming environment, not an application. The desktop computers of the 1980s on the other hand, needed to be far more accessible out-of-the-box. The Macintosh in particular was designed as a computer that would be used primarily at home: you got a manual with it, but there would be no friendly co-worker at the next desk to help you get to grips with the system, let alone an administrator - the idea that there would always be an “IT department” or “Admin user” to fix things or do difficult configuration tasks is the single biggest factor that prevented Windows getting to the level of “it just works” that you saw in the 1980s-1990s Macintosh.

Kristian Walsh Silver badge

Re: training

Your argument works against CLI tools too. Features often get added to commands along with inconsistent ways of enabling them: lots of projects commit the sin of not reviewing the interface to a new feature, only its code. There is definitely a failure to understand that User Interface also applies to CLIs, and that the same principles of being able to reapply things you learned already to a new tool should be followed. But once an invocation method is in place, it’s there for good, and you end up with batshit-crazy commands like ip or git whose options and submodes form a contradictory ball of crap that you just have to memorise (or keep looking up on StackOverflow).

Linux throws this into sharp relief because it has the worst examples of CLI tool design (e.g., anything networking related) installed beside the best (e.g., the gnu coretools, which have carefully preserved the principles of the Unix userspace tools while adding new features)

Kristian Walsh Silver badge

Re: GUIs were and are intended to demystify the computer

You keep saying “Unix” when you mean “X11” there. Unix has no GUI, and therefore has no mouse.

The three-button mouse was dropped because it was hard to use. It was the one part of the Xerox Alto design (inspiration for Macintosh) that consistently confused users. Even when test users had realised that there were three buttons, shifting their grip on the mouse often resulted in mis-clicking as their fingers shifted to cover different buttons. Add to this that the third finger is also less dextrous than the first two, which made the right button slightly harder to use, and some kind of simplification was always going to happen. A single-button design has the advantage of no possibility of confusion, so the learning-curve is made easier, but it does limit the usefulness of the mouse once the user has become accustomed to using it. The two button design is a compromise that allows for additional function while solving the three-button device’s problems - not just Microsoft, but almost every other vendor of mouse-controlled PCs of the mid-1980s used a two button device; the only personal computer of this era I can think of that shipped with a three-button mouse was Acorn’s Archimedes (now more famous for being powered by the very first ARM CPU).

Apple stuck with a one-button mouse far longer than it should have (resulting in the awkward “control-click” invocation for contextual menus), but it’s 18 years since Apple finally offered a two-button mouse with its computers.

I’m surprised that the bigger ergonomic failure of the original Mac hasn’t been mentioned: pull-down menus. Before MacOS 8 gave you an option, on old Macintoshes, in order to select from a menu, you had to move to the menu bar, press the button, and then keep it held so that the menu would stay visible until you reached the item you wished to select, and then you released the button to select that option. [Contrast with the current behaviour: click menu title, move mouse to desired item, click again to select]. “Pull-down” worked okay for short menus, but it got progressively more difficult to navigate for long ones, and became an ergonomic nightmare when System 7 introduced hierarchical submenus. For users with limited mobility or strength, this one feature made Macs almost impossible to use.

EU launches investigation into X under Digital Services Act

Kristian Walsh Silver badge

Re: Ministry of Truth

No, it doesn’t.

Hate speech has a very clear legal definition. Exact laws differ, but in my country, it can be summarised as “public communication with the intention or likelihood of being threatening or abusive and likely to stir up hatred against people because of their origin, appearance, religious beliefs, sex, or sexuality”. That isn’t the sort of subject you can accidentally stray into during a normal conversation.

If you have difficulties with social interaction, I can understand that: I’ve worked in IT long enough to have encountered many such people. But you and I both know there’s a big difference between committing a faux-pas and wishing harm on a category of people. And yes, people can be thin-skinned and sometimes take offense where genuinely none is intended, but that’s life. It doesn’t remove the obligation on all of us to not be a dick.

Kristian Walsh Silver badge

Re: Ministry of Truth

Good to see such diligence and care after making a decision to use an untested treatment based on nothing but hearsay on the internet.

This might hurt a little, but let’s apply some logic to the observed facts, shall we?

A drug as cheap as Ivermectin was not recommended for treating Covid-19. Other drugs, such as corticosteroids, were. Ivermectin was investigated, but the results showed it to be of no significant benefit as a treatment, and no benefit at all as a preventative. Its poor side-effect profile makes Ivermectin a bad option to prescribe on the off-chance that it might do good.

Ivermectin is a drug from the 1970s, is easy to synthesise, is out of patent, and can be made as a generic by anyone. Lots of small, independent drug companies could have made a fortune off this… if it worked.

So, what’s the more likely reason for the medicine not being made widely available for Covid-19? a. It didn’t work ; b. Vague conspiracy about big drug companies, Bill Gates, the Government, and probably Jews as well (because sooner or later you lot always end shitting on the Jews…)

Kristian Walsh Silver badge

Re: Ministry of Truth

Oh that’s all right then. Only 1% of people die.

If you want to parade your utter innumeracy to the world, saying things like that is a good way to do it. What’s the argument you’re trying to make here? That one is a small number, so actually hardly anybody died of Covid?

What about “less than 1%” of 24 million, that being the number of Covid cases recorded in the UK? Is 230,000 still a small enough number for you to make idiotic statements about it not being a real thing? As you’ve already shown you lack the ability to reason about magnitudes, here’s a hint for calibration: deaths from traffic accidents in the UK for the same 3-year period as the Covid pandemic would be about 6,000 people; for deaths from violent crime, the total is around 2,000 over that period. (Fatality figures for the USA, with a population of around 5x the UK’s: Covid=110 million cases, 1.2 million deaths, Traffic=120,000*, Homicide=45,000)

As a society, we put huge resources into reducing traffic deaths and deaths from violent crime. But having twenty times more people die from something that’s now preventable? Nah.. let’s let them die because fools with only a half understanding of logical reasoning think it’s a fucking conspiracy...

__

* Yes, that is the right number of zeroes.

Kristian Walsh Silver badge

You know the article made that very same pun, right?

England's village green hydrogen dream in tatters

Kristian Walsh Silver badge

Re: The EU...

To me, the EU nations’ plan shows evidence that they looked into how Hydrogen could actually work in reality, have planned a large-scale bulk distribution pipeline for the gas to allow it to be sent efficiently from where it’s cheap to make to where it’s needed to be burned by electricity generators, and they are now proceeding with building that, even if it will take nearly a decade to complete.

In contrast, the UK’s plan of trying to feed Hydrogen gas into an existing infrastructure (whose state of maintenance is largely unknown) that was designed for natural gas to allow it to be burned domestically for heating has all the hallmarks of a plan conceived simply to show “something” was being done before the next election comes around.

Tesla to remote patch 2M vehicles after damning Autopilot safety probe

Kristian Walsh Silver badge

Re: Autopilot may undermine the effectiveness of the driver's supervision

You’ve hit the root of the problem - it’s inconsistent no matter what you do: do you keep Left and Right aligned to the wheel (and thus your hands), or to their orientation within the cabin? If you do decide to flip the meaning, there has to be a point (at about 90º) where that changeover happens, which itself makes the system unpredictable (you cannot add a dead zone in the middle, because the rules say that the indicators always have to work in a car). If you don’t make the sense flip, then the driver has got to mentally map “left” to your right hand when making turns (or worse - they have to look at the wheel). I know first-hand that that this is error-prone - my car has radio controls on the rear surface of the steering wheel: left hand side changes source/track/station, right hand side does mute/volume, and I have regularly pressed the wrong one when trying to mute or turn down the radio while steering).

The obvious accident case here is when you’ve parked at the side of the road, with the wheel turned 180º due to the parking manoeuvre, you hit the road-side indicator button, and then you pull out. Indicating toward the kerb is not how you tell drivers already on the road that you want to join traffic... in some places it gives the exact opposite message.

This is nothing but more cost-cutting by Tesla - mounting all the controls on the steering wheel means there’s no need to bring signal/power lines out to the stalks. It’s the same reason why the Model3 has no instrument binnacle, and why Teslas don’t have secondary controls (e.g., air conditioning) on buttons, despite the proven safety advantages of physical controls allowing the driver to operate them by feel without needing to look away from the road.

Kristian Walsh Silver badge

Re: Autopilot may undermine the effectiveness of the driver's supervision

Holy shit, you’re right.. they actually found a way to make the cabin-controls even worse in a Tesla. I remember the M3 indicators being the inferior “return-to-centre” type, but they were definitely mounted to the steering wheel column, and did not rotate with the wheel, but on the latest cars, they have indeed been replaced by buttons on the fucking steering wheel.

To quote TopGear magazine: “Sounds like a terrible idea in theory, and having sampled the system on the Model S Plaid, [we] found it to be a terrible idea in practice too. You have been warned.”

I hope they’ve got good lawyers at Tesla..

Kristian Walsh Silver badge

Re: a sticky brown smear

Your quote is from Matthew Anderson’s book “The Elements of Eloquence”.

As a native speaker, I find that “a brown sticky smear” is a totally natural construction, and is a more humorous formulation than the canonical “sticky brown smear”. The placement of the word “sticky” after a colour determiner tips you off that it is functioning as a euphemism for the “material” part of the phrase, and that’s what makes it work.

Humour frequently bends or breaks grammatical patterns, and this is one example of that. The greatest humourists in English frequently mess with the strongest rules in English for comedic effect: P. G. Wodehouse did it so naturally that you don’t even notice it; Douglas Adams and Terry Pratchett were more obvious about bending the rules, sometimes to near destruction, but no less funny when doing so.

Kristian Walsh Silver badge

Re: a sticky brown smear

Capitals are important too: “Excuse me while I help my Uncle Jack off his horse!” expresses a sentiment that’s a lot more wholesome than the lowercase equivalent...

Kristian Walsh Silver badge

Re: "recall"

LIDAR is a technology, it’s not a company. Many manufacturers produce LIDAR systems, most of those are broad-range auto parts manufacturers like Bosch, Aptiv, etc. LIDAR is desirable because of what the R stands for. It is entirely valid to bring up a readily-available technology that would have avoided a large number of Tesla’s autonomous driving incidents given that so many of them had a failure to detect distance to hazard as a contributing factor.

On your other whine, yes, Tesla’s fire rate is lower than ICEs - so is any EV. The grown-ups compare EVs to EVs, and there, the picture is less rosy. Tesla’s choice of battery chemistry comes with longer range and a higher inherent fire risk. That is a fact. They may take steps to mitigate that risk, but the baseline risk is higher. (this is true for the NCA batteries used in most Teslas, not LFP batteries fitted to lower-range models; LFP is the least combustible of Li-Ion chemistries)

But really, with a population of only 2 million vehicles, they found 350 accidents significant enough to be included in the investigation (i.e., damage to vehicle or harm to occupant) where “Autopilot” was a factor. That’s actually a pretty high number for anything with safety-of-life implications. Especially on a vehicle fleet of relatively new cars, that spends most of its time on suburban multi-lane divided highways.

But whatever... who am I to spoil someone’s dream: you keep on sucking Elon’s coteat - I’m sure he’ll tweet you a discount code for a Tesla!

China's Loongson debuts processor that 'matches Intel silicon circa 2020'

Kristian Walsh Silver badge

Re: Fake benchmarks though

If I’m not mistaken, the kind of kids you refer to, who regularly win those national math competitions in America are… American.

ASML joins with Samsung, SK hynix, for chip research lab in South Korea

Kristian Walsh Silver badge

Re: What's the Problem?

Not smarter, they just had the advantage of living in a much more stable society for a long period when Europeans were more occupied with constantly fighting each other. While it’s true that war accelerates invention, repeated conflicts erode the society’s ability to support long-term research and progress.

Whenever anywhere in Europe had a period of sustained peace, clever people were able to pop up with all sorts of interesting things too (clockwork, optics, combustion engines); and China’s regression in the 20th century coincided with its half-century of revolution, civil-war, forced relocation and internal purges.

The belief that any one nation or another is cleverer is just racism dressed up as praise. Nations end up being “good” at things mainly due to geology, geography or history... not DNA.

Microsoft floats bringing a text editor back to the CLI

Kristian Walsh Silver badge

GPL isn’t an issue - any editor would be a self-contained binary that would ship with Windows in order to be invoked from the CLI. Dynamic linking against a binary library does not require that library to be GPL-licenced, so it’s fine to link against the MS C runtime library, for instance. As long as Microsoft publishes the modified source of the version of Nano they built, they’re compliant with GPL.

But really, licensing is not the reason why Nano is a bad idea.... I see someone else has pointed out how antagonistic its keyboard commands are to someone familiar with a system where O means Open, X means cut, and W means Close.

Vi mightn’t be “friendly” but at least it doesn’t turn your muscle-memory into a weapon against you.

AMD thinks it can solve the power/heat problem with chiplets and code

Kristian Walsh Silver badge

Re: Glad to see

To answer your question, Windows 11, at 3.5 Gbyte, would require 2,431 floppy discs... or 5 to 6 CD-ROMs.

The main reason for what you call “bloat” really is additional functionality, plus resources for multiple languages and higher-definition displays. Yes, NT 3.5 came on a handful of disks (the Workstation installer is 25 Mbyte - less than 1% of the size of a Windows 11 install), but: it only just had a TCP/IP stack (built-in TCP was an advertised feature of 3.5!), had no SSH, no web-browser, no support for GPU acceleration (not even a thing back then), no management features, limited graphics drivers, limited command-line shell tools, no assistive technologies, no resources for languages other than English, a handful of fonts (all bitmapped, all Latin-only), and the list goes on...

Then there’s resources: I remember back in 2001, a friend noting that the 256x256 RGBA icon file he had just included in his MacOS X app bundle was larger than the entire system ROM of the original Macintosh (in the System 6 days, the Macintosh OS was mostly in ROM; the System disk added the localisable resources and additional machine-specific drivers) Higher resolution displays with more colours means every graphical element gets bigger.

The install isn’t what’s loaded, though. Resources and libraries don’t get loaded until they’re actually needed, so startup times are not much longer between 7 and 11: you can find out how fast Windows 7 loads on modern hardware by using a VM - it really isn’t significantly faster than 10/11. If you want to see just how I/O bound the OS startup process is, load something like Win 3.1 or Mac System 7.x from SSD on a modern emulator. Back in the days of MacOS 8 on spinning disks, I used to turn on my Mac in the morning, then go to the coffee machine, because there was no point in staring at icons appearing along the bottom of the screen for two minutes. I launched the same OS on an emulator recently, and the whole thing sprang to life in three seconds!

Kristian Walsh Silver badge

Re: "as Moore's Law has slowed and more power is required to push higher performance"

He was slightly out, but not by much - and for a technology predictions made in the mid 1960s, the accuracy is astonishing. The first RAM chips to break the 65,000 transistor barrier came not in 1975, but in 1977. The first >65k gate CPUs arrived in 1979 with the Motorola 68000 (the clue to the transistor-count is in the name).

In 1975, memories were running around 16k gates, and CPUs were at around 5k gates. CPUs lagged behind the “Moore’s Law” curve through the late 1970s and early 1980s, and only started to follow that pattern of doubling transistor counts every year once on-board cache became commonplace from the mid 1980s.

The reason CPUs had fewer gates is that much more of a CPU is active at any given time than in a memory, and thus you need to space things out a bit more so that heat doesn’t become an issue.

Systemd 255 is here with improved UKI support

Kristian Walsh Silver badge

Good use for systemd...

In a tech interview I did once, I was asked the question “What is systemd?”

It turned out later that the question was irrelevant for the work - I can’t even remember if we had a systemd-equipped distro on the hardware. What really mattered was the candidate’s reaction to being asked. Anyone taking a very strong position on the matter was immediately docked points. Once employed, I later interviewed one candidate who very much nailed their coffin shut with a long rant in response to this very question, so it definitely had value as an interviewing tool. That person was already an unlikely due to other weaknesses on the simple programming element, but this sealed the deal.

Kristian Walsh Silver badge

Re: /usr

/Users came to Apple from NeXT. The distinction mattered there, but is moot on Macs because Apple’s filesystems are case-preserving, but case insensitive. This is because they treat the directory entries as text (with a filesystem-defined encoding), rather than a sequence of bytes (with encoding left as a matter for client libraries to impose). Actually, Apple goes further (or used to), and decomposes filename text to a canonical form so that you don’t go insane when files are created using inconsistent combinations of pre-composed characters and combining accents, as here:

$ ls

bómánta bómánta bómánta bómánta

(you’ll have to use hexdump to see which of these is which)

Microsoft's code name for 64-bit Windows was also a dig at rival Sun

Kristian Walsh Silver badge

Re: Codenamed

In the early 1990s, Apple had a pair of PowerPC systems named after famous hoaxes: Piltdown Man (somehow shortened to “PDM”) and Cold Fusion. At the same time, another product, which would become the PowerMac 7100, was codenamed “Carl Sagan” after the TV astronomer’s catchphrase “billions and billions” - a prediction by Apple of how financially successful the model would be.

Sagan got wind of all three names, and got the hump so badly about being associated (in his head) with two famous hoaxes that he had his lawyers send a cease-and-desist to Apple. Over an internal codename. Management then leaned on the hardware team, and “Carl Sagan” became “Butt-head Astronomer” (BHA). And Sagan got on the phone again. Apple said that the product was actually codenamed “BHA” and that the B was for “Bald”, but to no avail. In the end, the product was called “LAW”, for “Lawyers Are Wimps” internally. And it did not make anything like “billions and billions” for Apple.

I joined Apple a little after this, so nothing I ever worked on had a cool name, except I do remember one of the endless PowerMac G4 variants was called “Yikes!” for reasons best known to the product team.

Dump C++ and in Rust you should trust, Five Eyes agencies urge

Kristian Walsh Silver badge

Re: I must be a bit thick

“Specifically, what's returned from a C function is whatever is in the processor's accumulator.”

The standards don’t actually mandate any specific mechanism for return, but for the sake of interoperability, all compilers for a given architecture will follow the same rule, whatever that happens to be. Almost all compilers I’ve inspected the output of used return-by-register where possible, usually using the lowest-numbered suitable register, but I know some architectures where that is a problem for certain types (“double” on CPUs without FP registers).

I do agree with the tendency to hoard code, though. Sometimes it’s from lack of knowledge about what’s outside (anything that touches a global state is very hard to remove, for fear of destabilising other parts of the system), but I do thing that there’s a mindset in some developers that all changes should be as minimal as possible - often at the expense of the legibility of the resulting code. That might make sense in a hugely complex piece of software like the Linux kernel (where your chances of having a pull accepted are inversely proportional to the size of the diff), or any other large codebase that has been “battle-hardened” rather than formally tested, but on small modules that are functionally tested, re-writing and culling code should not be out of the question.

Kristian Walsh Silver badge

Re: your specific example is a genuine bone-fide compiler bug

Actually, your example is flagged by the default warning settings of clang, but we all know people who don’t consider warnings to be a code problem...

I agree that the general case, which often relies on runtime state, can be impossible to detect, but it’s not a decision between detecting all instances (impossible) or none (useless) - it shouldn’t be too much to ask that when the compiler does detect something that produces undefined behaviour, it reports that as an error by default. After all, we all know how much shipping code is compiled with warnings off, just so it can get through automated build-system checks.

And again, for particularly old or ratty code that used to get through GCC (and don’t start me on the shit that GCC used to allow just because it made writing the linux kernel a little easier…) but now does not, locally suppressing errors is the way to go, if only to highlight that the source-code probably has other issues too.

(Incidentally, thank you for an interesting discussion – they’ve become harder to find online lately, even on El Reg)

Kristian Walsh Silver badge

Re: your specific example is a genuine bone-fide compiler bug

Yes, I did notice that -O0 fixed this (but good god, next-instruction branches!), but I feel that’s a bit of a copout from the compiler writers; the same kind of passive-aggressive bear-trap that GCC used to hide under developers’ feet - it’s disappointing to see that attitude creeping into clang too.

If the behaviour of a function is undefined, then by definition the function is un-optimisable, so locally tagging it as -O0, or making it “optimisable” to { void; }, and anything between are all okay, but I’d argue that allowing the optimiser loose on a graph that is known to be defective is just asking for trouble, so it really highlights my main complaint: that program-code from which the compiler cannot produce a defined behaviour from should be an error by default, not a warning.

(And yes, I believe things like “x = x++;” also fall into this category. That’s an error of thought, but even if the standard specified an outcome for this, I’d make it a warning because either reading is equally sound, logically.)

Polish train maker denies claims its software bricked rolling stock maintained by competitor

Kristian Walsh Silver badge

I wouldn’t be surprised if the impetus to employ the security researchers was a story from someone who had worked in Newag and had heard rumours of such a kill-switch. Railway vehicle servicing is not a big field, and people move between companies.

I didn’t check, but I had a sinking feeling that a diligent searcher would find numerous links between executives at Newag and key figures in PiS (the political party that until the recent election had a stranglehold on power in Poland). PiS is an amazingly corrupt organisation, even by the standards of populist parties, and Poland is well rid of them.

Swedish Tesla strike goes international as Norwegian and Danish unions join in

Kristian Walsh Silver badge

Re: Det är jävla rolig

Yeah. I can’t help thinking Musk insisted on having a factory in Germany for no reason other than a desire to say that Tesla were making “German cars”, and had no idea of how manufacturing is organised in Germany - a country where union representation on company boards is commonplace; a country where, one Thursday afternoon in a week where I’d had to put in two long days to fix an issue, I was told to leave the office immediately because I’d exceeded the working time limit.

Funny how the good employers are the ones that produce the highest quality cars, isn’t it? The simple truth is that you cannot get any kind of high-quality manufacturing without employee buy-in. Treat your line staff like drones, and they will return that contempt by doing the absolute minimum to keep themself employed. This is why Musk is so in love with robotics, but robots are a dead end. Robots don’t find problems, they just repeat what you told them to do, and the key to quality is giving your workers the confidence and authority to find problems with the process and fix them.

Kristian Walsh Silver badge

Re: Funny Old World

Secondary action is not illegal in the countries where it is happening (why would UK law have any bearing on Swedish law?). And it is not the kind of secondary action that occurred in the UK. In this dispute, the workers at other companies are refusing to handle business specifically for Tesla. They are not refusing to attend work at their employers. All post in Sweden is being delivered, except for specific post from Tesla; all vehicles are being unloaded at Danish docks.. except for vehicles made by Tesla. See a pattern? The workers at other employers are joining in a dispute against Tesla, while continuing to perform the rest of their duties.

This is a big difference to what happened in the UK, where a single union would call all of its members out on strike, even those in workplaces where no dispute existed. They did this for leverage, to make their localised labour dispute into a national political issue, and so force the government into taking a side in the dispute. The major difference in the Nordic countries is that there is a long-standing principle that the government never gets involved in labour disputes, so unions have nothing to gain from blanket action except unpopularity. On the other hand, narrowly-focused secondary action, like this case, gives even small groups of workers a much stronger hand than they’d have alone, but most importantly, it keeps the unions on the right side of public opinion.

Amazon's game-streamer Twitch to quit South Korea, citing savage network costs

Kristian Walsh Silver badge

Re: is this better for SK Broadband?

It’s not about how rich you are - that’s exactly how the likes of Netflix are framing this argument. It’s about how much of the utility you use, and the way that you use it.

To cite your two examples: the electricity companies do charge you more if you consume a lot of electricity, or use it in such a way that it places higher demand on the infrastructure - even the exact same power consumption can attract different rates depending on the type of load you’re placing on the grid. Similarly, freight vehicles pay higher taxes and road-tolls because they consume more of the road (both in terms of physical space and weight-loading). You pay for the resources that you use; what profit you make from using those resources is nobody’s business except yours.

The principle of Net Neutrality says only that one individual customer cannot be favoured over another (so, for example, Disney can’t slip a few million to your ISP to prioritise its streams over those of Netflix or Amazon), but that doesn’t mean that ISPs cannot favour one class of customers over another. Some types of traffic place higher demand on network infrastructure, and high-bandwidth streaming is one of the most demanding. Twitch is actually worse than the movie streamers, as its content is generally live, which rules out the use of an edge CDN with cached content. However, the companies that profit from these streams usually want to pay only a data-centre peering fee, which does not accurately reflect the costs of ensuring that traffic arrives at the customer premises.

The content streamer’s view of “fair” is like Steinway driving a customer’s new piano to the local postal sorting office and saying “Here you go: Get that to Apartment 6, 113 Main Street, thanks. Oh, and we’re only paying you five bucks, considering we’ve brought it most of the way for you.”.

Kristian Walsh Silver badge

Re: is this better for SK Broadband?

If they are like pretty much every broadband provider, then its income is a fixed monthly fee that does not take into account traffic volume beyond a “fair usage” cap. On the other hand, its expenses are very much dependent on traffic volumes.

Nobody will cancel their broadband because Twitch is gone, so I think they’re happy with the arrangement, and any future streaming entrant will know that it really is the ISP’s way or the highway, so overall they’re winning.

My personal view on this is that the large corporate traffic generators do not pay their fair share for carriage, but that what the ISPs are looking for is also not a fair reflection of cost versus value - the right solution is somewhere between, but that would require big tech companies to actually pay their fixed operating costs themselves, something that their business models are designed to avoid. There’s a lot of “net neutrality” bros out there who are very vocally against the idea of charging different rates to different customers, but it’s funny that when you actually look into who’d be charged more in such a regime, it’s not the FOSS distributors or the journalists, but the big tech media companies… astroturfing is alive and well.