* Posts by Kristian Walsh

1817 publicly visible posts • joined 10 Apr 2007

Rust is eating into our systems, and it's a good thing

Kristian Walsh Silver badge

Re: Yeah They're Wrong

f64 Makes it obvious to new programmers how big a “double” is. I’ve seen code where people forget this, read 32 bits from a stream, cast it to double, store it somewhere, then all hell breaks loose later.

I write a lot of C. With the exception of loop-counters, I always use the <stdint.h> names for integers, not the built-ins.

I've looked into Rust a few times. It seems to be a way of automating some of my own (hard-won) good practices when using C++: use RAII, avoid "new", methods take objects as const references not pointers unless you've a really good reason to do otherwise, try to use return by value of small types wherever you can. Memory leaks and violations weren't ever a class of bug I got in my C++ code.

C is a simple language. That makes it easy to master, but it is very, very low-level, and that puts everything on your shoulders. Some people like to think that makes them a better programmer, but after a long time doing this stuff, I think that honestly it just gives you more chances to plant landmines in your code. One thing I like about Rust is that it still allows you to do the dangerous but fast things if you need to, but the path of least resistance produces safer code than taking the same unthinking approach in C.

Kristian Walsh Silver badge

Re: Meh!..Meh, Meh

Yes, 6502 was her original influence, but ARM was always a 6502 programmer’s wish-list for a better ISA.

The first time I saw the “conditional instruction” idea I was amazed at how something so simple could fix such a common performance issue - branches are fairly expensive, but code is filled with short conditional jumps. For example, code like "if (x==0) y=1 else y=2” need five instructions on a typical ISA (load, branch-if-zero +2, load #1, branch-unconditionally +1, load #2), but only three on ARM (load, load-if-zero #1, load-if-not-zero #2). Saves space and memory, Genius.

Is it a bird? Is it Microsoft Office? No, it's Onlyoffice: Version 7.2 released

Kristian Walsh Silver badge

Re: They keep using that word, but I do not think it means…

Yes. Early typefaces used a lot of ligatures, in order to make (cheap) printed books look like (expensive) written manuscripts - Gutenberg’s Bible uses about sixty different ligatures, as well as variations of each standalone letter, in an attempt to make the printed pages look like they were produced from the much more expensive woodcut process.

Most modern “handwriting” fonts use carefully designed beginning and end stokes such that when the letters are printed right up against each other, the strokes join up. But, when type was set with metal, you couldn’t easily overprint one character with another like this, so “handwriting” faces had to use lots of ligatures to deal with letter combinations - an early (and famous) example of is the Claude Garamond’s “Grecs du roi” greek typeface from the mid-16th century. ( https://en.wikipedia.org/wiki/Grecs_du_roi )

Kristian Walsh Silver badge

They keep using that word, but I do not think it means…

…what they think it means. The feature they’re describing isn’t “ligatures”.

Ligatures in English and other European languages are basically stylistic choices, not a requirement of the language. Other European languages might use what look like ligatures to English speakers, but are actually letters in their own right: To Danes, “æ” is just as much a letter as “w” is to English-speakers, and the French “œ” represents a different sound to the two letters “oe”, and “ß” isn’t interchangeable with “ss” in German (Swiss notwithstanding). Every other joining of letters is done to make text look “interesting” or to solve kerning problems, like ff, fl, fb, ffi, and fi in typefaces where the f glyph has a long overhang to right. Anyone who actually types the “fl” codepoint (U+FBO2) into a document is causing more harm that good - that code only exists for round-trip encoding compatibility with the original MacRoman 8-bit encoding from 1984 - it is not a character.

(As an aside, I have a pet hate of monospaced fonts that form ligatures, and I reserve a special place in hell for the ones that try to get all arts-and-crafts-movement with the punctuation combinations used by programming languages: after decades of training my eyes to find &&, <=, !=, +=, ++, ==, ?. and other digraphs in code, rearranging them into artsy shapes only hides them from view)

Unlike Latin scripts, in calligraphic scripts like Arabic, ligatures are essential to creating legible text, rather than something that looks like a jumble of individual letters thrown on a page.

But from the sound of it, and the examples used, this release isn’t talking about ligatures at all, but what’s called contextual rearrangement: the way that some writing systems lay out glyphs (the shapes that we in English call “letters”, “numbers” or “punctuation”) in a different order to the reading or storage order. This is common to pretty much all of the scripts used to write the native languages of India, Bangladesh and Sri Lanka.

Here's an example in Sinhala, as used in the Sri Lankan language of the same name: ඵ + ේ → ඵේ

If your browser rendered that right, you'll notice that the second character has “wrapped around” the first, with one part of it being rendered to the left, and the small diacritic mark from the second character (the bit that looks like a p) now sitting to the right.

These days, most software can display this sort of thing properly (it uses a finite-state automaton described in the text renderer and/or the font itself to choose the appropriate cluster glyph based on the input sequence), but for an editor, locating your cursor within one of these clusters as you move it “left” or “right” is a more complex problem.

In Rust We Trust: Microsoft Azure CTO shuns C and C++

Kristian Walsh Silver badge

Re: C and C++ are different

You think the language that’s eating the planet is... C++? Oh man, when you see how much JavaScript is out there, you’re going to have a fit...

Microsoft highlights 'productivity paranoia' in remote work research

Kristian Walsh Silver badge

Goodharts’s law (paraphrased: “any statistical measure will become inaccurate once used as a target”) doesn’t apply to the particular solution they’re hawking here, as it appears to be primarily a business communications system, rather than something that measures productivity itself.

Microsoft warns of bugs after nation pushes back DST switchover

Kristian Walsh Silver badge

Re: Hard Coding Ahoy

Time-zones are a very complicated problem to solve, and that approach has been ruled out a long time ago, because time-zones, DST observance, and territories are an overlapping set with no hierarchical arrangement: even your “simple” US example is thwarted by the Hopi Nation lands located in the State of Arizona (the former observes DST, the latter does not).

Having debugged a lot of date/time code over the years, I'm going to say that whether or not Microsoft issued that fix on time would be largely irrelevant. By default, databases store UTC for DATETIME unless you go out of your way to be an idiot, so DST is invsible to these. The issues are in calculations and presentation on those stored dates, and that’s where hackery begins.

I can guarantee that a huge number of developers will have hard-coded the DST changeover dates in their code - especially if they were using Java, whose date libraries are historically so bad that they encourage hand-rolling of alternatives. (Yes Jodatime, now java.time, is acceptable, but it took nearly two decades to become standard!)

Calculating durations and repeating events that cross the DST changeover is something I’ve rarely seen done right first time, and often the “second time” fix is something like “if (start<'hardcoded_DST_changeover_date' && end>'hardcoded_DST_changeover_date') ...” that kind of code is just waiting around for something like this to happen.

Tesla faces Autopilot lawsuit alleging phantom braking

Kristian Walsh Silver badge

Re: From the owners' manual

Not a valid comparison.

The steering servo, as its name suggests, does not have any decision logic in it: it simply ensures that the angle of the steered wheels matches the angle you have selected using the steering wheel. If it fails to do this task, and this failure causes a crash, then the manufacturer is at fault for preventing the driver from controlling the vehicle, just as much as if a mechanically-coupled system failed. (Although all “electrical” steering systems retain a mechanical linkage to allow the vehicle to be safely moved out of harm’s way in the event of a total electrical failure)..

An AEB system, on the other hand, makes decisions that control the car: if it detects an object in the vehicle’s path, it warns the driver, but then if the object gets too close without any braking input from the driver, it engages the brakes directly, removing control from the driver. However, failure of the system does not result in loss of control for the driver because there is no point in this process the driver cannot apply the brakes themselves. Legally speaking, the responsibility to not harm other road users always rests with the driver, because it is only the driver who can be expected to be aware of the entire situation.

And if someone rear-ends you, whether as a result of manual braking or an AEB event, it is they who are responsible, because they were not keeping a safe distance from you.

Kristian Walsh Silver badge

The problem is that aside from the phantom braking there are also instances of Tesla cars on Autopilot driving straight into genuine obstacles: eighteen-wheeler trucks, highway gore barriers, and even parked police cars with full lights flashing.

It’s hard to make the argument that the system is over-cautious.

... Yes, in each of those, the driver is ultimately responsible, but in each of those, it’s also very hard to see how a normal Autonomous Emergency Braking system that doesn’t rely solely on visual image processing would have allowed the car to strike a large, solid object.

AI detects 20,000 hidden taxable swimming pools in France, netting €10m

Kristian Walsh Silver badge

Re: cost vs return

€24 M to get it running and run it for two years, according to a post above, so that makes it net positive.

Kristian Walsh Silver badge

Re: Chemicals & Services

And how they find the construction companies is by tracking their inputs. It’s hard to build anything without materials, and the suppliers are far more diligent with their paperwork - if they’re not, they don’t stay in business for long.

It’s actually pretty hard to fiddle your taxes these days if you have any kind of input costs. Once caught, you open yourself to some very severe penalties, and there’s very little room for argument.

The reason is that there are two taxes in play: income tax on profits, and VAT on inputs; and the amount you claim/remit in VAT and the income you report for income-tax need to tally - the ratio doesn’t matter, so long as it’s within the acceptable range for your business type and region, but the revenue offices have very good models for different types of business, and they know within a copuple of percent how these figures need to line up.

To avoid tax, therefore, you have to underdeclare not just profit but also inputs. The problem is that if you underdeclare VAT, you’re directly stealing from the government, and the penalties for this are swift and severe. But, if you underdeclare income for tax, while paying your correct VAT to avoid prosecution, now your input costs don’t match your declared profits, and you get audited and eventually end up with a big tax demand.

Kristian Walsh Silver badge

Re: Chemicals & Services

It’s very different. In both of these cases, you gave your consent in advance of the purchaes for the autorities to have your details; and you are the direct beneficiary of the purchases (I don’t know French laws on this, but I suspect it is a criminal offence to buy controlled poisons for an unidintified third party).

That’s very different to the government going to a pool-cleaning supplier and demanding their customer lists for the sole purposes of conducting a tax investigation. There’s no public safety grounds for doing this that would override the lack of consent from the purchasers.

However, if the French taxman is like his Irish cousin (who I do know about), they can —and will— look at the tax returns of the construction and pool-cleaning companies to estimate how many pools there are in each company’s catchment, and they can then compare that with the property improvement declarations in that are to see if they tally. If not, an on-the-ground investigation can be conducted to find out who’s dodging. All of this can be discovered without knowing one customer name or address, so GDPR does not apply. This story sounds like they used the satellite imagery as a replacement for that on-the-ground survey in certain Départments where there was already an anomoly, rather than them starting from a blank sheet.

That emoji may not mean what you think it means

Kristian Walsh Silver badge

Re: Eggplant

The shit emoji came, like emoji themselves, from Japan, and its original meaning was “good luck”.

It's a pun based on the re-arrangment of the syllables in the Japanese for “Good Luck” (literally “happiness-luck”, 幸運, read as ko-uu-uu-n, こううん), and the childish word for shit, unko, (うんこ, uu-n-ko).

most commonly translated as “poo” or “poop” depending on which side of the Atlantic you reside on.

Emoji itself is a Japanese term, 絵文字 (emōji) that just means “picture-letters”.

Businesses should dump Windows for the Linux desktop

Kristian Walsh Silver badge

Almost exactly on time, we get the "kickback" story. It doesn't stand up to scrutiny.

1. In terms of city revenues, Microsoft's HQ never really moved. Unterschliessheim (former location) may not be in Munich, but it is within Munich's administrative region (Landkreis in German); so the town authority there pools resources with the city of Munich and in turn receives services from the city MS. The notional increase in taxes from a move into Munich city was negligible.

2. Munich actually paid, in the form of tax reliefs, for Microsoft move into the new location, an innovation hub it was developing between the city's two Universities. (It gave the same reliefs for other businesses too). MS wanted to move too (the shift in how tech companies recruit staff since the 1980s makes a campus-adjacent HQ a big plus), but it wasn't going to turn down the tax benefits.

It's stretching credulity to claim that this was some kind of bribe by MS. The story got legs from people who have no idea of how local government functions in Germany, or for that matter how big the economy is around Munich: Microsoft is just another medium-sized fish in a very, very rich pond of both domestic and foreign companies.

Kristian Walsh Silver badge

As we’ve somehow lurched from enterprise applications to microcontrollers (...?), my experience of embedded is that it’s Linux that has the worse support. Or rather, it has either good support, or none, depending on the device you’re using. That’s not a viable platform to work from if you want to do this as a living (yeah, tools really matter, but clients often have invested in a particular micro for lots of good business reasons, and they won’t throw away those years of code just because some bolshy dev says Linux is cool).

If you’re lucky enough to target a device in the "has Linux support" category, then you can certainly feel that Linux is “better”, especially if all of your experience (and thus all of your ability to script and automate the horrible bits of developing on limited systems) is on Linux already. Even then, a client will send you a datasheet for some peripheral IC, and that’ll have a progammable core, and... oh look, a Windows-only toolchain to code for it.

I do embedded development, on Windows, and I use the Windows Subsystem for Linux to manage a lot of the nasty stuff around the edges. However, I’d never move to Linux. Firstly, my target chip has even poorer tool support on Linux than it has on Windows (and that is saying something!), but to be honest, for me the only good thing about Linux is the commandline userspace, and WSL gives me those already, so moving over to just Linux means mostly losing out on Windows-only stuff.

Kristian Walsh Silver badge

Your recollection is incomplete. 90% was the peak figure for user desktop replacements, but very quickly those users began to request a return to Windows as the Linux-based solutions were missing key functions.

The Linux ecosystem has no centre, and its continuous development model makes it much more of a moving target for internal application developers than Windows is, unless you expend a fair amount of effort maintaining your own distro. (Package maintainers are the real heroes of Linux - it’s a thankless, time-consuming job, but without them the whole ecosystem would collapse).

The nail in the coffin was the prospect of decentralised working. There simply wasn’t any good private-cloud solution for collaborative work. For several classes of application, Munich basically came to the realisation that to make Linux work for it, it would have to permantently invest in its own team of programmers and maintainers in addition to its existing IT staff (who also had their own grievances regarding provisioning and monitoring). It was a buy/build decision, and the real costs of “build” were now known to be far higher than simply paying for a ready-made product. After all, the city doesn’t build and maintain its own trams - the latest lot was bought from Siemens - so what made software any different?

Other organisations have fared better with Linux, however, but they were more pragmatic: adopting it where it was a viable and mature alternative. The mistake Munich made was the (ultimately ideological) decision to do a top-to-bottom replacement of every software system with a FOSS alternative. That really makes no sense - you might not like Exchange, but is there really a FOSS alternative that’s anywhere near as good at managing office communications?

China's 7nm chip surprise reveals more than Beijing might like

Kristian Walsh Silver badge

Re: Ours

And yet American companies do build and operate state-of-the-art fabs in the EU. So, if the EU was trying to stop it, they’re going a funny way about it. Certainly, you don’t give a billion euro in grants to stop someone coming into your territory.

Intel just kicked off two more CPU fabs in Italy and Germany, to go with the one it already has ready to go in Ireland. The UK didn’t make the shortlist for the new sites partly because of Brexit (you need a stable supply-chain for a semiconductor plant), and mostly because of the lack of skilled labour.

That latter one is what has really crippled English industry (I say English, because education is devolved) - underinvestment in education, especially at secondary level, means it costs far more in training to hire a plant operative in England than it does in Germany, France, Italy or Ireland.

Your strange assertion that “diversity” is incompatible with a meritocracy shows you’re not thinking things through. I’ve worked with and for enough white thick-shits who had rich parents to get them through college to know where the real problem is in hiring practices, and it’s not ethnic minorities.

Kristian Walsh Silver badge

Re: Ours

The meaning of that original statement is very clear: all of the UK Conservative Party’s bleating about being the defenders of British industry, and all of their anti-foreigner rhetoric pales into insignificance when presented with the opportunity to earn quick cash (and pocket a personal payoff as a consultant after retirement).

Software issues cost Volkswagen CEO Herbert Diess his job

Kristian Walsh Silver badge

Re: Stand by for subscription services like the ones BMW recently announced.

You’re thinking of the problem as if the seat heaters were a standalone product, instead of one component of a system where there’s a power and data bus already in place, with a high-performance general-purpose computer attached to that bus. A simple solid-state current driver IC, uniquely addressable on the bus and controlled by the cabin software is far more flexible and cost-efficient than the electo-mechanical system you propose, especially as that part can be mass-produced to suit for any actuator application within any vehicle.

Also, by having these loads centrally controlled, you can also very closely manage total current draw on the 12 V electrical system, and keeping maximum current down can save you a lot of money.

If that wasn’t enough reason, there’s the bill of materials: replacing the cost of your timer and relay with a common solid-state part across 4 million units a year is an example of They’re Doing It Right, and that’s before you consider the knock-on costs of maintaining supplies of additional unique parts.

Kristian Walsh Silver badge

Re: Fast chargers

Everyone homes in on DC fast-charging because it looks most like the way you fill up a petrol car, but it’s really not how the majority of EV recharging will happen. Private cars spend 95% of their time parked, so given the ubiquity of electrical supply, that parked-time is when they will charge. Any residence with a driveway can use AC home charging at 7kW (rising to 22kW in countries with 3-phase domestic supplies), which is cheaper and more convenient than forecourt charging. AC charging can be rolled out across pay-parking facilities, commercial premises and workplaces too without much techical difficulty. On-street charging is still a challenge, but it’s a very solvable problem: the difficulties are mostly political and commercial, not technological.

You say filling up with petrol is “convenience”, but is it really? Having the car charge up overnight — or while you do your day’s work— is in a different league when it comes to convenience: just unplug, and go. No wasting your time detouring to a station, or queueing at the till... and it’s cheaper too.

The only place where fast-charging becomes relevant is for very long distance journeys (exceeding 75% of your battery’s range). But with current capacities, those journeys are very rare. Yes, there are people out there who have to drive more than 200 miles at a go, but they really are outliers. For them, a diesel car is still the best option, but eventually EVs will catch up there too.

The electrical grid is entirely able to accommodate EVs. Demand on the grid today is not in any way constant, and it has to be able to operate at a peak that is far in excess of the normal loading. Nightly charging actually solves a problem that has plagued electricity generators since Edison’s day: the most efficient way to operate most types of generation is to run them continuously, but demand fluctuates wildly over the day and week, and if you over generate, you have to dump the power.. (Search for “duck curve”, and follow the articles you find to learn more about this topic).

EVs actually help the grid. They aren’t steady-state dumb-loads, they have intelligence, and are all capable of negotiating with the supply (the charging connector has a serial datacomms link, and the protocols used on it include a Vehicle-to-Grid facility). Right now, you can schedule cars to take electricity at any time of your choosing (to reduce your home bill), but there’s no technological impediment to prevent the electrical utilities and vehicles negotiating their charging periods to react to instantaneous power surpluses in the grid. There’s also the prospect of using your EV battery as a domestic demand buffer, for which the utility will reward you with cheaper electricity.

Regarding capacity, it might interest you to know that domestic electricty demand peaked in the early 1990s, and has declined since (Datacentres have picked up some of the slack, but not all). A lot of infrastructure was put in place based on that level of demand, and mass adoption of EVs will just bring things back towards those projections.

But, it’s not like grid infrastructure is immutable once installed. Unless you’re misfortunate enought to live in Texas, the electrical grid is constantly maintained and expanded to meet demand for electricity; and it’s not like a hundred million EVs are just going to wink into existence and plug in next week: the transition will take decades, and decades is long enough to adapt any grid.

This credit card-sized PC board can use an Intel Core i7

Kristian Walsh Silver badge

Re: Fahrenheit?

... OR GIVE ME CLYDE ?

Kristian Walsh Silver badge

Re: Fahrenheit? More Precise?

True, although once we’re being technical, I’m sure you remember that water actually freezes at its triple-point, which is 0.01 ºC (273.16 K), not zero, and that the boiling point is only 100 ºC at 1013.25 hectopascals of air pressure (= 1 atmosphere); in a hard vacuum, pure water has no liquid state, and so boils at any temperature over its triple-point, and freezes at any temperature below.

Kristian Walsh Silver badge

Re: Fahrenheit? More Precise?

We have decimal places for when we want to be precise. There's no need to convert out to an odd-shaped unit with an offset baseline.

Big Tech silent on data privacy in post-Roe America

Kristian Walsh Silver badge

Re: Democracy

"So if a Democratic president, a Democratic Congress and a Democratic senate were unable to stop this, exactly what was the point in voting for them ?"

For the purposes of “stopping this”, the Democrats (or, more correctly, Senators who would uphold Roe v Wade) do not hold the Senate.

Why? Well, like the President, the Justices of the Supreme Court can be impeached and removed from office by Congress. However, just like the President, doing so requires a majority of the House of Representatives to vote for impeachment, which triggers them to be sent for trial in front of the Senate, after which a two-thirds majority of Senators must vote for conviction in order to remove them from office.

The first part is easy: there are reasonable claims that the appointments of Neil M. Gorsuch, Brett M. Kavanaugh and Amy Coney Barrett were all invalid, as all three committed perjury during their appointment hearings. Under questioning, each explicitly denied that they would even consider repealing the judgement in Roe v Wade, and yet, on the very first instance that the matter came before the Court, each agreed wholeheartedly to overturn it. I can believe in Justices having varying changes of opinion after many years, but such a rapid and complete about-face on this issue stretches credulity.

But impeachment isn’t enough. The conviction must be voted for by the Senate, and by 67% or better. In a normally-functioning government, a proven case of perjury would be normally enough to have one or more of these Justices removed from the court, but the highly partisan reality of US politics where most of the 50 Republicans will pretty much vote down any Democratic impeachment, makes such an outcome impossible… unless the Democrats were to somehow secure 67% of the seats in the Senate.

... and that is exactly the point in voting for them.

Jeffrey Snover claims Microsoft demoted him for inventing PowerShell

Kristian Walsh Silver badge

Your use of the present tense is odd in that post, given Microsoft’s current championing of PowerShell—and, for that matter, Linux—on Windows today. This is not the 1990s anymore.

And Microsoft never had any relationship with IBM they could “split” from apart from being a vendor of DOS, and a development partner in OS/2 - both in the 1980s: anyone managing then retired long ago.

The rot in the 1990s was what happens to any dominant player in a market: gaining share becomes more difficult, so management decides instead to rent seek, and adopt defensive positions to block competition (we’re seeing this with Apple on mobile). Of course, these tactics are contrary to the needs of customers, and can sow a long-lasting poison that damages the chances of any genuinely good product that comes from the company later. As an example, Windows Phone was the most techically advanced mobile SDK, with amazing low-resource performance, and I believe that it would have been a success had it not come from Microsoft, with all the baggage and industry distrust that that entailed...

Microsoft, like other successful companies, also attracted the bullshitters. I worked for Apple in the 1990s, and we used to say that the only good thing about working for a dying company (which it was then) was that the kind of MBA-but-no-intellect manager-class tended to avoid us when choosing a place to inflict themselves upon. Instead they flocked to Oracle, Netscape (heh), Microsoft and Sun, who were all on the up, and thus offer a low-risk environment for their inept ideas. (Not to say we didn’t have home-grown assholes, but nobody tried to bring in morale-destroying “new ways of working”)

Kristian Walsh Silver badge

Re: powershell command missing

The bash solution becomes ugly very quickly once your needs diverge from "results sorted by directory size". The PowerShell version, on the other hand, is trivial to adapt to sorting by any attribute of the system.

You can also trivially adapt the PS version to give you sizes, sorted by most-recently modified directories, or excluding directories that have not changed since a given date. Try that in a bash one-liner if you really want to learn the darker corners of bash (but really, don’t do it: complex functions belong in scripts, not one-line alias macros)

Lest I be accused of fanboyism, I should mention that I don’t use PS very much: I overwhelmingly write scripts in bash (or more accurately, the subset that’s approximates the original Bourne Shell - a habit formed from writing for embedded systems using dash and other lightweight shells). But I’m not some kind of idiotic zealot who thinks that just because something is from Microsoft, it’s inferior to whatever came from Unix. The reason I don’t use PS because it’s not well supported on Linux, where I do most of my scripting; and on Windows, its default commands are all different to what I know from Unix.

But if you’ve ever had to parse the crap emitted by linux tools like ip, you'd wish that Linux had something like PowerShell’s object-stream approach to pipelines.

Making the most common Linux tools produce key-value output (even if it has to be in a lousy format like JSON) would be a start, but that requires cross-project collaboration, which is something the Linux world has proven to be very bad at.

Kristian Walsh Silver badge

Re: I would get it fired for inventing Powershell

I suspect you don’t do much shell programming. Consider this bash snippet:

X=009

test $X = 9

... should that return true or false? It returns false, because '=' compares text. There’s a difference between lexical comparison (characters must all match) and value compare (parse to numeric, then numeric results must match), and numeric comparison is what -eq, -le, -gt and company do on the unix `test` command ([ is an alias for test, chosen to make shell-scripts a little more readable).

Powershell is similar, but you first need to know that PS is a typed runtime (like Python). The equivalent code:

set X 009

if ( $X -eq 9 ) { echo "true" }

will report true, because X is a number. Change that initial assignment to be a string, and the comparison now fails.

set X "009"

if ( $X -eq 9 ) { echo "true" }

If you want to force integer comparison, you can cast the operand:

set X "009"

if ( [int]$X -eq 9 ) { echo "true" }

Datacenters in Ireland draw more power than all rural homes put together

Kristian Walsh Silver badge

Re: Datacenters vs grid power.

Because that hot air is typically below 35C (i.e., only 10-15 above indoor ambient) efficient energy extraction is expensive (although it's getting better). The low temperature also precludes use for district heating except in places with high-density accommodation and harsh winters, and Ireland has neither.

Kristian Walsh Silver badge

It would be a massive reduction for Ireland, given that so much of the DC equipment located in Ireland is providing services for EU and global customers.

The uptick in demand is one of the knock-on factors of Brexit. EU customers want their data-processing to be carried out within the EU. Most of the transatlantic fiber makes landfall in the British Isles, but with the UK now outside the EU, Ireland has become a preferred location, despite its thorny planning process.

The usual suspects will say this is all about tax rates of course, but datacenters are cost-centres, and their location has nothing to do with where profits are booked. Amazon, for instance, located is first AWS Europe region in Dublin, but it has no business operations in Ireland at all: even its shop website is a redirect to amazon.co.uk. Amazon’s European operations are run by a Luxembourg company.

Growing US chip output an 'expensive exercise in futility', warns TSMC founder

Kristian Walsh Silver badge

Re: Wait for it, TSMC. Just wait for it.

It's not about “top notch” minds. The belief that it is is part of what got the USA in this mess, but the high cost of education in the States is the real killer for manufacturing competitiveness. (I won’t mention healthcare costs, as that cripples all sectors’ competitiveness, not just that of the manufacturers)

High levels of college debt from even the most modest of institutions makes it impossible for companies to hire high-skilled operatives at internationally-competitive wage rates.

That same system of amassing debt just to get educated discourages manufacturing workers from even bothering, leading to lower skill levels in the work-force, thus exacerbating the previous wage issue through lack of supply.

Basically, the whole education system in the USA discourages “making things” as a career option, and this is why, as Chang noted, the USA has pivoted away from mass manufacturing towards services and niche, cutting-edge technology. The alternative would require a ground-up overhaul of how Americans are educated, and that’s both impractical (districts, counties, states, private colleges and federal government all have fingers in the pie, but nobody is in charge of enough of it to change anything), and politically lethal for either party.

GNOME 42's inconsistent themes are causing drama

Kristian Walsh Silver badge

Re: I hate "modern" UIs.

“Creatives” shouldn’t be let near user interfaces until they have been properly designed. UI is strictly speaking a branch of ergonomics, and the people I know of who did it for a living are serious engineering types.

The problem is that a device’s user-interface is now seen as a marketing point, not a functional one. That means lots of flashy but sub-optimal solutions get chosen over those that are more functional.

Unfortunately this trend is creeping out of computing/web/phones and into areas where it’s starting to pose genuine problems. Those cars with touch-screen-only secondary controls are effectively unusable for elderly or other users with poor fine motor skills (=the ability to accurately position and work with your fingers), whose motor skills would otherwise be perfectly adequate to safely pilot a car. Previously, both the secondary (air, radio, seats...) and the primary ( steering, speed, brake, indicators) controls in a car have gone through decades of usability engineering to ensure that they can be used by people with pretty much any level of fine motor control, but the shift to touch-screen for secondary controls has undone that good work. It has, however, saved money (if you think Tesla uses touchscreens because they’re cool, you should look at the costs of designing and making physical control buttons sometime)

Kristian Walsh Silver badge

I suppose when you got a laser printer, you started to print all your documents at 3 point text too because you could...

Resolution is not the same as size. Bumping up the number of pixels on your display doesn’t give you more space to lay out your work - or improve your eyesight. And making the display itself bigger also comes with a penalty.

First off, the limiting factor for information density on any interactive display is the human operator. Text needs to be large enough to be clearly readable at the typical viewing distance. That means larger type size (in pixels) and greater line-spacing. The reason your modern UI’s default settings on a 4k monitor still uses the same physical character size today as a VT100 (around 2.0 x 3.5 mm) is because that character size was not pulled out of a hat; it is the result of detailed study of human vision and ergonomics.

UI controls also need to be larger, but for a different reason. Monitors have got bigger physically. Whereas twenty years ago a single 17 inch monitor was the norm, and 30 inch the practical limit, now that norm is 24 inches, and the maximum can be multiple 35-inch displays. Because the distance you have to move the mouse cursor can now be much, much bigger than before, it is much harder to accurately position that cursor on buttons that are further away - so because the average display surface areas get bigger, the controls must get bigger too for the system to remain usable. (The theoretical basis of this has been understood since the 1950s, see https://en.wikipedia.org/wiki/Fitts's_law)

But the problems with this KDE release aren’t density, but misunderstanding of the rules they’re working with, and instead aping the mistakes made by an existing popular system (the fallacy of assuming that popularity means high quality). They have made the same mistake that Apple made with iOS7 when they tried to copy Windows Phone’s minimalist UI - they went too far, and simply removed all the button-borders and other visual signifiers from their UI without providing anything else in their place. The result is a mess.

Windows Phone gets a lot of blame for this, but it’s misplaced. Anyone who has actually used a Windows Phone device will tell you that its UI may have been flat, but it was still distinguishable: yes, there were only a handful of visual cues, but they were used consistently across the whole UI - and the Windows Phone UI did feature boxed buttons; frameless buttons are an abomination that the world can than Apple for. As an example, colour was consistently used as an interaction cue: pretty much any text drawn in the UI highlight colour would react to touch - Apple badly broke this basic rule when it did iOS 7, indiscriminately using colour both as a decoration and as a interaction cue. The irony that it was Apple being so ignorant of basic design principles isn’t lost on this ex-employee, but that’s what you get when you fire all your UI designers (the Human Interface Group was closed back in the late 1990s) and expect graphic artists to have the same skillset. “Looks like” is not “works like”

Intel counters AMD’s big-cache PC chip with 5.5GHz 16-core rival

Kristian Walsh Silver badge

Re: Let's Qualify that 5.5GHz Figure

Yeah, it’s crazy isn’t it. The Motorola 68000, the first CPU I learned to program in assembly-language, was so-named because that was the number of transistors on the die!

…and the process node for the first 68000 chips was 3.5 μm, meaning that each of those transistors was around 500 times larger than those of a modern CPU.

Kristian Walsh Silver badge

Re: Let's Qualify that 5.5GHz Figure

Yes, it was fine when the node measurement was just something that was used internally at the fabs, but tech bloggers with not quite enough knowledge have unfortunately turned it into a Holy Number. No surprise, as we’ve been here before several times with “numbers”, most recently with the confusion about high versus low TDP, cores versus threads, 64-bit versus 32-bit, and of course, the granddaddy of them all: Megahertz.

Alder Lake is built on what, until very recently, Intel called its “10 nm Enhanced Super-Fin” process. Despite having that “10 nm” in its name, this is actually able to pack more gates onto a die (100 Million transistors per square mm) than the slightly older TSMC “7nm” process (92 Million transistors per square millimetre) that AMD is using right now.

However, TSMC has a newer “5nm” process (albeit only for ARM SoCs) that packs in around 165 Million transistors per square millimetre, and Intel has no answer to that until its “Intel 4” process starts up, allegedly later this year, with an expected density of around 180-200 Million transistors per square millimetre. In other words, when it comes to wafer fabrication, whoever has the newest thing has the best thing.

But AMD is expected to jump to that TSMC 5mm process later this year, and so will take back the process lead; Intel’s first “Intel 4” CPUs, meanwhile, will be “Meteor Lake” in mid-2023.

114 billion transistors, one big meh. Apple's M1 Ultra wake-up call

Kristian Walsh Silver badge

Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

If you understand your workload, why don't you just max out the RAM when you buy the system?

It’s precisely because I do understand my workloads that I am able to defer the cost of any upgrades. “Maxing out” is a clear sign that a buyer doesn’t have a handle on their real requirements. You can get away with it for one off purchases, but those mistakes multiply quickly.

I make my living writing software systems of various sizes, operating from a few kilobytes to a terabyte in data-size. Right now, I know the work I’m doing has low RAM requirements, but as soon as I shift to anything cloud-based again, with lots of containers, I’ll buy the extra RAM. By then it will also be cheaper, both nominally, and in terms of spending power. But then, maybe I’ll still be writing small systems through the life of the system. If so, also great, I haven’t wasted a cent.

But that’s just my single PC. My point was if you’re making that decision for thirty to a hundred workstations, the consequences of such a cavalier approach to provisioning as you‘re suggesting will add up quickly, even before we consider Apple’s steep price ramp.

As for downtime, it’s twenty minutes, including the memory check (five, if I do what you did and don’t self-test the system hardware before putting it into use). I’m fairly confident in my ability to safely fit a pair of DIMMs without damaging my computer. But don’t imagine that’s any kind of a boast: computers are designed to be assembled by low-skilled labour - there’s nothing magical or “elite” about putting components into one yourself.

Kristian Walsh Silver badge

Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

My experience is different: I have upgraded RAM in every system I’ve ever owned, both Mac and PC (my last Mac laptop, a 2015 MacBook, had a RAM upgrade in its lifetime too). It’s the one thing that I do tend to run out of in my workloads, but then, I had a period of developing systems that required a lot of containers and virtualised systems, and they tend to consume lots of RAM.

Storage is another thing that I have always ended up upgrading, but then it has been for speed (SATA 3 to SATA 6 to NMVe), rather than capacity. I don’t think I’ll need more than 1 Tbyte onboard for a long time to come.

The issue of expandability came to my mind because I have just replaced my own desktop system, and I fitted it with 32G of DDR5. I may end up needing 64G in a year or two, but the price of DDR5 is so crazy right now that it just wasn’t financially sound to buy that much until I really needed it. By then, the usual pricing curve for electronics will make it much cheaper for me to upgrade. Apple’s design decisions with the M1 chip take that option away from me, and I find that position difficult to swallow in something that claims to be a professional tool.

Kristian Walsh Silver badge

Re: I saw the reveal presentation, and, while I'm no fanboy, I was amazed

You’ll hit that RAM barrier pretty quickly, though, and then you’ll have nowhere to go except sell and replace. If you’re a professional, your time is your income, and the downtime of buying and setting up a new system compared with unscrewing a panel and adding DIMMs is significant.

That’s my big problem with these: you’re basically given a choice of grossly overpaying for the memory you’ll need over the unit’s lifetime, or ending up at a dead-end. Also, at 128 Gbyte, that maximum memory purchase isn’t even that high.

That inability to cost-effectively expand means that this isn’t a professional system; and the specifications are more than any regular user could ever find a use for - the only mainstream high-performance computing application is gaming, and there’s almost no Mac gaming market. Even if there were, games require GPU rather than CPU muscle, and the M1 Ultra is nowhere near as competitive as a GPU.

I think that’s why the response to the M1 Ultra has been so muted. It’s great, in theory, but who would actually buy one? At the other end of the range, I also feel that there’s too much of a gap between the Mac mini and the entry-model Studio: the entry Studio model is nearly double the price of the top mini.

Kristian Walsh Silver badge

Those industries are supposed to buy the Mac Pro, which can accommodate 1.5 Tbyte of RAM.

I say “supposed to” because pricing on Mac Pro lives in its own private fantasy-land - anyone paying €6k+ for a quad-core Xeon 3223 and 32 Gbytes of RAM would need their head, and/or budgetary approval privileges, examined.

Just two die for: Apple reveals M1 Ultra chip in Mac Studio

Kristian Walsh Silver badge

Re: Mac Studio

I had one of those in my mac-using days. Probably the last good Mac desktop: a function-first case design, internally expandable, could be placed under a desk easily, and (surprisingly for Apple) it wasn’t priced at a multiple of competing systems (I think my 2x Xeon model cost about 25% more than an equivalent HP at the time).

This new thing is neither fish nor fowl. The low-end model is actually pretty decent for someone who needs more than the €1000 M1 mac Mini and cannot leave the world of Apple, but more than doubling the price of the top Mini to get to a €2200 entry-level Studio is a real deterrent. (I suspect some of this pricing is supply-chain related, but I also don’t expect Apple to drop the prices when things improve, either..)

But it’s the upper-spec “Ultra” one that has no use-case that I can see: it’s a sort-of workstation CPU, but you can’t expand the RAM or storage, and it runs an OS that is poorly supported by vendors of the sort of software you normally need a workstation to run. At €4,500+, no rational purchaser would buy one. (especially as it lacks the “tech furniture” appeal of the Mac Pro)

But, looked at in the hermetically-sealed bubble of Apple’s product-range, it does fulfil a function - it bridges the chasm in price between the midrange Mac Mini and the workstation Mac Pro, whose €6500 entry-price for an 8-core Xeon system with just 32 GB of RAM is nothing short of offensive.

Apple has missed the video revolution

Kristian Walsh Silver badge

Re: Apple had spectacularly bad timing

M1 is a great personal-computer SoC, but once you get into the high end, it quickly becomes unsuitable for demanding workloads. The biggest reason is that RAM is capped at 64 Gbyte, and has to be chosen at time of purchase.

64G of RAM sounds like a lot, but it won’t be in 2-3 years, and it already isn’t enough now if you’re working with 8k video.

For context, consider that Apple’s own Xeon-based Mac Pro systems, aimed at this space, max-out at 1.5 Terabytes of RAM. That gives you an idea of the sort of configurations people are looking for in this space (although any production house with proper budgetary control won’t be buying Mac Pros if they can avoid it when comparable rack-mountable systems can be had at much lower cost)

Of course, the solution would be to design an M1 using a traditional offboard RAM architecture, but once you pull the DRAM off the package, you lose a lot of the M1’s performance advantages, which were made possible by not having to deal with different RAM specifications.

But, even if M1 kept its performance figures, it’s still outclassed in this space: once you‘re on a desktop, power consumption and cooling become less of an issue, and it’s hard to argue with the much higher peak performance per core figures achieved by AMD and Intel chips.

Kristian Walsh Silver badge

Re: My favorite irony…

These huge LED-matrix displays are normally driven by Windows, because the makers of the display controller hardware only supply drivers for Windows systems; even getting Linux drivers for these things is hard: the OS licence saving is a pittance compared to the complete cost of the rig, and Linux requires a lot of tuning before it will work acceptably with this kind of real-time processing.

But even if it was a macOS system, it would not be Apple hardware (you need a full, multi-slot PCI chassis, which Apple no longer makes), but Apple themselves would still have to take responsibility for keeping the hardware drivers up to date. That’s a lot of effort for one single installation.

But if you’re looking for more examples you can go further down the chain: All of the high-tech automated assembly of Apple’s products is controlled by Windows PCs, because that’s the OS that all the the equipment used in PCB and computer-controlled manufacturing runs.

(Why is this? Well, apart from cost, there’s long-term support. I once had a client who came to me for advice for his business which sold a very expensive piece of specialist machinery built around a MacOS-based controller. When Apple killed support for CarbonLib, in which all of the control software had been written - for performance reasons - it became impossible for him to source new controllers, as the new macs would only run versions of OSX that did not support Carbon. Remember, you cannot legally run OSX on generic Intel boards. My advice was to hire someone to rewrite everything on Linux, by the way. “Rewrite everything” was exactly not what he wanted to hear, but it was the only way out of the trap.)

But none of it matters. For a long time, most of Intel and Microsoft’s advertising and promotional material was produced on Macs. I was an Apple fan in those days (partly because the systems were good, but also because, as an employee, I was able to buy that quite good hardware without paying the full, extortionate, prices), and this fact used to raise a smile, but really it did nothing to convince people using Windows systems that Macs were better. The same goes now, in reverse.

Kristian Walsh Silver badge

Re: Apple dumbed down and threw pros under a bus

No, he means video and audio. Avid Composer launched on Macintosh and was Mac-only through most of the 1990s; basically the time it took to become the preeminent non-linear editing system for video production. Even today, people in film and TV production will use the term “Avid suite” as a generic term, even if the actually software is a competitor system such as DaVinci Resolve or Adobe Premiere Pro.

Part of the reason for this early advantage was that all Apple systems (until the Macintosh 4400 and later, the iMac) supported SCSI hard-drives. That higher throughput and ability to attach up to 6 external drives was essential if you were trying to do video editing in the 1990s.

I used to work at Apple in the late 1990s, and audio-video production, pre-press and education were the only markets where Apple desktops still had any traction, and it was a big worry when first Adobe, and then Avid started to produce Windows versions of their flagship products.

Apple did purchase Final Cut to keep a foot in the game, and while it had initial success at the higher end of the market, Apple’s lack of competitive workstation products throughout the 2010s meant that it has now slipped into a kind of “prosumer” space, with most large productions moving to Windows-based systems, which offer much greater flexibility (and value for money) in terms of the hardware you can run them on.

A Snapdragon in a ThinkPad: Lenovo unveils the X13s

Kristian Walsh Silver badge

Re: Seduced and abandoned by yet another attempt at a non-x86 Windows?

For me, Google is still the leader when it comes to offering stuff to developers and then leaving them high and dry later. Apple also has history of abandoning developers at dead-ends with no migration path except “rewrite everything” (OSX CarbonLib?), although it has improved immeasurably if you compare now to its Don Juan attitude in the 1990s, when APIs would be trumpeted as the Second Coming at one year’s WWDC, then completely written out of history by the time the next one arrived (Just off the top of my head I can remember OpenDoc, Dylan, MacApp, OS 8 “Copland”, Appearance Manager, QuickDraw GX... and I may even have contributed my own tiny part toward this too while at Apple)

Returning to Silverlight as the example, at least there had a forward migration path to converting apps into WinRT 8.1 applications - a class-path rename would get you about 75% straight carryover of application code, and Microsoft went to some effort to provide you with workarounds when that didn’t work, which isn’t bad considering to complete platform change underneath. Sure, you lost the opportunity to run in-browser, but by that time using these browser-plugin runtimes was rightly seen as far less efficient, and no more secure, than just downloading and running a signed application.

The issue for users with these changes is that Microsoft’s third-party devs are pragmatic when it comes to the use of their time, and are slow to rewrite apps just so they can say they are using The Latest Library (in contrast to Apple’s cohort who tend to leap over themselves to be first to have rebuilt their app against whatever Apple announces).

That’s not saying that Microsoft’s current clusterfuck with GUI toolkit APIs is an any way okay, mind you. I happen to really like WinRT, or UWP as it’s known these days, (and I especially like its responsiveness on slow CPUs), but I’d be happy to abandon it and use something else that has a long-term future if only MS would just make its damned mind up about what that “something else” should be.

As for x86, Microsoft isn’t to blame for these failures. Intel has very effectively used its position as the dominant supplier of desktop and laptop chips to discourage vendors from producing non-x86 Windows machines. Remember when Windows NT was launched on about five ISAs: Microsoft really wanted to break with “WinTel”, but the hardware builders weren’t so free to do so. Same story today - x86 ISA chips have higher price and higher profit-per-unit than ARM parts, and so the suppliers of these really want hardware vendors to stick with x86. Maybe if AMD applied some of its Ryzen know-how to an ARM ISA part, we’d see real competition, but I wouldn’t hold my breath.

Kristian Walsh Silver badge

Re: Seduced and abandoned by yet another attempt at a non-x86 Windows?

They was indeed a PowerPC build of Windows NT4 Workstation (for devices conforming to the PowerPC Reference Platform, so it won’t work on Apple hardware). I worked for Motorola at the time, and NT was one of the options available with its “PowerStack” PPC workstations..

Microsoft gives tablets some love in latest Windows 11 build

Kristian Walsh Silver badge

The middle finger has been in the emoji set since Unicode 7.0 in 2014, as U+1F595.

The closest you’ll get to “wanker” is Waving Hand, U+1F44B, which has the delightful side-effect of making its recipient thing you were actually saying hello.

(... much like replying “Good For You!” to an email, with the capital letters conveying your real meaning.)

Meet Neptune OS, an attempt to give seL4 a Windows personality transplant

Kristian Walsh Silver badge

"a bad programming interface".

As this is the NT kernel itself, not the Windows user-space, I don’t think Linux supporters should be shouting too loudly about bad interfaces. (x86 and x64 syscall numbering...?)

Actually, even in user-space, my fairly long experience of Linux development would still advise a more conciliatory attitude: once you get away from the core APIs and tools that were shared with Unix, Linux APIs get very inconsistent, and you see the same concepts modelled in different ways in different libraries. Whatever its deficiencies, Windows (and MacOS) user-space libraries are designed to a “house style” and the concepts learned from using one library are helpful when learning others.

Nothing’s perfect, and that inconsistency is the downside of the Linux development model whose large number of semi-autonomous projects with the freedom to do what they want allows fast feature improvement and bugfixes.

Canalys: Foldable shipments could 'exceed 30 million by 2024'

Kristian Walsh Silver badge

The “will” you refer to appears in a sub-clause of that sentence, a sub-clause which functions as the object of the main verb in the sentence, and that verb is “to forecast”. That meaning of that main verb implies that the sub-clause is in fact a conditional statement, despite looking like the simple future tense. (Consider the example: “maybe I’ll go”, or, as written longhand: “it may be that I will go”. That is not a definite statement, despite the apparent use of the future simple “will go”)

So, because the sentence is making a conditional statement, it is entirely correct to summarise it using “could”

…or “might” if you’re really sceptical of the source.

20 years of .NET: Reflecting on Microsoft's not-Java

Kristian Walsh Silver badge

Yeah, only the exact kind of writing code against a supplied API that you happen to be doing is “real” programming.

Kristian Walsh Silver badge

Re: Notably missing in action...

That’s the problem everyone has on Linux. There is no native UI, just a half-dozen pretenders to the title of “Linux UI”.

I have faced it in my own work over the years: you start a project, and you first have to pick a UI library - a task that itself robs you of time, as you try to pick the facts out of various fanboy postings on the web. I’ve often regretted decisions that I made, or had forced on me, at project outset (e.g., the arse-pain of having to wrap existing datatypes in gObject boilerplate just to be able to have custom list-item views* in GTK+, or every single aspect of using Enlightenment).

In the end, I settled on Qt for Linux GUI work, but it took a lot of dead-ends to get there (and did I mention the living hell of having to write a UI in Enlightenment?).

Ironically, Windows suffers the same problem right now, with multiple competing UI kits leaving developers completely at a loss about what they should use for a new project (UWP, MFC/WPF, Win32??? Dammit, just tell me I have to use one, and I’ll use that, but you’re making me learn multiple platforms just to discard all but one of them). At least .Net 6 is fixing that issue.

__

* yes, this GTK+ example is probably over a decade out of date, but I have not forgotten how unpleasant, error-prone and time-consuming a task it was to do something so basic, and as a result, I resolved to never use GTK+ for a new project again. That’s another example of why this problem is such a problem on Linux: because there are so many other options, I bailed out on GTK+, rather than pushing the developers to fix the ball-ache, or writing a utility library to do it for me. As a result, the development of the toolkit slows down, and everyone ultimately loses out. And it’s the same for every library.

IBM looked to reinvigorate its 'dated maternal workforce'

Kristian Walsh Silver badge

Re: The funny thing is they compare themselves to Accenture...

A good point. Accenture is known for its phenomenal churn-rate of young employees, and that keeps their median age artificially low (if you keep losing lots of 23-year-olds before they hit 24, and hire in new 22-year-olds to replace them, it's going to push down your numbers).

A higher median age is normally a sign of a company that people want to stay working with for the long term; so you can’t help but think that IBM is actively trying to make its working environment less hospitable...