* Posts by Kristian Walsh

1290 posts • joined 10 Apr 2007

Page:

Your next laptop will feature 'CMF' technology

Kristian Walsh
Silver badge

Re: Cost vs spec

There's plenty of systems like that already for you. Lenovo, Dell and HP's enterprise lines, to name but a few, have models that are bristling with ports, and where you're paying purely for the components inside. I had an office-special Dell 15" at a previous job - it was pretty dull and boring, but the specs were really good and it would survive a fair amount of abuse, and it could connect to pretty much anything.

It's not a zero-sum game: companies are just addressing buyers who have different priorities to yours; but your attitude to purchases is still the prevalent one. If anything, the higher margins from the style-led systems will make the function-over-form models more affordable.

2
2
Kristian Walsh
Silver badge

Re: Translation :

On the contrary. This focus on aesthetics is because laptops and computers in general have a longer working life than before. There was a time you'd have a new machine every 18-24 months just to keep up with the needs of your software, but now a combination of better silicon and much more remote application hosting means that a typical business user's laptop will see four years or more of use.

It's worth investing a little bit in making something look and feel pleasant if you're going to be looking at it for that long..

Apple isn't the only one with a history of doing this, and it's a mistake to assume that this strategy is only about "high fashion accessories". Gaming brands like Alienware have also had a lot of success investing in styling and non-functional frippery to shift their hardware to their target demographics. But somehow, making a bog-standard PC look like some kind of extra-terrestrial combat equipment doesn't attract the same kind of derision as daring to make a portable computer that a style-conscious woman, or man, would be happy to be seen carrying.

Personally, I'd rather have the burgundy laptop with the fabric inlay, but I'm not going to pour scorn on someone whose gaming rig is their pride and joy.

7
6

Brace yourselves, fanboys. Winter is coming. And the iPhone X can't handle the cold

Kristian Walsh
Silver badge

I miss the days...

... when mobile phones were designed for use in the wintery depths of the Swedish or Finnish countryside

Never heard of a Nokia screen that didn't work in the cold.. they were the first to bring out a phone you could use while wearing thick gloves, something I'm not sure if iPhones can do even now (the use of fingerprint unlocking forces you to de-glove anyway)

12
0

First iPhone X fondlers struggle to admit that Face ID sort of sucks

Kristian Walsh
Silver badge

@coolcity - Re: Same as the Lumia 950

Where Microsoft (or rather Microsoft and Intel) did have face recognition two years ago was in the Surface Pro 4, Surface Book and all subsequent devices. I've seen that system in action, and it's really good - I'd go as far as saying it's faster than fingerprint reading, especially if you've just opened up the laptop.

Lumia's system was "iris recognition". Neither my wife nor I could get it to work. We both wear prescription glasses full-time, though, and I think that's the major factor that separates your experience from ours.

For a phone, I think fingerprint is probably the quickest, simply because you can get a head-start on unlocking the phone as you're picking it up or taking it from a pocket. It's not really a surprise that Apple has ditched an ergonomically good mechanism in favour of a much more complex and less useful one that sounds better right up until you use it... it's their modus operandi these days.

0
0
Kristian Walsh
Silver badge

Re: Same as the Lumia 950

Strictly speaking, the Lumia had iris recognition, not face recognition.

And honestly speaking, it didn't really work. Doubly so if you wear glasses. Even when it did work, the tiny delay between registering and unlocking was far longer than just using a fingerprint sensor, and that's without considering that you can start the sensor's job as you're taking the device out of your pocket.

Some Windows laptops and tablets support a far superior face ID system from Intel which uses a secondary IR-spectrum camera as well as the laptop's front-facing one. In this, it's like Apple's system. Where it differs is that, as far as I have read, the laptop setup works reliably.Hardly a surprise, as Intel and the PC OEMs are using the technology appropriately: you're usually looking straight at a laptop when you want to log in to it, so facial recognition can genuinely save time.

2
0

LTE it snow: Microsoft to punt out LTE-tastic Surface Pro in December

Kristian Walsh
Silver badge

I can't see it either. The Surface division makes money for Microsoft. Sure, it's not the same kind of return on capital as they get from their other divisions, but making and selling "stuff" is always lower margin than selling services or software.

And as well as earning money, the Surface line has done a lot of work to dispel the myth that only Apple can make attractive hardware, and it has given Windows OEMs a benchmark to aim for.

3
1

Footie ballsup: Petition kicks off to fix 'geometrically impossible' street signs

Kristian Walsh
Silver badge

@Stoneshop - Re: 1st world problems.

Actually, the new German train sign (introduced in 1992) is an "interesting" (depending on how interested you are in graphic design and signage, of course ;) ) example of how to change a design without losing recognition. I cannot find a link to the story I read about this now, sadly, so this is a summary:

This new version of this sign replaces a steam-train design ( https://commons.wikimedia.org/wiki/File:Bild_12_-_Unbeschrankter_Bahnübergang,_StVO_DDR_1964.svg ). Replacing this existing diagram posed problems because first, the old signs would remain in place; and second, other countries used (and continue to use) the "steam" version of this sign, and Germany, being in the centre of Western Europe, has a lot of transit traffic on its road network, so whatever the new one was, it had to be compatible with the other, older ones.

As a result, one of the concerns when drawing the new one was that it should not depart significantly from the dark-light patterning of the old-style "steam train" version when viewed from a distance. This is why the train curls rather than being shown side-on or head-on (which is the norm for pedestrian signs directing people to trains): a side-on view of a modern train would look too similar to the existing "trams sharing the road" sign, and a head-on version would be sufficiently different to not be recognisable at distance to drivers who are familiar with the old sign.

If you squint at the new version, and imagine a steam-train you should hopefully see that the roof and pantograph of the train form an area that could be considered the "smoke", and the view of the carriages curling behind look somewhat like the rear standing area a steam engine (I know nearly nothing about steam-trains, so those are definitely the wrong terms).

It's not identical, of course, but it still suggests the outline and pattern of the old version when seen at a distance in peripheral vision, and that was one of the goals for the new design: A driver who's familiar with the old sign will recognise the new one at distance as being "like" the old one. When they get closer, it'll look like a modern train, but the most important part of road signage is the recognition at distance.

0
0
Kristian Walsh
Silver badge

Re: 1st world problems.

If your first impression after seeing the sign was "that car has spun around", then the sign has succeeded.

Signs aren't pictures, they're intended to convey an idea quickly, and unambiguously. Some signs don't seem to much sense if you analyse them, but the "odd" choices are there for a reason: "Food available" shows a spoon-and-fork because the more "obvious" fork and knife looks like a crossed-out fork at a distance, or "no food available". The bellows-camera is like that because no other sign looks like it, and it is recognisable as a camera. Same goes for the "choo-choo" train. The skid is missing tracks because showing two more of them adds visual clutter and obfuscates the meaning.

If you want a genuinely dumb UK road sign that needs changing, I would nominate this: http://www.key.co.uk/img/W/KEY/nt/IC/nt-img20070322130940_101357.jpg

If you said "footpath", you're in for a nasty surprise, but it's an perfectly sane assumption to make. Especially if, as a non-driver, you're someone who has never opened a copy of the Highway Code...

It actually means "No pedestrians", and while it is consistent with the rules of the road signage system, it's inconsistent with how people interpret symbols. There's a reason why the "no left turn" and "no right turn" and "no parking" got diagonal bars through their sign designs, but for some reason, this one escaped.

(Ireland uses a slight variation of the UK signage system, but one in which all round, "prohibition" signs have a diagonal through them to make the meaning clear; here's our "no pedestrians": http://trafficsigns.ie/rus-038/ )

My favourite UK sign-trivia is about the "School children" sign used in the UK: In the original international signage design, this sign depicted an older boy is leading a younger girl ( https://image.shutterstock.com/display_pic_with_logo/60395/60395,1217349674,4/stock-vector-warning-children-on-road-sign-illustration-15493621.jpg ), but the UK version has an older girl leading a younger boy ( https://static.independent.co.uk/s3fs-public/styles/article_small/public/thumbnails/image/2016/05/09/08/children-road-sign-new.jpg ) because Margaret Calvert, who designed the signs, used to accompany her sometimes unwilling younger brother to school. That's about as close as a road-sign designer can get to self-portraiture, I think...

(Many countries now mix and match the "big-sister" and "big-brother" versions of this sign, but the UK exclusively uses young Master Calvert being dragged to class by his big sister...)

10
0

How many times can Microsoft kill Mobile?

Kristian Walsh
Silver badge

Whatever is being denied you can count on the opposite being true.

Well they're doing the opposite of denial here: basically claiming that Windows 10 Mobile is finished, so what to make of that? Certainly, there's still lots of multiply-sourced rumours concerning Microsoft's hardware division, Qualcomm, the Snapdragon 835, and x86 emulation that need to be resolved.

My feeling is that Windows 10 Mobile is "dead" simply because the regular Windows 10 build is going to extend to portable, ARM-based, LTE-connected devices, with third-party Win32 app support through emulation. If it does that, there's really not much else that's unique to the "Mobile" branch of the OS except voice calls, which just an app.

0
1
Kristian Walsh
Silver badge

Re: Andromeda: Apple

iOS and MacOS were totally different GUIs for different kinds of use/screen. They maybe need to rethink the latest versions of MacOS a little and think harder about the big iPads and the surface like big tablet with pen, keyboard and iOS.

I think what you're saying is that Apple has yet to confront the problem Microsoft was trying to solve back in 2012: how to make a device work well on both touch and mouse input. Windows 8 favoured touch to the exclusion of mouse input; Windows 10 treats both more equally.

They superficially look the same, but Windows 10 is not the same UI for mouse and touch. It's about 80% the same, but things like click-targets and pop-up controls change their size and spacing when you're using touch input. It's also actually pretty easy for app developers to do complementary mouse and touch interaction on controls, once they decide that it's something that should be done (hint: fingers can't generate MouseEntered events).

I think Apple would cancel macOS entirely before they'd make it touch-capable. The only value in iOS is its app catalogue, and running those apps on a Mac will either compromise the security sandboxing that iOS guarantees app authors (if Apple still allows root access), or make macOS a similar low-function proposition to ChromeOS (if Apple blocks root access). The latter would kill Apple's sales to all those trendy tech companies; the former could cause some apps to be completely withdrawn from iOS.

0
2

Microsoft's foray into phones was a bumbling, half-hearted fiasco, and Nadella always knew it

Kristian Walsh
Silver badge

Re: "That's not a technical issue, it's something the developers need"

Yes, it's not technical - it's not many developers want to share 30% of their revenues with Microsoft - it's a model that can work for mobe indie developer selling apps for 4.99, won't work for applications sold for hundreds or thousands by not so small companies...

Certainly. There definitely should be a "flat-fee" or at least a capping of that 30% amount. Even $50 a copy would be good value for some of the $600 software packages. That, and finer-grained pricing control would make it possible to get the big-name packages onto the Store.

But, there's also the existing mechanism of putting the software up on the Store as a free download, and selling the activation key through your own channels. This is very close to how Adobe manages its products right now. The software just a client for using an existing subscription.

3
3
Kristian Walsh
Silver badge

Re: And yet

Let's keep some perspective here. Windows 10 S turns into Windows 10 Pro with a mouse-click and reboot. You're prompted to make this change the first time you try something that's not allowed by "S". If you need the features of "full" Windows, that process is not going to be beyond your abilities.

But if you don't need PowerShell, the Linux Subsystem or Visual Studio, then 10 S is the way things should be done on Windows, or any other OS aimed at non-technical people. "Store" applications install and run in isolated containers, which makes it a hell of a lot easier to manage what's on the computer - deleting an application no longer leaves old registry settings, services or libraries hanging around; and you don't have twenty "update assistant" processes all running in the background - the Store manages the updates. No more needing to periodically "nuke" a system just because it's become filled with old software that was never uninstalled properly.

10 S also can't be given that un-removeable manufacturer bloatware that makes home Windows PCs so painful to maintain.

There's no no technical restriction on the vast majority of apps being in the Store, and no legal restriction on FOSS software binaries being there either. Inkscape and VLC are already there; others like LibreOffice (of course) and Gimp are holding out for political reasons ("we don't agree with app stores") rather than doing something that would actually make life better for their users.

Shipping Surface Laptop with Windows 10 S was a way of raising awareness of the existence of Windows 10 S. Unless you work in Education, you or I would not have heard of it except for the Surface Laptop launch.

The only problem with 10 S for the vast majority of users is that the software they might want to install is not yet available from the Store. That's not a technical issue, it's something the developers need to look at. And yes, Microsoft also needs to delete the 80% of what's already on the Store that's fake. Nobody cares about "number of apps" anymore, so it should be about the quality of what's there.

10
17
Kristian Walsh
Silver badge

Re: Awwww shut up and quit your whining.

That strategy assumes that app developers are rational actors. It is not a safe assumption to make.

Famously, the owner of Snap, Inc refused point-blank to port Snapchat to Windows Phone, even when Microsoft themselves offered provided the necessary engineering. When third-party clients appeared on the platform, they were threatened with legal action. For an application that required a wide user-base, the decision was bizarre.

Google also refused access to YouTube via an app on Windows Phones; but at least here there's the explanation of a company engaging in anticompetitive practice to protect its marketshare. The Snapchat decision was purely "I don't like Microsoft".

Now, I have no use for Snapchat, but it's a "must have" app for a lot of users, and its absence from the Nokia/Microsoft phones prevented their combination of low price and smooth performance gaining traction with younger buyers at a time when Android and iOS could not offer this.

16
3

Microsoft Edge shock: Browser opts for Apple WebKit, Google Blink

Kristian Walsh
Silver badge

Re: Seriously.

Arrow was not acquired, it was developed in-house by Microsoft: https://www.microsoft.com/en-us/garage/profiles/arrow/

0
0

Hollywood has savaged enough sci-fi classics – let's hope Dick would dig Blade Runner 2049

Kristian Walsh
Silver badge

Re: For me the moment of Roger Batty's death is extraordinary and deeply moving.

Saw a documentary on this (on one of the many, many, Director's releases), and surprisingly, his version is no shorter than the originally scripted part. The scripted piece was pretty flat and in Hauer's words, "opera speech", so he dumped almost all of it. Ironically, given that "opera" description, it was he who brought in the famous "Tannhäuser Gate" phrase that had been in an early draft of the script but was later cut.

The closing lines, with "... like tears in the rain" were entirely Hauer's, and he got an ovation on set after the take.

5
0
Kristian Walsh
Silver badge

Re: Hollywood being moribund

I know it's a typo, but "Marvell Comics" gave me a smile. You'd need superpowers to get that damn NIC to show up in Linux....

But, on your point, the problem is that studios are businesses, and a business can't drop a quarter of a billion dollars on something that might return their money. The days of releasing a wide range of pictures to catch a cult hit are largely gone in the big studios. Whatever your views on torrenting etc, one thing it has done is dramatically shortened the earning life of the average movie (and mega-budget movies often turn out to be very average movies). Now, a picture has to make back its costs on the first three or four weeks of theatrical release - if it doesn't, disc sales and streaming won't rescue it.

Marvel's endless movies are successful because the generations of Americans who grew up reading those comics provide a ready-made audience for anything they put out. People hear the title and most will know what it's about without the studio having to spend a cent on publicity, so the hundred million or so they do spend has a much greater effect in getting as many people buying tickets in that brief window before the thing shows up on every torrent site.

Streaming TV services are the only place you'll see slow-burners like the original Blade Runner getting commissioned now. The charging model there at least allows a story that doesn't have instant appeal to be commercially viable over time as more people get into it.

6
1

Shock: Brit capital strips Uber of its taxi licence

Kristian Walsh
Silver badge

Re: 40,000 drivers out of work

EGR blanking and rechipping for economy and high NOx is perfectly legal.

No it's not. "You probably won't be caught" doesn't make something legal. If it was perfectly legal, the cars would ship without EGR from the factory.

The car has to be certified as legal to use in the United Kingdom (or any other country with laws). In the UK this is the Vehicle Type Approval, and is separate from your MOT, which is a roadworthiness and safety test. Part of the Type Approval lists the exhaust emissions standard that the vehicle complies with. The current standard is for passenger cars is "Euro 6", and it sets strict limits on NOx emissions among other things. (Which standard you need to comply with depends on when the model was introduced, although if you're Mercedes or Volkswagen, it seems you can bribe your way into complying with the old type approval rules...)

If an owner modifies the emissions controls such that their car is no longer compliant with the emissions standard on its Type Approval documents, it will be illegal to operate that vehicle on public roads. End of story.

Whether you'll be caught and punished is a different matter, but of all the things that come out of a car's exhaust pipe, Nitrogen oxides are the most dangerous to long-term human health. They are the major cause of urban smog, which causes respiratory illness in children and the elderly. You're free to not believe in global warming safe in the knowledge that it'll take decades before you're proven wrong, but NOx pollution is much more immediate and direct in its consequences.

32
2
Kristian Walsh
Silver badge

Why Uber was stripped of its licence

Uber lost its licence for not adequately following up on reports of passenger assault and rape, and not providing evidence of adequately screening drivers for prior violent offences.

It doesn't matter what they charged. It doesn't matter that "Black cabs are too expensive". Uber operated a company where a driver could attack a passenger and get away with it, and so it lost its licence to operate in London.

It doesn't matter that you, personally, never had a rapist driving any time you booked an Uber. It mattered that in the cases when people did, Uber didn't follow up on the police reports, and didn't take action against the drivers.

If someone else started an Uber competitor tomorrow that did everything Uber did, but obeyed the actual laws of the land, properly screened its drivers and co-operated with police investigations of assault on passengers, they would not have their licence revoked by TfL.

The usual suspects are, of course, free to assume that "Uber ignored cases of rape and battery" isn't really a reason to have a licence revoked, and that this is all a smokescreen to stop the "exceptional" people achieving their birthright as rulers of the world.

149
6

Uber sued by Uber for tarnishing the good name of Uber

Kristian Walsh
Silver badge

Re: Trademarks...

Trademarks are granted within domains. Uber-the-dickhead applied for a mark in the domain of "transportation services" (not sure that's the exact name, but they are all quite vague), whereas Uber-the-Floridian applied and got one in the domain of IT.

This happens a lot: For instance, there are two entirely independent companies using the trademark "Kenwood": a UK one, making kitchen equipment, and a Japanese one, making electronics and radio communications equipment. An American butter company also trades freely as "Finlandia", two aisles away from the better-known vodka brand.

Canon EOS cameras are in a different domain to Volkswagen's poor-selling EOS Golf-with-a-metal-roof, so both names were granted. "Dove" is granted to Mars and Unilever for a chocolate bar, and a range of soap products, respectively - to further avoid confusion, Mars uses the "Galaxy" name for the chocolate in countries where the soap is popular, but "Galaxy" is also a trademark of Ford, although not for chocolate... meanwhile "Puma", one of Ford's other names, is also a globally-known brand of athletics gear, and so on. Basically, without domain limitations, there'd be very few good trademarks left by now.

The problems happen when a company that starts in one domain, then grows into another, or when a company gains such a bad rep that it defames the name of everyone else using that name.

Uber is that second type, but a famous example of growing into infringement is Apple: the trademark "Apple" was given to the computer company, despite already being held by a record label (The Beatles' "Apple Corps") because in 1976 (and now), these were different domains of business. In the 1980s, when Apple added audio recording and playback to their products, the record company secured a legal agreement preventing the computer company operating in the music business, which didn't seem like a problem for Apple until they started selling music on iTunes.. In the end, lots of money changed hands, and everyone is sort-of friends now, as evidenced by the availability of all the Beatles' albums on iTunes, but Apple-of-Jobs did need to get permission from Apple-of-Beatles for the right to use the name "Apple" in connection with selling music.

[Trivia: the Mac alert sound, "Sosumi" is a reaction to the original, 1980s-era spat between these two. The name of the somewhat Beatles-like organ chord was originally called "Let it Beep", and then changed to "Sosumi" in reaction to complaints from Apple's Legal department. The developer passed it off as being a Japanese word, but it's actually three English ones ]

1
0

Google sued by Gab over Play Store booting

Kristian Walsh
Silver badge

By calling themselves "conservatives", they're eroding the traditional conservatives' rational ideas: that you should earn your living through work; you should understand why things are as they are, rather than blindly change them; that you should be personally responsible for your actions; that you should be charitable, and that you should actively help to look after the less fortunate in your own community.

[contrast the socialist/liberal position that the state should support those who are not working without question; that many of societies structures are outdated and need to change; that poverty, upbringing and societal deprivation can explain away some crimes; that the state should provide for the less well off, with taxation replacing charity]

The first mistake people make about the "alt-right" is that they're conservatives at all: they're actually radical Libertarians. They don't want to conserve anything: like the radical Left, they want to destroy what's here now (and just like the radical left, they can't come up with anything that wouldn't be at least a thousand times worse that what we've got).

Forget work-ethic and personal responsibility: these dickheads want the whole pie and they want it now; they want to destroy any kind of authority so they can do what they feel like without thought for consequence; and when they screw up, they'll blame "the media/the left/the deep-state/bias/positive discrimination" - basically anyone except their own dumb self. They're the spoilt brats of the 1980s generation that idolised Ayn Rand's philosophy of "I've got mine, so fuck you", but without the benefit their grasping parents had of growing up in modest circumstances, during a period of history where "right on" socially liberal causes were strongly in fashion.

Conservatism is almost the opposite of the "alt-right" in many respects, but in the US, the name "conservative" has been pretty much co-opted by the radical Libertarians, to the point where it's hard to separate them anymore.

(For what it's worth, my own personal politics are left-liberal: socially strongly progressive, but fiscally mildly conservative)

5
0

Unloved Microsoft Edge is much improved – but will anyone use it?

Kristian Walsh
Silver badge

Re: The interface is terrible

Each to their own. I like the fact that there's so little "interface" around the content I'm trying to read. I actually wasn't aware of the F11/Full-screen trick, but I'll be using it more.

As for Chrome, I've pretty much resolved to never install it again now. It's habit of grabbing all my CPU cycles was bad enough, but a couple of days ago, I logged in to the gMail website on a new (Windows) system, and of course I got the standard "we've seen a new sign-in, was it you?" mail afterwards from Google, but now they've added a bit at the top saying "you were using Edge; why not use Chrome instead? Get Chrome here." No thanks; I don't react well to coercion.

Google has become the Old Microsoft: whatever you like or don't like about Edge, it is a standards-compliant browser, as is Safari, as is Firefox.. so why do so many Google services (Meet was this week's example...) tell me that I need to install Google Chrome to use them?

Isn't that behaviour exactly why Microsoft was sued by the U.S. Government ... and lost?

46
4

Five ways Apple can fix the iPhone, but won't

Kristian Walsh
Silver badge

@John Robson- Re: Sound

" one you complained that at the nyquist frequency you can lose signal - which is never in question

For another with a band limited signal there is no loss of phase information from sampling."

But both of those "complaints" are real things.

Okay, the Nyquist frequency is the maximum frequency component that can be reproduced by a sampling system. Obviously this will be 0.5fs, because two samples are required to capture the positive and negative cycles of a sinewave. Anything above that frequency can't be adequately captured. That's Nyquist's Theorem (and Shannon's)

But, and this is the bit I think you've missed, this assumes that the sampling clock and signal component at 0.5fs are phase-coherent. Nyquist describes the theoretical maximum information capture, which requires an assumption about phase.

Consider a signal with only a single sinewave component at exactly 0.5fs of amplitude ±1.0, sampled at fs. If the sampling clock and the signal are in the appropriate phase, the samples will be obtained at the peak and trough of that waveform, resulting in a train of +1.0, -1.0, +1.0, -1.0. Perfectly captured, perfectly reproducible. That's the situation Nyquist described.

But: shift the phase of that signal by 90 degrees. Now the sampling occurs at the zero crossing points, and the output is a train of 0, 0, 0, 0, 0, ... How is that distinguishable from silence? But there was a component at the Nyquist frequency in the input. The thing is that Nyquist's theorem assumes that the phase of the components are compatible with the sampling clock.

90 degrees is the worst case, but at other phase offsets, you lose amplitude accuracy. If you were to shift that input signal to be 45 degrees out of phase with the sampling signal, the signal will be present, but as +0.7071, -0.7071, +0.7071... right frequency, wrong amplitude.

As you shift the phase of an input signal that's close to fs, the recorded amplitude will appear to change - this is loss of information (The amount of error depends on how close your component is to the Nyquist frequency) That is not a controversial or "wrong" position, it's a fundamental property of sampling, and it's the main reason why signals with a 20kHz bandwidth are sampled at 96k and 192k.

I do agree with your other points: final mastering has done a lot to ruin the reputation of CDDA (although lousy DACs that weren't linear to 16-bits did their damage before then), and yes, the differences are marginal at the end. I don't think that the higher rates are very useful in themselves, but rather in the way they give adaptive reproduction equipment more "real" information to work with, so that when they've finished mangling and munging the samples, what's left is still as good as 44.1/16.

1
0
Kristian Walsh
Silver badge

Re: Sound

and demonstrates confusion over things like nyquist frequency, and the accuracy of phase information in a sampled signal...

I can't see how you came to that conclusion - the problem of phase affecting the recorded amplitude is well known and pretty easy to demonstrate, and it is significant when your goal is fidelity of reproduction, rather than simply producing an intelligible signal. Phase differences in high frequencies between the Left and Right signals are the whole reason stereo recording works at all, so it's a very important factor.

But really the point I was making was that there's nothing special about 44.1k / 16 bit, and that without the particular constraints that existed at the time, the industry would have gone with higher bitrates. Particularly, that 44.1k sampling rate was a result of the need to create masters affordably, rather than any solid engineering analysis of the problem. It's telling that every subsequent format has used a base rate of 48kHz or a multiple thereof.

If 44.1k at 16 bits had been "perfection", there wouldn't have been such an immediate jump in bitrates so soon after its introduction. For comparison, it took nearly 30 years for 24-bit RGB to be challenged as a display system; consumer DAT recorders were already at 48kHz less than ten years after the introduction of CD.

I take your point about adjustment to high sound levels, but it's not just the absolute dynamic range, it's the non-linearity of that range, and everyone's hearing is different. It's generally accepted that 16-bit PCM divided over 96 dB (okay, 110 with dithering) isn't quite good enough to deal with the peak sensitivity of the ear. 24 bits is definitely a touch of overkill, but it comes out as a nice multiple of bytes, and gives more headroom for mixing and signal processing that is becoming much more common in reproduction equipment.

Dithering the LSB is simply overlaying a 15-bit PCM signal with a PWM signal - you get the downsides of PWM in exchange for a lower noise floor over a part of your frequency range. It isn't adding any information, just hiding the errors in a different place.

I didn't say a correctly upsampled signal would lose information; I said that common methods of upsampling a signal cause information loss, especially from 44.1 to the 48/96/192 rates.

Now the non-technical advantages:

One of the plus points of high bitrate audio is that it has resulted in better quality DACs. Just as CD's higher dynamic range caused an improvement in the quality of amplifiers... that were then used to play vinyl; so the requirement for affordable parts that can handle "24-bit at 192kHz" results in much better reproduction of the 16-bit at 44.1kHz sources we mostly have. Paradoxically, it's only the spread of high-bitrate audio that allowed people to see how marginal its advantage is.

(another non-technical argument is that because 44.1/16 is so wedded to those metal discs that gets played in anything, such recordings have recently been mastered to utter mush just so that they will sound "loud" on cheap equipment; the other formats tend to escape this last step in the process)

8
0
Kristian Walsh
Silver badge

Re: Sound

"16 bit, 44.1kHz isn't an arbitrary playback standard. It's chosen to match the capabilities of the human ear - the complete capabilities of the perfect human ear."

Nope. It was chosen to match the vertical-blanking insertion period used by 60Hz U-Matic videotape equipment. In the late 1970s, it was the only affordable recording medium with the bandwidth to hold a CD master, so it dictated the sampling rate. ( https://cardinalpeak.com/blog/why-do-cds-use-a-sampling-rate-of-44-1-khz/ )

So, that gives a maximum reproducible frequency of 20.05 kHz. While it's true that few humans can sense audio signals over 20kHz, there are many steps in the chain of reproduction that make 44.1kHz not quite good enough to reproduce the full audio spectrum, especially if you wish to provide a stereo signal.

First off, Before you can get any kind of digital signal, you need to encode it. That means sampling. However, before you sample a signal, you need to remove any signal components whose frequency is too high for you to sample. If you don't do this, you get aliasing, and a worthless digital input (https://en.wikipedia.org/wiki/Aliasing). Thing is, the analogue filters you need to do this removal of un-sampleable signals do not have a perfect on/off response - in effect, if you want a filter that will pass frequencies of, say, 16 kHz, you may also have to allow allowing frequencies as high as 25 kHz through too, because they're still within the tail-end of the filter's "pass band". You can make that cutoff sharper, but it can create "ripples" in your pass-band, and/or allow higher frequencies through again (analogue filter design is a special kind of hell...). But, if you were to raise your sampling rate to 48kHz, then you've got at least 4kHz of headroom above the highest frequency you need to preserve.

Down-converting a multiple of 48kHz to 44.1 kHz is possible, but if it's not done correctly (and it often isn't), it introduces similar artefacts to the aliasing problems during sampling.

The second reason for higher rates is for better preservation of signal phase. The human auditory system uses phase differences between higher-frequency signals to determine spatial positioning of sound source, but phase and amplitude interfere with each other in digital sampling systems as you approach the maximum permitted signal frequency. The extreme case is that a signal with a frequency of half your sampling frequency will not register at all if it is 90 degrees out of phase with the sampling signal (the sampling points would fall on the zero-crossings of the input, so you get 0,0,0,0,0... as your output). With mono, phase isn't usually an issue, which is why most sampling tutorials gloss over it; with stereo, phase accuracy is very important.

The third reason is that most modern replay equipment processes its signal before converting it back to analogue. Equalisation, driver response correction (as used in "direct digital" speakers and headphones), room parameters, delay, noise cancellation and dynamic compression all happen on the digital signal, but all take their toll on the output. If you start with more information, even if that information is not audible, the accumulated errors from DSP will still be in the inaudible part of your signal (you don't get the same benefit by simply "upsampling" to 192KHz/24-bit before processing, because upsampling itself cannot add information; in fact, it removes it).

Finally, your hearing isn't linear, but PCM audio is. 16 bits is about 100 dB of dynamic range, but your hearing has about 130 dB of dynamic range, albeit with a non-linear response. You could use non-linear PCM to extend the same 16 bits over a wider range of amplitudes, but that means non-linear DACs, which are much harder to make than linear ones (and it can increase audible distortion where high-amplitude, but very low frequency, tones are overlaid with higher frequency tones - as often occurs in music). It's easier to just use more bits, and capture the full dynamic range of human hearing.

With lossless coding, high bitrate audio doesn't take very much more space than 44.1/16 (mainly because of the signal is only 0-24 kHz), and as it makes improved reproduction much simpler to implement, there are plenty of reasons to prefer it to 44.1.

49
4

The future of Python: Concurrency devoured, Node.js next on menu

Kristian Walsh
Silver badge

Re: Trivial ?

You could have made that objection without coming across as a condescending git, you know.

I was discussing the type-enforcement features of the language itself: Enumeration types and swich-case illustrate an advantage of time-of-compilation ("static") type knowledge, versus time-of-execution ("dynamic") type knowledge.

As I was talking about the Python language, it's entirely correct to say that there's no enforcement of data types, because the language itself has no concept of expected types for function arguments. And, while you are also entirely correct that the Python runtime enforces datatypes, that's too late for any feature, such as enumerations, that requires compile-time type knowledge.

0
0
Kristian Walsh
Silver badge

Re: Trivial ?

Switch/Case is a code smell, that can be completely done away with. Especially in languages such as Python.

If it's a code smell, it's a smell of good design. Switch/case is designed to enforce small sets of values, especially when coupled with enumeration types. "Languages such as Python" don't enforce types at all (by default; I know about Python 3's type hints), so attempting to enforce values is a little meaningless.

In languages that support it, A switch/case block is telling you something very important about the author's mental model of the code at the time they wrote it. It's saying "At this point in the program, I expect this variable to have one of this limited set of constant values"*

If you're using enumerations, switch/case additionally allows the compiler to do coverage checking for you (with the warning that your switch block doesn't test for all possible cases).

If/elif/else cannot convey that information.

Saying something "can be completely done away with" is not a useful argument. Ultimately, all you need is 'if zero then goto' for any programming language, but filling your code with such constructs strips it of any hint of what the hell you were trying to achieve when you wrote it. There's a strong argument that whole point of having high-level languages in the first place is to capture the intentions of the programmer, because hand-optimised machine code is pretty opaque to a maintainer.

* There are, sadly, exceptions: Swift's switch/case "value bindings" feature ignores the "limited and constant" nature of switch/case in an attempt to be "helpful", and in doing so reduces the structure down to a pretty-printed way of writing if/elseif/else. If you're using "clever" value bindings in Swift, you really should be using if-elseif-else, because all you're doing with value bindings is hiding one kind of test, if, (i.e., "evaluate expression and compare result") within the language structure, switch/case, normally used for a different kind of test.

3
0

Massive iPhone X leak trashes Apple's 10th anniversary circus

Kristian Walsh
Silver badge

Re: Facial recognition

Intel had a much superior face-ID technology to that of the Lumias, and Microsoft used in some of its Surface products. It made use of an additional front-facing infra-red camera beside the existing visible-spectrum one. Having two cameras provided depth perception, while the IR one allowed the system to distinguish between a warm-blooded human face and a wax/plaster/plastic model of one.

I guess there wasn't space for yet another camera in a phone, but it does work much better than relying on a single visible-light camera.

2
0

Node.js forks again – this time it's a war of words over anti-sex-pest codes of conduct

Kristian Walsh
Silver badge

Agree. Neurologically atypical people can unintentionally cause offence by saying the wrong thing, but being told "I was offended by that" usually resolves such issues. And at least in written communications, it's easier for that message to be conveyed; many or the problems around Aspergers particularly are due to an inability to pick up on tonal or facial-expression cues in face-to-face or verbal communications - the things that convey the real meaning to negative responses like "ooo-kay..." and "yeah, thanks for that".

Knowing that a behaviour is offensive to somebody, and then choosing to continue with it in their presence cannot be excused on the basis of being neurodivergent. "Choosing" is the key word; people with Tourettes cannot choose; those with Aspergers most definitely can.

I've worked in software a long time. Long enough to meet many people who'd be described these days as "neurologically atypical". Of those, the percentage of assholes was pretty much in line with the percentage in the "neurological normal" population... If there is really a particular cluster or clusters of neurons that makes someone a dickhead, it's not those ones.

3
0
Kristian Walsh
Silver badge

Re: Anyone read the article?

The complaint is that he acted like a jerk with other contributors, had a history of responding overly aggressively to requests to stop doing so, and tried to dominate discussion through bad behaviour rather than demonstrating better alternatives. (Even by Node's poor standards, he's an outlier)

CoC documents are a bit dumb and cringeworthy, and they do tend to be framed in the language of Identity Politics, rather than starting from the simple rule of treating every other person with basic respect and manners. But that doesn't remove the fact that there are a lot of people out there who do need to be told what manners and civility are.

I will not accept "oppression of neurological minorities" as a counter-argument, as it's grossly offensive to the majority of people with Autism spectrum conditions who are not dickheads, because they have chosen to not be dickheads. (Frankly I find the idea underlying this "defence", that someone with Aspergers or similar cannot tell right from wrong, more insulting)

6
1
Kristian Walsh
Silver badge
Mushroom

Re: Well, I don't know who's done what to whoever

I didn't need a demonstration of poor interpersonal communications skills from steering committee members to make that decision; this four-word description sufficed:

JavaScript on the server.

6
0

US Navy suffers third ship collision this year

Kristian Walsh
Silver badge

Re: What do they all do? @SkippyBang

From the grossly-simplified understanding I have of the laws of maritime navigation say that where there's a danger of collision, isn't it always the vessel that can get out of the way fastest that must change course to avoid the collision?

In that case, wouldn't it be the destroyer that had to move? Tankers aren't known for turning quickly.

15
1

FYI: Web ad fraud looks really bad. Like, really, really bad. Bigly bad

Kristian Walsh
Silver badge

This has nothing to do with ad blockers. It's about ad networks "serving" adverts to automated bots, in order to collect the display fee. The advertiser pays for the ad, the broker (usually Google) gets its fee, the bot operator gets an income for "showing" it... but nobody ever sees it.

It's fraud, but you'll notice that a. the company uniquely placed to turn off the flow of ads to these bot networks doesn't do so, and b. that company gets its cut regardless.

32
1

Don't buy Microsoft Surface gear: 25% will break after 2 years, says Consumer Reports

Kristian Walsh
Silver badge

Re: Updraft102 Not really a "survey"

Apple are the nadir of design and reliability.

If those are his exact words, you could agree wholeheartedly with him, and then point out that a nadir is the lowest point.

On the topic, I had a Surface RT 2 (ARM) whose touchscreen failed after 15 months. Microsoft replaced it, free of charge and without quibble, with a Surface 3 (Intel) on the grounds that the RT2 wasn't made anymore. And they also included a keyboard with the replacement, on the grounds that the keyboard for the RT (which I never told them I owned) wouldn't quite cover the taller screen of the 3.

That said, Apple's customer returns policy is pretty good too - friends of mine have had no-question replacements for dud iPhones, although you do have to go to one of the company's Stores, which is pretty inconvenient if you're not in one of those cities. But that's why people think so highly of them: It's not the absolute reliability, it's how well the company deals with the problems that occur.

Consumer Reports is famous for extrapolating "findings" from wholly inadequate sample sizes. 300 is simply not a big enough sample size for a population of millions, but then to use data on one product to extrapolate to another, later model of a different form factor, is as nonsensical as saying that because some BMW motorbike owners had problems five years ago, you shouldn't buy a new 3-series Hybrid. (How does the reliability of a tablet accurately model the reliability of a laptop?)

2
0

New Amiga to go on sale in late 2017

Kristian Walsh
Silver badge

Re: Just remember... @Mike 16

Thanks for the "inside story" - my info was just picked up from extensive reading, but often there are things that those sources won't write down... :)

The Amiga being a console only makes more sense, in hindsight - Atari Inc already had a 68k system for cabinets before (Marble Madness, Paperboy, etc...), and it fits with the HAM mode being originally a composite video generator... arcade machines were all RGB displays, so there'd have been no need for composite output.

Lots of accounts from the time suggest that Tramiel really did miss out on Amiga, and the attempts to kill it were after he realised how much of a threat it would be. There's also a bit of Amiga Inc deliberately keeping a low profile to avoid being noticed by Atari's new owners.. It was probably animosity toward Commodore, as the only likely buyer, rather than Amiga, that led to "Tramiel" Atari trying so hard to secure injunctions against Amiga, and I wonder if Commodore would have been so keen on Amiga had it not been seen as a poetic way to stab Tramiel.

I didn't intend to claim that the Amiga chipset was based on the 400/800, just that the concepts are very similar - nobody else so fully embraced the idea of displaylists, or the idea that a graphic display mode is a property of the current scanline, rather than the entire field. Those ideas started in the 2600 out of necessity, and the mindset carried over into 400/800, and later Amiga.

JT essentially bought the Atari logo to slap on the designated heir

Absolutely. The complete shift in product focus after ST is amazing in hindsight. Atari, the company that pretty much invented video-gaming, suddently became a business-machine provider: the ST's value propostion was its high-resolution monochrome display, easy porting of DOS software, and cost-effective printers and hard-disks.

Meanwhile, with Amiga, Commodore Business Machines launched what would become the ultimate games machine of the late 1980s.

I wasn't sure of the exact timing of ST, but it was definitely a quick design, and most unusually for the time, it came to market on time and on price - I suppose on that basis, it must have been started before Tramiel took over.

When I studied microprocessor systems, all of the textbook and data-sheet reference designs for 68k were eerily reminiscent of the ST that I had learned to program 68k assembler on, so I suspect that there was a lot of wholesale lifting of reference designs there.

ST was a really good piece of engineering in terms of "most performance for least cost", even if that meant some dated component choices that didn't stand the test of time. Most unforgiveably, ST didn't even have a rudimentary DAC, just that Yamaha square-wave generator that wasn't even up to the capabilities of the 400's POKEY chip. ST did have a rudimentary DMA controller, if I recall, which would have made a PCM audio system in the mould of the original Macintosh's (DMA to a DAC, with an interrupt when the buffer empties) relatively easy to implement.

2
1
Kristian Walsh
Silver badge

Re: I'd say..

Jack Tramiel had already left Commodore before Amiga was launched. Had he been a little easier to get along with, I suppose he may have guided the Amiga product launch at Commodore (Atari's death meant that Amiga Inc pretty much had to go to Commodore for help).

I've a lot of respect for Tramiel - he never bought into the myth of exceptionalism that the other, West Coast computer pioneers did. He was always clear to correct anyone who attributed his business success to anything other than hard work and luck.

Also, as someone who could never have afforded a Macintosh, the ST's pricing made it much more of a "computer for the rest of us" than Apple's Macintosh. It and Amiga opened up oppotunites to learn "commercial" programming that had not been available before.

5
0
Kristian Walsh
Silver badge

Re: Just remember...

ST owner here. My first ever paid development work was a magazine cover-disk game for the ST. A meagre fee, but it's still "for reward"...

The ST is most definitely the newer system. It's a classic story of corporate politics. Before its magnificent implosion, Atari (the Warner subsidiary) had comissioned Jay Miner (developer of the video hardware in both the 2600 and the 400/800 computers) to develop a new chipset for arcade and home use based around a 68000 CPU.

In the mess of the takeover of Atari by Jack Tramiel (recently ousted from Commodore), that contract lapsed, and it seems there was also considerable "anti-Atari" feeling from the new management team too. In fairness, the old Atari had been spectacularly badly run as a business, so this could be justified. In any case, Amiga Inc had lost its customer, so went looking for a new buyer, and Commodore seemed a natural choice, as there were really only a few serious options (Atari, Apple, Commodore - but Apple had just put out their Macintosh, so that just left Commodore)

Anyway, having lost Amiga either through ignorance of its existence, or risk aversion, Atari (the new Tramiel-owned company) then needed a proper 16-bit computer, and fast, so the ST was built in about a year using mostly off-the-shelf components. The Blitter chip was the most complex custom silicon on the ST, but it missed the deadline and got dropped from the launched product, but the underlying graphics library (on the 68000's Line-A trap) in the ST's ROM was clearly designed with this chip in mind.

The ST was a better computer design: it followed through on the 68000's clean architecture to produce a system that was logically arranged and easy to program. The Amiga was far superior as a multimedia machine, but it had some quirks. AmigaOS was a bit too adventurous for a CPU without memory protection (or a way to restart a bus error). I did own an Amiga for a while, but the poor stability of its OS for "work" tasks brought me back to the ST. Amiga, hands down, had the best games.

If you're into the Amiga, it's worth looking into the Atari 2600 and the 400/800. There's a lot of what became Amiga in those two machines (e.g., the 400's ANTIC was the forerunner of the Amiga's Copper display-list processor)

Fun-fact. Amiga's impressive 4096-colour Hold-and-Modify graphics mode was never designed as such: it was instead a relic of abandoned circuitry to directly produce a composite video output, as befitting the product's games console origins. In that mode, you'd hold the chroma signal(s) for two pixel clocks, and modify the luma every pixel clock to produce a full colour display with lower memory requirements, as television video standards all had a higher resolution for the luminance signal than the colour signal. When the video output requirement changed to RGB, the circuitry was repurposed to provide the HAM modes, on the basis that it was already in the chip and might be useful for something..

21
0

Google's macho memo man fired, say reports

Kristian Walsh
Silver badge

Re: No diversity of opinion from "progressive" conformity

I remember 20 odd years ago our (female) science teacher telling us interesting extra-curricular science facts

I hope, for the quality of your education, that what she actually said was "when observing a large sample, the female participants showed this trait slightly more often than the male participants did." Because these differences are statistical, they are small, and the degrees overlap considerably between individual males and females. Reducing any sex-linked personality trait to a binary flag would be an egregious mistake.

I'm male, and have below-average spatial calculation skills, but high linguistic proficiency and positional memory - all of which are supposed to be "female" traits. However, I also have very high logical reasoning scores, which is allegedly a "male" trait. A group of us (mixed sex) did these tests once, and there was nobody who fit the clear-cut "male brain" or "female brain" categories. But I'm pretty sure that had 10,000 people taken the test, there'd be a slight difference by sex. But only slight.

We're in danger of amplifying the noise floor here: ignoring a huge "same" to focus on the tiny "different" it's a common fallacy I've seen in tech people when they move outside of techical decision making (it's not unique to technical people, but it's more common there). That thinking is absolutely the right approach when trying to track down a bug, but not when trying to hire a team. (Over-weighting these small differences between people explains both entrenched misogyny/racism and poor "diversity hires" - the only difference is in the hirer's preconceptions)

3
1

Microsoft Surface laptop: Is this your MacBook Air replacement?

Kristian Walsh
Silver badge

Re: Macbook Air replacement my arse

macos users also have access to a rather large library of open source tools with a wide range of options on how to obtain them from pre-compiled binaries for some to packagers like Homebrew or even bare vanilla compilation from source.

All true, but the implication that this isn't on Windows is out of date. These days Windows 10 supports pretty much the entire Ubuntu userland, which is a much more useful proposition. Once you enable the feature, you can install packages using apt-get, and they're the exact same binaries as Linux uses, so there's no delay for porting and no "missing" packages. Plus, it really is a Linux-compatible userland - for proof: about an hour ago, I used gcc on my Ubuntu-on-Windows 10 desktop to compile up a simple C tool, ran it locally, then scp'd that binary over to a native Linux host, and it ran perfectly there too. Do that on macOS, and I'll be impressed.

I was pretty much an exclusive Mac user for most of the last 20 years. Brew and Ports are nowhere near as easy or up-to-date as you describe, and there are also differences between the BSD core tools and the Linux ones that will bite you from time to time too (for instance, macOS comes with a different implementation of 'grep', with different defaults; same for 'netstat'; and 'route' shares only a name with its linux equivalent). I used the mac as a simple scratchpad to try fragments of shell-script, but it really wasn't compatible enough with a Linux for real prototyping. Windows 10's Linux subsystem still isn't as good as an actual Linux target for everything, but for a surprising number of developer use-cases, it is.

0
1
Kristian Walsh
Silver badge

Re: "But don't you have to pay another £70 to do so?"

The 10S free upgrade is to Windows 10 Pro, not Home.

0
0
Kristian Walsh
Silver badge

Re: Betteridge's law of headlines...

In my case, the answer is "yes". I've a MacBook Air that needs replacement. I'll either retire the Mac, or replace it with one of these. Either way, I won't be getting a MacBook - the new generation Apple keyboards are horrible (but at least the obnoxious light-up branding is finally gone).

The keyboard on this is phenomenally good for a lightweight laptop - I would agree that only the ThinkPad keyboards are better, but I could never warm to the ThinkPads otherwise.

0
2
Kristian Walsh
Silver badge

Re: ... but will it

If they hadn't made such an effort to lock down the BIOS to prevent people installing Linux then I might even have considered buying a Surface at one point.

Er, "such an effort"? Really?

The Surfaces are all pretty standard UEFI systems. Installing Linux on one is trivial; getting everything to work once it boots is not. The problem is getting Linux drivers for the custom hardware (sound, touchscreen, pen, keyboard, webcam), but that's not Microsoft's "efforts" - it's a matter of how helpful the peripheral makers are towards open-source driver maintainers, and that's a complete spectrum from "totally onboard" to "downright hostile".

Driver availability has nothing to do with the BIOS. If you can install and boot Linux on the hardware (which you can), then the BIOS is not "locked down".

So, if you want to continue to blame Microsoft for nVidia or Marvell's sins in respect to Linux, go ahead. But be aware that shouting at the cat is not a good way to stop the dog shitting on your carpet.

4
1

Windows Subsystem for Linux to debut in Windows 10 Fall Creators Update

Kristian Walsh
Silver badge

Re: Reminds me of

for example I can't do a build of the Android system I build on my Linux machine since the build system has some 32bit dependencies

Yes, i've been bitten by that before.. Not Android, but a product we had to build for both 64 and 32-bit systems. Our build system was x64, and while it's theoretically possible to build both target architectures on either platform, it's also theoretically possible to wrap a rope around the moon. We ended up creating a chroot filled with an i386-native toolchain for the 32-bit builds rather than try to fix all the cross-compile bugs in all the packages we used: lots of packages are still in the "building these sources, for this machine only" mindset, and like everyone else, we only had so much time to ship our product.

But I don't see WSL as a replacement for a real Linux system - instead, it's a way of getting some very useful Linux-based server tools running on a Windows machine, to help me develop the front end. It's the same thing as the Mac's BSD underpinnings: the existence of that not-quite-Linux shell in macOS is the reason why so many developers now use Macs (even after excluding the iOS people who have no choice).

The big selling point of Macs for me and other developers was always that Macs had all the "Unix-y tools" built in, while still having all the "working in an office" stuff that one also needed, plus working hardware drivers; WSL gives Windows the same thing, only much better, and it's honestly something that Microsoft should have done back in 2005 or so...

0
1
Kristian Walsh
Silver badge

Re: Reminds me of

There is one enormous difference: when you run 'apt-get' in MobaXterm it looks to the set of packages that have been specially written to work with MobaXterm; when you run 'apt-get' in WSL, it looks to Canonical's Ubuntu repos (but you can add any other by editing your sources file).

Basically, if a tool run in a Linux terminal, that same binary will probably run on WSL(*). To me, that's worth losing X11. (Besides, X11 is reported as working in the current WSL, it's just that Microsoft is not prioritising the fixing of any bugs people report with it)

* some sysfs/proc stuff notwithstanding, but the "unsupported" list keeps getting smaller.

6
4

The life and times of Surface, Microsoft's odds-defying fondleslab

Kristian Walsh
Silver badge

Re: Of course everyone hates the Metro interface!

A bit late to reply, but I've gone the other way to you (about two decades of Mac use, but got a bit tired of Apple's hardware removing connectors I used, and their OS releases assuming that my whole computing life revolved around an iPhone, one of which I have never owned).

With Windows 10, I not only have touch, but also a better bash shell than MacOS did... (unless you specifically want the BSD versions of the core Unix commands, that is; and while I wholeheartedly agree that BSD is a better Unix than Linux in many, many ways, people won't pay me to build software that runs on it, so Linux is what I need...)

Having apt-get alone has convinced me to stay with Windows, rather than go back to Macs. And having now used one recently, I'm pretty much decided on an i5 Surface Laptop as my next hardware purchase (as opposed to a 13" MacBook).

1
0

Confessions of an ebook eater

Kristian Walsh
Silver badge

What's a "Pointer-to-Book"?

First, I know nobody who uses the word "book" to refer only to an electronic document, so the "pbook" nonsense is totally redundant. There are, at this point in history, "books" (paper) and "e-books" (electronically stored and displayed). When paper books do become the tiny minorty (and it's not going to be for decades), we'll coin a term for them; and my money is on that term being the straightforward "paper books", just as "film camera" versus "camera" has replaced "camera" versus "digital camera" since digital photography became the universal technology.

Second, when coining English words, it's generally a good idea to use the consonant patterns of the language. "vlog" was bad enough (no other English word , but "pbook" can only be sounded by a native English speaker as "pook" ...or "book", and I don't know a language where "pb" would be sounded separately. (Say "drop bear" and listen carefully to the sounds you're making. The consonant at end of "drop" will be shared with the start of "bear". Now concentrate on sounding both separately. Awkward, isn't it?)

So, someone decided to coin a new word that's spelled only slightly differently to an existing word that's already universally understood to describe the exact same thing (and with that meaning is one of the thousand most common words in English), but the new word also has the added stupidity that it can only be pronounced in a way that makes it sounds exactly the same as the existing word it's supposed to replace. Bravo!

(and the biggest problem with e-books is that you can't hold multiple pages open with your fingers as you cross-reference an index, the place you were reading, and the description of the term that the place you're reading has just referenced)

3
2
Kristian Walsh
Silver badge

Re: Great article

16:9 panels were chosen for laptops because they were cheaper: the same 13" part could be used in a portable TV.

There's no evidence that they're better for reading text, whether side-by-side or one-up. The problem is that they lack height, and text is invariably written as tall pages of narrow columns. Look at how many websites (Reg included) pad out their layout left and right with the content in a narrow vertical column. There is a scientifically proven reason for this: it's to reduce the amount of horizontal tracking your eye needs to do between one line of text and another (similarly to how newspapers lay out text in narrow columns rather than long lines), so that you can read faster.

Wide displays don't offer any benefit for reading - even if you have two documents side-by-side, you will get to see more of them on a 3:2 or (better) a 4:3 display than you do with any of the widescreen displays. And that's before we look at how every GUI design peels precious vertical space away with toolbars/docks/menubars and window headers...

If there were laptops with 4:3 displays, I'd rave about them, but nobody makes them anymore. Surfaces seem to be the only range that has tried to move away from 16:9 and back towards the more work-friendly 4:3 ratio -- if there are others, I'd be delighted to hear about them.

7
0
Kristian Walsh
Silver badge

Re: Great article

...Sadly, that neat rule true only for the DIN A, B and C series of paper sizes. If you have to deal with the American paper sizes, you've just got to memorize the ratios, because they're all effectively random numbers: Letter is 1.29:1; Legal is just as wide, but taller, so 1.65:1. "Tabloid" is actually just 2 x Letter side by side, but because Letter was a non power-of-two ratio, the ratio of a double-sheet of it is a not-very-obvious 1.54:1... (American business resists metrification because it would cause "confusion"...)

The other neat A-series trick is how easy it is to calculate document weights: A0 has a surface area of exactly one square metre, so sixteen sheets of A4 paper (2^-4, or 1/16th of the area) has a weight that's exactly the gsm thickness of the paper stock. (80g for standard 80gsm copier paper) Try working the equivalent out starting from "pounds per uncut ream"...

Regarding screens, Microsoft's Surface line has the best screen aspect for reading A-series documents: its 3:2 display ratio (i.e., 1.5:1), is as near as dagnabbit the A-series 1.414 : 1 plus a tiny bit for a menu bar.

Hopefully, other manufacturers will follow suit: 16:9 is only good for movies.

20
0

Google goes home to Cali to overturn Canada's worldwide search result ban

Kristian Walsh
Silver badge

Re: Internet governance

One thing: "free speech" is the right to call the head of your particular government a sack of shit without them sending the police to your house in the middle of the night to spirit you away to a salt mine. It's a right of free expression, without threat of government reprisal. It is not a carte blanche to say whatever you want to whomever you want without having to deal with the consequences. If you call the guy in the bar a sack of shit, and he rightly punches you in the face for the insult, "free speech" is not a valid defence for your provocation.

A second example, which seems to be a problematical topic for the EFF and Google: if you "find" a movie/book/router design that someone else made online and you then make it available to other people as if you had been the given the right to do so by its owner, that's not "free speech" either: it's fraud, just as if I found a roll of ticket paper and used it to try to sell fake subway tickets.

Ultimately, all this "internet freedom" talk is just so much diversionary hogwash. Bluntly, Google doesn't want to police its index because it's an expensive thing to do, and as a monopoly, Google gains no value from efforts like this to make its product "better". Pretty much everyone already has to advertise via Google now, so any investment that doesn't directly shore up that monopoly is wasted spending. Every monopoly behaves like this eventually, and that, in a nutshell, is why monopolies are so bad for customers...

5
1

HMS Frigatey Mcfrigateface given her official name

Kristian Walsh
Silver badge

Re: great names like Revenge, Glorious, Implacable etc.

"Puncher" - not a good name for a rigid inflatable, though.

8
0

Good news: Samsung's Tizen no longer worst code ever. Bad news: It's still pretty awful

Kristian Walsh
Silver badge

Re: NaN

Just to state it clearly, the NaN test exemption doesn't apply to the code fragment shown in the article, as the expression shown in that code, "(x>x)", is never true for any type or value of x, NaN or otherwise*

* unless you overload operator>() to do something unrelated to testing for greater magnitude... In whcih case, I will find you and I will do bad things to you.

3
0

Page:

Forums

Biting the hand that feeds IT © 1998–2017