* Posts by Kristian Walsh

1274 posts • joined 10 Apr 2007

Page:

Shock: Brit capital strips Uber of its taxi licence

Kristian Walsh
Silver badge

Re: 40,000 drivers out of work

EGR blanking and rechipping for economy and high NOx is perfectly legal.

No it's not. "You probably won't be caught" doesn't make something legal. If it was perfectly legal, the cars would ship without EGR from the factory.

The car has to be certified as legal to use in the United Kingdom (or any other country with laws). In the UK this is the Vehicle Type Approval, and is separate from your MOT, which is a roadworthiness and safety test. Part of the Type Approval lists the exhaust emissions standard that the vehicle complies with. The current standard is for passenger cars is "Euro 6", and it sets strict limits on NOx emissions among other things. (Which standard you need to comply with depends on when the model was introduced, although if you're Mercedes or Volkswagen, it seems you can bribe your way into complying with the old type approval rules...)

If an owner modifies the emissions controls such that their car is no longer compliant with the emissions standard on its Type Approval documents, it will be illegal to operate that vehicle on public roads. End of story.

Whether you'll be caught and punished is a different matter, but of all the things that come out of a car's exhaust pipe, Nitrogen oxides are the most dangerous to long-term human health. They are the major cause of urban smog, which causes respiratory illness in children and the elderly. You're free to not believe in global warming safe in the knowledge that it'll take decades before you're proven wrong, but NOx pollution is much more immediate and direct in its consequences.

30
2
Kristian Walsh
Silver badge

Why Uber was stripped of its licence

Uber lost its licence for not adequately following up on reports of passenger assault and rape, and not providing evidence of adequately screening drivers for prior violent offences.

It doesn't matter what they charged. It doesn't matter that "Black cabs are too expensive". Uber operated a company where a driver could attack a passenger and get away with it, and so it lost its licence to operate in London.

It doesn't matter that you, personally, never had a rapist driving any time you booked an Uber. It mattered that in the cases when people did, Uber didn't follow up on the police reports, and didn't take action against the drivers.

If someone else started an Uber competitor tomorrow that did everything Uber did, but obeyed the actual laws of the land, properly screened its drivers and co-operated with police investigations of assault on passengers, they would not have their licence revoked by TfL.

The usual suspects are, of course, free to assume that "Uber ignored cases of rape and battery" isn't really a reason to have a licence revoked, and that this is all a smokescreen to stop the "exceptional" people achieving their birthright as rulers of the world.

146
6

Uber sued by Uber for tarnishing the good name of Uber

Kristian Walsh
Silver badge

Re: Trademarks...

Trademarks are granted within domains. Uber-the-dickhead applied for a mark in the domain of "transportation services" (not sure that's the exact name, but they are all quite vague), whereas Uber-the-Floridian applied and got one in the domain of IT.

This happens a lot: For instance, there are two entirely independent companies using the trademark "Kenwood": a UK one, making kitchen equipment, and a Japanese one, making electronics and radio communications equipment. An American butter company also trades freely as "Finlandia", two aisles away from the better-known vodka brand.

Canon EOS cameras are in a different domain to Volkswagen's poor-selling EOS Golf-with-a-metal-roof, so both names were granted. "Dove" is granted to Mars and Unilever for a chocolate bar, and a range of soap products, respectively - to further avoid confusion, Mars uses the "Galaxy" name for the chocolate in countries where the soap is popular, but "Galaxy" is also a trademark of Ford, although not for chocolate... meanwhile "Puma", one of Ford's other names, is also a globally-known brand of athletics gear, and so on. Basically, without domain limitations, there'd be very few good trademarks left by now.

The problems happen when a company that starts in one domain, then grows into another, or when a company gains such a bad rep that it defames the name of everyone else using that name.

Uber is that second type, but a famous example of growing into infringement is Apple: the trademark "Apple" was given to the computer company, despite already being held by a record label (The Beatles' "Apple Corps") because in 1976 (and now), these were different domains of business. In the 1980s, when Apple added audio recording and playback to their products, the record company secured a legal agreement preventing the computer company operating in the music business, which didn't seem like a problem for Apple until they started selling music on iTunes.. In the end, lots of money changed hands, and everyone is sort-of friends now, as evidenced by the availability of all the Beatles' albums on iTunes, but Apple-of-Jobs did need to get permission from Apple-of-Beatles for the right to use the name "Apple" in connection with selling music.

[Trivia: the Mac alert sound, "Sosumi" is a reaction to the original, 1980s-era spat between these two. The name of the somewhat Beatles-like organ chord was originally called "Let it Beep", and then changed to "Sosumi" in reaction to complaints from Apple's Legal department. The developer passed it off as being a Japanese word, but it's actually three English ones ]

0
0

Google sued by Gab over Play Store booting

Kristian Walsh
Silver badge

By calling themselves "conservatives", they're eroding the traditional conservatives' rational ideas: that you should earn your living through work; you should understand why things are as they are, rather than blindly change them; that you should be personally responsible for your actions; that you should be charitable, and that you should actively help to look after the less fortunate in your own community.

[contrast the socialist/liberal position that the state should support those who are not working without question; that many of societies structures are outdated and need to change; that poverty, upbringing and societal deprivation can explain away some crimes; that the state should provide for the less well off, with taxation replacing charity]

The first mistake people make about the "alt-right" is that they're conservatives at all: they're actually radical Libertarians. They don't want to conserve anything: like the radical Left, they want to destroy what's here now (and just like the radical left, they can't come up with anything that wouldn't be at least a thousand times worse that what we've got).

Forget work-ethic and personal responsibility: these dickheads want the whole pie and they want it now; they want to destroy any kind of authority so they can do what they feel like without thought for consequence; and when they screw up, they'll blame "the media/the left/the deep-state/bias/positive discrimination" - basically anyone except their own dumb self. They're the spoilt brats of the 1980s generation that idolised Ayn Rand's philosophy of "I've got mine, so fuck you", but without the benefit their grasping parents had of growing up in modest circumstances, during a period of history where "right on" socially liberal causes were strongly in fashion.

Conservatism is almost the opposite of the "alt-right" in many respects, but in the US, the name "conservative" has been pretty much co-opted by the radical Libertarians, to the point where it's hard to separate them anymore.

(For what it's worth, my own personal politics are left-liberal: socially strongly progressive, but fiscally mildly conservative)

5
0

Unloved Microsoft Edge is much improved – but will anyone use it?

Kristian Walsh
Silver badge

Re: The interface is terrible

Each to their own. I like the fact that there's so little "interface" around the content I'm trying to read. I actually wasn't aware of the F11/Full-screen trick, but I'll be using it more.

As for Chrome, I've pretty much resolved to never install it again now. It's habit of grabbing all my CPU cycles was bad enough, but a couple of days ago, I logged in to the gMail website on a new (Windows) system, and of course I got the standard "we've seen a new sign-in, was it you?" mail afterwards from Google, but now they've added a bit at the top saying "you were using Edge; why not use Chrome instead? Get Chrome here." No thanks; I don't react well to coercion.

Google has become the Old Microsoft: whatever you like or don't like about Edge, it is a standards-compliant browser, as is Safari, as is Firefox.. so why do so many Google services (Meet was this week's example...) tell me that I need to install Google Chrome to use them?

Isn't that behaviour exactly why Microsoft was sued by the U.S. Government ... and lost?

45
4

Five ways Apple can fix the iPhone, but won't

Kristian Walsh
Silver badge

@John Robson- Re: Sound

" one you complained that at the nyquist frequency you can lose signal - which is never in question

For another with a band limited signal there is no loss of phase information from sampling."

But both of those "complaints" are real things.

Okay, the Nyquist frequency is the maximum frequency component that can be reproduced by a sampling system. Obviously this will be 0.5fs, because two samples are required to capture the positive and negative cycles of a sinewave. Anything above that frequency can't be adequately captured. That's Nyquist's Theorem (and Shannon's)

But, and this is the bit I think you've missed, this assumes that the sampling clock and signal component at 0.5fs are phase-coherent. Nyquist describes the theoretical maximum information capture, which requires an assumption about phase.

Consider a signal with only a single sinewave component at exactly 0.5fs of amplitude ±1.0, sampled at fs. If the sampling clock and the signal are in the appropriate phase, the samples will be obtained at the peak and trough of that waveform, resulting in a train of +1.0, -1.0, +1.0, -1.0. Perfectly captured, perfectly reproducible. That's the situation Nyquist described.

But: shift the phase of that signal by 90 degrees. Now the sampling occurs at the zero crossing points, and the output is a train of 0, 0, 0, 0, 0, ... How is that distinguishable from silence? But there was a component at the Nyquist frequency in the input. The thing is that Nyquist's theorem assumes that the phase of the components are compatible with the sampling clock.

90 degrees is the worst case, but at other phase offsets, you lose amplitude accuracy. If you were to shift that input signal to be 45 degrees out of phase with the sampling signal, the signal will be present, but as +0.7071, -0.7071, +0.7071... right frequency, wrong amplitude.

As you shift the phase of an input signal that's close to fs, the recorded amplitude will appear to change - this is loss of information (The amount of error depends on how close your component is to the Nyquist frequency) That is not a controversial or "wrong" position, it's a fundamental property of sampling, and it's the main reason why signals with a 20kHz bandwidth are sampled at 96k and 192k.

I do agree with your other points: final mastering has done a lot to ruin the reputation of CDDA (although lousy DACs that weren't linear to 16-bits did their damage before then), and yes, the differences are marginal at the end. I don't think that the higher rates are very useful in themselves, but rather in the way they give adaptive reproduction equipment more "real" information to work with, so that when they've finished mangling and munging the samples, what's left is still as good as 44.1/16.

1
0
Kristian Walsh
Silver badge

Re: Sound

and demonstrates confusion over things like nyquist frequency, and the accuracy of phase information in a sampled signal...

I can't see how you came to that conclusion - the problem of phase affecting the recorded amplitude is well known and pretty easy to demonstrate, and it is significant when your goal is fidelity of reproduction, rather than simply producing an intelligible signal. Phase differences in high frequencies between the Left and Right signals are the whole reason stereo recording works at all, so it's a very important factor.

But really the point I was making was that there's nothing special about 44.1k / 16 bit, and that without the particular constraints that existed at the time, the industry would have gone with higher bitrates. Particularly, that 44.1k sampling rate was a result of the need to create masters affordably, rather than any solid engineering analysis of the problem. It's telling that every subsequent format has used a base rate of 48kHz or a multiple thereof.

If 44.1k at 16 bits had been "perfection", there wouldn't have been such an immediate jump in bitrates so soon after its introduction. For comparison, it took nearly 30 years for 24-bit RGB to be challenged as a display system; consumer DAT recorders were already at 48kHz less than ten years after the introduction of CD.

I take your point about adjustment to high sound levels, but it's not just the absolute dynamic range, it's the non-linearity of that range, and everyone's hearing is different. It's generally accepted that 16-bit PCM divided over 96 dB (okay, 110 with dithering) isn't quite good enough to deal with the peak sensitivity of the ear. 24 bits is definitely a touch of overkill, but it comes out as a nice multiple of bytes, and gives more headroom for mixing and signal processing that is becoming much more common in reproduction equipment.

Dithering the LSB is simply overlaying a 15-bit PCM signal with a PWM signal - you get the downsides of PWM in exchange for a lower noise floor over a part of your frequency range. It isn't adding any information, just hiding the errors in a different place.

I didn't say a correctly upsampled signal would lose information; I said that common methods of upsampling a signal cause information loss, especially from 44.1 to the 48/96/192 rates.

Now the non-technical advantages:

One of the plus points of high bitrate audio is that it has resulted in better quality DACs. Just as CD's higher dynamic range caused an improvement in the quality of amplifiers... that were then used to play vinyl; so the requirement for affordable parts that can handle "24-bit at 192kHz" results in much better reproduction of the 16-bit at 44.1kHz sources we mostly have. Paradoxically, it's only the spread of high-bitrate audio that allowed people to see how marginal its advantage is.

(another non-technical argument is that because 44.1/16 is so wedded to those metal discs that gets played in anything, such recordings have recently been mastered to utter mush just so that they will sound "loud" on cheap equipment; the other formats tend to escape this last step in the process)

8
0
Kristian Walsh
Silver badge

Re: Sound

"16 bit, 44.1kHz isn't an arbitrary playback standard. It's chosen to match the capabilities of the human ear - the complete capabilities of the perfect human ear."

Nope. It was chosen to match the vertical-blanking insertion period used by 60Hz U-Matic videotape equipment. In the late 1970s, it was the only affordable recording medium with the bandwidth to hold a CD master, so it dictated the sampling rate. ( https://cardinalpeak.com/blog/why-do-cds-use-a-sampling-rate-of-44-1-khz/ )

So, that gives a maximum reproducible frequency of 20.05 kHz. While it's true that few humans can sense audio signals over 20kHz, there are many steps in the chain of reproduction that make 44.1kHz not quite good enough to reproduce the full audio spectrum, especially if you wish to provide a stereo signal.

First off, Before you can get any kind of digital signal, you need to encode it. That means sampling. However, before you sample a signal, you need to remove any signal components whose frequency is too high for you to sample. If you don't do this, you get aliasing, and a worthless digital input (https://en.wikipedia.org/wiki/Aliasing). Thing is, the analogue filters you need to do this removal of un-sampleable signals do not have a perfect on/off response - in effect, if you want a filter that will pass frequencies of, say, 16 kHz, you may also have to allow allowing frequencies as high as 25 kHz through too, because they're still within the tail-end of the filter's "pass band". You can make that cutoff sharper, but it can create "ripples" in your pass-band, and/or allow higher frequencies through again (analogue filter design is a special kind of hell...). But, if you were to raise your sampling rate to 48kHz, then you've got at least 4kHz of headroom above the highest frequency you need to preserve.

Down-converting a multiple of 48kHz to 44.1 kHz is possible, but if it's not done correctly (and it often isn't), it introduces similar artefacts to the aliasing problems during sampling.

The second reason for higher rates is for better preservation of signal phase. The human auditory system uses phase differences between higher-frequency signals to determine spatial positioning of sound source, but phase and amplitude interfere with each other in digital sampling systems as you approach the maximum permitted signal frequency. The extreme case is that a signal with a frequency of half your sampling frequency will not register at all if it is 90 degrees out of phase with the sampling signal (the sampling points would fall on the zero-crossings of the input, so you get 0,0,0,0,0... as your output). With mono, phase isn't usually an issue, which is why most sampling tutorials gloss over it; with stereo, phase accuracy is very important.

The third reason is that most modern replay equipment processes its signal before converting it back to analogue. Equalisation, driver response correction (as used in "direct digital" speakers and headphones), room parameters, delay, noise cancellation and dynamic compression all happen on the digital signal, but all take their toll on the output. If you start with more information, even if that information is not audible, the accumulated errors from DSP will still be in the inaudible part of your signal (you don't get the same benefit by simply "upsampling" to 192KHz/24-bit before processing, because upsampling itself cannot add information; in fact, it removes it).

Finally, your hearing isn't linear, but PCM audio is. 16 bits is about 100 dB of dynamic range, but your hearing has about 130 dB of dynamic range, albeit with a non-linear response. You could use non-linear PCM to extend the same 16 bits over a wider range of amplitudes, but that means non-linear DACs, which are much harder to make than linear ones (and it can increase audible distortion where high-amplitude, but very low frequency, tones are overlaid with higher frequency tones - as often occurs in music). It's easier to just use more bits, and capture the full dynamic range of human hearing.

With lossless coding, high bitrate audio doesn't take very much more space than 44.1/16 (mainly because of the signal is only 0-24 kHz), and as it makes improved reproduction much simpler to implement, there are plenty of reasons to prefer it to 44.1.

49
4

The future of Python: Concurrency devoured, Node.js next on menu

Kristian Walsh
Silver badge

Re: Trivial ?

You could have made that objection without coming across as a condescending git, you know.

I was discussing the type-enforcement features of the language itself: Enumeration types and swich-case illustrate an advantage of time-of-compilation ("static") type knowledge, versus time-of-execution ("dynamic") type knowledge.

As I was talking about the Python language, it's entirely correct to say that there's no enforcement of data types, because the language itself has no concept of expected types for function arguments. And, while you are also entirely correct that the Python runtime enforces datatypes, that's too late for any feature, such as enumerations, that requires compile-time type knowledge.

0
0
Kristian Walsh
Silver badge

Re: Trivial ?

Switch/Case is a code smell, that can be completely done away with. Especially in languages such as Python.

If it's a code smell, it's a smell of good design. Switch/case is designed to enforce small sets of values, especially when coupled with enumeration types. "Languages such as Python" don't enforce types at all (by default; I know about Python 3's type hints), so attempting to enforce values is a little meaningless.

In languages that support it, A switch/case block is telling you something very important about the author's mental model of the code at the time they wrote it. It's saying "At this point in the program, I expect this variable to have one of this limited set of constant values"*

If you're using enumerations, switch/case additionally allows the compiler to do coverage checking for you (with the warning that your switch block doesn't test for all possible cases).

If/elif/else cannot convey that information.

Saying something "can be completely done away with" is not a useful argument. Ultimately, all you need is 'if zero then goto' for any programming language, but filling your code with such constructs strips it of any hint of what the hell you were trying to achieve when you wrote it. There's a strong argument that whole point of having high-level languages in the first place is to capture the intentions of the programmer, because hand-optimised machine code is pretty opaque to a maintainer.

* There are, sadly, exceptions: Swift's switch/case "value bindings" feature ignores the "limited and constant" nature of switch/case in an attempt to be "helpful", and in doing so reduces the structure down to a pretty-printed way of writing if/elseif/else. If you're using "clever" value bindings in Swift, you really should be using if-elseif-else, because all you're doing with value bindings is hiding one kind of test, if, (i.e., "evaluate expression and compare result") within the language structure, switch/case, normally used for a different kind of test.

3
0

Massive iPhone X leak trashes Apple's 10th anniversary circus

Kristian Walsh
Silver badge

Re: Facial recognition

Intel had a much superior face-ID technology to that of the Lumias, and Microsoft used in some of its Surface products. It made use of an additional front-facing infra-red camera beside the existing visible-spectrum one. Having two cameras provided depth perception, while the IR one allowed the system to distinguish between a warm-blooded human face and a wax/plaster/plastic model of one.

I guess there wasn't space for yet another camera in a phone, but it does work much better than relying on a single visible-light camera.

2
0

Node.js forks again – this time it's a war of words over anti-sex-pest codes of conduct

Kristian Walsh
Silver badge

Agree. Neurologically atypical people can unintentionally cause offence by saying the wrong thing, but being told "I was offended by that" usually resolves such issues. And at least in written communications, it's easier for that message to be conveyed; many or the problems around Aspergers particularly are due to an inability to pick up on tonal or facial-expression cues in face-to-face or verbal communications - the things that convey the real meaning to negative responses like "ooo-kay..." and "yeah, thanks for that".

Knowing that a behaviour is offensive to somebody, and then choosing to continue with it in their presence cannot be excused on the basis of being neurodivergent. "Choosing" is the key word; people with Tourettes cannot choose; those with Aspergers most definitely can.

I've worked in software a long time. Long enough to meet many people who'd be described these days as "neurologically atypical". Of those, the percentage of assholes was pretty much in line with the percentage in the "neurological normal" population... If there is really a particular cluster or clusters of neurons that makes someone a dickhead, it's not those ones.

3
0
Kristian Walsh
Silver badge

Re: Anyone read the article?

The complaint is that he acted like a jerk with other contributors, had a history of responding overly aggressively to requests to stop doing so, and tried to dominate discussion through bad behaviour rather than demonstrating better alternatives. (Even by Node's poor standards, he's an outlier)

CoC documents are a bit dumb and cringeworthy, and they do tend to be framed in the language of Identity Politics, rather than starting from the simple rule of treating every other person with basic respect and manners. But that doesn't remove the fact that there are a lot of people out there who do need to be told what manners and civility are.

I will not accept "oppression of neurological minorities" as a counter-argument, as it's grossly offensive to the majority of people with Autism spectrum conditions who are not dickheads, because they have chosen to not be dickheads. (Frankly I find the idea underlying this "defence", that someone with Aspergers or similar cannot tell right from wrong, more insulting)

6
1
Kristian Walsh
Silver badge
Mushroom

Re: Well, I don't know who's done what to whoever

I didn't need a demonstration of poor interpersonal communications skills from steering committee members to make that decision; this four-word description sufficed:

JavaScript on the server.

5
0

US Navy suffers third ship collision this year

Kristian Walsh
Silver badge

Re: What do they all do? @SkippyBang

From the grossly-simplified understanding I have of the laws of maritime navigation say that where there's a danger of collision, isn't it always the vessel that can get out of the way fastest that must change course to avoid the collision?

In that case, wouldn't it be the destroyer that had to move? Tankers aren't known for turning quickly.

15
1

FYI: Web ad fraud looks really bad. Like, really, really bad. Bigly bad

Kristian Walsh
Silver badge

This has nothing to do with ad blockers. It's about ad networks "serving" adverts to automated bots, in order to collect the display fee. The advertiser pays for the ad, the broker (usually Google) gets its fee, the bot operator gets an income for "showing" it... but nobody ever sees it.

It's fraud, but you'll notice that a. the company uniquely placed to turn off the flow of ads to these bot networks doesn't do so, and b. that company gets its cut regardless.

32
1

Don't buy Microsoft Surface gear: 25% will break after 2 years, says Consumer Reports

Kristian Walsh
Silver badge

Re: Updraft102 Not really a "survey"

Apple are the nadir of design and reliability.

If those are his exact words, you could agree wholeheartedly with him, and then point out that a nadir is the lowest point.

On the topic, I had a Surface RT 2 (ARM) whose touchscreen failed after 15 months. Microsoft replaced it, free of charge and without quibble, with a Surface 3 (Intel) on the grounds that the RT2 wasn't made anymore. And they also included a keyboard with the replacement, on the grounds that the keyboard for the RT (which I never told them I owned) wouldn't quite cover the taller screen of the 3.

That said, Apple's customer returns policy is pretty good too - friends of mine have had no-question replacements for dud iPhones, although you do have to go to one of the company's Stores, which is pretty inconvenient if you're not in one of those cities. But that's why people think so highly of them: It's not the absolute reliability, it's how well the company deals with the problems that occur.

Consumer Reports is famous for extrapolating "findings" from wholly inadequate sample sizes. 300 is simply not a big enough sample size for a population of millions, but then to use data on one product to extrapolate to another, later model of a different form factor, is as nonsensical as saying that because some BMW motorbike owners had problems five years ago, you shouldn't buy a new 3-series Hybrid. (How does the reliability of a tablet accurately model the reliability of a laptop?)

2
0

New Amiga to go on sale in late 2017

Kristian Walsh
Silver badge

Re: Just remember... @Mike 16

Thanks for the "inside story" - my info was just picked up from extensive reading, but often there are things that those sources won't write down... :)

The Amiga being a console only makes more sense, in hindsight - Atari Inc already had a 68k system for cabinets before (Marble Madness, Paperboy, etc...), and it fits with the HAM mode being originally a composite video generator... arcade machines were all RGB displays, so there'd have been no need for composite output.

Lots of accounts from the time suggest that Tramiel really did miss out on Amiga, and the attempts to kill it were after he realised how much of a threat it would be. There's also a bit of Amiga Inc deliberately keeping a low profile to avoid being noticed by Atari's new owners.. It was probably animosity toward Commodore, as the only likely buyer, rather than Amiga, that led to "Tramiel" Atari trying so hard to secure injunctions against Amiga, and I wonder if Commodore would have been so keen on Amiga had it not been seen as a poetic way to stab Tramiel.

I didn't intend to claim that the Amiga chipset was based on the 400/800, just that the concepts are very similar - nobody else so fully embraced the idea of displaylists, or the idea that a graphic display mode is a property of the current scanline, rather than the entire field. Those ideas started in the 2600 out of necessity, and the mindset carried over into 400/800, and later Amiga.

JT essentially bought the Atari logo to slap on the designated heir

Absolutely. The complete shift in product focus after ST is amazing in hindsight. Atari, the company that pretty much invented video-gaming, suddently became a business-machine provider: the ST's value propostion was its high-resolution monochrome display, easy porting of DOS software, and cost-effective printers and hard-disks.

Meanwhile, with Amiga, Commodore Business Machines launched what would become the ultimate games machine of the late 1980s.

I wasn't sure of the exact timing of ST, but it was definitely a quick design, and most unusually for the time, it came to market on time and on price - I suppose on that basis, it must have been started before Tramiel took over.

When I studied microprocessor systems, all of the textbook and data-sheet reference designs for 68k were eerily reminiscent of the ST that I had learned to program 68k assembler on, so I suspect that there was a lot of wholesale lifting of reference designs there.

ST was a really good piece of engineering in terms of "most performance for least cost", even if that meant some dated component choices that didn't stand the test of time. Most unforgiveably, ST didn't even have a rudimentary DAC, just that Yamaha square-wave generator that wasn't even up to the capabilities of the 400's POKEY chip. ST did have a rudimentary DMA controller, if I recall, which would have made a PCM audio system in the mould of the original Macintosh's (DMA to a DAC, with an interrupt when the buffer empties) relatively easy to implement.

2
1
Kristian Walsh
Silver badge

Re: I'd say..

Jack Tramiel had already left Commodore before Amiga was launched. Had he been a little easier to get along with, I suppose he may have guided the Amiga product launch at Commodore (Atari's death meant that Amiga Inc pretty much had to go to Commodore for help).

I've a lot of respect for Tramiel - he never bought into the myth of exceptionalism that the other, West Coast computer pioneers did. He was always clear to correct anyone who attributed his business success to anything other than hard work and luck.

Also, as someone who could never have afforded a Macintosh, the ST's pricing made it much more of a "computer for the rest of us" than Apple's Macintosh. It and Amiga opened up oppotunites to learn "commercial" programming that had not been available before.

5
0
Kristian Walsh
Silver badge

Re: Just remember...

ST owner here. My first ever paid development work was a magazine cover-disk game for the ST. A meagre fee, but it's still "for reward"...

The ST is most definitely the newer system. It's a classic story of corporate politics. Before its magnificent implosion, Atari (the Warner subsidiary) had comissioned Jay Miner (developer of the video hardware in both the 2600 and the 400/800 computers) to develop a new chipset for arcade and home use based around a 68000 CPU.

In the mess of the takeover of Atari by Jack Tramiel (recently ousted from Commodore), that contract lapsed, and it seems there was also considerable "anti-Atari" feeling from the new management team too. In fairness, the old Atari had been spectacularly badly run as a business, so this could be justified. In any case, Amiga Inc had lost its customer, so went looking for a new buyer, and Commodore seemed a natural choice, as there were really only a few serious options (Atari, Apple, Commodore - but Apple had just put out their Macintosh, so that just left Commodore)

Anyway, having lost Amiga either through ignorance of its existence, or risk aversion, Atari (the new Tramiel-owned company) then needed a proper 16-bit computer, and fast, so the ST was built in about a year using mostly off-the-shelf components. The Blitter chip was the most complex custom silicon on the ST, but it missed the deadline and got dropped from the launched product, but the underlying graphics library (on the 68000's Line-A trap) in the ST's ROM was clearly designed with this chip in mind.

The ST was a better computer design: it followed through on the 68000's clean architecture to produce a system that was logically arranged and easy to program. The Amiga was far superior as a multimedia machine, but it had some quirks. AmigaOS was a bit too adventurous for a CPU without memory protection (or a way to restart a bus error). I did own an Amiga for a while, but the poor stability of its OS for "work" tasks brought me back to the ST. Amiga, hands down, had the best games.

If you're into the Amiga, it's worth looking into the Atari 2600 and the 400/800. There's a lot of what became Amiga in those two machines (e.g., the 400's ANTIC was the forerunner of the Amiga's Copper display-list processor)

Fun-fact. Amiga's impressive 4096-colour Hold-and-Modify graphics mode was never designed as such: it was instead a relic of abandoned circuitry to directly produce a composite video output, as befitting the product's games console origins. In that mode, you'd hold the chroma signal(s) for two pixel clocks, and modify the luma every pixel clock to produce a full colour display with lower memory requirements, as television video standards all had a higher resolution for the luminance signal than the colour signal. When the video output requirement changed to RGB, the circuitry was repurposed to provide the HAM modes, on the basis that it was already in the chip and might be useful for something..

21
0

Google's macho memo man fired, say reports

Kristian Walsh
Silver badge

Re: No diversity of opinion from "progressive" conformity

I remember 20 odd years ago our (female) science teacher telling us interesting extra-curricular science facts

I hope, for the quality of your education, that what she actually said was "when observing a large sample, the female participants showed this trait slightly more often than the male participants did." Because these differences are statistical, they are small, and the degrees overlap considerably between individual males and females. Reducing any sex-linked personality trait to a binary flag would be an egregious mistake.

I'm male, and have below-average spatial calculation skills, but high linguistic proficiency and positional memory - all of which are supposed to be "female" traits. However, I also have very high logical reasoning scores, which is allegedly a "male" trait. A group of us (mixed sex) did these tests once, and there was nobody who fit the clear-cut "male brain" or "female brain" categories. But I'm pretty sure that had 10,000 people taken the test, there'd be a slight difference by sex. But only slight.

We're in danger of amplifying the noise floor here: ignoring a huge "same" to focus on the tiny "different" it's a common fallacy I've seen in tech people when they move outside of techical decision making (it's not unique to technical people, but it's more common there). That thinking is absolutely the right approach when trying to track down a bug, but not when trying to hire a team. (Over-weighting these small differences between people explains both entrenched misogyny/racism and poor "diversity hires" - the only difference is in the hirer's preconceptions)

3
1

Microsoft Surface laptop: Is this your MacBook Air replacement?

Kristian Walsh
Silver badge

Re: Macbook Air replacement my arse

macos users also have access to a rather large library of open source tools with a wide range of options on how to obtain them from pre-compiled binaries for some to packagers like Homebrew or even bare vanilla compilation from source.

All true, but the implication that this isn't on Windows is out of date. These days Windows 10 supports pretty much the entire Ubuntu userland, which is a much more useful proposition. Once you enable the feature, you can install packages using apt-get, and they're the exact same binaries as Linux uses, so there's no delay for porting and no "missing" packages. Plus, it really is a Linux-compatible userland - for proof: about an hour ago, I used gcc on my Ubuntu-on-Windows 10 desktop to compile up a simple C tool, ran it locally, then scp'd that binary over to a native Linux host, and it ran perfectly there too. Do that on macOS, and I'll be impressed.

I was pretty much an exclusive Mac user for most of the last 20 years. Brew and Ports are nowhere near as easy or up-to-date as you describe, and there are also differences between the BSD core tools and the Linux ones that will bite you from time to time too (for instance, macOS comes with a different implementation of 'grep', with different defaults; same for 'netstat'; and 'route' shares only a name with its linux equivalent). I used the mac as a simple scratchpad to try fragments of shell-script, but it really wasn't compatible enough with a Linux for real prototyping. Windows 10's Linux subsystem still isn't as good as an actual Linux target for everything, but for a surprising number of developer use-cases, it is.

0
1
Kristian Walsh
Silver badge

Re: "But don't you have to pay another £70 to do so?"

The 10S free upgrade is to Windows 10 Pro, not Home.

0
0
Kristian Walsh
Silver badge

Re: Betteridge's law of headlines...

In my case, the answer is "yes". I've a MacBook Air that needs replacement. I'll either retire the Mac, or replace it with one of these. Either way, I won't be getting a MacBook - the new generation Apple keyboards are horrible (but at least the obnoxious light-up branding is finally gone).

The keyboard on this is phenomenally good for a lightweight laptop - I would agree that only the ThinkPad keyboards are better, but I could never warm to the ThinkPads otherwise.

0
2
Kristian Walsh
Silver badge

Re: ... but will it

If they hadn't made such an effort to lock down the BIOS to prevent people installing Linux then I might even have considered buying a Surface at one point.

Er, "such an effort"? Really?

The Surfaces are all pretty standard UEFI systems. Installing Linux on one is trivial; getting everything to work once it boots is not. The problem is getting Linux drivers for the custom hardware (sound, touchscreen, pen, keyboard, webcam), but that's not Microsoft's "efforts" - it's a matter of how helpful the peripheral makers are towards open-source driver maintainers, and that's a complete spectrum from "totally onboard" to "downright hostile".

Driver availability has nothing to do with the BIOS. If you can install and boot Linux on the hardware (which you can), then the BIOS is not "locked down".

So, if you want to continue to blame Microsoft for nVidia or Marvell's sins in respect to Linux, go ahead. But be aware that shouting at the cat is not a good way to stop the dog shitting on your carpet.

4
1

Windows Subsystem for Linux to debut in Windows 10 Fall Creators Update

Kristian Walsh
Silver badge

Re: Reminds me of

for example I can't do a build of the Android system I build on my Linux machine since the build system has some 32bit dependencies

Yes, i've been bitten by that before.. Not Android, but a product we had to build for both 64 and 32-bit systems. Our build system was x64, and while it's theoretically possible to build both target architectures on either platform, it's also theoretically possible to wrap a rope around the moon. We ended up creating a chroot filled with an i386-native toolchain for the 32-bit builds rather than try to fix all the cross-compile bugs in all the packages we used: lots of packages are still in the "building these sources, for this machine only" mindset, and like everyone else, we only had so much time to ship our product.

But I don't see WSL as a replacement for a real Linux system - instead, it's a way of getting some very useful Linux-based server tools running on a Windows machine, to help me develop the front end. It's the same thing as the Mac's BSD underpinnings: the existence of that not-quite-Linux shell in macOS is the reason why so many developers now use Macs (even after excluding the iOS people who have no choice).

The big selling point of Macs for me and other developers was always that Macs had all the "Unix-y tools" built in, while still having all the "working in an office" stuff that one also needed, plus working hardware drivers; WSL gives Windows the same thing, only much better, and it's honestly something that Microsoft should have done back in 2005 or so...

0
1
Kristian Walsh
Silver badge

Re: Reminds me of

There is one enormous difference: when you run 'apt-get' in MobaXterm it looks to the set of packages that have been specially written to work with MobaXterm; when you run 'apt-get' in WSL, it looks to Canonical's Ubuntu repos (but you can add any other by editing your sources file).

Basically, if a tool run in a Linux terminal, that same binary will probably run on WSL(*). To me, that's worth losing X11. (Besides, X11 is reported as working in the current WSL, it's just that Microsoft is not prioritising the fixing of any bugs people report with it)

* some sysfs/proc stuff notwithstanding, but the "unsupported" list keeps getting smaller.

6
4

The life and times of Surface, Microsoft's odds-defying fondleslab

Kristian Walsh
Silver badge

Re: Of course everyone hates the Metro interface!

A bit late to reply, but I've gone the other way to you (about two decades of Mac use, but got a bit tired of Apple's hardware removing connectors I used, and their OS releases assuming that my whole computing life revolved around an iPhone, one of which I have never owned).

With Windows 10, I not only have touch, but also a better bash shell than MacOS did... (unless you specifically want the BSD versions of the core Unix commands, that is; and while I wholeheartedly agree that BSD is a better Unix than Linux in many, many ways, people won't pay me to build software that runs on it, so Linux is what I need...)

Having apt-get alone has convinced me to stay with Windows, rather than go back to Macs. And having now used one recently, I'm pretty much decided on an i5 Surface Laptop as my next hardware purchase (as opposed to a 13" MacBook).

1
0
Kristian Walsh
Silver badge

Re: It is more a copy of ultrabooks than iPad Pro is a copy of it

The idea that the iPad Pro is a copy of Surface is laughable, they serve two completely different markets!

They don't, really. And that argument has been used multiple times in the opposite direction to explain why tablets like iPad didn't matter to the PC market, and it was equally incorrect then. You can get Microsoft Office on iPads these days, and that covers over half of the things people use computers for in "industry" - for everything else, there's RDP/Citrix.

The iPad Pro is absolutely Apple's copy of the Surface, and you'd have to be living well within Apple's world not to realise this. It adds the same physical advantages that Surface had over using an iPad for prolonged productivity work, and it adds them exactly the same way: with a bigger screen (obvious), a clip-on keyboard (why not a clamshell? why not wireless?) and a pen (oh, sorry, it's a "pencil"...)

One can claim that iPad Pro is "a better iPad for doing work" (not your words, but it's the gist of Apple's marketing of it), but that slogan also describes Surface.

For what it's worth, I've a Macbook Air and a Surface 3. The Macbook has a nicer keyboard, but the Surface lets me use it as a notepad. My next laptop will have touch input.

6
1
Kristian Walsh
Silver badge

"Metro" is dead. It was the touch-centric UI toolkit for Windows 8.x. Desktop users (understandably) hated its large controls, and it never got momentum as a tablet API.

Windows 10 UWP, the replacement, is best described as a touch-friendly desktop application toolkit. Hover your mouse over a scrollable pane, and the scrollbar appears, but if you drag the panel with your finger, it does intertial-flick scrolling as people now expect. More subtly, the item spacing of popup menu items is wider when you tapped to reveal the menu, but narrower when you clicked with the mouse to do so.

The downside is that app developers have to provide access to features through both mouse actions and finger gestures, but it can be done.

However, the big deal for Surface has been the stylus/pen. Unlike earlier attempts at PC-stylus interaction, the stylus and your finger are recognised as two separate types of input device, so you can pan and zoom with your left hand while writing with the pen in your right.

Actually, if you've got a pen, do get the "Plumbago" app from the Store - I've found it's the best replacement for a paper notepad for sketching out ideas (basically: it works like a good old-fashioned paper pad that also has undo and cut-n-paste)

6
3

Confessions of an ebook eater

Kristian Walsh
Silver badge

What's a "Pointer-to-Book"?

First, I know nobody who uses the word "book" to refer only to an electronic document, so the "pbook" nonsense is totally redundant. There are, at this point in history, "books" (paper) and "e-books" (electronically stored and displayed). When paper books do become the tiny minorty (and it's not going to be for decades), we'll coin a term for them; and my money is on that term being the straightforward "paper books", just as "film camera" versus "camera" has replaced "camera" versus "digital camera" since digital photography became the universal technology.

Second, when coining English words, it's generally a good idea to use the consonant patterns of the language. "vlog" was bad enough (no other English word , but "pbook" can only be sounded by a native English speaker as "pook" ...or "book", and I don't know a language where "pb" would be sounded separately. (Say "drop bear" and listen carefully to the sounds you're making. The consonant at end of "drop" will be shared with the start of "bear". Now concentrate on sounding both separately. Awkward, isn't it?)

So, someone decided to coin a new word that's spelled only slightly differently to an existing word that's already universally understood to describe the exact same thing (and with that meaning is one of the thousand most common words in English), but the new word also has the added stupidity that it can only be pronounced in a way that makes it sounds exactly the same as the existing word it's supposed to replace. Bravo!

(and the biggest problem with e-books is that you can't hold multiple pages open with your fingers as you cross-reference an index, the place you were reading, and the description of the term that the place you're reading has just referenced)

3
2
Kristian Walsh
Silver badge

Re: Great article

16:9 panels were chosen for laptops because they were cheaper: the same 13" part could be used in a portable TV.

There's no evidence that they're better for reading text, whether side-by-side or one-up. The problem is that they lack height, and text is invariably written as tall pages of narrow columns. Look at how many websites (Reg included) pad out their layout left and right with the content in a narrow vertical column. There is a scientifically proven reason for this: it's to reduce the amount of horizontal tracking your eye needs to do between one line of text and another (similarly to how newspapers lay out text in narrow columns rather than long lines), so that you can read faster.

Wide displays don't offer any benefit for reading - even if you have two documents side-by-side, you will get to see more of them on a 3:2 or (better) a 4:3 display than you do with any of the widescreen displays. And that's before we look at how every GUI design peels precious vertical space away with toolbars/docks/menubars and window headers...

If there were laptops with 4:3 displays, I'd rave about them, but nobody makes them anymore. Surfaces seem to be the only range that has tried to move away from 16:9 and back towards the more work-friendly 4:3 ratio -- if there are others, I'd be delighted to hear about them.

7
0
Kristian Walsh
Silver badge

Re: Great article

...Sadly, that neat rule true only for the DIN A, B and C series of paper sizes. If you have to deal with the American paper sizes, you've just got to memorize the ratios, because they're all effectively random numbers: Letter is 1.29:1; Legal is just as wide, but taller, so 1.65:1. "Tabloid" is actually just 2 x Letter side by side, but because Letter was a non power-of-two ratio, the ratio of a double-sheet of it is a not-very-obvious 1.54:1... (American business resists metrification because it would cause "confusion"...)

The other neat A-series trick is how easy it is to calculate document weights: A0 has a surface area of exactly one square metre, so sixteen sheets of A4 paper (2^-4, or 1/16th of the area) has a weight that's exactly the gsm thickness of the paper stock. (80g for standard 80gsm copier paper) Try working the equivalent out starting from "pounds per uncut ream"...

Regarding screens, Microsoft's Surface line has the best screen aspect for reading A-series documents: its 3:2 display ratio (i.e., 1.5:1), is as near as dagnabbit the A-series 1.414 : 1 plus a tiny bit for a menu bar.

Hopefully, other manufacturers will follow suit: 16:9 is only good for movies.

20
0

Google goes home to Cali to overturn Canada's worldwide search result ban

Kristian Walsh
Silver badge

Re: Internet governance

One thing: "free speech" is the right to call the head of your particular government a sack of shit without them sending the police to your house in the middle of the night to spirit you away to a salt mine. It's a right of free expression, without threat of government reprisal. It is not a carte blanche to say whatever you want to whomever you want without having to deal with the consequences. If you call the guy in the bar a sack of shit, and he rightly punches you in the face for the insult, "free speech" is not a valid defence for your provocation.

A second example, which seems to be a problematical topic for the EFF and Google: if you "find" a movie/book/router design that someone else made online and you then make it available to other people as if you had been the given the right to do so by its owner, that's not "free speech" either: it's fraud, just as if I found a roll of ticket paper and used it to try to sell fake subway tickets.

Ultimately, all this "internet freedom" talk is just so much diversionary hogwash. Bluntly, Google doesn't want to police its index because it's an expensive thing to do, and as a monopoly, Google gains no value from efforts like this to make its product "better". Pretty much everyone already has to advertise via Google now, so any investment that doesn't directly shore up that monopoly is wasted spending. Every monopoly behaves like this eventually, and that, in a nutshell, is why monopolies are so bad for customers...

5
1

HMS Frigatey Mcfrigateface given her official name

Kristian Walsh
Silver badge

Re: great names like Revenge, Glorious, Implacable etc.

"Puncher" - not a good name for a rigid inflatable, though.

8
0

Good news: Samsung's Tizen no longer worst code ever. Bad news: It's still pretty awful

Kristian Walsh
Silver badge

Re: NaN

Just to state it clearly, the NaN test exemption doesn't apply to the code fragment shown in the article, as the expression shown in that code, "(x>x)", is never true for any type or value of x, NaN or otherwise*

* unless you overload operator>() to do something unrelated to testing for greater magnitude... In whcih case, I will find you and I will do bad things to you.

3
0

Ubuntu Linux now on Windows Store (for Insiders)

Kristian Walsh
Silver badge

Nokia's Store and non-Store apps (Was Re: new fangled Windows Subsystem for Linux)

Nokia's app store was never exclusive. Symbian still allowed you to install anything you wanted, from anywhere you wanted. All you got was a warning dialog saying "This isn't from a verified publisher. Do you still want to install it?"

This made it very easy to send beta-test or distribute review copies of paid apps - just email your .sis file to the relevant people, and have them install it directly by tapping the file attachment in their email.

Back on topic, the reason MS has put this into Windows Store is the same reason that their other built-in apps are kept in the Windows Store: it puts all the non-OS parts under the same distribution and update infrastructure.

The product is called "Ubuntu on Windows" (although the technology that allows it is called the "Windows Subsystem for Linux"). Canonical are 100% involved in this, which is why the "Ubuntu" name can be used. The confusion is that people use "Linux" to describe the kernel as well as the kernel plus various GNU-based userspace tools that make a usable system, of which Ubuntu is one.

For "Ubuntu on Windows", WSL plus NT is the "kernel", Ubuntu's set of tools is the userspace.

And the reason that WINE emulates win32.dll rather than the "Windows Kernel" (a misnomer - the current kernel is called "NT"; "Windows 10" is the name of the current userspace) is that there wasn't just one "kernel" when WINE started: win32.dll is the abstraction layer that gave Windows NT and Windows 3.1/9x the same APIs, so that's the correct layer to intercept.

3
0

Bonkers call to boycott Raspberry Pi Foundation over 'gay agenda'

Kristian Walsh
Silver badge

Re: The Bible

Well, from this viewpoint I cannot see that drawing as anything other than two women and a couple of kids. So, pass on my small congratulations for depicting an adult female human in a cartoon illustration without resorting to just drawing a boy with boobs and long hair*

(* Yes, I know it was good enough for Michelangelo, but that was a long time ago...)

5
0

Exposed pipes – check. Giant pillows – check. French startup mega-campus opens

Kristian Walsh
Silver badge

Is that the office, or the on-site creche?

That first picture looks like nothing else but a pre-school.

8
0

Researchers solve screen glare nightmare with 'moth-eye' antireflective film

Kristian Walsh
Silver badge

Re: Not new

Looked amazing though - I saw one in the basement bit of Selfridges in London, which is lit up like a Christmas tree, and there was no visible reflection on the screen. Very expensive, though, and a little too fragile for the general market - if you didn't use the special cleaning cloth when wiping the screen, you could end up polishing the screen and reintroducing reflections.

4
0

Uber culture colonic cleanses CEO Kalanick

Kristian Walsh
Silver badge

Re: Common?

The words "Individuals in a Reporting Relationship" mean that the prohibition applies only to two people who are in a reporting relationship; i.e., where one reports to the other.

Individuals who are not in a reporting relationship are not covered by the prohibition.

Dating and the Org-Chart: doing horizontally is fine; doing it vertically may be complicated.

1
0

Labour says it will vote against DUP's proposed TV Licence reforms

Kristian Walsh
Silver badge

Re: @Pen-y-gors

If I were a crooked government official, feathering the nests of my pals with some other country's tax money, I wouldn't be in favour of an organization like the BBC either.

Anyone who wonders why DUP are so strongly against the BBC should look at BBC Northern Ireland's investigative reporting output which has exposed them to be on the take on more than one occasion (same goes for their opposition, the equally morally dubious Sinn Féin). Most recently, look at the Renewable Heating Incentive scandal in NI where DUP gave its chums a nice little profit at the cost of 450 million pounds of UK taxes.

10
4

Hotel guest goes broke after booking software gremlin makes her pay for strangers' rooms

Kristian Walsh
Silver badge

Re: @baspax ma1010 "Sounds like a lawsuit"

The retailer stores the CC information. They have to in order to charge the customer. So they have the full credit card number.

The retailer doesn't store this information.

That's done on behalf of the retailer by a card acquirer (the company who "does your credit card payments"). If the retailer wants to do additional charges against the same card later, they can ask the acquirer to return a reference (not calculated from the card details) to the customer's card. To make a charge, the retailer sends that reference, plus the desired amount, back to the acquirer. They can repeat this as often as they want. As the reference is just a random number, and can only be used to make purchases that benefit the one merchant, it doesn't fall under PCI-DSS rules.

No retailer would want to store card info in their computer system. Doing that opens them up to a £15k a year PCI-DSS compliance audit. By offloading the storage to a card acquirer, the merchant/retailer only has to fill out a fairly simple Self-assessment questionnaire to verify that they're following "good practice" (i.e., not scribbling down card numbers, dates and CVVs on post-it notes)

1
0
Kristian Walsh
Silver badge

thoughts... Re: ma1010 "Sounds like a lawsuit"

My guess is that they never accessed her card details at all, but instead repeatedly charged against a card-on-file token that they'd legitimately kept for her card. How they got to doing that could be one of those few differences between debit cards and credit cards, and how their booking system didn't properly deal with them. (I'd be surprised if the hotel's booking system itself ever handled the card details - that's normally handed off to a third-party service due to the high PCI-DSS compliance cost of doing it yourself).

What they're most likely doing is asking for the customer's details to be retained on file with their card acceptor service, for later use. The result of that operation is a random-ish payment token that can be given back to the acquirer to make charges against that card in future.

My guess is that she was, unfortunately, the first customer to present a debit (not credit) card to the hotel. Debit cards have a subset of the functions of a credit card, so the returned information from a card acceptor will be smaller set of fields than for a debit card, with some values set to NULL (or missing, which can be the same thing depending on how you process the response). And that's where I think the fun would have begun...

If I wanted to make this happen, here's how I'd do it:

1. The response from card acquirer has NULL for a field that "always" has a value when used with a credit card, but despite that, still contains a usable "charging" token that can be used to raise charges against the card.

2. The unexpectedly-NULL field is used as the first term in a concatenation operation to generate a key to identify that card, but because of concatenation-to-NULL, the whole result ends up as NULL.

3. As there's not a card on file already there with NULL as its local "unique" id, the victim's card token gets stored into the "cards on file" table with the "unique ID" of NULL, but the correct token.

4. That card-on-file ID (NULL) gets stored against the first customer's booking record.

5. (later) The hotel booking system looks up the token for the first customer and charges the customer.

But...

Another customer with a debit card arrives, and steps 1..3 repeat as before, but this time, because there's already a card on file with the "unique" ID of NULL, the first customer's charging token gets associated with the new customer's booking.

...Repeat until the first customer gets very,very mad..

Incidentally, PCI-DSS doesn't cover the handling of stored tokens such as this, as they cannot be used to reconstitute a customer credit card account, and they bind exactly one merchant to the card (you can give someone else the token you acquired, but if they use it, the money still goes into your merhcant account, not theirs).

7
0

The open source community is nasty and that's just the docs

Kristian Walsh
Silver badge

Re: >Behind closed doors, in the dark underworld of proprietary software

Well, I can only talk from direct experience, but while I worked at Apple, the bug database (RADAR) was viewable by any employee with an account. People who knew I worked in Apple, and had filed bugs against MacOS components sometimes asked me to "informally" see how -- or if -- the issues they reported were being followed up.

While doing so, I never saw an offensive or disparaging comment on any of these. Timewasters (both internal or external) were simply commented as a neutral "can't reproduce, behaves as documented; recommend close" and that was that. At worst it was "yes, this is a problem; yes, we know; no, we're not going to fix it any time soon"..

Maybe Apple was "different", but I really don't think so: I know people who worked at Microsoft and have similar experience. The difference is probably that when you're representing your employer in public you think twice before acting the dick, because your employer might take offence at what you're saying in their name... and decide that you'd be better off with a different employer.

Perhaps FOSS doesn't have that high penalty to pay for being a shit.

3
0

Apple gives world ... umm ... not much new actually

Kristian Walsh
Silver badge

Workstation with disposable screen...

Apple is calling this a "professional" system, but it's not. A professional system in the markets that Apple still has presence in will usually have multiple, colour-calibrated displays attached to it. Each of those displays probably cost more than the $5000 Apple is asking for this all-in-one unit. Bundling the monitor with it is nuts: it's just getting the customer to pay more for something they didn't want and won't use..

Realy, how hard is this for Apple? All they've got to do is look at their own website in Wayback Machine and find the "XServe" product from the early 2000's. Update the CPUs, and make one of those. It'd fit neatly into the rack with the dedicated disk-array holding the enormous asset files the user is working on.

(I've seen the rubbish-bin Mac Pros in use, and they quickly turn into a rats' nest of cables and external storage devices)

This, like all of Apple's current range, is a product for hobbyist users. Nothing wrong with that, but to say that it's a solution for professional customers is naive at best, and insulting at worst.

14
4

Microsoft totters from time machine clutching Windows 10 Workstation

Kristian Walsh
Silver badge

Re: That's probably 10 years to late

Very little workstation-class software runs on macOS anymore, because Apple stopped making workstation-class hardware nearly a decade ago. (People forget that before the "dustbin" Mac Pro was left to wither, the previous Mac Pro tower was also left in semi-abandonment by Apple for a couple of years).

7
0

Windows is now built on Git, but Microsoft has found some bottlenecks

Kristian Walsh
Silver badge

300 Gb isn't everything...

Some divisions of Microsoft are not part of this project. Based on a comment in an earlier posting about this migration, the Bing group does use git, but doesn't keep its code in this super-repo. Apparently, they use sub-repos in their setup, and again apparently it has been a major pain in the metaphorical ballsack to keep everything in sync.

This is the big problem with sub-repos; you end up having to manually keep things in sync, whereas a "one big repo" solution makes it a lot harder for a dev to commit something to the head of their component that won't build against the head of every other component.

0
0

The revolution will not be televised: How Lucas modernised audio in film

Kristian Walsh
Silver badge

Lucas and the importance of sound

I'm in the camp that says George Lucas was a groundbreaking director rather than a great one: others took his techniques and made much better films with them than he did, but in sound design particularly, he was a real pioneer. Lucas's early features, "THX-1138", "American Grafitti" and "Star Wars (Episode IV: A New Hope)" all deserve their places in the list of best films of the 1970s.

For anyone in doubt about how far back George Lucas's understanding of the importance of audio in a film goes, have a quick skip through this, which is his student film from USC (later used as the basis for Lucas's 1971 debut feature, "THX-1138"):

Electronic Labyrinth - THX-1138 4EB

Okay, as a student film, this isn't just low budget; it's no budget, so it's a trying watch at times. But the thing to note is how well Lucas used the audio track. After the pre-title sequece, there's nothing that you could call "dialogue", just a layered soudscape of radio chatter that builds up over the running time to provide tension.

7
0

Emissions cheating detection shines light on black box code

Kristian Walsh
Silver badge

FCA case...

The article doesn't go inot much detail about the Fiat-Chrysler case, and from the discussion of the code, it's really about the VW emissions cheating. FCA's case is different; here's a summary:

The EPA examined the engine control software of FCA's 3.0 litre V6 diesel (made by FCA subsidiary VM-Motori) fitted to some RAM pickups and Jeep vehicles and found a few control loops that they felt would affect emissions in some cases. The behavior changes weren't major, but EPA claimed they could affect the emissions results (resulting in a fail in some tests which passed in the lab).

The charge against FCA is not that they are actively attempting to defeat the emissions test, but rather that they did not fully disclose these addition control systems in the engine software. Unlike the VW trick, these are all possible to trigger in normal driving conditions, but like the VW trick, they are always active in the EPA test cycle.

Strangely, had FCA told EPA about these up front, there may not have been any case to answer - the US rules state that you disclose everything to EPA, and they judge whether it's material to the real emissions performance of the vehicle.

The Volkswagen case is an order of magnitude more serious than anything else in the industry: it represented co-ordinated planning by VW to misrepresent the engine under test in order to allow them to sell a diesel engine in the USA that they knew beforehand would never meet US emissions regulations in any normal mode of operation.

6
0

Page:

Forums

Biting the hand that feeds IT © 1998–2017