* Posts by Deltics

355 posts • joined 15 Jun 2014


Turn on, tune in, drop out: Apple's whizz-bang T2 security chips hit a bum note for Mac audio


Possible solution...

I haven't read the threads and maybe someone has tried this and found it doesn't work, but what about dongling a USB 2.0 hub via Thunderbolt ? Would that take T2 out of the critical path for anything connected through the dongled USB hub, or does it still get involved ?

Maybe ?

Forget snowmageddon, it's dropageddon in Azure SQL world: Microsoft accidentally deletes customer DBs


Re: roll forward logs

From the information about the incident it seems this is more-or-less what they DID have, albeit the snapshot was at most 5 minutes older than the moment of droppage.

But even if the snapshot were taken instantly before the DB was dropped, there would be an elapsed period between when the DB was not available and when the screw up was realised and the database restored from that snapshot.

The statement that "5 minutes of data was lost" comes from El Reg and appears to naively extrapolate from the claimed oldest age of backup and ignore the question of just how long it was between the database ceasing to exist and the realisation of same and restoration of backup.

You like JavaScript! You really like it! Scripting lingo tops dev survey of programming languages


Re: Yeah...

"Re-Inventing the wheel" is a phrase that is right up there with the most over-used, abused and mis-appropriated in history.

Nobody EVER re-invented the wheel. The wheel was invented precisely ONCE (discounting for simplicity any question of parallel invention).

What DOES happen is that new types of wheel are constantly being designed to suit purposes and applications to which existing wheel designs are not desirable or well suited.

The designers of the Boeing 747 were not castigated for re-inventing the wheel when in designing the under-carriage they refused to simply slap on the wheels from a Model T Ford.

Mines the re-invented human enclosure garment with the re-invented telecommunications device in the re-invented integrated garment storage solution.

Ooh, my machine is SO much faster than yours... Oh, wait, that might be a bit of a problem...


Re: Well that was an invisible problem

The other thing to remember is that in 1990 CAT_5_ cabling was still 10 years in the future !

RIP 2019-2019: The first plant to grow on the Moon? Yeah, it's dead already, Chinese admit


Re: Puzzled

I'd challenge the assertion that science doesn't try to prove (or disprove) things "everyone" knows won't work.

Once upon a time well over 90% of people "knew" that the Earth was the centre of the Solar System, if not the Entirety of Creation. So to keep taking observations to create a model that placed the Sun at the centre of our particular local system and establish that as just one of countless such systems would be a complete waste of time, yes ?

Once upon a time well over 90% of people "knew" that life could not exist without heat, light and oxygen. So there would be no point looking for life where one or more of that trinity was not to be found and we would still be entirely ignorant of the vast range of extremophiles that defy our previous "certainties" about where life could be found.

Once upon a time well over 90% of people "knew" that Earth is a flat disc, so to try to sail beyond the Great Ice Wall that kept everything on said disc would be to condemn oneself to an eternal descent into an infinite abyss. Of course, science would have no truck with such a reckless and self-evidently suicidal endeavour.

In a sense you are right about science being interested in "new things", but sometimes those "new things" are ideas which may very well contradict or at least undermine our previous or current known truths. i.e. scientists absolutely are concerned with proving (or disproving) what we all already "know" because surprisingly often what we all "know" turns out to be wrong.

Sometimes they do indeed simply confirm what we already know, and we ridicule them for wasting their time (and our money). But every once in a while .......

On this occasion, perhaps the thinking was that the germinated seed might react quite differently to the very different nature of the onset of Lunar Night (as opposed to any process we might have on Earth for "freezing" a germinated seed to approximate that process)

We wouldn't know unless we tried it.

And yes, it's a good PR stunt as well. But t'was ever thus... ask the Montgolfier's about the scientific value of PR.

Come mobile users, gather round and learn how to add up


Re: Aproximate

That's not really approximation, just inefficient representation.

0.9999..... == 1.0

Ergo 3.9999...... == 4

'Proof' (probably much over-simplified and possibly even not really a 'proof'): If there are infinite 9's after the dp of a value that you wish to increase to the next whole number, then you need infinite 0's before the 1 to be added, but if there are infinite 0's then you can never reach the point where you can place the 1 to be added. If you cannot express the difference then there is no (meaningful) difference, other than the representation.

By extension I suppose n.0000.....N is always also equal to n.0

Mines the one with reciprocal ∞ in the pocket.

Google settles Right To Be Forgotten case on eve of appeal hearing


The "right to be forgotten" is specious nonesense. There is no such right. Or if there is, it is in constant conflict with everybody else's "right to remember".

Lest we forget.

JFrog to open freebie central repository for Go fans in the new year


Re: So... trying to replace GitHub?

Yes GitHub is version controlled, but it doesn't mandate or enforce a standardised mechanism or convention for a obtaining or specifying a specific Version "X" of Module "Y" without knowing the specific repository in which it is held, the layout of that repository and the particular tagging conventions used by the repository owner to indicate specific versions in the commit history.

It seems rather more simply that JFrog are trying to establish for Go what nuget, npm and maven et al have established for other languages (and note that GitHub doesn't make those any less useful).

Having code in nuget, npm and maven doesn't make it more trustworthy, simply more convenient and reliable (*).

Same rules apply.

(* reliability in the sense that when two developers say that they use version X of package Y, now they can be sure that they are indeed talking about exactly the same thing in a way that they possibly might not be if they both pulled the code from two difference sources who each had their own idea of which particular revision of that code should be labelled with that version. To that extent you might also consider it **more** trustworthy on those terms even if it does nothing to help assure anyone that "this code works and doesn't contain anything dangerous")

Keen for much-hyped quantum computing to finally land? Don't expect it for a decade


* However in certain limited applications we already have AI. The google search engine for example

The classic mistake of re-branding data-processing as AI.

* Getting a flying car that a non-pilot can fly and not crash into the myriad other flying cars is hard.

And land. Let's not forget landing. Any idiot can take off and fly and largely be successful at avoiding other things in the air. But getting safely back on the ground again? That's the REALLY hard part.

Almost as hard as building a personal vehicle with sufficient energy store and efficient propulsion to have a range that makes it useful for anything beyond saying "See! We built one!".

* Expect military applications [for driverless cars] in the next few years

But don't hold your breath for the broad and tumultuous changes to infrastructure and legal liability laws required to make widespread, meaningful personal use feasible.

* [cure] To all cancers, no. To some cancers, yes

Yes, the problem here is the categorisation of many disparate conditions with a common underlying cause into one "curable" thing. One might just as well posit a "cure for viruses". A cure for cancer is a null proposition.

* the problems [with developing fusion power] are engineering and material science not physics

That's true of any new technology. There was nothing stopping Neanderthal's from developing fusion power. The physics was the same, the lack of engineering and material science was the only problem. So sure, we can have anything (subject to the laws of physics being observed) we just need something unspecified and currently unknown and possibly unknowable to make it possible. You might extrapolate from this that it is only a question of time, possibly. But to put a timeBOX on that discovery, is to attempt to draw a long bow where the bow string is not fixed at one end - try it.


Falcon 9 gets its feet wet as SpaceX notch up two more launch successes


Re: NASA has concerns over SpaceX culture??!?

And what sort of a record do we think SpaceX would have had in the 1960's (never mind that, even in their actual timeline) had they not had the benefit of learning from NASA before them (and in some cases NASA stepping in directly to correct some pretty basic blunders before they had a chance to embarrass the wunderkind).

"On the shoulders of giants" and all that.


> Falcon fairing halves missed the net, but touched down softly

> in the water. Mr Steven is picking them up.

He may not be much cop at a game of catch but it's good to know that the Death Star's Head of Catering is not above retrieving his own dropsies. Just a shame that this one is wet. And this one is wet. And this one is wet! And this one is wet! And this... What?! Did you dry these in a f*****g rain forest ?

£10k offer to leave firm ASAP is not blackmail, Capita told by judge


Well, you could take the view that money was being demanded of her in the form of the savings to the organisation in terminating an employee at a fraction of the cost of their salary that would be paid over the period during which they were being "performance managed" out, if indeed such was the intent.

Douglas Adams was right, ish... Super-Earth world clocked orbiting 'nearby' Barnard's Star


Fascinating discussion about how to slow down a probe 6LY distant...

... but I don't see anyone addressing the question of WHEN to do so.

We're talking about a star system that is so far away that we're really only guessing at the distance and a probe being sent to investigate a planet that we only think is there but have no data for exactly where, let alone projecting where precisely (or even approximately) it will be at the time that the probe arrives.

Even if the distance calculation is accurate, and in astronomical terms might be considered "good enough", once you're in the locality you need to be assuredly more precise. In the same way that if I were to send a package from here (Auckland) to a friend back in the UK, just saying "Please take this package 18,360km that-a-way" (points toward London) is not going to result in a successful delivery or even arrival in the right county, let alone town.

Any probe will need to be fully autonomous, able to determine it's arrival at the star system, identify and locate any and all planetary bodies and perform the calculations necessary to either achieve a stable orbit around any particular thing of interest, avoid missing everything of interest and sailing on past into the void or (irony of ironies) avoid colliding with the very thing it has been sent to probe (or something else that we haven't yet determined existence of from our current, remote observation station).

And it needs to be able to choose when and where to perform any delta-v itself. Anything reliant on Earth-based technology (requiring data to be sent to Earth and then commands or laser light destined for any sails sent back to the probe) would result in that delta-v occurring almost 12 years too late.

Brit boffins build 'quantum compass'... say goodbye to those old GPS gizmos, possibly


Re: Space?

Which then just leaves the question of figuring out where Earth is once you've returned to the spot it was at when you left (assuming that you can either compensate for the expansion of The Universe, the duration of the trip is such that this expansion doesn't amount to a significant drift or that there is no such expansion after all so it simply doesn't figure).

'Pure technical contributions aren’t enough'.... Intel commits to code of conduct for open-source projects


Re: what.

Either you work for HR (or People and Culture if your org has supped from that kool-aid syphon) or have a very dry sarcasm to the point of being indistinguishable from sincerity.

Microsoft reveals xlang: Cross-language, cross-compiler and coming to a platform near you


Re: Not a new thing

Yeah, but that stuff is about 25 years old now, so cannot possibly be any good.

Besides, .NET "killed" COM so for the last 20 years COM hasn't been cool, meaning we have a new cohort of developers who can re-invent this wheel all-over again and think it's new. Sadly they will likely fail to learn from previous iterations because - again - it's old so is either forgotten or immediately discarded as irrelevant with nothing to teach us.

I wonder why is this industry seemingly so uniquely susceptible to this tendency to reject the past instead of learning from it ?


Re: Stupid conflicting name

How did .net Core not make the list ?

I mean, .net Core is at the core of .Net right ? Oh no wait, that would be .Net Standard, which is the thing which everyone has been running for the past nigh on 20 years right ? Oh no, wait that's .net Framework. Standard is the common pieces shared between .net Core and .net Framework.

So we have .net Core which isn't at the heart of .net Framework and both of which are SUPER-sets of .net Standard which isn't the (previous) standard .net that everyone has been used to.


That 'Surface will die in 2019' prediction is still a goer, says soothsayer


Re: SP?

You , dude... how do you like living in a territory where a manufacturer can abdicate their responsibilities for producing a product of reasonable quality after some arbitrary time period and then charge you extra for the privilege of ensuring that the product does indeed last a reasonable period of time ?

Come to NZ where the Consumer Guarantees Act (CGA) provides every consumer the assurance that every product sold in NZ will last for a "reasonable" length of time (literally what a "reasonable person would expect") taking into account how the product has been treated by the consumer, it's price and typical longevity of the type of product.

The CGA also applies a reasonableness test to whether replacement or repair is acceptable - e.g. if a product breaks after a few days of use (possibly weeks, depending on the cost of the product), or out of the box, then a new replacement is warranted (the consumer paid for a new product and is entitled to fair use of a new product). After a few months or years, then a consumer may need to accept a repair or reconditioned unit. Consumers may also exercise their right to entirely reject the product as being of merchantable quality (for a full refund) if it is not capable of performing as advertised/claimed or if the product suffers repeated failures (the CGA continues to apply to any replacement or repair).

Manufacturers must replace or repair within that period regardless of what they may say in their warranty terms and conditions - it is impossible to opt or contract out of the CGA. Even Apple.

AppleCare or no AppleCare.

Plex plucks media cloud service, sends users scurrying to exit


"Plex allows people to stream their content from other places – media servers, cloud storage providers like Google One and Dropbox, etc – to any device they own for $5 a month, $40 a year, or $120 for "lifetime" usage."

And this capability is not affected by the shuttering of Plex Cloud, except in so far as it means no longer being able to stand up a media server in the cloud, directly connected to cloud storage.

Anybody running their own Plex server and using Plex Pass to access their content remotely can still continue to do so - that has _never_ relied on Plex Cloud. I think that should be made clearer, lest El Reg be accused of misrepresenting facts in the interests of fomenting discontent.

It liiives! Sorta. Gentle azure glow of Windows XP clocked in Tesco's self-checkouts, no less


Re: Cross platform development is EASY

Yep a cross-platform UI for kiosks would be straight-forward. Otherwise... not-so-much.

Kiosks are much simpler for such things because the UX is constrained by the very specific nature of the device. But the vast majority of software in the world does not run on kiosks. It runs on various different devices and form factors and operating environments, many of which specifically differentiate themselves from others by differences in the UX.

If cross-platform development was EASY, everyone would be doing it by now and it wouldn't need to be constantly re-invented to get-it-right-this-time-no-honestly-we-have-nailed-it-now. I am guessing you don't remember (perhaps weren't even born at) the time of the likes of Omnis and various other 4GL's that had "nailed" cross-platform development in the 90's - a time when there was far less diversity in platforms to contend with.

Which is of course why Omnis and it's ilk went on to dominate the software development industry and why we are all using those tools now.

Oh, but wait. Then came Java with it's Ultimate Solution to the write-once-run-anywhere problem. Hmmmmm.

Then .NET. Then Qt. Then FireMonkey. Then Xamarin. Then .net Core (Jeez even .NET is taking two bites at the cherry).

It's almost as if this is a bit harder than some people seem to think.

.NET Core 2.1 – huh, yeah – what is it good for? Bing, apparently


Compilation in the deployment process...

So .net core is faster because they finally fixed a bunch of problems with ngen that prevented it from delivering on the promise ? Cool.


Re: What is JIT good for?

Benchmarketing can be used to "prove" anything. There is only limited correlation between "performance tests" and real-world performance.

(which isn't to say that Java isn't slower in the real-world, only that performance tests don't prove that - the real world does).


You missed one:

5. Alternative to Bing

(probably should be higher up the list than #5 as well)

Prank 'Give me a raise!' email nearly lands sysadmin with dismissal


Re: do you really want a complete list?

I'd also offer into evidence "By His Bootstraps".

But to be fair, my take is that the time-loop is very much at the centre of Bootstraps and Zombies where-as in TEFL (and To Sail Beyond the Sunset) the loop was only really a device to facilitate a much wider exploration of societal and cultural norms (very much the recurring theme in Heinlein's work) through the character of LL.

Microsoft Visual Studio Code replumbed for better Python taming


Re: 'IntelliSense autocomplete system'

I think perhaps the more pertinent question would be where have you ever seen crotchets, minims, quavers, brieves or clefs, let alone sharps or flats ... in programming source code ?

Eh ?

'#' has numerous "names" in different contexts. It's use as "sharp" in C# is purely whimsical, not due to any domain accuracy. As such, C-Octothorpe or C-Gate or C-Hash or C-Pound are equally valid whimsy (just not MS compatible whimsy).

Google Chrome: HTTPS or bust. Insecure HTTP D-Day is tomorrow, folks


Re: stuck on HTTP

> Every HTTP site creates an attack surface exposing every visitor to MITM, injection, and other attacks.

Ironically of course, every HTTPS site is also by definition an HTTP site. The difference in the presence of SSL doesn't change the fact that the basic protocol is the same.

The "ironically" part therefore comes from the fact that what you say about HTTP is also true about HTTPS. As soon as you put a publicly accessible site out there you have created an attack surface exposing every visitor etc etc etc. Whether that site employs HTTP or HTTPS doesn't alter the accuracy of that statement, only the difficulty involved in exploiting the attack surface you are generously providing.

Elon Musk, his arch nemesis DeepMind swear off AI weapons


Elon Musk, of Tesla autonomous vehicles fame, claims he will never develop autonomous weapons.

I'll just leave this here.


Microsoft's TextWorld gives AI a Zork-like challenge


The real test, surely...

... would be to set an AI the test of getting this thing up and running in the first place, like our plucky correspondent. i.e. In the face of incomplete and inaccurate instructions can an AI actually achieve an outcome it has been given in the form of a verbal or written instruction ?

The answer of course is "Don't be ridiculous."

With that in mind, I here-by declare the end of supposedly tech-literate journalists throwing the term "AI" around as if it is either meaningful or relevant to any current technological endeavour.

Thank you.

Azure Dev Spaces has hit public preview, so El Reg took it for a spin


Pish and tosh. The truly modern way:

Write Tests

Write Code

Run Tests

Fix Code

Run Tests

Deploy Code

Run Code

Change Code

Suppress Test Failures

Run Code

Your phone may be able to clean up snaps – but our AI is much better at touching up, say boffins


Am I the only one...

... completely fed up with the way that "AI" has become the go-to term for what we used to call "Data Processing" ?

Sure, the volumes of data involved have increase dramatically and the complexity of the algorithms along with it, but essentially the technology is no different than it has been for the past 30-50 years.

It's not AI, it's a computer. Dammit.

BlackBerry KEY2: Remember buttons? Boy, does this phone sure have them


Why would "lefties" mind which side the buttons are on ...

Is there a phone which has an entirely symmetrical design w.r.t left/right handed-ness AND the ability to re-assign button functions to suit (i.e. completely reverse/mirror the configuration between left/right hand "modes") ?

I don't think so. Even if there are buttons on BOTH sides, the buttons on one side typically do an entirely separate set of functions than the buttons on the other side.

So buttons on one side or buttons on both sides, left/right handedness is not a consideration, except in so far as the designers presumably already are taking *both* into account and ensuring that people aren't forced to resort to pinkie gymnastics, regardless of any left/right bias.

Developer’s code worked, but not in the right century


Why not use a standard date format ? Time Machine.

You saw the part where it was mentioned that this was running in a mainframe environment ?

ISO-8601 was first published in 1988.

Chances are the data on the mainframe had been around for decades before anyone thought of the need to standardise on date formats and was governed by more practical (at the time) considerations such as the need to save space, optimise processing efficiency etc etc within the constraints of what are likely to have been some very idiosyncractic/esoteric qualities of the runtime environment.

So fine, years later these new standards come along and processing power and storage efficiency are no longer the constraints they once were, and now you're only challenge is to convince the bean counters that they really should invest their beans in refactoring ALL of the date handling code in their systems which currently isn't broken at the opportunity cost of any amount of other value-add work in those systems together with the risk of introducing defects in systems that otherwise are working perfectly.

Good luck with that.

As for the documentation part of your question: What is highly likely to have happened here is that the date conventions in use were indeed very thoroughly documented but thanks to wilful misinterpretation of the Agile Manifesto (among other things) nobody believes in documentation these days (until AFTER they have learned how important it is). The developer might even have been told he had to read that documentation. Perhaps he even had read it in order to pass a control gate, but didn't actually take it in. Or perhaps they had read it years before, hadn't had to deal with date values in data for so long that they had forgotten or simply overlooked what they knew.

Bottom line is: This sort of screw up happens. That's why we say that working software is more important than documentation, but "working" doesn't mean "compiles and runs". It does mean that, but so much more.

Which is why testing is so crucial and why you never run new code for the first time in PROD.

That's the real mind-blown aspect of this story.

Keep your hands on the f*cking wheel! New Tesla update like being taught to drive by your dad


Re: Auto-crash-pilot

For many vehicles a wetware driver doesn't need any technology to "see through " the vehicle in front other than the front and rear windows already installed and their own ocular sensor array. (Translation: You can simply, literally see through the car in front).

Now, following a truck, SUV or family saloon loaded with crap on the parcel shelf, the wetware obviously has to make a suitable adjustment.

And that's the thing - for all the talk of "advanced" tech and AI and other brands of snake-oil, the simple fact is that as "impressive" as what might have been achieved might be, it is still a long, LONG way from being able to match a human. Even one of only average intelligence.

Shatner's solar-powered Bitcoin gambit wouldn't power a deflector shield


World leading Geothermal nation ... Iceland ?

So they're doubling their output to 100MW. That's nice. But also actually wrong.

100MW is the capacity of just ONE of Iceland's geothermal stations with the overall total capacity somewhere closer to 705MW

Let us know when they get close to New Zealand's 854MW.

Microsoft returns to Valley of Death? Cheap Surface threatens the hardware show


Re: Low Cost? at $499!

"but all Apple major products are massively over priced for what you get"

Wrong. Linus (of Tech Tips) demonstrated this by doing a tear down of an iMac, or possibly an iMac pro - I can't be bothered looking it up right now.

The point being that if you went out and bought the same components used int he assembly of that Apple product (with the exception of the case which of course would have to be a commodity case since you cannot buy the proprietary iMac chassis off the shelf) then your total spend would have been several hundred dollars MORE than buying the same components assembled by Apple in the form of an iMac/Pro.

Now, they also went on to establish that you could get equivalent or better performance with some smarter buying decisions, but in terms of a like:for:like component list, Apple was actually cheaper.

Google accidentally reveals new swipe-happy Android UI


Pursuing swipability by removing/hobbling .. swipability

Google feeds - you USED to be able to swipe articles to dismiss them. NOW you have to stab at the individual burger menu and tap "Hide this story" from the menu thus summoned.

GMail - you can swipe email notifications, but only to snooze them or to get to the GMail settings (!) where-in you cannot make any further changes to what swipe does. As a bonus, this useless swipe action works equally pointlessly in either direction.

I'm sure things were more swipable in the past but if we are assured this is "progress" then I guess it must be.

Samsung Galaxy S9: Still the Lord of All Droids


Unable to remove unused camera mode options ? Au contraire mes ami...

Simply long press on one of the camera options in the swipe menu/list. Bingo... pick and choose which modes you want included (two are not even enabled by default: Sport and Slow Mo [as opposed to SUPER Slow Mo]).

You can even change the order of them to group your preferred/most often used modes together.

A similarly hidden AWESOME feature is the ability to long press the shutter release button and DRAG IT TO WHERE-EVER YOU FIND MOST COMFORTABLE!!! Sorry for shouting, but this is so ridiculously useful when trying to take selfies and getting cramp from trying to hold the phone steady and crab your finger to reach the button.

I've historically always found phone camera apps to be a frustrating mess. The S9+ app on the other hand... brilliant.

A developer always pays their technical debts – oh, every penny... but never a groat more


Like any Debt, Not all Technical Debt is equally bad

Weigh the debt against the cost to repay and the value/cost of delay over the life of the debt.

A credit card debt is a wholly different class of liability than a mortgage.

For example.

Airbus plans beds in passenger plane cargo holds


Re: Glossing a commercial turd

In the commercial passenger air business, passengers are already referred to - or at least thought of - as "self loading cargo".

The problem is, you can't slap a barcode or RFID tag on a passenger that they couldn't/wouldn't remove, lose or give to someone else, so all that automated cargo tracking that works so well for the manually loaded cargo is the bit that doesn't work for the self-loading stuff, so you have to keep checking that each piece of cargo really is the piece of cargo that the documentation it carries claims it is.

Also a box of auto components being shipped around the world cannot start having villainous thoughts and plan to bring down the plane or hijack it in order to fly to it's preferred destination of Maranello instead of the scheduled arrival at Dagenham (via Heathrow) or secure the relase of fellow auto components imprisoned in JIT parts bins in Sunderland.

Complications, complications.

If you could organise to collect passengers from their homes or designated pick-up points, verify their cargo papers at that point, place them in a secure container (from which they cannot leave) for delivery to the airport to be directly loaded on board, luggage and all then sure, that could work. You just need to have passengers willing to be treated even more like cargo than they already are.

No password? No worries! Two new standards aim to make logins an API experience


Re: WebAuthn

Don't worry, it will soon become w15n (or possibly web13n, to keep the web generation happy).


Re: And when your biometric data gets stolen?

The bigger problem is that as of current state, biometrics are not even _as good_ as your username. Biometrics often cannot tell that I am me (poor lighting, not holding the sensors in the "correct" relative position to the bio being metred etc).

If biometrics cannot reliably identify me (false negative) then it must be allowed that they might sometimes be confused hat someone else is me (false positive). Even if that is not the case, only being able to identify me as me when particular environmental conditions pertain is equally dumb. I am only me when I am well lit me ? Hmmm, maybe that's why some people prefer to get intimate with their partners only with the lights off ?

Are usernames really any better? For sure I can deliberately provide a false username if using a username as... my username... but nobody relies on _just_ a username for authentication. That would be stupid.


Application publishing gets the WebAssembly treatment


What in the name of all that's angled or curly does WebAssembly have to do with C/C++ ?

WebAssembly is just a VM standard. It may reference C/C++ as examples as the sorts of languages that could target it but this is primarily because those languages are built on compiler stacks with re-targetable back-ends, not because WebAssembly is specifically aimed at C/C++ (or Rust, which also get's a nod).

It's rather an alternative to JavaScript which, incidentally, some have used as a back-end target on alternative front-end language implementation, e.g. SmartMobile Studio which transpiles Pascal into JavaScript or even the likes of TypeScript et al.

RemObjects recently added WebAssembly as a back-end to their compiler stack, which means you can target WebAssembly using C#, ObjectPascal, Swift or Java, since these are the front-ends already available on that stack.

All the king's horses and all the king's men could probably put Huawei's P20 Pro together again


Re: But wait...

Um, surely that depends on the case ?

In the Samsung LED View cover the only "unprotected" parts of the phone are (when the cover is open) the screen (duh!), about 1" around the power, 2.5" around the volume and Bixby buttons and about the same at the bottom for the ports and speaker.

"Unprotected" in scare quotes because I would have to drop the phone onto a particularly pointy thing to impact those specific areas in a way that wouldn't be protected by the rubber case either side of those areas.

And the phone still looks pretty lovely, even in the case (and the LED View cover functionality is almost as lovely and a real conversation starter when people notice it).

Sure, the LED View cover is not cheap, but actually not much more expensive than the official but basic Samsung flip covers. And I've never understood the logic that results in someone being happy to spend many HUNDREDS of dollars on a phone but baulk at spending more than 20 bucks on a case to protect it.

NUC, NUC! Who's there? Intel, warning you to kill a buggy keyboard app


Why not simply make this vulnerability abundantly clear and then leave users to make up their own minds ?

I have a NUC which is in an accessible situation in my living room and I use this app for convenient control from my phone without having to reach for RDP via a laptop and or struggling with the non-mouse/soft-keyboard disconnect that a tablet provide. Plugging in a mouse and keyboard into the front-firing USB-A's is easier than that.

Which also provides an attack vector for anyone wishing to "inject keystrokes" or indeed mousey gestures, that removing this app will not address.

Yes, if anyone can get on my wifi from outside the house then this is a greater attack surface than physical access to my NUC, but securing my wifi is a separate problem which, if adequately addressed, surely renders attack vectors which depend on that access moot ? No ?

Hookup classifieds ad sheet Backpage.com seized in Feds shutdown


First of all, let me make it abundantly clear that I am not advocating or in anyway supporting sex trafficking or any other form of trafficking or exploitation, not even the legally sanctioned exploitation of workers by tilting the legislative and regulatory regimes in favour of employers, in WHATEVER line of work their employees may be engaged in but which don't attract the liberal left penchant for campaigning and pyrrhric victory celebrations. But I digress...

If you wonder how the people of Backpage feel about this, then you must also ask yourself how the inventor of email feels. Or the telephone. Or the printing press. Or the first human-like creature to make an utterance which was understood by a fellow human-like creature to be a form of communication.

Backpage is - like innumerable other examples - a medium by which people are able to connect and communicate and exchange wants and needs. Shutting down the entire medium because of a small number of people using that system to engage in an exchange of wants and needs that are illegal or morally reprehensible is the proverbial sledgehammer to crack a nut.

What about all of the sex workers using Backpage around the world as a legitimate part of their legitimate business ?

What about all of the people around the world using Backpage for purposes wholly unrelated and unconnected to sex trafficking, the sex industry or any illegal or even legally questionable activity ?

We should console ourselves with the thought that all of those legitimate activities can simply find alternative channels ? Well so can the professed intended targets.

Hard problems demand hard solutions, not clumsy ones.

Blackberry snaps, yakkity-yak Snapchat app brats slapped with patent trap rap


Re: Ahhh, the optimism of lawyers

Maybe not how it works in the US but elsewhere you have may have to demonstrate actual losses since your claims for recompense are limited to actual, demonstrable losses, otherwise your claim is punitive, not restorative and in some jurisdictions there is no - or very limited - scope for punitive damage.

Optimism of lawyers is directly (or perhaps geometrically) proportional to the scope for them to make hay in the sunshine of the legal system in their particular jurisdiction.

UK tech whale Micro Focus: Share price halves as CEO quits, sales slide


Re: Legacy stuff?

Visual COBOL.NET is both a thing and actually pretty damned cool (if you're in the business of trying to migrate legacy code off of a mainframe and onto the .net platform).

COBOL was arguably the first "managed runtime" - one of the reasons that 50 year old code still compiles and runs on modern hardware (the code is old, the kit it runs on is not) is that the source language was abstracted so far from system characteristics. The same applies to the COBOL abstractions for .NET. In a PoC I was involved in a few years ago we found that mainframe COBOL code could simply be re-compiled for .NET - the only tricky part was any EXEC interface code, in our case switching from Oracle Pro*COBOL SQL integrations to ODBC/ADO, but that was a tiny, tiny fraction of the code and quickly and easily dealt with (most of it could be automated).

Like I said, pretty cool. But I can't say I'm unhappy that I didn't then get involved with the larger project to actually migrate the entire codebase. :)

Facebook suspends account of Cambridge Analytica whistleblower


Re: Oldsters?

"Young folk have no such qualms, understand the transactions they participate in and are more familiar with the privacy controls of the services they use."

This is clever, platitudinal patronising.

What he means is that anyone under 25 has not yet experienced enough unnerving consequences of sharing their data and so are still in that naive state where their complacency can be exploited.

Of course, if he actually said that then he would draw attention to the true thinking and insult and offend his market, both and all at the same time. So instead it is put in these terms that appeals to the same under 25 year old's vanity and sense of superiority.

10 PRINT "ZX81 at 37" 20 GOTO 10


My first step in a journey of 1,00 miles...

I remember my dad taking me to the local sorting office 37 years ago to collect the parcel from Sinclair, a shiny (well, actually no) new ZX81. One of the pre-assembled ones. I imagine this is why I ended up in software rather than hardware - no soldering required. I never could get the damned thing to load anything from tape, so I had to bash in listings from magazines, and then from that took the step to figuring out that by understanding this "gibberish" I could then change those games and make them do what I wanted.

And so began my life as a self-taught software developer.

My own personal computing history then went down a slightly esoteric route. While most of my friends had Speccies or Breadbins (C64's), with a couple of Electron's and even one Dragon32, my upgrade path took me to a TI-99/4a since a lot of my family (large number of aunts and uncles) not to mention my own parents at an earlier time, worked for Texas Instruments. When they pulled out of the home computing market staff they sold off stock to employees at bargain basement prices. Idiosynchratic as it was, it was a damned fine machine.

My path then returned to a more conventional route, with an Amiga 500 and after teaching myself C and Pascal programming with the multitasking GUI OS I then landed a job building software for this new fangled "Windows" on PC's which seemed positively antique compared to the Amiga (this was before even Windows 3.0).

Thus began a career in computing and software, all thanks to my late dad and that ZX81.

I've recently started assembling a museum of that personal computing history (the original equipment having been on-sold or disposed of, sadly not being appreciated at the time for the future value - sentimental or otherwise - that it would hold).

The one exception being my original ZX81, but that hasn't actually worked since the late 80's. But I have now replaced it with one from ebay in near-mint condition, including original box and sleeve.

There's something magical about switching on these old machines and seeing them spring to life as eager to please as ever, oblivious to the passing of time that has rendered them obsolete as anything other than curios and objects of affection.

Us? Reverse engineer HoloLens? No way, not us, nuh-uh – Magic Leap


1st of March is the first day of March, not spring. Check back on the 21st.

It makes sense that some people want a nice neat calendar that lines up arbitrary human relatable ideas with or sometimes despite observable/astronomical phenomena. It's happened many, many times over the centuries.

But some things still don't line up neatly, no matter how much we might like them to.

The equinoxes are one of those things (well, two technically).


Biting the hand that feeds IT © 1998–2019