* Posts by Andrew Commons

226 publicly visible posts • joined 22 Dec 2007

Page:

Chinese server builder Inspur trains monster text-generating neural network

Andrew Commons

Re: "Work on natural language processing"

@Mike 137

The 1950s and 1960s are really interesting periods where computing is concerned. WW2 saw the development of special purpose machines aimed at breaking ciphers. After that the concept of a generalized computer emerged and they were figuring out how to build them (they were one off and had individual NAMES), how to program them, and what they might be used for. AI/ML was of immediate interest right there beside A-bomb yields and other number crunching areas. The fact that business applications were different emerged quite rapidly in this period and influenced hardware development.

Now it's all kind of boring.

Andrew Commons

Re: that is more or less expected

"So, there will be systems in the not so far future that will be able to add meaning to the language they input and use that to generate meaningful language in the output."

Didn't Microsoft try something like that with Tay?

https://en.wikipedia.org/wiki/Tay_(bot)

Andrew Commons

Re: "Work on natural language processing"

"Work on natural language processing has been going on since the '80s and it's nice to see it coming near fruition."

I think you can trace it back to the 1950s. A small sample of papers from that period.

W. A. Clark and B. G. Farley, “Generalization of Pattern Recognition in a Self-Organizing System,” in Proceeedings of the 1955 Western Joint Computer Conference, Los Angeles, California, 1955, pp. 86–91.

Datamation, “Sarnoff foresees Voice-Controlled Systems,” Datamation, vol. 3, no. 7, p. 23, Oct. 1957.

D. L. Johnson, “The Role of the Digital Computer in Mechanical Translation of Languages,” in Proceedings of the May 6-8, 1958, Western Joint Computer Conference: Contrasts in Computers, Los Angeles, Calif., 1958, pp. 161–165.

W. W. Bledsoe and I. Browning, “Pattern Recognition and Reading by Machine,” in Papers Presented at the December 1-3, 1959, Eastern Joint IRE-AIEE-ACM Computer Conference, 1959, pp. 225–232. [Online]. Available: https://​doi.org​/​10.1145/​1460299.1460326

It was certainly well and truly on the radar and chess playing programs were being produced on the hardware of the day.

React team observes that running everything on the client can be costly, aims to fix it with Server Components

Andrew Commons

"Working software over comprehensive documentation"

You reap what you sow.

I was coding for a living way before this when comprehensive documentation was part of the process. The process was applied on a per project basis and the problem that wasn't addressed very well was centralising the documentation so that it reflected the system as it evolved over several decades.

The documentation was far from shit, it was the documentation management that sucked. Now you don't even have the documentation to mismanage.

Zero. Zilch. Nada. That's how many signs of intelligent life astroboffins found in probe of TEN MILLION stars

Andrew Commons

@Andy Non

I'm beginning to think that intelligent life is a temporary phenomenon and self destructs beyond a certain level

This is actually one of the theories floated to explain lack of a signal. They only last a very short time.

Hidden Linux kernel security fixes spotted before release – by using developer chatter as a side channel

Andrew Commons

Re: Linux kernel doesn't do too badly with this intractable problem

@Glen Turner 666

You are spot on.

With regard to point (2) many organisations have formal procedures for vetting people allowed into the 'inner circle'. Whilst these are fallible they at least raise the bar to some extent. I have no idea if such processes are applied in critical open source development environments.

The kernel is only one area where this problem exists and is probably not the best option for exploitation. The sweet spot is probably some component that is widely used and is not a standard component of major distributions.

If you use something direct from the (open) source then you are responsible for the due diligence.

Andrew Commons

Re: Security by obscurity, yawn

"What does security by obscurity offer?"

It can buy you a bit of time. The analysis demonstrates that, in this case, it may not buy much time.

Andrew Commons

Re: Fear mongering

"First, they are very few, highly trusted individuals. Second, the results of their activity is available for all to see after the fact."

A bit like Guy Burgess, Donald Maclean and co then?

We know that critical bugs can hide in plain view in open source software for years. I would be surprised if this attack vector has not been considered by actors who are prepared to take their time.

Relying on plain-text email is a 'barrier to entry' for kernel development, says Linux Foundation board member

Andrew Commons

Re: So not just about plain text email

The thing is that HTML email clients provide really good support for tracking technology.

Now that's got to be a good thing!!!

Well, actually no, and maybe the folk who are aware of and concerned by the tracking implications are the sort of folk you want in the kernel.

Have an up-vote.

Australian PM says nation under serious state-run 'cyber attack' – Microsoft, Citrix, Telerik UI bugs 'exploited'

Andrew Commons

Coincidence?

I did a WHOIS lookup on the first IPV4 address in the published IOCs....

It's assigned to a certain Vultr Holdings...

Has The Reg got something they are not telling us?

Cheshire Police celebrates three-year migration to Oracle Fusion by lobbing out tender for system to replace it... one year later

Andrew Commons

Re: Requirements issues?

Different problem. Not a good analogy on my part.

You need the ability to have some information in the systems treated as classified.

In addition there may be unusual requirements around work hours which will not be found in 'normal' work environments.

They are paramilitary organizations which brings a bit of baggage with it.

Andrew Commons

Requirements issues?

Law enforcement systems can have unique requirements where HR and Payroll (and maybe other systems) are concerned. The inclusion of participants that are not law enforcement bodies is very strange and may be part of the problem here.

For example...someone is on the payroll but they will have an uncertain future if it is known they are on the payroll. But they still have to pay taxes on the income....

Making mug shots disappear is actually a real requirement.

Google shifting workloads to run when the sun will shine and the wind will blow

Andrew Commons

Re: Local regulation?

@Olius, this may be the way things go....but the only thing we can be sure of is that the resulting API will be subject to abuse...so grid stability will be even more compromised.

Some areas may also have their own plans for excess power...pumping a bit of shit up hill never hurts .. so this will have to be factored into the equation.

Google is always self centered.

Andrew Commons

Local regulation?

Unless Google is using its own generating capacity I would expect this would require some regulation. Having all your green energy sucked out of the grid without notice by Google would cause quite a few problems. An exaggeration of course but managing the electricity grid can be very complex and I don't think sudden load shifts are welcome.

RIP Katherine Johnson: The extraordinary NASA mathematician astronauts trusted over computers

Andrew Commons

Re: 7090

The DEC PDP 1 and the IBM 7090 were both rolled out in December 1959. So while the term may not have been invented comparisons were possible.

EFF warns of 'one-way mirror' of web surveillance by tech giants – led by Google

Andrew Commons

Cookie Zip Bomb?

I believe you can have compressed cookies so you could maybe exploit this to deliver a zip bomb.

Are you coming to the party dressed as an IMP? ARPANET @ 50

Andrew Commons

Re: History is written by the winners

In the late 1990s DECnet inter-networks were as big as IP inter-networks. I had DECnet on Macs as well as microVAXes and Ultrix workstations, it was widespread and pretty easy to use.

If DEC had not crashed and burned it could have been a very different world.

Google to bury indicator for Extended Validation certs in Chrome because users barely took notice

Andrew Commons

This is hilarious.

Once upon a time, long, long ago - well in the late 1990s anyway - when eCommerce was becoming a thing the ".com" certificates were only issued after verifying that the applicant is a genuine legal entity. You had to produce a lot of paperwork and it was not a quick process.

Roll forward to mid-2000s and all that has gone. Getting a ".com" is a trivial exercise. The certificate authorities responded by running road shows for "Extended Validation Certificates" that were only issued after verifying that the applicant is a genuine legal entity...and would cost more that the original ".com" that you had jumped though hoops to get. Oh...and they had this green stuff in the "chrome" in the browser that could not be manipulated.

Roll forward...and it's all shit again. And it will always be shit. The technology works, the process doesn't.

IT outages in the financial sector: Legacy banks playing tech catch-up risk more outages, UK MPs told

Andrew Commons

@Glen 1

In my experience (over more than a decade and a half in finance industry) it was records management failures that gave rise to documentation voids. It was documented, management insisted on it, subsequent teams kept it up to date. It reaches a point where no further change is required. Then, over time, just like the Saturn V, the documentation gets lost. This is generally tied to internal structural reorganizations.

"If the docs don't exist, then we just have to learn the hard way." If you think having the working COBOL source as a starting point is 'the hard way' you have a bit to learn yet. If the source code has also been lost you are about to learn why those who came before you resisted the urge to change this code every other day in order to 'surprise and delight' the customer.

Andrew Commons

And for decades it has been handling all the really obscure edge cases and convoluted regulatory requirements that are also not understood by the people writing the shiny new systems. Outages will be the least of their problems when all that starts to bite.

Andrew Commons

Bragging rights

"Internal bragging rights"..Yup, Dev and Ops orbiting so closely and so rapidly they are being picked up by LIGO.

The whole development methodology space seems to become completely unglued.

White House mulls just banning strong end-to-end crypto. Plus: More bad stuff in infosec land

Andrew Commons

Goodies and Baddies

Telling the difference between 'goodies' and 'baddies' when dealing with encrypted traffic is nothing new. The same problem has existed with physical messaging forever, that's why the plain brown paper envelope was invented. More recently we have "burner" phones. Traffic analysis can potentially fingerprint software but sticking with widely used applications provides the anonymous envelope.

Traditional methods, such as human intelligence sources, still work but scaling them to deal with the Internet is the unsolved problem.

Andrew Commons

Back to the future

In the days of COCOM, and in fact early Wassenaar, encryption was recognised as dual-use and export controlled. Banning strong e2e is just 'back to the future' and, having been there already, we know how that works out. The algorithms leak, new algorithms are created, and those who are outside of the immediate reach of the authorities roll their own. And, of course, you can always resort to a one time pad. Difficult to decrypt communications is not easy to ban unless you ban encrypted communication completely...but then you have things like steganography.

Out-of-office email ping-pong fills server after server over festive break

Andrew Commons

Re: Exchange?

@jake

"...because Microsoft uses DCE (Distributed Computing Environment) as developed by the Open Software Foundation in the early 1990s"

I think that should read:

"...because Microsoft butchered DCE (Distributed Computing Environment) as developed by the Open Software Foundation in the early 1990s"

From memory, they 'tweaked' certain 'standards' and built a wall between the NT and DCE worlds. Then, seven years later, they realised that this had not been such a great idea and sucked up to Kerberos. But it had badly damaged DCE by then.

Microsoft debuts Bosque – a new programming language with no loops, inspired by TypeScript

Andrew Commons

Re: What's Wrong With a Loop?

@Paul Crawford

FORTRAN was quite happy with multiple RETURN statements as well as multiple ENTRY statements. In a memory constrained world the RETURN would save you one or two bytes over a GOTO to a single RETURN statement. When running out of code space meant resorting to manually loaded overlays this was a serious consideration.

Andrew Commons

Re: What's Wrong With a Loop?

@Headley_Grange

HP Calculators and plotters...memories. The desktop calculaters in the early 1970s were programmable. The programs occupied register space starting with the high registers and coming down. The program could reference those registers which led to the interesting, and immediately grasped, option of modifying itself. Take the square root of R15 and see what happens next. Now hook a plotter up.... What we learnt from that was that those early plotters were tough. :)

China Mobile, you can kiss good Pai to America: FCC to ban 'spy risk' telco from US

Andrew Commons

Re: China Mobile / Cisco

They are all doing it and they know they are all doing it. The world then gets divided into two groups - those who can manufacture stuff and those that have to buy stuff because they have lost the ability to manufacture. Those that can manufacture are obviously in a better position than those who have lost this ability. That's globalization for you.

Who needs foreign servers? Researchers say the USA is doing a fine job of harboring its own crimeware flingers

Andrew Commons

Re: This is not exactly news

@AC

The Centre for Applied Internet Data Analysis (caida) has a slightly dated list that might work for you.

http://www.caida.org/data/as-classification/

There are others and you can probably pay money for more current data.

Don't be an April Fool: Update your Android mobes, gizmos to – hopefully – pick up critical security fixes

Andrew Commons

You want to check their update policy. I've seen some suggestions that it is 2 years from device release but you seem to have gone past that point.

http://www.xperiablog.net/2018/02/01/sony-mobile-official-android-upgrade-policy-firmware/

Andrew Commons

@Dan Melluish

You should probably look at this: https://support.google.com/pixelphone/answer/4457705?hl=en

Nexus support died in November but I think they did ship a December update but it has been quiet since then. Pixel is now the supported device but only for three years from the release of the model.

All other manufacturers have their own policies that are independent of Googles policies.

I think the list you looked at was misleading.

Andrew Commons

I cared about support...

Nexus 5X, latest available at the time of purchase. Support ended at the end of last year. Only Pixel getting security updates now. Nexus 5X still works so I'm now three months worth of critical updates out of date and that will just keep on going up.

Andrew Commons

Re: Pixel only

I would certainly agree that it should be much longer than 3 years but the reality is that the majority of android phones out there are unpatched at the operating system level because the manufacturers and telcos don't bother with the updates.

Nothing incredibly bad has happened because of this. The stagefright vulnerability was a non event. The action is in the Apps.

That could all change overnight of course but until that happens there will be no pressure to change things and the 3 years of updates matches the replacement cycle of the majority of phone users.

Andrew Commons

Pixel only

I think anything other than the Pixel series is now on its own. The 3 year security update window has closed on all other 'Google' branded phones.

What bugs me the most? World+dog just accepts crap software resilience

Andrew Commons

Re: Contracts cannot override statute

Very good points. Maybe there is an opportunity for a bit of 'innovation' and 'disruption' here in the form of service that does the heavy lifting for the consumer, a 'small claims broker'. Make it 'convenient', add an 'App', it ticks all the boxes. Create the tsunami of claims.

Andrew Commons

Contracts cannot override statute

That is true. But software companies that issue patches for critical vulnerabilities month after month are still in business.

So then we get into another interesting discussion, which I think is part of some other discussions on this piece, are the bugs actually hurting? Or, are the users just conditioned to the inconvenience. Or, is action against the vendor not a practical option for the average user.

I've certainly experienced issues that have required days of work to recover from. Taking action against the vendor would have made that effort look insignificant. My financial capacity to take any action would also be questionable....they have very deep pockets if they are big and they can just go bankrupt if they are small.

So we come back to constructing an appropriate and proportional legislative framework to take on the problem.

Andrew Commons

Re: Reliable code

The EULA, or in olden times Terms and Conditions, have always stated that the software was not warranted to work as described on the box. When I was writing commercial software using refurbished, but still hideously expensive, microVAXes in the late 1980s we copied the DEC Terms and Conditions almost word for word. No guarantee that the software was going to work. If it didn’t work then we were going to be in deep shit so, with a few relatively minor exceptions, it worked as advertised.

The economics have changed now. Minimal development platforms do not represent 25% of the value of your house. Failure costs you nothing as a developer. And the EULAs still have that big out.

Changing that will require legislation. Safety and Privacy are possible avenues that can be used to achieve this. But consumer apathy will make this an uphill struggle, convenience and shiny will win every time.

Make America buy phones again! Smartphone doom 'n' gloom crosses Atlantic to cast shadow stateside

Andrew Commons

Global impact?

I had the same thought. In terms of percentage I think you would probably need to use a population figure excluding the very young and the very old so maybe a third is not unrealistic.

So where do the 1.8 Billion replaced devices go? How much energy and pollution is involved in their disposal?

The planet is right, it needs to get rid of us, quickly.

Radio gaga: Techies fear EU directive to stop RF device tinkering will do more harm than good

Andrew Commons

Re: What's the problem....

My reference was Australian regulations. The point is that these regulations are widely accepted as a 'good thing' for a good reason.

Andrew Commons

Re: What's the problem....

Smart heaters? Smart ovens?

Botnets created from rooted routers to take down critical infrastructure.

My point is that we actually have reached the point where insecure devices can cause harm and destruction and we need to start thinking about that because there are billions of them out there.

Andrew Commons

What's the problem....

Now electrical equipment that is plugged into the electrical grid are expected to be safe. There are regulations in place that attempt to protect consumers, and the grid, from unsafe equipment. The electricity grid has safeguards built into it to minimise the impact of unsafe equipment. I don't think anybody thinks this is a bad thing.

So why such strong objections for equipment plugging into the RF grid which, I think, lacks the kind of safeguards that apply to the electricity grid.

We know that all the gadgets being plugged in are completely fucked. They are full of bugs and are actually dangerous when you consider how they can be exploited. So this legislation is basically saying you need to be compliant before you get on the grid...just like electrical equipment, just like cars before they get on the road, just like aircraft before they carry passengers,..... I don't hear objections in these cases.

Is this really so bad?

Cue down votes.

Nice 'AI solution' you've bought yourself there. Not deploying it direct to users, right? Here's why maybe you shouldn't

Andrew Commons

Re: "No one really understands why machine-learning code is so brittle"

The last link given in the Reg piece goes to a neat piece of research that used adversarial examples thrown at an image recognition application to conclude that it triggered on texture rather than shape.

This highlights the real problem...they do stuff but we don't know how and as a result have no idea how they will behave if the input goes off-piste. So, dressing a wolf in sheep's clothing actually works with the current technology.

Who needs malware? IBM says most hackers just PowerShell through boxes now, leaving little in the way of footprints

Andrew Commons

No, cue a policy that uses native access control to only allow those who need to run Powershell to access the executables.

Simple and sensible.

This image-recognition neural net can be trained from 1.2 million pictures in the time it takes to make a cup o' tea

Andrew Commons

Re: You can't make a cup of tea in 90 seconds

The answer is TeaOps...

1. Rapid delivery of tea - kettle still hot from previous pot

2. Pot still warm and half full, just add more tea

3. Brew?? What's that, we don't care what it tastes like!

Linus Torvalds pulls pin, tosses in grenade: x86 won, forget about Arm in server CPUs, says Linux kernel supremo

Andrew Commons

DEC HALs

DEC were certainly users of hardware abstraction layers. They had one in VAX/VMS I think, it was always instructive to go through the BLISS header files looking at the comments to see next years models emerging.

Crash, bang, wallop: What a power-down. But what hit the kill switch?

Andrew Commons

Placement of kill switch and other quirks

Probably similar timeframe - VAX 11/780s in late 1970s. Kill switch next to a phone on the wall... Engineer makes call on phone, leans against wall,.... lights out!

Same installation. Telecomms 'electricians' removing a cabinet. Not sure if power is off. Large screwdriver between active and earth shows, momentarily, that power WAS on, lights out once again.

Some air-conditioning fun from same location if On-Call is interested in that sort of thing.

Object-recognition AI – the dumb program's idea of a smart program: How neural nets are really just looking at textures

Andrew Commons

It seems to be an extension of this study...

https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1006613

Keep in mind that the smallest change required to get an image classification algorithm to misclassify is .... 1 pixel.

This must be some kind of mistake. IT managers axed, CEO and others' wallets lightened in patient hack aftermath

Andrew Commons

Re: Seems legit

@Peter2

As far as I understand it they segment the network so that if Internet access is required for work purposes then you (the employee) have internet access. if Internet access is not required for work purposes then no access. This includes email. Devices with Internet access do not have access to the protected segment.

There are many roles that do not require Internet access in an organisation. Technical roles are often considered an exception but there are ways that this can be minimised.

Andrew Commons

Re: Seems legit

Indeed, and the western world should probably follow Singapore in removing Internet access from most public service accounts. They committed to this in mid 2016. See this commentary related to this incident:

https://www.gov.sg/news/content/internet-separation-could-and-should-have-been-implemented-in-public-healthcare-system

Cyber-insurance shock: Zurich refuses to foot NotPetya ransomware clean-up bill – and claims it's 'an act of war'

Andrew Commons

Re: An opposing point of view

That would be interesting. I suppose you could also extend negligence to include using software that you knew was faulty regardless of how much you patched it. Proving you weren't negligent does indeed become a challenge.

Andrew Commons

An opposing point of view

Interestingly there is an opinion from Marsh LLC, part of Marsh and McLennan who are in the same business as Zurich and about the same size as Zurich, that is was NOT Cyber War.

[PDF]https://www.marsh.com/content/dam/marsh/Documents/PDF/US-en/NotPetya-Was-Not-Cyber-War-08-2018.pdf

I would have thought that contributory negligence - failure to patch - would have been the tack used by the insurance companies.

Page: