* Posts by Sproggit

10 posts • joined 8 Jul 2012

World's most complex cash register malware plunders millions in US


Escape Route?

We're seeing more of this sort of thing every day, week, month. One thing remains curiously absent from developments, though, which is any form of consequences for the vendor. [Aside, perhaps from the reputational damage - but memories seem short]. If companies were

1. Taking all reasonable steps to protect their data

2. Not grabbing data that they should not take and do not need

3. Keeping all their technology patched and secure

4. Deploying cyber controls adequate to the risks

then we would likely be seeing less than this. If these retail outlets were vehicle manufacturers shipping cars and trucks with defective breaks, you would expect to see government getting involved and prosecutions for corporate negligence in the works. So why don't we see lawmakers offering to step in and protect the little people from cyber security negligence?


AMD sued: Number of Bulldozer cores in its chips is a lie, allegedly


Re: Reread the Article (@bri)

The original post from td97402 basically called out the fact that some of the AMD chip design involved sharing of some components between pairs of cores, seeming [to my mind] to imply that because the cores shared a branch prediction engine, and both the fetch and decode stages of the instructions, that this meant that the chip didn't really have properly independent cores.

I'll repeat for the record that I'm not a chip architect and that what follows may be factually incorrect...

However, what I wanted to say was that it is entirely possible that the sharing of the fetch and decode units across multiple cores is entirely reasonable. For example [here comes the fiction] suppose that, on average, each instruction takes 4 clock ticks to execute. Suppose that the fetch and decode units can each retrieve and decode an instruction in one clock tick and then switch between different threads in a second clock tick.

If this theoretical model were in any way reflective of the actual CPU, then AMD might have been able to determine that one fetch and one decode unit [with adequate state switching] would be sufficient to "service" two processor cores.

Terrible analogy: I drive a car with a relatively simple 4-cylinder 2-litre engine. The car has one fuel pump. That pump is essential to the engine, since without it those 4 cylinders simply won't get the fuel/air mixture needed for combustion. But the engine only needs one pump, since that pump is plenty capable of supporting all 4 cylinders. In a similar way [again, I have no way of knowing if this is true] the "effort ratio" between what the fetch/decode units do and what the processor does could *easily* be such that these two components can be very effectively shared.

I really didn't want to pick an argument with the original post, just to point out that there could be all sorts of design reasons [and, in modern CPUs, there are all sorts of examples] of sharing or time-slicing components across the broader system design.

To my way of thinking, in order to show that sharing single fetch and decode units between 2 CPU cores is deliberately misleading, someone would have to first show that having one fetch and one decode unit per CPU could actually produce more throughput. Without this, the plaintiff's case is based on conjecture and lacking a basis in fact. Now that presents a massive problem for the plaintiff, since the only way that they could demonstrate this would be to have AMD build such a chip. Which I can't see AMD inclined to do...


Re: Reread the Article

I'm not a chip architect, but doesn't the presence or absence of duplicate numbers of those components depend entirely upon things like chipset timing?

Specifically, is it not possible to have a pre-fetch unit that is running at [in practical terms] double the clock speed of the cores? Or put another way, is it safe to assume that the throughput of the pre-fetch unit is tightly tied to that of the processors?

Let's put this another way...

If you fire up any modern manager on an Intel Core i7 powered machine, you will see that the "core count" is precisely double what Intel claim for the chip, thanks to Hyperthreading. But what most but the nerdy aren't aware of is the fact that the Intel chips will typically "sleep" one ore more of these "cores" in order to manage the temperature of the chip... So [being argumentative] we could argue that Intel can't claim the number of cores they do if the chip isn't designed to use them all simultaneously?

I'm not trying to pick an argument with you, I'm just try to offer a view that says that modern chip design has become so hideously complex that this entire [and seemingly frivolous] case seems to be built entirely on semantics.

The irony here is that anyone truly concerned with this "nth level" of performance from their CPU is not actually going to count or measure things like this, but actually review simulations of performance from industry-accepted measurement and benchmarking tools like (I think) SiSoft SANDRA. [ I might be a bit out of date with that example!] So for someone to come along at this point with an argument like this is not far short of spotting an ambiguity in the documentation for a 10-year old car and thinking they can sue. Caveat emptor!


Is Windows 10 slurping too much data? No, says Microsoft. Nuh-uh. Nope


We've All Lost The Plot

If you look up the "traditional" definition of an Operating System, it is designed to provide just a few basic services:-

1. Task Switching

2. Resource (i.e.) Memory Management

3. Hardware Abstraction

4. Bound to be something I forgot

If you go back to the DoJ AntiTrust trials and Microsoft fighting to argue that Internet Explorer - a *web browser* was an inseparable part of the OS, we find the early signs of Microsoft trying to shore up their continual upgrade cycle by folding more functionality into the OS that forces user upgrades and thus forces the cash-cow to keep on churning out their income.

When we get into discussions such as this one [and it's fascinating, with lots of terrific and thoughtful insight] it's interesting to see how far the "question" has been moved by Microsoft [and others] and their marketing. Maybe even the GNU/Linux model of shipping a platform that can pull in any of tens of thousands of apps in an instant even helped push MS in this direction. But the point here is that Microsoft are playing fast and loose with the definition of an OS as an excuse to push functionality that nobody wants. Once before the DoJ took them to task for it - but it is sad though perhaps inevitable to see that nobody is even discussing that here, let alone seeing a chance of it happening.

The argument we're seeing from all the major platform vendors is, "As long as there is another platform we can argue you have a choice. As long as we can make that argument we are free to do what we want."

Sadly the simple fact is that W10 is shipping in enough volumes and people are using it in enough numbers for MS to be able to continue undiminished. If there had been a mass movement in the industry to boycott the platform, we might have seen a partial retreat on this. But they astutely decided that a retreat from something new, shiny and free was unlikely, especially given that less than 1% of the users will be aware, much less care, how badly they are being abused.

I have nothing but empathy for everyone posting and sharing their frustration here in this comments thread, but I suspect the truth is that the lack of change/concern/remorse from MS since W10 launch means we can only expect this to become more restrictive and intrusive with time.

It boils down to this [and I apologise in advance for being so blunt about it]:-

Either use the product and shut up, or don't [and shut up]. Those are your only options. Don't bother to argue, complain, plead, beg, cajole, implore, impress, entreat or otherwise attempt to influence Microsoft to change W10. They are not listening. Get over it.


So what the BLINKING BONKERS has gone wrong in the eurozone?


On German Economics Between the Wars


You wrote, "It should be said here that they don't think that inflation produces Nazis, no. Because the great German inflation was actually in 1921 to 24 and that didn't bring Adolf Hitler to power at all."

I'm not an economist, but I do remember being told [by a history teacher, which of course does not make it true] that one of the primary factors in the Germany economic turmoil in the early 1920s was in large part due to the somewhat unreasonable restitution burden placed on Germany by the Allies at the end of the First World War. I'm not sufficiently familiar with the economic trends of the time, but I believe that whilst the reparations might have looked like a firm and somewhat punitive response in fair economic conditions, they became untenable as the pan-European situation worsened as time progressed into the early 20s.

Further, whilst I would be very happy for someone to provide more details and correct me if I am wrong, I also understand that one of the reasons for the present Greek financial crisis could easily be described as Germany's own fault. Specifically, this is the fact that, at the time that Greece joined the Euro as one of the "first trenche" of countries to do so, they had in fact failed to meet the criteria for economic convergence as set down by the Bundesbank [um, sorry, ECB]. Because Germany [um, sorry, the Bundesbank... no, wait, the ECB] wanted the Euro to be successful, they turned a blind eye to the massive hole in Greek tax receipts. [The same was true, to a greater or lesser extent, to Portugal, Italy and Spain]. As a result, the Euro accepted into their midst not one but four economies that each carried significant fiscal risk. As if that wasn't enough, what then followed [to varying degrees] were economies [and Britain under Gordon Brown's Chancellorship was the same] that spent far more than they earned, and borrowed to cover the shortfall, relishing the availability of cheap money thanks to falling interest rates.

Except of course the credit has to run out at some point, but the issue for the Eurozone was that the wreck happened at the same time for many countries around the world, leaving them no ability to work around the problem.

But the "root causes" of the Greek situation can be summarised with two simple observations:-

1. At the time that Greece was admitted to the Eurozone it failed to meet the convergence criteria and therefore was nothing more than a problem stored up for the future;

2. At the time that Greece was admitted, the oversight of the convergence process was handled by German banking regulators [since of course the ECB is basically the Bundesbank with a new logo over the door, and run entirely along German banking lines].

This last point is relevant given that Germany runs her economy in a way that is in harmony with ECB policy; Greece does not. Unless or until Greece is willing to run her national economy like Germany, this friction/tension/trouble will remain in the Eurozone. It is a philosophical challenge as much as an economic one.

Oh, one more thought...

If the Eurozone and the ECB doesn't get their act together and sort this out pronto, then other EU economies will start to fall perilously close to the trouble that Greece are in now. And just wait for what *could* happen if France gets into difficulty. Not because their economy is weak: it isn't. But because such a *huge* percentage of France's economy is reliant entirely upon the state, whilst Germany has a smaller state machine and more activity in the private sector. Trouble in France could potentially see the Euro fall apart completely...

p.s. Not an economist, or a historian, don't have a clue what I'm talking about. ;)


That shot you heard? SSLv3 is now DEAD


If only there was a way to shame companies into upgrading their security promptly.

Bank of Scotland online, for example, is still using TLS1.0 [not exactly the same as SSLv3, but not far enough removed to be considered significantly more secure] for all it's banking activity.

What are they thinking?

If you try and email their support line, you get an auto-reply which begins, "Thank you for alerting us to the suspicious e-mail you have received."


How can one of the big national banks (part of Lloyds Group these days) have the temerity to operate like this?


SpaceX touts latest gear: new module, rocket demo


Program Costs

I think Kharkov makes some interesting points about the cost of the U.S. space program, but in large part I suspect they miss the truth of it. American space exploration was launched by Kennedy with two aims in mind. One was as a means of demonstrating the 'superiority' of Capitalism on the world stage - the chief protagonist being the U.S.S.R. at the time. The other was as a means of boosting the US economy. In economic terms, the 1950s had largely been about the US shifting back from a war economy (and recovering from the costs). Kennedy's plan was simple - pour vast amounts of public funds into the space program at the top (via NASA) and then ensure that it trickled out into the broader economy by allowing NASA to award contracts. This wasn't quite a "cost no object" approach, but remember that NASA had a sponsor (Kennedy) who was very keen to demonstrate American technical prowess with a moon landing. So the start of the space program was done on a "just make it happen" budget.

The second aspect to this is to think about contemporary capability at the time: materials science was pathetically ignorant in comparison with what we've learned since. More, one of the major challenges in getting the moon program started was the development of an avionics computer capable of adjusting the vectored thrust of the launch vehicle 50 times a second. With today's technology, our reaction to that would be "Pfft!" (too easy) - back then they had to develop that capability from scratch.

Since the 1950s mankind has learned that a space program isn't merely "science fiction" but that it has a broad range of commercial benefits, including better communications, SatNav, (satellite dish) entertainment and even R&D. Operating in space has become a legitimate commercial goal.

When you combine these two major factors (the pressure to be commercially viable and the *massive* advances in the relevant sciences) it stands to reason that we should be able to start today and develop a program that is massively cheaper than NASA's original 1950s offering.

However, despite all of the above, I do think that what we're witnessing today is special - but for different reasons. If we compare not just the design but the entire mindset behind Rutan's SpaceShip One, for instance, the articulated tail section is not just engineering genius, it's a very elegant solution to the deceleration/re-entry problem... If we look at Musk's idea to allow the "spent" first stage of a rocket to retain just enough fuel to safely land itself... Then it's becoming clear that we've moved beyond the occasionally wasteful "just enough to work" mentality of the "public sector" space program of the 1950s, and are now looking at a commercially viable future.


Haswell micro: Intel’s Next Unit of Computing desktop PC



It looks as though Intel are leaving the best of their kit for others (Apple) to debut. When you look at the best of what's been announced (i.e. Haswell with Iris Pro 5200 graphics) then the only vendor selling actual kit is Apple with the Macbook (inc Air/Pro) ranges that include Retina Display.

Hopefully 2014 will see availability spread to other manufacturers. When it does, this particular offering will be consigned to the parts bin. Personally, I'd wait for something like a 16Gb, i5, 5200-based Intense, from fit-pc.com. That would be entirely passively cooled (completely silent) and have two digital monitor output ports - likely HDMI and DisplayPort. The 5200 GPU has the bandwidth to comfortably support a pair of 1920x1200 monitors (or a single 2560x1600) and we'll likely also see something with decent optical output.

This gets us close to an ideal configuration for, say a web developer, with one screen for a browser and one for an IDE, with decent sound output and no noisy fans. I can live in hope, I suppose...


Tesla's Elon Musk v The New York Times, Round 2


The Balance Of Evidence

This second round of claim and counter-claim brings some useful additional data to the table. Fortunately, we can ignore most of it, as stated here, as a bit of "he said, she said". What we can't do, however, is ignore the facts.

Our NYT journalist, in their article, made very clear mention of the fact that they used cruise control in order to preserve battery life. If you look at the analysis data provided by Tesla, it is impossible to spot any period of the test drive during which cruise control was active - the vehicle speed is just a series of irregular spikes, even when on a sustained run. Point 1 to Tesla...

Where our NYT journo did get specific about aspects of the journey - for example he was very clear and precise in terms of reporting different vehicle speeds, taken from the dashboard of a car on which the speed is very, very easy to read. The data from the trace - and this is clearly visible - reports very different speeds. Point 2 to Tesla...

Our journalist is also very specific about the timing of charges during the journey. Once again, the Tesla trace data reports this very differently. It is important to note with this point that whilst in his latest response our journo replies with the answer that he was doing what the people from Tesla told him to do... or that he stopped charging when the range indicator said the vehicle had enough charge... Thinking about that, it almost makes sense on the surface. However, if you gave me a car capable of 40mpg for a test drive, and I put 2 gallons of fuel in the tank for a 50 mile drive, I think you'd agree that in reasonable conditions it would get me there. But if I drove around at 6000rpm in 1st gear, chances are it would not. So the response that "I stopped charging when the guage said I should get there..." is a little specious if the unfinished remainder of the sentence is, "... and then drove like a plonker to ensure I wouldn't." Point 3 to Tesla.

Final thought. Whilst I'd concede that Musk's rebuttal is a bit heavy on the righteous indignation, it is very clearly supported by graphically presented, factual data, captured from the actual vehicle performing the actual test drive. I notice with keen interest that whilst the journalist is very heavy on responses, at not one point does he respond with: "Your data is wrong." That speaks volumes.

[ Oh, FWIW, I'd consider myself a complete petrolhead and have zero interest EVs... but in this case it looks like a journalist being caught out trying to make up a salacious story and being caught in the act... ]


Review: Samsung Chromebox


And The Target Was?

Lots of interesting comments already made and I agree with the assessments of the review and comments that question the point of this product. Maybe we all got it wrong by considering it to be aimed at consumers? Suppose it was aimed at Apple instead - a device that was intended to try and compete with the latest incarnation of the Mac Mini, but one which was critically compromised by the choice of OS, limited thanks to Samsung's partnership with Googe over Android?

As others have posted, there is no shortage of viable (and much better) alternatives out there.

Quite a few people mention the Acer Revo PC. I had one of these, but just replaced mine with the Shuttle X35. The advantages of the latter are:

Faster Processor

Completely silent fanless design

More powerful on-board graphics extends display capability to 1920x1200 pixels

Add an SSD and you've got a seriously quick, fully featured machine capable of stand-alone operation.

I partnered mine with Ubuntu 12.04 (works fine with Mint, too) and I get accelerated graphics courtesy of an Intel driver, snappy response and ultra-low power consumption.

Yes, I must concede that it costs more than the Samsung - especially if you opt for a decent SSD - but it's so worth it.

This Samsung mess has just got to be prompted by the desire to poke Apple in the eye, as opposed to real consumer demand...