Re: So, Apple is lazy and greedy, Samsung a victim?
4 thumbs up and 8 thumbs down – people don't like the truth!
423 posts • joined 6 Jun 2014
Register presents Apple as being lazy and greedy, but takes pity on Samsung. That is 180º wrong. The only innovations in this space are from high-end features. Thus we see the high-end new products attracting a high price. For Apple you can still buy the previous models. The truth is there is little innovation to be made in this space anymore. Apple did such a good job of innovation 10 years ago, that not much else could be done.
Along come Samsung and others. Samsung really is greedy and wants to take over the whole world. Samsung is lazy. It copies Apple. There are no famous names at Samsung, like at Apple and Microsoft, IBM, DEC, Burroughs, Unisys, etc. Samsung let others innovate and then copy. Register puts it down to Samsung not being dependent on phones, but they can cross-subsidise from other markets.
Anything based on Android is also subsidised by advertising. They collect the data on you and make you the product.
These indeed are concerning times for anyone involved in the IT industry. What should be very helpful to mankind is becoming a millstone around our neck.
I'll give Register some credit for once. An article that just states the facts, no editorial.
Now if it were an Apple story you could guarantee much editorial on how Apple is evil and anyone who buys Apple an idiot, followed by much trolling and vitriol (vitroll, I just made that word up!) in comments.
I'm not sure this is the right approach for the future. In fact, it seems more political than anything. Politics aside, what we need is not so much CPU architectures as entire system architectures. Traditional CPUs are designed for old-style computing, when scientific calculations were done and really the machine was dedicated to that.
But now we have multiprogramming and new apps being loaded into machines by innocent users who know nothing much about security. Current CPU architectures (and languages) offer very little in the way of security being based on the thinking that you own the whole machine and can see the whole of memory as a flat space.
Modern systems – even at the low level of an OS need structured memories that respect boundaries. A quick browse through the RISC-V documentation revealed no clues as to any such support for real modern computing.
That seems a shame to me, and most probably a lost opportunity to rethink things. It seems a shame that performance is still put way in advance of user protection, which – built into the lowest levels of architecture – could be implemented in the most performance effective way, rather than building loads of software on top that is far more effective at sapping CPU cycles.
You could look at this idea as the inverse of distributed computing. Instead of a process being distributed, many processes are implemented on a single machine on virtual processors (this is hardly a radical idea either), but the very ability to do this is baked into the system (CPU) architecture. Smalltalk was also an attempt to view the world in this way.
Once systems are designed and implemented in this manner, real distribution becomes easy (but that is another subject).
These are not really new ideas though – they need revisiting.
"It would be no different than GM being sued because you can't put a cheaper Kia engine in your Cadillac"
But actually, the App Store does allow you to do that. But Apple makes sure the new engines are safe to use. That seems like a good compromise.
Meanwhile the lawyers are set to make good money!
"...when it fails, is unrepairable and goes into landfill in a year or two."
The only thing going to landfill is rubbish comments like this.
Did you ever stop to consider that making things repairable makes them more susceptible to failure in the first place? Plug connectors fail. Things that are soldered into place are less likely to fail.
Macs do not fail after 2 years. I have given a few of mine to relatives and they have made over 10 years old.
Then when they are past their life, there is a good recycling scheme in place so they don't go into landfill.
Are they really overpriced? Stop and think about the amount of technology in these devices both hardware and software. Then there is the factor of miniaturisation – the smaller the form factor the more expensive.
Microsoft's and Apple's prices reflect more directly the costs. Others cross subsidise from advertising, selling your details and from other parts of their business and then using off-the-shelf software not tailored to the end user.
Everyone always wants to pay less for any item they buy. That does not make them overpriced.
It sounds like there is something not equal in your story. What were the service contracts you entered into? Was it a very old iPhone compared to new Samsung? Was the Samsung under a corporate service contract and the iPhone individual – reading between the lines I get that impression?
Now Samsung are worried that others will do to them what Samsung has done to the market. Of course Samsung can see this happening to them – they well know the tactic. Bring out some cut price hardware to destabilise the incumbents (they have not been 100% successful here). Make prices cheaper by not having such good software or support – the hardware looks good in a shop, and most people do not go into all the complexities of computing and software, they make a decision on 'swankiness' and price.
So now Samsung is acting like an incumbent trying to deflect the attacks of others who are now using Samsung's exact tactics.
"the manufacturer doesn't make lots of money"
Which manufacturer would that be?
Apple puts more than others into R&D. The others copy that because it is cheap. The others make money out of selling your data.
"rather than the Apple phone that just sends the vast majority of the overinflated price to the bank..."
That is just a stupid comment.
"Unless you're a sad loser who likes spending too much money on stuff just to look 'hip' at a coffee bar"
That's a silly thing to say.
Besides with your story, we'd have to take your word for it. Then look at why it might be £9 difference. Maybe they are selling cheaper because they are getting a kick back.
Yes, that happens in this craze world where companies are making you buy cheap junk.
>>>Re: Define "win"
"The Failure of Open Source
Open-source software is supposed to promote the idealistic notion that software should be freely available and cheap for all. It is actually achieving the opposite effect. Here is why. [...]"
"While Google might have developed Android (???is it open source???), Android is mainly based on Linux (more warm, fuzzy open source sentiments) – a system developed for speed, not security. "<<<
>>>BS of the highest order.<<<
Really? Don't you have anything sensible to say? You have to provide a counter argument. The fact is that Linux IS built for speed not security.
Try to have something intelligent to say.
"The Lisa was actually a commercial failure"
The Lisa was a very good machine, really better than the Macintosh. I saw lots of them. But the price was not right and not able to be less. It needed a 10MB (from memory) Profile disk drive. Not a bad machine for $10,000 and 1/10th of the cost of what Xerox could do.
But the brilliance of the Mac was to do it for 1/10th of that.
The expansion slots of the Apple II was also one of its worst features, but probably not at the time. Any extra hardware like that is now built in on the mother board, or provided by software.
Glad to see you are reading the history. But what Microsoft did to Apple and what Apple did to Xerox are completely different things.
"Noah Webster had a tremendous influence on American spelling, and he was down on "ou"s sounded as "o". I do see your point: yet the English-speaking world does manage to get along with quite a few words spelled and perhaps sounded the same."
Some words share spelling but are pronounced differently. The most prominent one in this industry is router and router. One is based on rout, pronounced 'raut' and refers to someone who commits atrocities. The other is based on route, pronounced 'root', and we use a lot of these in the Internet. This is related to routine, and no one says 'rautine'. There are plenty of examples where you (there's one) pronounce 'ou' quite differently. You, group, routine, should, tour, through.
The final 'e' on a word most frequently changes the pronunciation of the final vowel:
on(e), plan(e), hop(e), rag(e), sit(e), din(e), min(e), pin(e), sin(e), quit(e), rat(e)
Thus rout and route are different words that should be pronounced differently.
"Bill Gates for Windows claiming he'd copied Apple's idea to which Bill pointed out they'd both been to see the Xerox OS"
No Apple did not steal from Xerox, but Microsoft did steal from Apple. I don't believe Gates saw anything before Jobs demonstrated the Macintosh to him.
Douglas Englebart invented the mouse around 1963, not Xerox PARC. Jef Raskin at Apple was doing similar stuff to PARC and knew those guys. Raskin did his Ph.D in the 1960s on the graphics package that became Apple's Quickdraw. He was working at Apple doing similar stuff to the Xerox guys. It was Raskin who suggested to Jobs that he take up PARC's invitation to go and see what they were doing.
PARC invited industry players in Apple, Tektronix, and IBM to view their stuff, because they had been ordered by Xerox HQ on the East Coast to drop what they were doing - it wasn't Xerox's core business. Tektronix and IBM didn't get it. But Jobs did. And the Xerox PARC guys were amazed how Jobs got it, since Xerox, Tektronix, and IBM didn't. Some at PARC realised it was the end of the road there, so those like Alan Kay and Larry Tesler left PARC to further this technology at Apple. They went on Apple's payroll, so were rewarded for their efforts.
Apple still took considerable risks to develop this technology. The other part of the story is how PARC machines cost nearly $100,000, but Apple managed to put it in a machine selling for $10,000 (the Lisa), and then $2,000 (the Mac).
Apple also did not exactly copy the PARC interface. Pull down menus at the top of the screen were Apple's innovation.
Now Bill Gates did illegally copy Apple's stuff - particularly Quickdraw that was Raskin's.
"I built one recently with my daughter"
I commend you for your educational efforts, but do remember that computing is about far more than building hardware. It really is about end-user experience and software that is independent of hardware.
I hope you enjoyed it - I wish my soldering skills were better, but I hardly ever have to do any these days.
"materials cost $443"
That's just materials. Lots of costs on top of that, like assembling, transportation, wages of many people involved.
I knew this article with nothing to do with current technology would attract the usual anti-Apple irrational criticism and hatred so frequently seen and this proves the point.
"People being blatantly ripped off"
No, you are using dodgy figures. It is also not like you have to pay this. I'm quite happy with my iPhone 7 for the next few years. Apple also have cheaper options.
People say Apple is not innovating. But since all the basic phone capabilities are out there, the only innovations must come at a cost.
Remember the competition sell their phones cheaply because the business model is more based on advertising and selling your data. So you pay in other ways other than initial cost.
"you'd be better taking your money and setting fire to it"
Now that really is silly and shows you don't have much to say.
"How'd you work that out?"
To what comment are you referring.
It is undeniable that Samsung have copied most of the stuff from Apple via Android. A few things Apple does later. But usually when Apple does something after another company, it is that the other company has rushed a half-baked product to market. This started with Windows I, rushed out to beat Apple, but it was pathetic compared to Mac when it came out.
Lately it has been face recognition. Samsung had a form first, but it could be beaten with a photograph. Apple spent extra time to get face id right.
"revisionist history of the PC as well"
What 'revisionist' history are you talking about.
The history is that IBM wanted to put Apple out of business like many other companies they had put out of business.
Read Richard DeLamarter's "Big Blue: IBM's Use and Abuse of Power".
It is a fact that Microsoft stole source code off Apple.
Nothing revisionist about what I said.
The original PC might have been aimed at businesses. But it was still IBM's attempt at regaining the PC market and killing Apple. They only did it to do that. IBM did not want to cannibalise its own profitable market, but did not want to leave it to another company to do that.
"OTS component based business machine"
The business observation has some truth. Windows smacks of being an office machine where a worker just comes in and does a small set of tasks directed by the machine.
"Microsoft stealing Apple code?"
Absolutely. Sounds like you don't know the story (or are denying it). Gates and Microsoft did steal Apple's code. Gates wanted access to the Macintosh source code to develop Word. That code then turned up in Windows, was done without any agreement from Apple, and heavily used Jef Raskin's QuickDraw which was not part of Xerox's work.
Now Gates put around that Apple had stolen from Xerox. That was not true, but many anti-Apple people perpetuate this myth.
Windows 95? We are talking about the early 1980s.
"under the hood was totally different" that proves nothing. Similar software can run on very different hardware. It's the basic theory of computer science.
"It can only be a matter of time before certain Android phone makers slavishly follow suit."
That sentence neatly sums up the problems in this industry. Apple does research and develops products. Others see the success greedily want to grab those profits so just copy, instead of doing things their own way. Their business model is "put Apple out of business so we can grab ALL the profits". That is not a good business model and it even killed IBM (the ridiculous PC was IBM's attempt at killing Apple, and they almost succeeded, Microsoft tried the same with the cheap Windows knock off which even stole Apple code).
Perhaps when anyone criticises Apple customers for being Apple 'fanboys', they should really consider that it is the other manufacturers who are the real Apple fanboys, and any of their customers demanding a product because it looks like Apple are also Apple fanboys.
Is Linus really committed to quality?
If so, he would have adopted a microkernel approach for security, rather than the monolithic kernel for speed. That is speed over quality and security.
He seemed to use the same tactics in the argument with Andrew Tanenbaum.
Trump says "The US is the world's piggybank". Well how does he think the US filled that piggybank in the first place? Empires have always syphoned funds from the rest of the world.
Now the world is paying taxes to the US on everything, mainly in the form of transaction charges.
Programming is not coding anyway. Coding is what a code generator in a compiler does.
But programming does not have to be imperative, like functional programming it can be declarative. This is telling the computer what you want, rather than how to do it. Relational databases are also founded on this principle.
We need to get away from code.
"The chip also appears to support pointer authentication, a feature Arm introduced to its Armv8.3-A"
I looked at that link to the ARM slide presentation.
My impression (could be wrong) is that it is like prison officers knocking off at 5 pm and asking the prisoners to lock themselves in the cells at 10 pm. It looks like the checks are optional.
Yes, we need to get away from insecure pointer languages like C and C++, but we also need to make non-optional checks in hardware.
We need architectures like the Burroughs B5000, which has been with us since 1963 (and still going). Any program that violates security is unceremoniously dumped.
Tony Hoare also notes that protection should never be turned off.
When many CPUs were designed in the past, the emphasis was for speed for scientific applications that basically ran on one machine. There was not the need for security.
We now need architectures and programming techniques that build in security (and correctness). We should not be worried that processor cycles are spent on security checking. Security has become fundamental in the modern world of computing. We should stop ignoring it for the sake of performance an optimisation.
But to bring around this change needs a change in attitude in almost the entire industry. The concepts of security and software correctness and verification have been around for over 50 years, but largely ignored in favour of dubious concepts like 'programmer freedom'.
"So with C you have a chance of writing correct software if you are _really_ careful and have good experience in at least one assembler."
You can write correct software if you can hold all the details in your head and then are really careful. However, there is a contradiction in your sentence. This complexity is actually brought about by C because C gives little help in this area that a modern language would.
Programming is also not about machine code or assembler. Maybe it is helpful if you know that for C because C exposes these details and expects you to think in this way. But that is precisely what makes C an outdated language and ultimately insecure.
C and C++ have long ignored the modern issue of software correctness. These are bugs that could be automatically discovered and guarded against, but put there by unwitting programmers. The C philosophy has always been some misguided notion of 'programmer freedom'. But this is really naive and stupid – bad hackers can use these bugs to attack systems, and what is more is C and C++ are the perfect hacker tools. Software correctness and security are two sides of the same coin.
It is time the industry moved to modern tools and languages.
So the author wants to downgrade to Oreo from Pie. That says a lot about where Android is going. Releases are named after junk food, and Oreo must be the most tasteless cardboard of a biscuit, just appealing to children's taste for sugar, knowing no better. Seems to sum up Android.
Remember the three views of computing "Computers are to control people" IBM and then Microsoft. "Computers are meant as tools for people to augment their intellect" Doug Englebart, then Silicon Valley, Xerox PARC, and Apple.
"AI will replace human intellect" John McCarthy (LISP) and AI researchers.
Well, I'm sick of AI 'helpers' coming up and telling me what they think I might have wanted. You can see this in the way Google and Facebook do things. They are based on the push paradigm. Sometimes, yes you want to be notified of something, but more often than not, stop the advertising. Marketing people think it is doing their job to pester you, from putting waste paper in your mailbox to popping up all the time telling you what they think should be useful.
We need an electronic "No Junk Mail" which says NO AI!!!!!!
People need to take control again. Let the systems know "No, I'll tell YOU when I want something".
“the Cupertino idiot-tax giant”
Stop insulting people for buying Apple by calling them idiots. Apple purchasers buy Apple for good reasons, including overall quality of software and hardware (software always comes first).
The term “idiot” means they have not really looked into things. Well, all the things that go on in the IT world are difficult to determine. But how do the others get their prices low – because they advertise and pass your details onto other third parties. This is being naive as to where the cost of your appliance comes from – with Apple you own the appliance, with advertising-based cheap prices, your appliance owns you.
Now I’m not going to insult those who buy these other products by calling them idiots, rather try to educate them about what is really going on, rather than being naive about what is going on.
It seems the Register is wilfully naive, or maybe being idiots. Usually those who repeat the same thing over and over, are just trying to promote a lie. The ‘idiot tax’ phrase is used over and over by the Register. It is not true, and it is insulting. Has Register been taking lessons from Trump?
Biting the hand that feeds IT © 1998–2019