Re: You try to block our products
Intel could get some Knights Whatsit wins, replacing Tesla? Or is that too far-fetched?
26 posts • joined 16 Aug 2011
Intel could get some Knights Whatsit wins, replacing Tesla? Or is that too far-fetched?
Looking at the rainy day video, I would say the engineers have successfully achieved the simulation of a driver with about 0.2% blood alcohol level.
Can you be both relieved and frightened at the same time? A driverless future seems much farther away than I feared.
It is the AI controlling the nukes that is the dangerous bit. Obviously.
Oh good. Expect reverse engineered zero day exploits (well, practically "zero day") in 3...2...1...
I don't know how reliable the numbers are. Could be IBM used some creative accounting to put some of the incoming bad stuff on the recently-sold division, so that the other divisions as well as future quarters will look better.
Nah... I'm sure Big Blue would never do anything so shady.
I understand not wanting to utilize shingles (because they have technical drawbacks) or the Kinetic drives (because they require that bugaboo, software work), but the argument to avoid helium drives was not very convincing.
Who cares if there is helium inside or not? If your competitors offer the option to put 6 TB drives in their arrays you would be a fool not to sell them for some (probably temporary) single supplier risk reason.
Perhaps the risk comes up in contract negotiations with Hitachi (how to guarantee the risk does not actualize) - but really, it is the same thing as with, say, x86 server processors for the past few years. Intel has had long periods of performance dominance over AMD, but I don't see server vendors refusing to sell any Intel processors that are faster than AMD processors (just in case Intel decides to raise their prices)...
Meanwhile, the 28 nm GTX Titan apparently contains about 7 billion transistors, and the Nvidia's GTX 680 about 3.5 billion transistors (the GTX 680 on a slightly under 300 square mm chip).
The Xbox One GPU is far less powerful, but it will still eat up a significant amount of space. Then there is the 8-core CPU, not to mention the cache that takes up half the transistor budget... For me, it looks like the NSA had to settle for a few hundred million transistors at the most.
It's not just about benchmarketing. If you could get, say, 10% better battery life simply by compiling everything native with a better compiler, everyone using Atom-based Androids would benefit. And Intel *should* know all the tricks about minimising the power consumption of their own chips better than anyone else.
Windows 8? I upgraded immediately. Speed was the most important reason, it just works quicker than Windows 7 - especially with a little bit older hardware.
The UI mostly worked ok, with the occasional annoyance caused by the Modern world intruding on my workflow. So after six or seven months of trying to get to terms with the inconsistencies, I gave up and installed the Start8/ModernMix combo. Especially the windowed Modern apps allowed by ModernMix removed just about every annoyance I had, and suddenly, for me, Windows 8 became the best Windows version ever, fast, stable, and usable.
Now, based on what I have read about Windows 8.1, it seems I will still be using the Stardock add-ons. This is quite odd - after all, the UI clearly does not need major updates to improve the user experience in a dramatic fashion, and the Windows 8 UI could do with some old-fashioned evil empire embrace and extend.
I will give the UI tweaks in 8.1 a chance, but I'm worried that there is no one left in power at Microsoft that can actually look outside their Microsoft-centric bubble and copy someone with good taste. They clearly can't do it on their own...
Java? The language that has outsourced all the insanity into the build and dependency management systems and the thousand libraries, frameworks and domain-specific languages of varying quality and ancestry, many of which still drown you in AbstractProxyFactoryInterfaceBeans? The language with probably the most lines of bad code written in the past decade? With a dozen tribes gathered around different ecosystems, separated by a common language...
The biggest performance jump since the late 90s happened sometime over the past four years, and the low-cost and mainstream (prebuilt) PCs have missed out so far. No wonder non-gamer people are hanging on to their old PCs longer than before, there's not much difference between a Core 2 Duo and an Ivy Bridge when everything is slowed down by a spinning disk.
However, I don't really see the point in sticking with pre-Core 2 Duo hardware at this point, when you can pick up an old C2D system for not much money at all. That gives a noticeable jump in performance and power efficiency compared to anything before 2006, and most likely gives you SATA II ports to use for - yes, you guessed it - an SSD system drive.
You are probably correct. The 4" form factor in an ultra-thin phone is clearly one of the reasons why iPhone 5 has so much worse battery life than the S III, even with a non user replaceable battery (5.45 Wh vs 7.98 Wh battery capacity).
Saving on screen resolution and CPU frequency may be the only way to get an acceptable compromise between screen size, battery life, and "sexy" thinness.
I expect to see 4k TVs in all the most popular form factors, eventually even down to 32". It doesn't matter if there is a detectable difference at 32" between upscaled 4k and Full HD, some shills and fools will always find some freeze-frame, zoomed-in picture where things looks "sharper". Before you know it, the common wisdom of the Internet has shifted to recommending a 4k TV.
I guess this is just the way marketing works nowadays.
In hindsight, the Full HD label was very successful for the TV makers, but the 3D didn't get so much traction. There was very limited native content in either 1080p or 3D, but perhaps the difference was this: nobody believes the autoconversion from 2D to 3D really improves the picture or even produces a watchable result, while people are willing to be persuaded that upscaling to a higher resolution improves picture quality.
No wonder they are going back to the "improved resolution" marketing. OLED might prove to be surprisingly hard to sell to mainstream TV buyers, even if the real PQ improvement potential going from LED LCD to OLED is far higher than going from 1080p to 4k.
Lock-in is one thing. Not being backward compatible with yourself is quite different. That is just about the biggest FU from a company towards their customers, and almost the opposite of lock-in - perhaps you could call it lock-out?
With IE6, there is some security justification for Microsoft abandoning the old interfaces - they had painted themselves into a corner and only had bad options.
But there was absolutely no reason to orphan Visual Basic 6, except for language snobbery! I still see plenty of well-working, business critical VB 6 applications, often written by bright people who were only semi-pro programmers. The apps are often quite nicely structured from the point of view of extending the application, but would be a nightmare to convert to .NET or whatever.
Metro apps with Windows 8, as well as each new, improved version of Windows Phone, are just this philosophy taken to its logical conclusion: with each new version, you have to leave the old apps behind and start from scratch.
This is the kind of philosophy change that is preventing the old, well-oiled, ruthless, lock-in leveraging, world dominating Microsoft money machine from effectively competing with Apple and Google. You can't build momentum if you have to start over every two years. And if you keep screwing the developers, they will stay away from your world until and unless you actually have a market position too big to ignore.
It's always been the curse of the successful Windows developer: if you create something too popular, Microsoft will create their own version and include it in Windows for free. I'm sure that is a big reason why Gabe Newell is so pissed off at Microsoft - he's been on the inside and knows how ruthless Microsoft can be.
Why is this bad for anybody except Valve employees, then? There's a reason Steam is currently the runaway #1. Valve's response so far has been exemplary: instead of suing Microsoft, they add value to their users by releasing an OS X version of Steam, and are making noises abour releasing a Linux version. None of that may matter - Steam may fall in the end and Valve have to resort to lawsuits - but they point to a work culture that is able to adapt and innovate.
GOG is another example of a "good", innovative company. It would be a pity if Microsoft manages to kill off these kinds of stores. You just know their store will be much more corporate and never quite as good, and you just know Microsoft *will* throw developers under the bus if and when the internal strategy changes (see: Windows Phone 6.5 apps, Windows Phone 7 apps).
Plenty of happy ZTE Blade owners out there, and quite a few will start looking for a new phone in the next year or so. No need for them to look for Samsung or HTC or whoever if they want to upgrade - ZTE is already a proven brand for them.
Perhaps there will be some handholding in the release version to try and help 60-somethings (and 30-somethings and 15-somethings, for that matter!) get to grips with the new UI. But if you have to support your parents or other folks in computer matters, tell them to stay away from Windows 8. You don't want those nightmare support calls, trying to teach them to use the new touch-centric UI with a mouse.
Really, I think the only way to teach someone like my father to use Windows 8 with a mouse would be to give a training course in the essential keyboard shortcuts - it would be like WordPerfect 5.1 for DOS, which he mastered with the help of a keyboard template (anyone remember those paper templates you could put over the keyboard to show which function key did what?). This is progress?
I think it shows more that once you lose momentum by making a few serious missteps, turning corporate, and/or becoming uncool, it is very hard to turn that around. And if there are no real competitive barriers to entry, you can easily go all the way back to zero. Not just web companies - RIM and Nokia have fallen *hard*.
So from that perspective, I actually don't think Facebook is the best example of a bound-to-fail-eventually swindle (mild but sustainable profitability for some time would actually be a good sign to me), and I have no opinion on github and their funding. Groupon, on the other hand, seems like the canonical example of a bad web company - importantly, most people seem to have already noticed that, which means they will have major problems turning it around.
OTOH Ebay, as an example, has the kind of momentum/inertia from its scale which has proven comparatively resilient to their corporate boneheadedness. If you have an addicted customer base, like Activision with their WoW subscribers, you can really go into corporate plunder mode without (immediate) effects. And finally, if you are part of a monopoly or oligopoly like the phone/cable companies, you can do almost any kind of evil and benefit seemingly endlessly. Perhaps that is part of the problem - if a rising web startup without that kind of inbuilt advantage looks at these kinds of "success stories" and tries to follow a similar path, they are very likely to fail.
The usual "value" argument about Macs is that while it does cost a bit more to buy, you can also sell it on for more money than a same-vintage PC - and you come out even or ahead of the PC buyer while getting to use Apple hardware. Meanwhile, the older sold-on Mac, especially with some well-thought, cheap hardware upgrades, will be much more useful than those nasty crashing Windows PCs of the same age...
There is some truth to this - I recently bought a four year old, 2007-model Mac Mini (first Core 2 Duo) for about 200 euros, about 40% the original price, and the little thing is almost ridiculously fast after replacing the 80 GB, 4200 rpm hard drive with a 128 GB SSD for about 80 euros (of course, you could do this upgrade with similarly pleasing results with most any PC of the same vintage, but the point is the "greenness" of extending the useful life of old hardware).
If you can't do this any more in the future, it is wasteful for the environment but also for the wallet of the Apple buyer: the value of used Macs is going to plummet - you definitely won't get 40% of your money back in 2016 if you spend 2500 euros on a Retina Macbook Pro today - at least if you can't replace the dead battery, add more memory, or swap a bigger SSD to the old Retina.
"...what 'complaints' would those be..?"
Lack of keyboard backlight, keyboard feel, touchpad feel, display quality and resolution, like said above.. I am of course talking about the last Zenbook generation. What impresses me is that Asus is addressing the complaints and even exceeding expectations. If they continue doing this - removing pain points and delighting their users - Apple will have a real battle on their hands. After all, they are used to the competition just making incremental improvements - a faster chipset, faster RAM, faster SSD, faster USB, faster graphics between generations - something that just improves a few benchmark scores, basically. So from that perspective Apple is just keeping up with general technological progress with the 2012 Airs.
Why this particular case is annoying me is that Apple is overpricing their new Macbook Airs. Not against some nebulous PC competition which you can always dismiss, but against the very, very similar 2011 Macbook Airs. The base model euro price increased by 120 euros, for example.
In any case, this is the same crap AMD and Nvidia have been pulling for a couple of generations - a new generation card will cost a bit more and draw a little less power than the similarly-performing older model. And just look at what the hard drive producers are getting away with. Are these the first signs of Moore's law really breaking down - instead of getting more for less you get more for more? Or is this just a sign of a company or an oligopoly with pricing power just jacking up prices because they can?
To some extent, I understand what Turner might be talking about. I already do a lot of single-tasking, especially when programming - too many levels of thinking to handle to be distracted by anything not relevant blinking in the corner of my eye. I never seem get much use out of multiple monitors either. So I guess I'm supposed to like Metro.
Thing is, I don't see much there that would help my current workflow. The current "Dock" metaphor seems to work well for me for stacking the 10-30 other currently open apps, browsers, IDEs etc. in the background for quick task switching, but out of the way of the single thing I'm working on at the moment. A potential improvement is improving visual identification of multiple instances of the same open app, but I haven't really figured out how Metro would help there. All it seems to do is add multiple new options how something could be accomplished, and jarring visual transitions from the old desktop to Metro. In-order style app switching is useless after more than 10 open apps in any case, no matter if it is Alt-Tabbing or Windows Phone-like. There's probably some promise for better use of widescreen aspect ratios for single-tasking, but having played with the latest preview a bit in a VM I'm rather underwhelmed so far.
The other thing, of course, is that while most people probably can get used to Metro and not have it hinder their productivity too much - all in all it is not *that* different to Windows 7 if you stay away from Metro apps - I also know very well from observing my colleagues that there definitely isn't one single way to write code - many, if not most, of them really like to have multiple monitors and several windows open at the same time. In fact, I believe my single-tasking style is somewhat in the minority. Why try to make the grand new UI concept some kind of a office productivity enhancer when that UI is pretty much useless to most people in your average work environment?
This year, Firefox development has really brought some meaningful improvements. Firefox 6 was faster than 5, 7 was faster than 6 (and used significantly less memory too), 8 was faster than 7... yeah, I can live with the new Firefox release schedule of a new version every six weeks. Long may it continue.
I haven't hit bugs either, perhaps thanks to the people subscribed on the beta or aurora channels testing the new versions for 12 weeks before I get the update.
"We are seeing solid progress against our strategy"
- Things keep going in the opposite direction to what we planned
, and with these planned changes we will emerge as a more dynamic, nimble and efficient challenger,"
- We will soon be a very small company with a small marketshare compared to the big players ("a challenger")
Message received and understood.
So: Intel is hurt (and AMD perhaps even more) - Microsoft really is serious about breaking Intel and Windows on ARM. And by "serious", I mean determined to put Windows users into their own walled garden alongside all the other little controlled little gardens springing up everywhere. Oh crap - Linux never looked better than today. Hope that particular alternative won't be taken away too.
Integrated 3G, integrated 3G... where are you? At least there is no some hope of getting that in one of the ultrabooks, it seems Apple just isn't interested in providing that option.
Toshiba might be the one to provide it as an option, at least - this is the least-MBA like ultrabook so far.
In what way would the below C++98 be clearer than Tom 38's version? What does the added verbosity gain?
list<pair<string, string> > words;
words.push_back(pair("About bloody", "Time"));
list<pair<string, string> >::const_iterator i = words.begin();
while (i != words.end())
cout << i.first << i.second << endl;
++i; // mustn't forget this
I thought Tom38's post made some everyday C++11 advantages clear: the old way required a loop condition test and manual iterator increment (chance for errors) as well as a redundant declaration of the iterator type (more work to change to use a different container if needed, as well as the general fiddliness and verbosity of STL iterators).
What important information is the auto keyword hiding? Now that the features are in the standard, these should become common C++ idioms that any programmer worth anything will recognise and read fluently.
This is just the kind of syntactic sugar that raises the abstraction level slightly and makes programs more readable, as well as making the initial and subsequent programmers touching the code more productive (get rid of a common class of bugs, make reading the code faster and make refactoring easier - what's not to like?).