You forgot something
What? No mention that AVX is the underpinning of Larrabee and hence a crossover from GPGPU?
28 posts • joined 12 Mar 2008
What? No mention that AVX is the underpinning of Larrabee and hence a crossover from GPGPU?
Points for looking toward private enterprise.
Major loss of points for doing it the Big Contractor way.
Why did it think for a while, pick two companies, and ink huge deals. That's the lazy bureaucratic way. Find a contractor who'll promise to do everything, pay them, and hope for the best.
The market way would be to say, "we'll pay /anyone/ who can put cargo into space." Commit to spending x amount of money in the long run, but leave the doors wide open and scale it up gradually.
For one thing, entrepreneurs can then write up sensible business plans, talk up the market to investors, while knowing that all they have to do is produce the good and NASA will buy it. (Right now the players are those who are so rich they don't need a business plan.) It would be invigorating. Sure, there'd then be a mess of suppliers. Some of them deadbeats. Some launches would fail. But that's what a market is. Dynamic.
But NASA is lazy. It doesn't want to deal with a dozen firms. It doesn't want to deal with the fits and starts. It doesn't want a real market. It just wants to outsource to a couple guys who'll promise to take care of everything. That's how the defense industry works, and it's an overpriced wasteland of bullshit.
This won't get us a cheap private path to space.
No need to waste money on this. You can build a computer with four GTX280s that will in fact have better performance. (It's a complete misconception that you need Tesla to use NVIDIA's tech.)
It might very well be easy to use the personalities, but the packaged vector-op personalities are worthless. FPGAs cannot get good performance mascarading as CPUs, especially on floating point code. ALL the magic will come from creating custom "personalities," creating custom software written in a hardware description language that transforms your algorithm into transistors. And doing that is hard.
The problem is that such languages are backwards, left behind by the demands of the semiconductor industry which doesn't value useability and hates change. The first object-oriented HDL was released barely three years ago, and few tools support fully it. There's also hardly any books on it. Prophesized new generations of languages that automate many of the basic problems of hardware design (particularly the rather difficult task of passing data from one part of your "program" to another, and making sure everything gets to where it needs to in the right number of cycles) have been promised for some time but have not materialized. Writing code that compiles to transistors is painful, few people know how to do it, and few people study the subject in school or in their spare time.
If an independent software industry springs up made up of a few brilliant coders who create useful personalities and resell them, then the FPGA idea could thrive. (Like I said, utilizing already-made personalities seems easy.) But the vertical markets that this thing targets don't work like that. Every one of their algorithms is unique, they develop everything from scratch, and the FPGA programming model and the transistor languages will be hell for them.
It's a wonderful idea, but i just don't think it's time for it yet. And if all you do is use the bundled vector personality, then a GPU will be ten times faster for, literally, one hundredth the price.
Although, if you can really just offload code a few lines at a time (to create your own instruction), then rewriting those few lines in an HDL is not nearly as hard as rewriting an entire algorithm. There could be hope, if the granulity is there. Although to get even those few lines working you need to write many more to interface to the DRAM and the FSB. More information is needed on what tools Convey has really created (in those fleeting dozen months) to automate such chores.
That's hillarious. I'm not sure though it's "patent trolling" per se. It's more bizarre. They want to figure out others' technologies, and then turn around and demand money from those people for having invented something but not disclosed how it works. (Most likely, they'll be doing that to other defense contractors. Or, stopping other defense contractors from doing that.)
This is all rather ironic, if you keep in mind that one of the main points of patents isn't to "protect property because what you think of you can stop anyone else from thinking" (yeah right), but to encourage inventors and manufacturers to disclose and document their secrets, so that industry can learn and improve on them. This was at least the selling point during the Industrial Revolution.
There were studies that said essentially the same thing going a while back, but let's be clear about what they're saying:
Smoking weed won't induce schizophrenia in someone who isn't going to get it.
BUT, smoking weed WILL induce a schizophrenia-prone person to develop symptoms at a younger age. That's not an argument against weed, but this point has to be repeated! For the sake of the small population who honestly does need to put down the fn bong.
And as for weed in general, the argument is simple: The dose makes the poison! Nuff said.
Thank god for the new feedback rules. A few months ago, the buyer would've just gotten negged too, for telling the truth. Happened to me, fn asshole sellers.
Of course, the whole feedback system is shot. Everyone always gives +1 unicorns and rainbows. You get one negative feedback and it's a huge deal. There's no flexibility in the system. Then sellers expect that if they sell you the wrong thing but make it up for it afterwards, they don't deserve the atomic bomb that is a -1.
I was expecting an article that would attack head-on the question, "ok, global warming's happening. so what?" Instead it was a lot less direct, and eventually wavered somewhere in talking about Kyoto and money. I wanted stuff about plants loving CO2 and needing less water as a result of its abundance. Stuff about how the well-established animals, not the waif pseudo-species in the rain forest, don't care about global warming because they'll hardly notice relocating themselves if the timescale is an entire century. (Same regarding humans displaced by sea level.) Stuff about how hurricanes, even if they increase, aren't scary if you just build your house out of brick and don't live in a city under the sea.
I want an article that attacks that which isn't talked about. We keep going on and on about whether or not global warming will happen (or worse, bitching about whether it has), and rarely stop to actually question if it's something to unabatedly fear. Or if it's all just airy hype.
The drawing is a lie. It's gonna have powerful engines giving forward thrust. The disk is just like a stationary wing.
Think of it as an Osprey that works. P.S. that "dragonfly" link is the right idea.
The Wii costs pennies to make. It's just one SoC. It has about as much power as an Xbox 1, from seven (7!!) years ago. And costs now as much as the xbox 1 did back then!!! HAHAHA. Nintendo is rolling in dough.
To be fair, the margins on GPUs are a pittance compared to CPUs. For all their machinations, it's still been an incredibly competitive industry. Much more so than what goes on in the Intel monopoly land, AMD not withstanding. I'd say give these guys a break and deal with the real problem.
But the wikipedia entry was correct, wasn't it? What are we bitching about, again?
Ok, ok, the judge should have double-checked elsewhere. Still...
Windows 7 Server (internal codename) == Windows Server 2008 R2 (product name)
Holy @#$. Those bastards! I knew Windows 7 was a lie! It's gonna be a point release of Vista two years too late.
22nm in three years. That's crazy!
You guys all know that the END comes at 16nm, right? No molecular computer could be any smaller...
You guys... you're such amateur theorists.
Obviously this guy *knew* something. The knowledge drove him to depression, and right before he was about to crack, the powers took action. Remember he was a key investigator in this whole mess.
I'm sure Oracle and Microsoft have put in all the fine grained locks that a SQL database can possible use. They went through the pain and the bugs, and it's done now. Transactional memory is in no way shape or form faster than fine-grained locks, which by far are the most efficient implementation.
Transcational memory is NOT faster than fine-grained locks. It's just way easier. It's a boon for new software, and developers will like it. But to say that MySQL will suddenly leapfrog over age-old databases that have perfected their locks is just a non-sequitor.
Well, I don't know that. But.. this guy has done the worst job imaginable over the past 7 years. He's turned an idea that is pretty fn cool and surpisingly practical and affordable (gliding on an AI-powered unicycle) into a laughingstock. The initial design was atrocious and universally reviled. It was damn ugly, had crippled performance, and scared people around it (which is why its performance got crippled). What did this guy do? Nothing. He sat on his ass for years until finally a slightly redesigned (and slightly less gay-looking but just as imposing) version came out to noone's attention.
This guy is worthless. Apple is gonna hate him. I don't think Apple's products will get ruined by him (I know they won't put up with his stupidity), but Segway is really lucky and now has a much brighter future.
P.S. what segway should have done is made a seated version. Maybe like the original wheelchair that introduced the tech. Or maybe like a bicycle. Anyway, it's the awkward standing that makes you look like a dangerous douche.
P.P.S. fucking patents. If there was more than a single monopoly allowed to make self-balancing scooters, this idea could have really developed into something by now.
These are not "virtual worlds." That label makes the concept sound cool and hype-worthy.
In fact, use the correct language: They are 3D chat rooms. That's right. All they are are chat rooms... with a gimmick. And the gimmick is pretty lame.
But the real problem is that they broke the concept of the chat room itself. They took out the "room", the confined space where you enter to talk to a group of people who are already there. It may not seem like a big deal, but it's actually a huge difference from wandering a world coming up to strangers. The social dynamics change profoundly (think about it).
In MUDs or MMOs you still have those groupings (eg clans or quests or servers). Even better if the groupings have a purpose. MS 3D Chat was also fairly good because it just had ordinary chat rooms. But a wide open world just doesn't work. [Insert poignant prose about urbanism and the alienation of modern society...]
The amount of data will fundamentally change the "scientific method." In the narrow sense of the world.
We'll stop mucking about with dumb clinical trials and ridiculous social experiments. Absolutely. And it will profoundly change those fields.
But the data and the math will merely make fuzzy bullshit fields like psychology more similar to very concrete fields like physics. Physics has long been dominated by data and math.
Models will absolutely have a place. For starters, data will be far from enough to make predictions. (A physicist can build a airflow simulation for airplanes because of the mathematical models his data has let him construct. It is impossible to build such a simulator using a raw database.) For enders, statisticism hardly satisfies humans, who will always apply their heads to figure out the models behind things they see. The models which, perversely, are not arbitrary but essentially embedded in reality itself.
Dude... you guys need a section on your website for all the excellent climate change articles you're writing. Something that's a list of all the best ones that'd be easy to link to.
CherryPal, get it?
These people are fn brilliant. They sell these things in the developing world. Out of the box it only does one thing: surfs porn (err.. the "web"). Everyone buys them just for that. And then CherryPal can sell to this install base Terminal Services. Besides porn, you can have Office, Photoshop. HELL you can even have Crysis (with StreamMyGame). Without going out to the internet cafe, that is.
(On top of that there's an ulterior factor: telcos like China Telecom need to move away from piss-poor voice by offering other services, and are looking to jump into thin clients hooked to their DSL lines.)
Only problem is that these app subscriptions will have to compete with free. The free you'll get by forking for real hardware and pirating all the apps from your friends.
But it's not about apps. Not yet. Now they just want you to to buy the machine and keep it in your bedroom.
And those Nvidia GPUs can more than handle all of the calculations that the SpursEngine does. (See the article on Tesla.) This is all about software. The Cell is there for no good reason, except to justify its own existence. Yet if it wasn't there to be justified, the really nifty software wouldn't get written.
But... I'll be waiting till all this (including super-resolution upscaling) gets ported to normal video cards.
This is just the next-gen 10-series GPUs. And yeah, they'll play Crysis fn sweet.
They're connected to 20 carriers, and play them off one-another to get the cheapest deals. Ohhh. Why didn't you just say so?
Been reading up on .NET 3.0 (and XAML), have we? I think "separation of programers and designers because both are idiots in their own ways" is an offensive and dumb position. Separation is good because it makes things easier and _logical_. The new "separated" APIs are just better APIs (for many reasons), no matter if it's one person doing both things or different departments.
As for this story... there is no separation. Java is java, Flash is flash. I think the point is that Java has a shitty graphics API that no one bothered to fix because Java fanboys are academic dweebs who don't know or care anything about UIs or ease-of-use. It's good that Java on mobiles is getting elbowed aside by Flash and .NET (silverlight). Maybe we'll have some progress now.
According to this guy hard drives, properly recorded (start from a blank, write to it once) hard drives last for at least 20-30 years. It'll be a lot longer if you "refresh" it, read then write back the data remagnetizing the platter. Even if the hard drive has trouble reading itself, data can be recovered with specialized but widespread hd data-recovery machines. Try doing that to a holographic disk. The biggest advantage of hard drives, besides speed and capacity, is that it'll be much easier to find a working SATA port than some niche holographic reader.
Besides, NUMBERS LIKE "50 years" etc ARE RETARDED. All storage devices use error-correction algorithms. If you want your data to stay around for 100 years, just tune the algorithm to have enough redundancy.
Vista is very scalable. Just take a look at windows 2008 or vLite.
The question is why the fuck does microsoft want to shove on us the same aero, windows search, uac, etc. bullshit when it can offer a lite version with all of that disabled (and re-enableable later, like in win2k8). People would actually LIKE vista...
!!!!!!!!!!!!!!!!!!! Windows 2008 !!!!!!!!!!!!!!!!!!!!!!!!!!!
It's the way vista was meant to be. Microsoft should've made a "Workstation" version and it would sell millions, but running "Server" is no problem since everything is componentized and nothing is pre-installed. It's also free for 6 months' use. Download it from microsoft.
Biting the hand that feeds IT © 1998–2018