978 posts • joined 16 Oct 2007
Re: Can we at least be a little bit smug?
I guess you've never heard of AT&T UNIX. Stallman's petulance far predates Microsoft.
Anyway, his software isn't free in the sense that research is free. No research paper says "you can use the results of this research so long as you give away the complete design for any product you make with them". This idea is a clever invention of Stallman's which serves his political purpose. Equating it to old-style giving-away of research is profoundly disingenuous on his part.
If Newton had done that and set a precedent, never mind steam fax machines, we'd have had almost no progress at all. Stallman seems hell-bent on making this happen in computing. Thankfully, the talented are now moving to MIT and BSD licenses, so we have things like LLVM compiling to iOS, and the world is a much better place for it.
And this on the day I saw Google's amazing 3D city rendering technology that's coming down the pipe ...
Re: Virtual Memory confusion
Oh, well, quite possibly ... except RISC OS 3.5 was pretty much the end of the line for the machines, and Dynamic Areas required special magical coding to use. Still a fair way from virtual memory as familiar to most of us today. Dynamic Areas were like the DOS Extenders of RISC OS. I'd mercifully forgotten about their existence until now :) It's possible I used one for DOOM, although it's equally possible I didn't.
Virtual Memory confusion
The Archimedes had no paging of RAM to hard disk, and the MEMC certainly wasn't capable of doing it on its own! :)
What MEMC did was offer a virtualized address space so that each application could have its own memory paged into the same address space while it was running, with other applications' memory protected by being hidden. This is a basic feature of most 32-bit processors but it was a first for a home computer.
MEMC had all the hooks for proper virtual memory with disk paging, but it wasn't a feature of either Arthur or RISC OS (although presumably RISC iX had it). Similarly, neither Arthur nor RISC OS offered pre-emptive multithreading - the application had to specifically yield so other apps could get a timeslice.
Re: I made money out A310 Archimedes
Awesome, Chris - I bought one of those upgrades, and it completely saved my ass while working on Wolf 3D :) :) :) I think you did my ARM3 and MEMC1a upgrades at the same time!
That's not what Taleb is saying in the Black Swan at all. He's saying that you can't just assume everything is a normal distribution. The stuff from the low probability parts of the Gaussian occurs with low probability - Black Swans occur too often, because they come from non-Gaussian distributions. You do *not* need massive amounts of data to show this - two events out of a thousand, with supposed probability of one in a billion, is sufficient to show you're almost certainly dealing with a non-Gaussian distribution.
Taleb's book is mostly an exhortation for professional statisticians - especially in finance - to remember their training about other distributions and stop pretending everything is Gaussian (or, your version, that you supposedly need so much data to prove it is non-Gaussian that you may as well assume that it is). ISTR there's also some stuff about statistical independence too (e.g. if one person gets foreclosed on, it's likely others will too - foreclosure is not an event independent of wider circumstances) but I presume you're not arguing against that.
Read Stafford Beer
Taleb is retreading waters Beer walked on in 1973 ("Brain of the Firm"). Beer not only explains why storing every bit is irrelevant and pointless, he also predicted the rise of big data. Nearly 40 years ago. In fact, you can see that "big data" has and always will grow to consume available capacity, until people finally realize that all the data is a useless waste of space. Once you understand how to analyze it, you analyze it in real time then throw it away.
Beer would disagree with Taleb about the frequency of analysis. In Beer's theory, data that is averaged over a year is a year out of date, and hence almost completely useless. It is better to have 95:1 noise:signal than to have data that is too late to act upon. Plenty of statistical methods can cope with 95:1 noise:signal. Not the human brain, perhaps.
I look forward to reading Taleb's new book. His last one was awesome.
"Open" does not mean "we want competitors to steal all our data which cost us money to gather so they can compete on an unbalanced playing field and eventually beat us thanks to having lower costs".
The GPL exists *explicitly* to prevent this from happening with source code, and it does so by requiring reciprocal responsibility from the users of the open stuff.
So yeah maybe if you can define "open" to mean there is some reciprocal responsibility from the users of the data, you'd be fine. Which is *exactly* how LinkedIn and Facebook are doing it, and exactly the point Asay is completely failing to grasp with this nonsense rant about the "principle of openness". Wake me up when he has a workable GPL-style API license that takes all this into account. Otherwise, he can STFU with his "lessons" for companies that actually have successful products and real competitors and pretty damned open (within reason) APIs.
There is no such legal right. Read the article, which explains it properly.
El headline summarizes this research differently than most other reports.
And yet, if you know more science than Lewis Page, you're probably more worried about climate change than he is. El Reg's position on this is political, not scientific. And this political position informs the way they pick and choose which evidence to report.
Which is EXACTLY how everyone else is reporting this story - that your belief on climate change typically depends on your politics, and not on your level of scientific literacy.
But any El Reg reader realized this a long time ago, at least as it applies to journalists. The fact that Page pretends he's smart enough to reinterpret the paper using a magically privileged viewpoint ("this is what the scientists are *really* trying to do") underlines this point quite nicely. To Page, not only is it a political issue, but everyone involved has purely political motivations. One sees the world through one's own eyes, after all.
What does this article really say?
ANY strategy is a losing strategy for the vast majority of companies. That's the numbers game. One in ten become profitable. The question is, which strategy is best? And that's going to depend on your product.
You're not going to make a billion dollars with a ten-million-dollar product. Unless you're Instagram and you sign on the dotted line while Zuckerberg is half cut.
OTOH, there are still billion dollar businesses to be built. They're just not going to be built in someone's garage in 3 months, on top of someone else's product. Because it's not like anyone got rich using someone else's microprocessor, or software stack. *cough*
But there is still a big difference between writing a Facebook app and making a product that changes the way the world uses computers. Except when it's a Facebook app that does that.
So, business as usual then :) Contradictions all round. If anyone actually *knew* how to build a $1bn business, they'd just do it, time and time again.
Re: Groundbreaking, 20 year old game appears on web
When I ported this to ARM in 1994, John Carmack suggested using BSP trees for rendering; it would have been much faster but I was too far along to replace my renderer. If they did that here I think it would pretty much fly.
This still costs real money?
One would have thought the licensing cost to MPEG-LA for DVD-era technology is now rather trivial. Is this not the case? You still have to pay real money for MPEG-2 and *cough* CSS?
Re: It's still proprietary
Yeah maybe they can use gcc 3 as a backend. It's not like they're aiming for speed of compilation or anything.
Almost brought a tear to my eye. A beautiful article about a beautiful piece of technology. My jaw dropped when I read the ARM worked without applying Vcc :)
Re: Britain's IP laws...... We are all Criminals
@ DavCrav, you are using two incompatible definitions of "rip off".
In one case, The Man "rips off" customers by offering something they want at a price they are willing to pay (if they are not willing to pay, they do not pay, and are not ripped off). So the customer is given an offer, which he accepts, and because he thinks it is to his mind too expensive he then turns around and says he's "ripped off". This is "ripped off" as in "rip off Britain", and I don't see this being the exclusive preserve of the IP-related industries. It's not like food or petrol, either, where you have little choice but to pay up. You can always not have that CD. The whole thing is legal because this is how markets work.
In another case, a fraction of the public "rips off" The Man by getting something they want without having to pay for it. Technically it's not theft but morally it pretty much is. That's why it's illegal. It has nothing to do with "The Man" making the laws he wants. Things like stealing = illegal. Things like making offers and having them accepted or declined by agents capable of free will = legal. Pretty damned simple if you ask me.
Comparing these two "rip offs" using the same term is disingenuous in the extreme. I suppose it is equally valid to steal petrol or food if one decides they are "too expensive"? No, of course not. An exception is made that taking something without paying is OK so long as it doesn't *directly* cost the producer anything. It's a bullshit distinction. To my mind this moral high-ground crap is far more distasteful than the actual execution of the crime. It would be better if people had the intellectual honesty to admit they take stuff without paying simply because they can.
PS: great article, Mr Orlowski! For once I agreed with everything :)
The interestingness of this article is to some extent destoyed by Lewis' repeated insistence that 10kWh per 1,000 litres translates to 1kWh per 167 litres. If he just used the honest approach of rounding up 1.67kWh to 2kWh, it would hardly damage his point, but it would make it seem less like he's fiddling the numbers to suit his case.
Paris, because she can't divide 10,000 by 1,000 either.
I for one welcome all these Dragon 32 remembrances. Although I've never seen the Dragon's startup screen look so crisp :)
Great article. One niggle:
"established that moderate use of a mobile phone reduced the rate of cancer (though well within the margin of error)"
If it's well within the margin of error, it's not really "established" then, is it. Unless tossing a coin 100 times and getting 51 heads means that it's now "established" that the coin is biased to heads (within the margin of error).
Call me a pedant but it's a shame that such a tour de force of rational reporting on the subject of statistical interpretation is spoiled by this small apparent misunderstanding of statistical interpretation :)
I'm not sure there are any quantum algorithms developed yet which use floating-point. Quantum computation is all about integers and set theory, at this stage. When they say "accuracy" I think they mean correctness, not floating-point precision. That would be correctness as in not broken, plus I believe quantum computers always have some probabilistic chance of giving the wrong answer; that probability is also a kind of accuracy (level of confidence).
For brokenness the problem seems easy - try a bunch of NP problems and verify them (in P) on a classical computer. But the result is a mixture of brokenness and probabilistic accuracy. You then have to figure out what proportion is brokenness.
So I believe the issue is that they really need to simulate the quantum computer on a classical computer and calculate what the probabilistic accuracy should be, so they can cross-check with the empirical figures and see what the brokenness level is. But a 300-qbit computer is so large that it can't be simulated.
Presumably you're an idiot
"Presumably, he'd also like second-hand books and DVDs banned too. Maybe even clothing, while he's at it."
Clearly, from what he's said, he doesn't give a shit about those other markets, since he doesn't develop in those markets. This is about what's good for Crytek, not what's good for Paramount or Gucci. But nice try on the whole "if someone thinks A they must necessarily think B and C too" strawman bullshit.
Full kudos for including a Dragon 32-only game. Although, meh, I'd have perhaps chosen Shocktrooper.
Re: Some of these people should have done maths and not engineering
This thread is retarded. It doesn't take 5 developers to develop an iPhone app, nor does it take a year. You can do most games with a programmer and an artist in 3 months, and some can be done with a programmer and a few OTS art resources in a few weeks. Around £60K PA including costs, or £15K for the whole thing. £15K / 66p = 23,000 sales to break even. The 1-month project costs more like £5K, so 7,500 sales to break even. This is assuming up-front cost pricing model, which no one is using any more, since you can get 100,000 downloads of a free app and monetize that to £15K a lot easier than you can get to £15K on upfront purchases.
Current app in development: art budget £5K, programmer budget £2K. So you can go even lower than the above figures if you think hard about it.
People are typically making a few hundred quid per month per title; over the lifetime of a title (say 2 years) that's going to get you within the right ballpark. One in ten titles will do 10x better, though, and once you get one of those you're in profit overall. Other titles do 100x to 10,000x better, which is the obvious attraction of iPhone development. There is a power law at work; about 1/N of titles will make Nx the typical figure.
The idea that iPhone software development isn't a viable business is obvious bullshit given the size of the industry. It's a marginally profitable enterprise with the possibility of huge returns if you strike the right note in just the right way.
Oddly enough, I assumed that was what had happened. Having kids myself.
A good article
I disagree with a few points though.
Firstly there is this Orlowski-style simile where he compares compulsory coding over a 1- to 3-year period with a journalist doing it for a day. Doesn't really mean anything. You can delete the whole journalist doing HTML part from the article and lose nothing.
Secondly there is the part about where does the time come from for coding. Well, the time comes from not teaching kids how to use Word so much. There already is time on the curriculum for IT, the question is how do we use that time.
Thirdly there is this Wizard of Oz nonsense. Abstraction is what computers do. The whole point of using computers, at every level, is to rely on the man behind the curtain. But each man behind a curtain is a piece of software, or hardware, which talks to a man behind another curtain. Ultimately it's all governed by the quantum wave equation, but fortunately you don't need to be able to solve that in your head in order to add a div element to a webpage. That's progress, Andrew, not philistinism.
But I think the main reason Orlowski falls down here is due to his lack of imagination. This article is good in that it does acknowledge the current state of things rather than appealing purely to historical precedent, but we're not teaching kids to enter the workforce of 2012, we're teaching them to enter the workforce of 2016 or 2020, where they will stay until 2060 or so.
Now, it is noticeable today that not many people program - it is a vocation, not a skill like driving a car. But the benefits from using computers are always greater to those who understand them, while the ease of programming and understanding the programming models is increasing. It is perfectly conceivable - perhaps inevitable - that by the middle of these people's careers it will be quite normal for everyone to be doing programming at some level. Even now, a manager who can write an Excel macro is going to be more productive than one who has no idea.
This idea may be speculative, but the idea that the industry will be unchanged in 30 years is obviously wrong. What we can agree on, surely, is that computers are pretty important, using computers beyond a certain level requires some programming knowledge, that basic programming knowledge is easy enough to teach, and that teaching kids how to drive the UIs of programs which may not be around in 10 years is stupid? In which case, teaching programming at years 7-9 seems to be a no-brainer.
I also find it somewhat bizarre to find anyone arguing for the status quo in education today, but hey.
Re: @Geoff Campbell So which is it?
Actually, if you take the time to read what Geoff in fact said, he isn't necessarily giving credence to the BBC either. He's just giving zero credence to Lewis Page's spin. He doesn't have to take a position himself to say this.
Re: Is there any point me posting any clarifications on yet another disingenuous knobwit blog?
Ah, the Samantha Brick come-back. Nicely played.
Re: Isn't life a physical process?
All physical processes produce pink noise, in fact - perfectly flat all-spectrum white noise doesn't exist. Presumably this is some specific shape of pink noise that requires life to produce. I didn't read the paper yet.
Still, the results of the original experiment have perhaps been unfairly discredited. The device gave a positive result as designed - it detected life on Mars. It's only because the other two failed that the announcement was made that no life was detected on Mars. It's good that they're going back to try to get more data from the original signals; maybe they can finally resolve it one way or another.
"it’s not entirely safe to assume anthropogenic antibiotics have created an evolutionary hothouse that forces bacteria to defend themselves."
Sigh, and thus we are once again invited by El Reg to believe that scientific knowledge is just something someone assumed one day.
The fact that bacteria can evolve to be antibiotic-resistant by perhaps switching existing genes on, rather than by developing new genes, has absolutely no bearing on the question of whether or not those bacteria are in fact evolving to become antibiotic-resistant. A misunderstanding of the mechanism of a phenomenon does not disprove the phenomenon itself.
Re: Kids eh?
You're thinking about *telepathy* :)
The freetards are out in force today.
Doesn't make any difference how expensive textbooks are. If you can't make them cheaper on the same level playing field you've got nothing to offer anyone. Making them cheaper by ripping off existing textbooks is not actually making them cheaper, it's just deferring most of your cost to other people. Eventually those people will go out of business and who are you going to copy then? This business is a machine for putting the textbook industry out of business, while making a short-term killing. And people are fine with this? Thank God it's illegal.
Of course, it could just be ...
Of course it could just be that no-one over here has heard of Interview Street, and that even if they had, they wouldn't be spending enough time on it to rank up in the tests, as they don't value what looks like a pretty faddy and high-effort way to try to get a job.
An article a few cm away on the front page says half of IT hires in Asia suck.
And you know what "self-selecting samples" are in stats, right? They're useless, that's what they are.
And this story is based on an Interview Street press release, right?
Too many signs pointing to bullshit for my liking.
A ringer says ...
Realms of Ancient War is looking pretty good :)
Disclaimer: I'm working on it
Re: lets be honest
Is this wilful ignorance on your part? No one is proposing to store all your emails and SMSes. The information that will be stored is who you emailed and who you SMSed. I'm not sure you can claim copyright on that data.
I enjoyed the article but now I'm wondering if Orlowski is making the same categorical error as you are in relating privacy and copyright in this way.
I get confused sometimes between the words "nostalgia" and "nausea". Which one am I experiencing now?
I love the endian-swap-based plot premise :)
I wonder how long before LLVM supports DCPU-16?
100m users? 100m accounts more like. Everybody got accounts. Nobody uses them.
Also, the bill seems to refer only to "pornographic images" - that is, any image, from any source, which is pornographic. I don't see anything about specific websites or anything else. It would appear ISPs would need to block pornographic images that may appear on YouTube, Facebook, Flickr, Something Awful ... or indeed absolutely any website on the internet.
The only way for ISPs to do this is to block everything on the connection until an over-18 has verified with them. Which would be highly amusing.
Pork barrel (fnarr fnarr)
Age verification is only used on pay sites. These politicians must have powerful porn-mongering donors if they're this desperate to wipe out YouPorn, XVideo, RedTube, *cough* or whatever those sites are called.
"Global warmings deniers" used in an El Reg article? Awesome :) Now we need a fight between Chirgwin and Orlowski to determine the real truth about global warming - convenient or otherwise. A fight using PlayMobil, of course.
No it hasn't. RISC OS used traditionally-anti-aliased fonts, displayed on CRT. ClearType uses the precise positioning of the different colour cells within a pixel to get 3x actual horizontal resolution (plus slightly odd discolouration). It only works on LCD screens, and it was invented by Microsoft late enough in the game that RISC OS was already a distant memory.
Re: WHY does it look worse?
While this is factually true, you should be careful not to underestimate how much spatial information is in an image. For instance, by analyzing a rasterized line it is possible to determine the original line coordinates with enough accuracy to re-render the line in higher-res. Many anti-aliasing algorithms on PS3 / X360 do exactly this, rather than render at high-res in the first place, and while not perfect they work damnably well.
Misconceived idea of Lewis' work
All of CS Lewis' work is Christian allegory. These are no more sci-fi than Christian metal is metal.
Re: I'm confused...
Really? You must skim-read the Orlowski and Lewis stories. They debate every single point, and plenty of stories have taken the stance that the world is not warming (they wouldn't shut up about the hockey-stick a year ago) or that the world isn't warming as fast as they say. Others, to be sure, have taken contrary positions such as "the world is warming; so what" and "the world is warming; but it's not our fault".
In fact it would appear that their precise position depends on which counter-consensus research they've picked up that week, and the only general theme running through their work is "throw as much mud as possible because some may stick", and basic grass-roots contrarianism.
Re: I doubt it will be a "steady rate of shrinkage"
Why would you assume that, when such a device at such a scale is completely unimaginable with current technology, even if Moore's law were to continue unstopped for another 17 years (which it probably won't)?
For instance, never mind miniaturization, think about surface area to volume ratios.
I just went to Physics World and saw the article (which is free, not paid) and I don't see any particular indication that Physics World is a peer-reviewed journal; certainly, the article in question is clearly marked as a "Feature", so the only real facts I can surmise is that Physics World runs Features which are timely.
No shit, Sherlock.
Re: But, but, but ...
Yeah, but that's so bad it's good.
The Thick of It just wasn't that funny. Amusing, yes, but not laugh-out-loud funny like YPM was. Same scriptwriters gives hope that it will reach the same heights as the older series. Cautiously optimistic.
"actual real climate scientists who know some maths"
So you accept that actual real climate scientists know something of what they're talking about?
That's a turnaround.
Perhaps you could stop listening exclusively to the dissenting voices amongst them and take notice of the consensus now.
So it's a PC then. Well, I guess that will making porting easier.
- Asteroid's SHOCK DINO KILLING SPREE just bad luck - boffins
- BEST BATTERY EVER: All lithium, all the time, plus a dash of carbon nano-stuff
- Stick a 4K in them: Super high-res TVs are DONE
- Just TWO climate committee MPs contradict IPCC: The two with SCIENCE degrees
- Review You didn't get the MeMO? Asus Pad 7 Android tab is ... not bad