The art of public speaking
The guy in the video say "uuhm!" too many times for me to watch it all.
Broadcom has released open-source drivers and documentation for the graphics processor that's used in the Raspberry Pi microcomputer, among other devices. "To date, there's been a dearth of documentation and vendor-developed open source drivers for the graphics subsystems of mobile systems-on-a-chip (SoC)," Eben Upton, a …
Huh?
You want Broadcom to fix another manufacturer's video drivers?
Or are you saying you want the open source community to ignore Broadcom's drivers until they've fixed this issue you're seeing with a specific configuration?
Well, either way, here's what's going to happen. Each member of the open source community is going to work on whichever project rewards them best. Whether that's financial remuneration, resolving an issue they themselves experience regularly, respect from the community, their own self-actualization, or some other reward depends on what they want.
You can give some of them an incentive to work on the issue you're seeing (one of the simplest incentives is plain detailed information on the issue itself), or, (this is the beauty of open source) you can even work on fixing it yourself.
You can get involved, or you can continue bitching. Your choice.
Wuh?
Did you read the title of the post? About Raspberry Pi's? And Broadcom?
You know, a couple of companies with no relationship to ATI at all.
(Oh, and actually these Broadcom drivers do not need fixing, they work fine. But this gives the ability for an open source set of drivers to be developed)
This post has been deleted by its author
I will take any down votes for this as you wish.
The Pi crowd are really making things move, not just the broadcom announcement, which is cool.
Look at how they have pushed other projects with investment, kept a level head and never lost track of their goal.
2.5+ million units in 2 years, if I was not a Brit, I would be patting them on the back and saying well done, good show!
Looks a bit high for what the foundation describes as a reasonably straightforward port to the BCM2835. Perhaps not quite that straightforward?
It is good to see binary blobs being acknowledged as a significant problem when the Foundation have long been saying it was not a problem for the raspi.
Smells like a way to search and hire good coders. They already have the source code to the raspberry pi part, after all. Why not simply release that too?
If they were to discover devs the traditional way, they'd pay at least twice that to the recruiter and still not be sure. This way, there's less risk and they get to find people they'd never have come across.
The graphics stacks currently run on the internal Videocore VPU's on the Raspi, talking to the 3D HW blocks. There is an ARM side shim that takes graphics calls done on the ARM and passes them to the GPU for execution. With the release of this SW (originally for a different chip, with no VPU, so all the work is done on the ARM talking directly to the VC4 3D HW blocks) there is now the ability to port the stack to work entirely in ARM space, effectively an open source 3D driver. So the 3D no longer requires any work being done on the VPU's. It should even be faster.
It also opens the way to an android port.
The Raspi Foundation does not have the manpower to do either of these things, hence the bounty.
As for blobs being as 'problem' - I personally think the aims of the Foundation can be reached with blobs all over the place. But they do get a load of flack from people about it, so why not get rid of some of the reason people whinge?
The issue with blobs is that you have no idea what they're doing internally, if they're well behaved or if they'll walk all over other parts of the system you're using.
Yes nvidia have released tegra code, but the issue doesn't just apply to RasPi and I expect the single biggest restraint on releasing source code is the fear that someone will use it to show they're breaching a (hitherto unknown) patent in the hardware.
What Broadcom have done is a good step in the right direction.
I suspect your $£500,000 figure is high. I also suspect that the $10,000 figure is too low, since it completely ignores the loss of IP involved in publishing the necessary documentation. (As I understand it, this second factor is the official reason that folks like ATI don't just document their hardware and let the experts write the drivers.)
"(As I understand it, this second factor is the official reason that folks like ATI don't just document their hardware and let the experts write the drivers.)"
At a guess, you could probably remove the word "just" from that sentence, so the rest of it couldn't really apply anyway? As I said though, a guess ....
> If you don't have patents and don't assert copyright, you don't really have any IP
Trade secrets are a real thing. Sometimes keeping an invention secret is an attractive alternative to disclosing it in a patent publication.
For a long time, I and others resisted the term "intellectual property", as it seemed to me to blur the boundaries between patents, trademarks, copyrights and trade secret material. Now that the UK has an Intellectual Property Office, I think that battle is lost!
but I suspect the main reason that large companies are reluctant to release their source code is the amount of work involved in making them fit for public consumption, starting with removing all the comments that say things like "//Fred is an idjit, look at this crap!" and then ensuring that something embarassingly piss-poor hasn't slipped through the net.
Oh, and management thinking that "this cost us 300K to develop, therefore it is *worth* 300K"...
The big disclaimer here is that no one has ever accused me of being a programmer so this is as much a question as anything else.
I remember an interview with an MS person who explained why a great deal of their functions in their APIs* were undocumented. The reason was that once a function was documented, people would start coding for it and thus MS were more-or-less required to maintain the functionality across versions and new OSs, etc... He explained that many 3rd-party compatibility issues were caused by programmers using undocumented functions that were then changed or removed in later updates.
Of course it seems that MS left whole APIs undocumented for their own commercial advantage but I still see the point this chap was making.
Does this kind of reasoning apply here as well?
Compared to some, and to how they once were, Microsoft are actually quite good with their API documentation. They even go so far as to say, 'this will be deprecated in a future version' several versions beofre something finally goes, for instance many of the stored procedures in SQL Server have been marked as deprecated for several versions, but are still there. If you then find an SP you were using in SQL Server 2000 is no longer there in SQL Sever 2012 because they told you it would be deprecated in 2005, and it breaks your code, you only have yourself to blame.
I don't think this is a major issue. Companies like this already have tools and procedures in place to prevent people seeing code comments and other things they shouldn't see. They will already, as a matter of course, release the source code to various partners and customers (manufacturers, not end-users) that have signed NDAs etc.
My opinion about Fred and Barney is the least of the problems in releasing the source code.
I can get away with code that is indented the Klingon way, unused variables, compilation warnings, etc in corporate code provided that it works and is up to spec.
That is unacceptable for open source projects. You have to clean up all of that so it looks like an illustration from Kernigan and Richie in order to release. That may be doable for a driver or a module which is to go in a larger project. Doing that for an established closed source project is a major undertaking (often on par with writing it in the first place).
I was going to post similar.
I remember in the mid/late 90s, a friend had a top of the range PC with a Voodoo 2 graphics card, staring in amazement at the anti-aliasing and the lighting effects (back then, every game had red and blue lights everywhere, it was atmospheric at the time).
Now it runs on a 20 quid 'toy'
In 20 years time, looking forward to playing Crysis and Bioshock on a RaspberryBake....
If only Qualcomm Atheros could pay attention do the same for some of their iffy wireless modules drivers, groan.
... good news and well done Broadcom, i can see which chipset manufacture im going to bump to the top of my list for future hardware purchases if you do this across the board for all linux chipsets drivers.
Sure you this will enable you to connect LCDs to it since you can finally change the code to support the one you have and all that. That is already a huge stuff forward.
However for me the more exiting is that this unleashes the GPUs for general purpose computing. This will allow you to do fairly complex SDRs on that hardware.
A game that played fine on a Dreamcast.
200MHz Hitachi SH-4, 16MiB RAM, 100MHz PowerVR2 8MiB VRAM. Optical disc.
Over 13 years ago.
Even had 4 player split screen.
Not exactly a monumental achievement for a supposedly far more powerful device
700MHz ARM11, 512MiB RAM, 250MHz Broadcom Videocore IV. NAND flash.
Yes, 20 FPS on an embedded linux system that runs on 5V and is smaller than your hand. You do realise that the 1990 linux system you are implicitly referring to would have also cost many multiples of the £24 a Raspberry Pi costs, before you even take inflation into account, and would be the size of a small piece of furniture?
To stretch an analogy, you are not even comparing apples to oranges, it is more like you are comparing apples to the cube root of 47.
Yes, indeed there are.
But as for your completely missing the point...
The competition is to port the ARM driver to work on the Raspi. A very good way of demonstrating you have done it, is to get it to run Quake 3 *, so that is what has been chosen.
* Quake 3 being a heavy/comprehensive user of the OpenGLES stack that needs to be ported.