* Posts by juice

136 posts • joined 16 Nov 2010


Full Linux-on-PS4 hits Github


Re: It's a fun experiment...

Performance-intensive stuff: most of this comes down to the GPU these days. The Pi itself is a key example of this; the fairly underpowered ARM chip (at least in the original iteration) relied heavily on the Broadcom GPU.

A fairly quick glance online shows the PS4 GPU to be roughly equivalent to a Radeon 7850 (http://wccftech.com/playstation-4-vs-xbox-one-vs-pc-ultimate-gpu-benchmark/). These look to be available for around 75 quid online, and come with 2GB of dedicated ram.

Admittedly, there's something of an apples/oranges comparison here, since I'm looking at second-hand prices. Then too, the PS4's custom-tuned architecture may well have some speed advantages - though conversely, GPU performance under linux is still generally behind that of Windows, and that's even assuming a hack like this is able to get access to all the hardware, and that drivers are available to take advantage of it.

Still, for around £150, you can get a quad-core machine with 8GB of ram, 2GB of dedicated GPU ram and a GPU equivalent to the PS4. And generally, that'll include a Windows 7 licence which can be upgraded to Windows 10 or junked and replaced with Linux.

And then you can spend the rest in the pub ;)


It's a fun experiment...

But these days, it does seem a bit redundant.

Getting Linux running on the PS3 was interesting at the time, as it was pretty powerful for the price in some number-crunching scenarios, thanks to the Cell architecture. But Moore's law had already marched on a fair amount by the time Sony withdrew support for Linux, thanks in no small part to the rise of the GPU as a device for massively parallel processing.

These days, "consumer" hardware is very much a commodity. Android-based USB-powered thumbsticks can be picked up for less than 15 quid - or, if you want to build something for scientific purposes, for the same price as a PS4, you could pick up ten Raspberry Pis and slap them together into a cluster.

Or you could nip onto Ebay and pick up a OEM small-form-factor PC; at a glance, there's plenty of multi-core, 3ghz machines with 8GB of ram available for less than a third of the price of a PS4[*]

And with all of the above, you don't have to worry about the functionality vanishing if/when Sony patches the exploit.

It's still an interesting experiment, but it's definitely of limited use in the real world!

[*] This is exactly what I did a while ago; said box fits comfortably under the TV and does a good job of running Windows 10 with Kodi, Steam, iTunes and a few other bits and pieces. Plus, it's all controllable from my phone - including the TV itself!


A Logic Named Joe: The 1946 sci-fi short that nailed modern tech


A million monkeys is a bit unfair...

These people were actively thinking about the future, rather than just hammering random keys. Though admittedly, it can sometimes be hard to tell the difference ;)

There's plenty of other interesting nuggets out there, too.

EE Doc Smith produced some spectacular space-opera cheese; much of this was the cliche "hero saving heroine from Certain Doom with the power of Science", but his Lensman series included some interesting concepts and his exploration of how to handle complex space battles was cited as an key inspiration for the US military's development of Command Centre capabilities in World War 2.

Robert Heinlein produced some equally interesting stuff - the militry concepts and tactics in Starship Troopers are well thought out (and the way these were ignored by the film is a major reason why I despise it) - and along the way, he also invented things like waldos (named after his story) and the water bed; his story was actually used as an example of prior art when someone tried to patent the concept!

Keith Laumer is much less well known, but produced some interesting concepts, especially in his Reteif series, where a diplomat wanders the cosmos, cleaning up after his incompetent bureaucratic superiors. Admittedly, it's hard at this distance to determine how much was original and how much was drawn from other sources, but he dabbled with concepts such as virtual reality, remote-controlled robotic bodies and cloning. It's possible at least some of this was driven by the fact that he suffered a stroke which restricted his mobility.

There's many more out there - for instance, the British government ignored Arthur C Clarke's ideas about geo-stationary satellites.

Sadly, one area where the Golden Age of sci-fi seemed quite weak was around computing science (though again, EE Doc Smith did come up with the concept of "robot controlled" spaceships as the first line in massed assaults). I suspect this was down to editors/publishers not being comfortable with the concept (and/or assuming the reader wouldn't be interested); Science was there to be controlled, not self-governing!


DARPA to geeks: Weaponize your toasters … for America!


I'm mildly surprised...

That no-one's mentioned the Atomic Toaster from MDK 2 yet!



You've seen things people wouldn't believe – so tell us your programming horrors


Bad code? Don't talk to me about bad code...

I spend a lot of time trying to fix things with a codebase which dates back over 15 years and has been hacked on by dozens (if not hundreds) of people with highly varying levels of knowledge and experience.

The bit of code I'm looking at *today* is a prime example: it's meant to deal with account cancellations. How does it do this, you ask? Well, it runs a query to pull back every account with a cancellation date set *regardless of whether the date is in the future or not*, and then performs a pass in the code to filter this down to the customers who we're actually interested in. Because everyone knows databases are bad at applying date and primary-key constraints to queries.

Then there's the code which used a switch statement to round a timestamp to the nearest 15 minutes. Y'know, instead of using the modulus operator.

Or the code which used the "last modified" timestamp on a file to determine the next polling period, rather than using the "YYYYMMDD-HHIISS" metadata embedded in the filename - and the two could differ significantly as the process could take over an hour to run. Though to be (un)fair, this same code also mandated a two-hour overlap between polling periods, because who doesn't love reprocessing data?

Or the code which compared an array to itself and surprisingly always got a match!

Or the web-application page which was showing configurable options which should only be displayed to certain users. Aside from adding around 100,000 extra items to the document's DOM - each with active Javascript code registered against it - this also added several megabytes to the overall page size. Entertainingly, said page was the default landing page, so fixing this issue sliced over 50gb of data per day off the internal network.

And the list goes on...

I'll be the first to admit that I've written some bad code in the past, and newer code in the system is (generally) of a higher quality. But even so!


From Zero to hero: Why mini 'puter Oberon should grab Pi's crown


Re: So many strawmen, so little time...

Fair point - I forgot about the RAM upgrade on the Pi2b - I'm still running RaspBMC on a 512mb B+, as it Just Works :)


Commentee comment - could the author miss the point any more widely?

"The culture of computing for several decades has been C and Unix or Unix-like OSes"

Off-hand, I can think of a lot of operating systems which haven't fallen into these categories. The Japanese Tron OS, BeOS, RiscOS, Amiga OS, QNX, Warp, VMS, MS-DOS, Palm OS, etc. Some may have been written in C and some may have been a bit *nixy, but not at the same time.

They just haven't caught on in the same way as *nix systems. To me, a big part of the reason for this is that Unix came from a mainframe/multi-user/batch-processing background, and therefore had a head start when it came to modern "networked server" paradigms (e.g. LAMP), where a given machine may be running dozens if not hundreds of tasks in parallel. And, y'know, the whole "free as in speech and occasionally beer" thing for OSs such as Linux and BSD; combined with the dropping cost of hardware, this led to a huge takeup of *nix systems by amateur enthusiasts, which then fed back into the workplace.

Other systems - including Oberon - generally came from a consumer or real-time/single user perspective, and weren't able to adapt. Windows is a notable exception, though it's telling that Microsoft accomplished this by essentially ditching their old codebase and switching over to their multi-process/multi-user New Technology system - and which itself was built by engineers from DEC, who had previously worked on VMS, a server-orientated competitor to *nix...

"The IT industry assumes that operating systems have to be written in C to work -- wrong -- and must by nature be big and complex -- wrong."

I don't think anyone is claiming that an OS absolutely has to be written in C [barring the odd flareup of flamewars on places like Slashdot]; it's just that the most popular operating systems have been written in it.

As to whether or not an OS should be big and complex: that's a full-blown topic all by itself. Modern hardware is so much more complex than hardware from even just a decade ago, and we expect it to do far more: more data, more threads, more peripherals, more displays, higher throughputs, more parallel processes, virtualised hardware, etc - and we expect all this to happen flawlessly and reliably on low-cost, commodity hardware. Handling everything that can go wrong - from dropped network packets to processor stalls - is complex and needs lots of defensive code.

It's also worth noting that there's been many efforts to go down the micro-kernel route - QNX and Gnu Hurd being two prime examples, with the latter being a prime example of how "theoretically superior" concepts don't always come out as expected in the real world.

"But it should be something simple, clean, modern, written in a single language from the bottom of the OS stack to the top -- and that language should not be C or any relative or derivative of C, because C is old, outmoded and there are better tools: easier, safer, more powerful, more capable."

I'd love to hear suggestions on what should replace it? It sounds like you're rejecting things like Java and C# (and hence by extension things like the Android runtime)

Other than these, the last real attempt to do this was BeOS, and this failed. Partly due to allegedly dodgy behaviour from a certain industry giant, partly because they targetted the consumer market and partly because they couldn't get a critical mass of applications and developers.

"We should start over, using the lessons we have learned. We should give kids something small, fast, simple, clean, efficient. Not piles of kludge layered on top of a late-1960s hack."

Perhaps the biggest lesson to learn is that reinventing the wheel is expensive, time consuming and generally pointless. Most if not all of the technical lessons we have learned are already encapsulated in the current popular operating systems - they've survived and grown because they've evolved and rearchitected themselves along the way. Both Windows NT and Linux have moved towards "hybrid" kernel design - not quite microkernel, but not entirely monolithic. They handle a wide range of physical hardware from a vast range of manufacturers - CPUs, network/audio/network/video/etc. They handle as many real-world issues (packet drops, security attacks, parity errors, etc) as they can. There's literally thousands of man-hours which have been ploughed into making them as robust as possible.

Dismissing all of that as "hacks" is simply foolish. I'm reminded of the article by Joel Spolsky, written back when Mozilla decided to reinvent the wheel and reimplement Netscape Navigator from scratch, and in doing so essentially conceded the browser wars to Microsoft. http://www.joelonsoftware.com/articles/fog0000000069.html

"No, we should not be teaching children with "real world" tools. That is for job training. Education is not job training, and vice versa. You don't teach schoolkids woodwork with chainsaws and 100m tall trees"

Oddly, to my mind, that's exactly what you're proposed. In fact, you're essentially expecting them to first assemble the chainsaw before firing it up. And therein lies the thing which this article seems to have misunderstood; it's about fifteen years out of date. The computer as a singular device has long since stopped being the primary thing people need to learn about; these days, it's all about what you can plug into it (or what you can plug it into), whether that's a camera, a network, a mechanical device, a sensor or a coffee machine. To do this, you need a development environment and tools (e.g. an IDE and support libraries), and that's precisely what things like the Pi - and the linux ecosystem - offer.

So no, we shouldn't be pointing schoolkids at a tree and passing them the parts to a chainsaw. We should be giving them some planks of wood, a saw, some nails and a hammer and telling them to build a birdhouse based upon an existing template. Said template may have been sketched out in the sixties and look a bit crap, but it's tried and tested and the children are free to innovate and reinterpret it - maybe they can use a 3D printer to give it a tiled-roof look, or a CNC milling device to etch the face of their mum on the side...


So many strawmen, so little time...

"The Pi's strength is its cheapness and the simplicity of its hardware, but at heart, software-wise, it's a PC... <rant about ARM vs x86>"

This is an odd complaint. At heart, the Pi is a mobile-phone chipset married to a low-end ARM chip, and it will run whatever OS is provided. It only takes a few seconds of looking at the official website (https://www.raspberrypi.org/downloads/) to see that there's a number of "officially approved" OS builds available for it, ranging from various flavours of Linux to Windows 10 /and/ RISC OS. And it doesn't take more than a few seconds to find ports of FreeBSD, Android and even more obscure OSs such as Haiku.

It's also worth noting that the Pi isn't bundled with an OS by default, which means that people are actually choosing to run Linux on it - as indeed, are many other "non-PC" devices, especially in the IoT landscape. After all, it's free and there's lots of existing dev tools and support libraries.

"There were some missed opportunities in creating the Raspberry Pi. The Foundation could have brought harmony and a de facto standard for firmware on ARM hardware"

The Pi was never intended to be a high-volume device. Instead, it was intended to be a relatively low-volume educational device, and it wasn't clear until after it had launched how popular it would become. Setting industry standards were never part of the foundation's remit.

Also, the Pi had only sold 5 million units as of February this year. Even if we assume that volumes have since managed to doubled to 10 million, that's a drop in the bucket compared to the "billions" of other ARM-based devices which the article itself notes have been sold in the same timeframe. So the Pi is hardly in a commanding market position!

Finally: as the article itself comments, the Pi deliberately sidestepped the firmware issue. What it doesn't mention is that this was for several pragmatic reasons - the impact on manufacturing costs being the main one. Because, once again, it was intended to be a low-cost, low-volume educational device.

"Failing that, the Foundation could have bundled RISC OS with it"

It's available on the website for free, and there was a fair amount of excitement/publicity when the Pi first launched about the fact that RISC OS was available. Which suggests that, as fun as tinkering with obscure OSs can be, people actually wanted to use an OS which has lots of existing tools and libraries available...

"Pi project founder Eben Upton fondly recalls his first machines, a BBC Micro and an Amiga 600. A kid could fathom those; I did, with my ZX Spectrum."

Ah, the humble Speccy - the grey +2 was my introduction to the wonderful world of computing. And in truth, I think it's a lot easier to learn how to use a computer these days. The 8-bit machines did offer a BASIC prompt on startup, but there was generally little or no support structure for people other than the official manual, whatever the local library had in stock and the odd magazine type-in (which quickly died off as the commerical world moved towards the use of machine code). These days, you can use the internet to search for documentation/prior examples, or post queries to somewhere like stackoverflow.

I'd also argue that it became significantly more difficult to learn how to code when the 16-bit era landed. You no longer had BASIC bundled with the machine and commercial C/Pascal compilers were relatively rare, underperformant and usually badly documented. So you had to either learn assembly or pick up a third-party program such as AMOS.

Then too, if your code crashed or went into an infinite loop back in the 8/16-bit days, you generally crashed the entire computer and lost all your hard work in the process. And let's not go into the time-cost of backing up to tape or floppy disk - especially the latter, since most home coders used repurposed magazine cover disks with distinctly variable levels of quality control...

"Twenty-first century Unix is too big, too weird, too full of arcane 1960s strangeness."

"Conventional wisdom is that this complexity is an unavoidable consequence of modern software's age and maturity, and that it's worth it. You just ignore the stuff you don't need.".

The ZX Spectrum was basically a 16k ROM bolted to 16/48k of ram, a 4mhz z80 CPU, and a custom ULA which did some magic to reduce component counts (and led to the infamous color-clash issues).

To take the current "high-end" Pi, the Pi 2 features 512mb of ram, a multi-core processor, an OpenGL capable GPU, an audio chip, a DMA controller (and an MMU), a mass media controller, a serial controller and a few other things for good measure. All essentially built into the one chip. The complexity of modern software goes hand in hand with the fact that the hardware is so much more capable. And since you can't chisel bits of silicon off the CPU, you pretty much have to ignore the stuff you don't need...

"Which brings me to the other cheap little educational computer you've never heard of: the OberonStation ... No, it won't do the 1,001 things Linux will, but it's no toy ... But what it shows is that complete, working, usable operating systems – not single-trick ponies – can be built, essentially single-handed, and can be understandable by ordinary mortals"

Hmm. An effectively proprietary OS, no USB ports, no soundcard, no network capabilities, PS/2 keyboard/mouse ports and VGA-only output. That sounds like a toy to me!

From a quick glance at the manual, Oberon was a vanity/sabbatical project built by two people in the eighties. I.e. it's pretty much ideosyncratic by definition and was designed back before the concept of networked computers/IoT became mainstream. Also, the manual states that the system "can be understood by a reasonably determined person", which is definitely a step beyond being understandable by an "ordinary mortal"! So I really can't see any justification for using it these days. Especially since any OS-level skills/knowledge you pick up can't be reused on other devices.

So no, Oberon shouldn't grab the Pi's crown. If there's even a crown to grab. Which there probably isn't, since there's so many competitors out there, starting with the millions of Arduino devices out there. The fact that the Pi Zero is so low cost may well cause it to grab some more "makers" market share from Arduino and others of the same ilk, but there's still plenty of choice out there!


Microsoft working hard to unify its code base, all the way down to the IoT


The right tool for the right job...

You don't use a sledgehammer to put a screw into a piece of wood. Unless, y'know, it's right next to you and the screwdriver is still in the toolbox...

It sounds like we're going back to the "write once, run anywhere" ethos that Java once enthused about, and it's likely to encounter the same issues that Java did: the levels of abstraction needed to get the same code running on devices A and B mean that you need more physical storage, more run-time memory, more processing power, and more electrickery to keep things ticking over. And for the IoT ecology, all of these - especially the electricity - are generally in short supply. It's the age old "cheap, powerful, efficient: pick two" dilemma, and in a commodity market, cheapness is generally mandatory.

Also, there's a question about what's going to be done with all the data spewing from these devices. Is someone really going to gather all the stats needed to monitor a fridge compressor - and even if they do, are they going to be able to put together a realtime monitoring mechanism *and* have some way of exposing it securely for customers to access? That sort of thing costs time and money and unless there's some sort of high-value support contract in place, there's little or no reason to provide it. Especially since in a few years time, there'll probably be a new model of the compressor and the entire thing will have to start again...

(To be fair, there is a case to be made for having a widget sat atop the freezer that monitors for pre-defined, short-term issues - a change in the compressor's RPM or power usage, a prolonged change in temperature, etc - and punts out an alert via email to the butcher and/or the company which provides the support contract for the freezer. But that's very different to the kind of real-time monitoring/tuning/statistical analysis that Microsoft are talking about, and requires far less resources to implement)


Doctor Who's good/bad duality, war futility tale in The Zygon Inversion fails to fizz


Re: We've been here before...

1) Maybe - looking back now and reading the wikipedia summary, I guess it can be taken either way. But even if that is the case, it still feels unethical - the Doctor is basically refusing to take "no" for an answer and forcibly wiping people's memories until they agree with him!

2) True, but that doesn't address the issues which led to the splinter group becoming terrorists.


We've been here before...

'm starting to regret being so hard on RTD back in the day, as this felt fairly similar to some of the stuff turned out back during his reign. The Doctor bumbled around without actually achieving anything, there were some heavily telegraphed "plot twists", and lots of people died because the Doctor was faffing around. Then too, the entire ending hinged on a macguffin/Deus Ex Machina and the story fizzled out with an implicitly contradictory message, a plot hole large enough to migrate the entire Zygon race through and nothing was done to address the consequences of the various events (e.g. lots of dead people) [*]...

On a brighter note, the dramatic speech actually was quite dramatic.


1) The cease-fire has failed /fifteen/ times, and given that Kate doesn't look to have aged drastically, this has happened within the space of no more than a couple of years. I.e. things keep breaking down to the point of a full-blown MAD scenario within 3-6 months. Surely that's a sign that the peace treaty is a complete failure?

2) As much fun as stealing the plotline from Sunshine of the Spotless Mind must have been, the memory-wipe only affected the people in the room. What about the millions of Zygons outside the room and the unknown number of humans who knew that the uprising had occurred? Is there to be no justice for people affected by the atrocities carried out by the splinter group? If nothing else, the Zygons are going to have to choose some new leaders...

3) Similarly, even if the Doctor did manage to magically erase the memories of everyone on the planet, what about all the people who died - all their friends, family, medical and legal records, etc. If the Doctor is prepared to go back and wipe out people's memories of their loved ones to artificially maintain a demonstrably unsustainable peace, he's a much bigger monster than anyone else could ever be!

4) Why did the Doctor keep clumsily asking if Osgood was human or Zygon? Of all the entities in the universe, he should be the one most aware of the power of an anonymous symbol (e.g. such as a question mark...). It would have made more sense for Clara or possibly even Kate to ask that question - Kate especially had good reason to demand an answer!

5) And since someone will no doubt spark up with a "you don't have to watch it" comment: I've actually enjoyed some of the episodes this season; it does feel like there's an effort being made to steer things towards a more interesting path. And with some fairly rare exceptions, there isn't exactly a huge amount of British sci-fi to pick from!


After Burner: Sega’s jet-fighting, puke-inducing arcade marvel


Re: The music in this game was awesome

The soundtrack was included on a Your Sinclair covertape (http://www.ysrnry.co.uk/ys36.htm) and I have fond memories of listening to those tunes while playing various games on my humble speccy.

It's also worth noting that Afterburner is part of a lineage at Sega which essentially started with Space Harrier (which also offered a deluxe seated edition[*]), and the Afterburner/G-Loc games in the arcade, before moving onto the home consoles in the shape of the Panzer Dragoon series, before going out on a high note with Rez. And the person who designed Rez (Tetsuya Mizuguchi) then went on to create Child of Eden. though I'd personally say Rez is the better of the two...

[*] I've got memories of a trip to Blackpool as a young'un, and I could swear that I saw my cousin playing Space Harrier while perched on some funky mechanised fighter-pilot style seat, but the only images I can find for the SH seated version show a fairly boring wooden all-in-one cabinet...


To save mobile web, we must destroy JavaScript, HTML and CSS


To save the village...

We first had to destroy it.

Let's be honest here (as several people already have been): yes, there's a lot of JavaScript and HTML cruft out there, with varying degrees of effectiveness and efficiency. And yes, optimising it would reduce CPU usage and maybe improve battery life a bit. But that's not the main problem for mobile devices - or indeed deskbound computers.

Instead, it's all about the network traffic - both the size of the data and the lookups/translations required to determine the route to said data. And while HTML/JS contribute to the size of the data, I'd be willing to bet that for 90% of the websites out there, the binary data (i.e. images) far outweighs the size of the code.

F'instance, on this very page... if I download it, there's about 1.35mb of data. 34kb of this is the page/content. There's another 180kb for jQuery and another 85kb of CSS.

There's then around 500kb of what looks to be advertising-related JS and a further 1280kb of data spread across some 120 images. And that all needs to be cached, decompressed and generally tinkered with to get the page rendered.

And that's not going to change, no matter what form the code wrapped around it takes.


Google Chromecast 2015: Puck-on-a-string fun ... why not, for £30?


"Phone apps masquerading as remote controls"

Eh? The remote app for my old WD TV Live worked pretty well (much better than the WD TV Live's SMB mounting, but that's a different story) even on my old Samsung S3.

Meanwhile, my LG G4 (and the G3 before it) works fairly well as a basic TV remote, and does a good job with both the HTPC (as a KVM), Kodi/Xbmc - and potentially also the Xbox 360, though I've never actually tried using it for that!

I can also use it to VNC into my desktop machine upstairs, and it also does a good job of acting as a remote control for the iTunes install sat on the same machine; Retune even lets me tell iTunes to stream from the desktop down to Kodi so I can have good tunes and psychedelic visuals running whenever I'm downstairs.

Overall, both my original TV remote and the Logitech Harmony have been gathering dust for a wee bit now...


Hide the HUD, say boffins, they're bad for driver safety


Re: Wrong question

Having just come back from a 2,500 mile drive to Austria and back[*]... I found cruise control to be the best thing since sliced bread. However, the long roads do seem to encourage some bad driving practices on the continent; on the dual carriageways, people tend to overtake with their cruise control set to just a few kph faster than the speed you're driving at. So if you're coming up to a slower-moving vehicle, you either have to brake or rev your engine to nip out before the cruise-controller blocks you in...

Anyhow, back to HUDs, and it's the same as anything else (e.g. smart-phone interfaces): it'll take time

to evolve something which offers relevant information in a non-obtrusive way. Simple shapes/icons, use of colour-coding, etc. The article's point about fighter-plane HUDs is a good one; not only do the military spend lots of money on trying to make the HUDs effective, but the pilots themselves are heavily trained to make best use of them. Something which can't be guaranteed when it comes to Joe Bloggs in his company BMW...

In fact, I suspect the main issue will be that car manufacturers will have to downplay the expectations of people who've seen the heavily contrived VR/HUD displays in things like Minority Report and Iron Man. Slapping something like those onto someone's windscreen is pretty much a guaranteed recipe for disaster...

[*] And the worst bit of this journey? It wasn't the french potholes or the german trucks. It was the M1 and M25, thanks in no small part to the huge swathes of 50mph semi-permanent roadwork zones and the enforced slowdowns for accidents/closed lanes/temporary roadworks. Especially since at least two of the latter proved to not exist at all! And that brings up another point about HUDs and "smart" roadways: information needs to be both relevant and timely...


LG slaps SIX CORE Snapdragon 808 in LEATHER G4 dog&bone – not overheaty 810


As a G3 owner...

I'm not sure this is worth an upgrade!

About the only thing which stands out on the list is the camera - though it'll be interesting to see how well this performs, given that image/video processing is one of the few things out there which can greatly benefit from speedy multi-core CPUs...


Oxford chaps solve problem in 1982 Sinclair Spectrum manual



obDisclaimer: I've known Matt for years, and attended a talk he gave in Sheffield about this very performance, which I recorded - https://www.youtube.com/watch?v=r1a3JYp-VFs

Said recording covers most of the points above, but to summarise:

1) It's not some sort of publicity stunt. It was just a bit of fun for his local musuem, who were putting on a "Geek is good" season - and they approached Matt with a suggestion for hooking up some Speccies and BBC Micros together.

2) Memory limitations. The symphony is over an hour long, so can't be crammed into 48k of memory, especially in BASIC.

3) Hardware. Unsurprisingly, getting hold of lots of *working* Spectrums is pretty tricky these days. Matt had to search high and low to find enough kit for this - and even then, several failed to work on the day

4) Raspberry Pi. Each model of speccy has slightly different timings, so the pi was used to keep them in sync; the code for the music ran locally on each Speccy, rather than using them as dumb terminals

In the end, it was a bit of fun for a museum display. If anyone wants to go one better, then grab a 48k Spectrum (or emulated equivalent thereof) and get tinkering!


Doctor Who's tangerine dream and Clara's death wish in Last Christmas


Why are people quoting Inception...

When it was a fairly blatant rip-off of Existenz with a bit of Half Life 2's head-crabs thrown in. Or am I in the wrong reality again?

As story-mashups go, it wasn't too bad, though it's perhaps telling that no-one in the family (save myself) could be bothered watching it on Xmas day; instead, we watched it on iPlayer on Boxing day...


Goes like the blazes: Amazon Fire HDX 8.9 late 2014 edition


Wake me up...

When someone produces a decent/sensibly priced 12" tablet with a 4:3 viewing ratio, so I can read magazines and web-browse at a sensible viewing scale - I'm getting tired of poking and prodding the screen to try and hit the tiny links on my company's mobile-unfriendly Outlook web client, even on my LG G3's 5.5 inch screen!

All these variations on a 16:9 7"/9" tablet are missing the point - not only is the market glutted with the beasties, but they're not really big enough to offer a significant advantage over the many 5"+ phones which are now available...


Post-Microsoft, post-PC programming: The portable REVOLUTION


Re: What tosh

Round our way, the vast majority of developers use Macs... because our estate is basically LAMP/LAMJava, so it's possible to do most things "natively" in MacOS without having to install some variant of Linux that the IT team won't support and that doesn't quite work with the systems forc^H given to us by the corporate mothership. And as with most Apple stuff, the hardware is pretty nice; certainly more so than the Thinkpads that non-tekkies get.

Though it is quite fun watching them run around trying to find a VGA adapter dongle whenever they need to do a presentation - those things have become like hen's teeth around here...



"I should note that to really get any development done on an iPad you'll need a real keyboard"

We do have these things called laptops... maybe you should try one of them?


Trickle-down economics WORKS: SpaceShipTwo is a PRIME EXAMPLE


There's a difference between trickle-down technology and trickle-down economics, and this article seems to be confusing the two. It's also conflating "industrial" technological developments with "personal" technological developments, as well as the /purchase/ of luxury items versus the /creation/ of luxury items. Oh, and it's also completely omitted the role of government in technological developments.

Overall, if it was a piece of GCSE homework, I'd give it a C-. Anyhow, to justify my marking...

Trickle-down technology: yep, yesterday's high-end/luxury feature is today's mid-tier value-add and tomorrow's low-end commodity, whether it's something like car air-conditioning or a quad-core mobile phone with a HD-resolution screen. Though to counterpoint this, it's worth noting that companies often withhold technology trickle-down to avoid cannibilising the high-end market. This also means that the high-end market has higher profit margins: same equipment, different configuration, higher price. Microsoft and Adobe are key examples of this in the software world; AMD and Intel offer similar examples when it comes to CPU frequency/clock-speed locking.

And whichever way you cut it, TDT is absolutely nothing to do with rich hobbyist tinkers, except for the fact that they're part of the initial "rich" group who can afford to buy thi

ngs before they're commoditised.

The other point is that the industries mentioned in the article (cars, rockets, IT, etc) have pretty much all developed out of government investment - which in turn has been mostly driven by war - WW1, WW2, the cold war, etc. In fact, the technological underpinnings which have allowed Branson and others (e.g. John Carmack) to try and progress space travel can be traced directly back to when the German government took a bunch of amateurs tinkering with rockets and threw lots of money at them; when WW2 ended, the same people ended up working for the USA and USSR governments.

Trickle-down economics. In the simplest form, the idea is that giving tax breaks to the rich will improve the overall economy. However, there's a few flaws in this, the biggest of which is the assumption that the rich will go out and spend the extra money. Given that the rich (pretty much by definition) already have everything money can buy, they're far more likely to invest the money in abstracted, minimal-tax financial schemes - and these schemes are likely to be offshore and hence deliver little or no benefit to the local economy.

Similarly, if the rich do spend more money, it's likely to be on "luxury" brands and services and as highlighted above, a significant percentage of the cost for these items is likely to be for the "brand" rather than the resources needed to produce it. To use Apple as an example (as it's cited in the article), a 64GB iPod Touch currently costs £250 at PC World whereas a 32GB iPod touch is £180. That's an extra £70 for just 32GB of flash memory, which can be currently had elsewhere for £10 - £15 so (to grossly simplify things), Apple is getting £60 of additional profit from the "luxury" model. And that profit's going straight back to Apple at the high end of the economy; the low end of the economy isn't seeing a single penny of it.

Industrial technologies vs "personal" technologies. In brief, a personal technology is one where the technology can be mass-produced, it offers a significant improvement over "muscle power" and the ongoing cost of use is low enough for an individual to fund it. Bicycles, cars, computers, mobile phones: each one in turn offered a quantum leap forward in terms of travelling times, carrying capacity, processing capacity and communication capabilities - and the ongoing costs for each are relatively low, thanks in no small part to the fact that (to a greater or lesser degree) they're built on infrastructure derived (again!) in no small part from government investment.

Industrial developments however, carry too high a cost for most individuals to afford them or are simply impractical for the majority of individual uses, so tend to be used for mass-transit. And guess what: these are (again!!) usually funded or at least partially subsidised by the government - aviation fuel and train infrastructure being two good examples.

In fact, to keep with the aviation example: it costs around £7,000 to get certified on a light plance. Then, there's also the cost of maintaining the plane and buying the fuel, not to mention storage. Even hiring a plane is expensive; the cheapest I found at a glance online was £350 an hour; conversely, a low-end hatchback car can be hired for 3 days for just £40 - or effectively around £0.55 per hour!

It's therefore unsurprising that a lot of pilots get their certification through a stint in the military... which is funded (again!!!) by the government.

It's not unreasonable to expect that even when "commoditised", space-flight will prove to have a similar cost ratio when compared to aviation. Whichever way you cut it, climbing out of a gravity well requires a lot of energy, puts an incredible strain on components during flight and requires a lot more technology (e.g. vacuum seals, etc). And as for the cost of getting certification, there's probably going to be at least one extra zero tacked onto that £7,000!

So, to summarise: Branson's investment is a good thing: he's not wasting his cash on "luxury" items and there's likely to be trickle-down technology. But space-flight will never be a commodity technology, nor will it ever be something that the average individual can personally own, and the trickle-down effect is likely to take years - if not decades - to manifest. And the vast majority of the money which comes out of developing space tourism will be going straight back into the high end of the economy. And it's all only possible thanks to that bogeyman of Republicans and Conservatives: big government. And any further significant developments (e.g. space elevators) will almost certainly have to be backed by big government, in much the same way as the Chunnel and other similar infrastructure projects have been.

To offer a final counterpoint to the article: if you want to boost the economy, then a better approach would be to increase spending power at the bottom end, where it's much more likely to be spent on physical and/or low-margin goods and services which need (relatively speaking) much higher levels of resource to produce - and in far higher volumes, to boot. For instance, if you give a multi-millionaire an extra £250,000, he might go out and buy a single high-end car. Give ten non-millionaires £25,000 apiece, and they'll go out and buy ten mid-level cars. And aside from the ten-fold increase in resources needed to produce those ten cars, where a luxury car is likely to involve significant levels of imported resources (ranging from engines up to the assembly of the entire car), a much higher percentage of the resources for a mid-range car will have been drawn from the local economy - as will the resources needed to maintain those ten cars (e.g. garages, mechanics, etc).

To offer a final counterpoint to the article: if you want to boost the economy, then a better approach would be to increase spending power at the bottom end, where it's much more likely to be spent on physical and/or low-margin goods and services which need (relatively speaking) much higher levels of resource to produce - and in far higher volumes, to boot. For instance, if you give a multi-millionaire an extra £250,000, he might go out and buy a single high-end car. Give ten non-millionaires £25,000 apiece, and they'll go out and buy ten mid-level cars. And aside from the ten-fold increase in resources needed to produce those ten cars, where a luxury car is likely to involve significant levels of imported resources (ranging from engines up to the assembly of the entire car), a much higher percentage of the resources for a mid-range car will have been drawn from the local economy - as will the resources needed to maintain those ten cars (e.g. garages, mechanics, etc).

In fact, there's evidence to suggest that giving people a guaranteed basic income actually has a major benefit to the economy as a whole; not only does it simplify administration and thereby /reduce/ government, but it also has significant social benefits: crime drops, child nutrition and school attendence improves, people save more and produce more startups. In fact, that's pretty much the key premise behind the article - but instead of a small handful of Bransons and a small number of indirect long-term economic benefits, we get major direct ongoing economic benefits, hundreds - if not thousands - of entrepeneurs *and* Branson will still be free to tinker with spaceships - in fact, he may even have more cash to do so, if tax revenues rise to the point where government can cut taxes.

Admittedly, the above is simplified and there's plenty of other factors to take into account. But hey... tis the end of the day.


Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?


Re: Five Hundred and Forty Nine???

A quick glance dug up a sim-free price for the iPhone 6 of 539 quid, or just a tenner less - and that's for the low-end 16gb model.

Either way, there's not going to be many picking one up sim-free; most will pick one up on a contract. And interestingly, another quick glance at carphone warehouse shows that contracts for the iphone 6 are currently trending around a tenner (per month) higher than the equivalent contract for a Z3 (e.g. £43 vs £34.50 for a bare-bones 24-month contract with no up-front costs).

Then too, give it a few months and the Z3'll have come down in price - I wouldn't be surprised if it could be picked up for under £400 after Christmas...


Z3 and G3...

I picked up the LG G3 a month or so back - as tempting as the previews of the Z3 were, my contract was up, the S5 didn't look particularly inspiring and I didn't feel like waiting for the Z3 to come out, partly because I'm never keen on buying expensive/high-end kit during the initial launch - I much prefer waiting for the inevitable hardware/software issues to be debugged and patched out.

Speaking of which, the G3's now had 3 software patches, two of which specifically targeted performance and battery life (the third one doesn't seem to have done much other than tinker with the keyboard a wee bit). And even without enabling the "experimental" ART runtime, so far the battery life is pretty impressive; even with fairly heavy use (2-3 hours) a day as an ebook reader and the usual online timewasters such as Facebook, I'm comfortably getting 2-3 days usage on a single charge; it's easily triple what I was getting out of my (admittedly slightly aging) Galaxy S3.

... which is all a long-winded way of saying: were the comments in the article about the G3s battery life based on how well it performs now, or how badly it performed when first released? I know re-testing kit can eat up a huge amount of time, but it'd be nice to get a proper assessment of how the various flagship models compare to each other "now", rather than how things were several software revisions ago - especially since these days, software affects everything from battery life and call quality to camera performance and video quality!


Want to see the back of fossil fuels? Calm down, hippies. CAPITALISM has an answer


The problem with this article...

Is that we don't just use fossil fuels for energy. Aside the the obvious uses (plastics, fertilizer), any industry which uses chemicals (pharmaceutical, cosmetics, etc) need them...

Cracking the energy issue is a start. But it's by no means the full picture.


Read The Gods of War for every tired cliche you never wanted to see in a sci fi book


It wasn't until I got to the end of the review...

That I realised that it wasn't one of John Ringo's books ;)

From the review, it does sound like one of the cookie-cutter military-scifi books that Jim Baen used to love. And tucked away on the Baen website (http://www.baenebooks.com/c-1-free-library.aspx) is a nice set of free novels from various authors - Baen was one of the first to realise the marketing potential of giving away older books for free, especially if they're part of a series.

Among many others, there's David Drake's Redliners (military leading civilians through a hell-planet's carnivorous jungle), a number of titles from David Weber's Honor series (space opera with epic quantities of ship battles) and even a few of John Ringo's titles.

Alas, the online archive seems to have shrunk over time. However, Baen also gave away CDs with many titles on, which are available - with Baen's knowledge - at http://baencd.thefifthimperium.com/. So you can still brush up on AI-tank battles (the Bolo series), or go to war with the Romans against an evil empire controlled by a intelligence sent back from the far future (Belisarius). Or you could even join some genetically modified bats and rats - armed with a Shakespearian data download - as they take on an insectoid army despite the worst efforts of some incredibly inept human commanders (Rats, Bats and Vats)...


Ofcom will not probe lesbian lizard snog in new Dr Who series



First: Dr Who isn't Sci-fi (and arguably hasn't been ever since RTD picked up his pen): it's fantasy with a bit of technobabble and the occasional[*] Deus-ex-machina thrown in.

Past there, the kiss scene was pretty blatantly crowbarred in[**], but to be fair, the entire episode was pretty much made up of heavy-handed, self-indulgent and distinctly clumsy set-pieces, most of which were intended to establish this series overarching plot-thread rather than progressing the story at hand.

So overall, I'd say there's far more worthy things to complain about[***] ;)

[*] Alright, more than occasional, especially if you throw in the way the sonic screwdriver gets used these days. I was trying to be generous...

[**] Given that the robots stopped moving instantly when you stopped breathing, the characters could have gulped a breath every 30 seconds and gotten away scot free without any issues at all...

[***] No, I wasn't impressed. And I am getting bored of footnotes, so I'll stop now ;)


Microsoft: Just what the world needs – a $25 Nokia dumbphone


46 hours playback, 32gb, usb rechargable?

I'm actually tempted to pick one up as a pure MP3 player - there's not many out there which can claim to have that level of battery life...


NO TIME to read Facebook? Delegate the task to your FUTURE SELF


Interesting... though "nuggets" sounds a bit pants. Maybe they could call them... hmm... I dunno... bookmarks?


ZX Spectrum cassette player lost? There's an app for that

Thumb Down

"with a few notable exceptions"

I shouldn't feed the troll, but...

WOS has managed to get permission from approx. 250 publishers, including notable companies such as Gremlin, Firebird and Hewson), and around 800 individual developers. And more people grant permission every month, as even a cursory glance at the What's New page will show.

Then too, where the copyright owner has stated that they don't grant permission (e.g. Ultimate, Codemasters), WOS removes the software from their website.

It's not perfect - the original developer may not own the copyright on their games, and for some titles, it's far from clear who the copyright owner is. And there is an argument to be made that if permission hasn't been explicitly granted, the software shouldn't be offered for download at all.

But equally, WOS (or at least the volunteers behind it) is making a proactive effort to track down copyright owners and obtain permission - and they've achieved this for a significant percentage of the gamebase. Dismissing that as "a few notable exceptions" is rude at best and trolling at worst.


Retro-tech fan seeks cash for Commodore 64 clones


As other people have pointed out...

There's already a "hardware" C64 emulator, which was embedded in a Quickshot-styled joystick that could be hooked straight into your TV - the Direct-to-TV thing Jason mentioned.

There's also a PC which is built into a "breadbin" case - see http://www.popgive.com/2011/04/commodore-64-is-back.html for details.

(A friend has one; it looks quite funky but is also prone to overheating...)

Overall, this guy is basically reinventing the commodore-shaped wheel...


Are biofuels Europe's sh*ttiest idea ever?


Re: Yeah right

" It's the US that's gone massively for biofuels. They don;'t cause anyone to 'starve'. They've just dealt with the US surplus of maize."

Mmm. That's why there were riots in Mexico due to the rise in maize prices... which was due in part to crops being diverted for biofuel:


And don't forget the upcoming pig-apocalpyse, as herds are thinned out due to rising feed prices:


In both of the above, there's lots of other factors, both human (e.g. speculators) and natural (e.g. droughts). And there's a lot that could be done to address wastage in other areas, both at the point of production and consumption. But fundamentally, using maize to produce biofuels means that there's less maize for other things...


China strikes blow for property rights, British move to collectivism


Wow. Is this article meant to be serious, or was it just a paid-for trolling piece?


A new draft of China's copyright law strengthens the rights of artists and writers who write anonymously - in other words, artists who create orphan works. "Many users have been avoiding payment by using works that are written anonymously or in pen name. The new draft will effectively end this practice,"


First: strictly speaking, I guess "anonoymous writing" are orphan works - after all, you can't identify the creator. At the same time though, this was a deliberate choice by the creator, rather than being due to the work having lost it's identifiers and/or having unclear ownership.

Equally, if the writer has chosen to be anonymous, then this pretty much means by definition that they didn't want to be associated with it or reimbursed for their efforts. Indeed, there may be legal and political reasons why they don't want to be identified as the creator of the work.

Beyond that, I'd point out that China is currently undergoing a lot of political change, and that the government is putting a lot of effort into managing and censoring discussion about this. The changes to the copyright law would seem to be part of this effort: barring the use of anonymous writings seriously limits the options for people to criticise the government and/or disseminate information.

Overall, trying to tie a law designed to stifle political debate into the argument about the reuse of IP seems more than a little disingenuous...


Latest PS3 hack hits Sony with massive migraine


This would have been interesting a few years ago...

Y'know, when the Cell processor was still a relatively low-cost option for parallel processing research.

But now, there's so many low-cost and/or multi-core processors out there, you're almost certainly better off going for something which isn't subject to the whims of corporate imperatives

(e.g. http://www.phoronix.com/scan.php?page=article&item=phoronix_effimass_cluster&num=1 - a 6-board, 12-core ARM cluster; each board has 1gb of ram and the entire ensemble uses just 30 watts and could probably be tucked into a phat-PS3's case with room to spare...)


Raspberry Pi patch adds warranty-safe overclocking


Re: Why?

"Overclockers have always puzzled me. The manufacturer knows exactly which are the critical pathways in the CPU. They can test and appropriately speed-grade their chips by exercising these pathways. Intel turbo-mode is supported, meaning that Intel has tested your CPU at the highest turbo speed they support. You'll get correct results, as long as you stay within the thermal envelope."

This isn't necessarily true: the manufacturer may choose to sell "underclocked" parts, as it's cheaper to produce everything on a single process rather than having multiple production lines. Also, their stress-tests are (presumably) based on relatively low thermal tolerances and assume that the customer is using standard voltages and a cheap OEM heatsink, rather than one of the mega-fancy liquid-cooled, silver-plated, multi-fan, mega-finned uber-heatsinks that your average overclocker likes to bolt atop their CPU.

So there's often a fair amount of milage to be had by going beyond the manufacturers recommended specs...

And in answer to "why overclock a Pi": why not?


Game devs beg UK taxman: Can we pay 30% less?


Re: "How'd I do?"

"But what the hell - I agree with your tone. These buggers should focus on having a profitable business plan and not expect an already bankrupt system to make their lives any easier."

That's a bit simplistic. The key problem is that other countries - Canada and France, most notably - are already offering significant tax breaks, which has resulted in UK development being too expensive, which in turn leads to 1st/2nd party studios being shut down[*] and a brain-drain effect as developers are forced to leave the country. So establishing tax breaks in the UK is arguably just levelling the playing field.

(insert argument about protectionism here)

There's also other factors to consider: the economic benefits of the UK Film Tax Relief has been pretty impressive.



On a turnover of £3.4 billion, the core UK film industry directly contributed around £1.6 billion to UK GDP in 2009. This means that the core UK film industry contributed more than twice as much to GDP as the machine tools manufacturing industry [and] three times as much to the economy as the designer fashion sector.


without the UK Film Tax Relief in place, we estimate that its film production would be around 75% smaller, reducing UK GDP by around £1.4 billion a year and Exchequer revenues by about £400 million a year. Since the Film Tax Relief costs HM Treasury around £110 million a year, this means it generates about £13 in GDP for every £1 invested.


we estimate that around £1.9 billion of visitor spend a year might be attributable to UK films.

In 2009, this additional spending was estimated to be worth £950 million to UK GDP and £210 million to the Exchequer.


In other words, the Film Tax Relief has turned out to be a highly profitable investment for UK plc. And while the games industry arguably doesn't employ as many people and doesn't have quite the same potential for "visitor spend", there's a lot of potential for a higher ROI. For instance, Grand Theft Auto 4 (which, incidentally, was made in Scotland by Take Two Interactive) grossed nearly one billion pounds by itself - more than any of the individual Harry Potter films...

[*] Sony Liverpool - nee Psygnosis - are a very recent and visible victim - though this is arguably at least partly because Sony decided to have them doing little else other than pumping out Wipeout sequels for the last decade




Toasted marshmellow

It's worth bearing in mind that the original Ghostbusters game (for the C64) was rushed out in 6 weeks [*] - not quite as bad as the infamously short dev time for Atari's ET game, but still pretty bad.

Even so, it was surprisingly fun to play, though the gameplay was somewhat unbalanced - the driving/ghost-hoovering element quickly became repetitive and once you'd earned enough cash, you just had to sit and wait for the EP levels to reach maximum and trigger the end-game: you couldn't even go back to the shop to upgrade your equipment.

(though for Spectrum gamers (and possibly others - I haven't checked), there was a glitch/easter egg which could save you some cash during the initial game: entering "0" as the car-type gave you a solid black rectangle which was cheaper than the rest of the options!)

It's also worth noting that a nice person produced a remake of the Ghostbusters game - http://www.classic-retro-games.com/GhostBusters_193.html

Unfortunately, they then suffered a HDD failure and lost all of the source-code, which killed dead any chance of further updates and improvements...

[*] Wikipedia claims 8 months; David Crane states six weeks - http://www.edge-online.com/features/making-ghostbusters . Take yer pick ;)


3D: 10% of LCD TVs in 2011, 25% in 2012


Re: 3D Popularity

"3D itself is more popular than 10%. According to the MPAA statistics, 21% of box office sales in 2010 were 3d. That's up 91% from 2009."

It's worth bearing in mind that cinemas charge more for 3D showings, so they're always going to generate more sales revenue than the equivalent number of 2D showings.

Also, while I don't think it was the case in 2010, there's been a recent trend for films to only be shown in 3D - Underworld 4 and Ghostrider 2 were both only available in 3D at all of the local cinemas. It'd be interesting to find out if the additional per-ticket revenue is offsetting the potential drop in cinemagoers - anecdotally, my friends and family have started to actively avoid going to 3D showings thanks to the high prices and lack of benefits!


Nintendo cuts full-year forecast – by BILLIONS


The question is...

How can a company which is infamous for making a profit on *every unit sold* (hardware and software) be making a loss? Where is all the money vanishing to?

It's not being pushed into new games - for all that Nintendo has a significant number of first/second party development houses on it's books, the number of releases they produce is tiny. Wikipedia's "Wii game list" page indicates that they only published 13 games in 2011 - and several were from publishing agreements with third-party developers (Arita, Artoon, etc).

And from memory, their half-year results indicated that only 50% of their expected losses came from currency issues; the rest came from sales failing to meet expectations. And even then, they'd managed to write off a significant chunk of their losses via some tax-law jiggery-pokery.


Dizzy: the Ultimate Cartoon Adventure



I dunno if elReg is interested, but I strongly suspect Retrogamer would like to take a look. And some photos for the WorldOfSpectrum bunch wouldn't go amiss, either ;)


Beeb rescues old Who episodes


It's not the cost of storing the tapes...

But the cost of the tapes themselves. Remember, this was all back in the 1960s:


"When videotape was introduced into the BBC back in the '60s it was very expensive. The machine to replay it on was the price of a very expensive Rolls Royce, and the tape itself cost the price of a Mini."

Another article suggests the tapes cost around £200 apiece, which in today's money is somewhere around £3500!

It's therefore not too surprising that they were taping over older, "lower-value" recordings - after all, you couldn't just nip to Staples to grab a ten-pack of blank tapes!


US stealth bombers finally get nuke-nobbling super bomb


I wouldn't agree with that. It may not have had a huge direct impact on the German war effort (though it severely affected German agriculture), but it did have several important indirect effects:

1) It significantly boosted morale in the UK

2) It boosted the UK's standing among it's allies (America, Russia)

3) It showed that precision-bombing attacks could be effective (rather than just carpet bombing everything in the area)

4) It paved the way for Barnes Wallis to produce his tallboy/grand-slam bombs (the bouncing-bomb came about because he initially couldn't get anyone to sign-up to his "earthquake bomb" strategy) - and these arguably had a far bigger impact on the war, being used to take out railways, bridges, V2 bomb factories and the V3 site.

(for more details, see http://en.wikipedia.org/wiki/Operation_Chastise )


EU recording copyright extension 'will cost €1bn'


@El Presidente

Y'know.. posting a link to 72 parlimentary submissions by various special interest groups isn't really providing a clear example of how strong copyright benefits benefit society and the economy. In fact, I'd suggest it's a breach of DeMyer's law (excessive quoting), which means you automatically lose the argument ;)

Beyond that, while I haven't read them all - I've no intention of performing your research for you - it's pretty clear that few - if any - are concerned with copyright extension. F'instance, the British Video association is only concerned with the impact of allowing format shifting and the British Beer and Pub association (which is a consumer, not a creator) is only interested in performance royalty management.

"So, to recap, no /money/ involved yet the money/mouth imbalance prevails ?"

I'm not sure if you're deliberately missing the point (and you've clearly decided to ignore all the other points I've provided rebuttals for), but to repeat: I believe that people should have the opportunity to profit from their works *and* that there is greater cultural and economic benefit from sharing creative works. Given that my work is generally low-value, I've therefore decided to freely give it away, in the hope that other people will then build on it to create new works of increased economic and cultural value.

(and if by a miracle, I do someday create a high-value work, I'll be happy to release it to the public domain once I've had my opportunity to profit from it. And that could even generate more revenue for me - releasing older works for free has worked really well for the authors over at Baen/webscriptions.net)

Now, they might not - and in truth, I'd be surprised if more than 0.1% of what I create gets reused. But at least the opportunity is there, which is more than could be said if I'd kept it all locked away!

Is that really so hard to understand?


One more time ;)

Your "category 1" is "people who have a vested interest in making money from the work of others"

How does reducing copyright help this? The main abusers of copyright are the corporations who buy up copyrights and then sue other people for infringement, as per the Men at Work example. Reducing copyright terms reduces their ability to do this and gives individual creators more freedom to create new works.

Your "category 2" is "proponent of freetardism because they have grown up with the internet in their bedroom and, like, everything is free on the web, isn't it ?"

Given that I've explicitly declared that I believe creators should have the opportunity to profit from their work, how do I fall into this category? The issue is that a blanket extension of copyright terms affects everyone and only benefits a few. I pay for my media, through amazon, emusic, play.com and more: my concern is absolutely not about "freetardism".

"unsubstantiated c'n'p facts": which facts are these? The Men at Work debacle is clearly documented; the British Library's study is freely available for review; the PRS figures came directly from the PRS. If you can point me to an unsubstantiated claim, I'll be happy to dig out the details for it. The only thing that comes close is my stated belief that relaxed copyright laws produce greater benefit for society and creators - and even then, I've provided examples (Keep Calm and Carry On, Northern Soul) of where limited copyright (and/or the bypassing of copyright) has led to cultural and economic benefit, both for society and the original creator.

And I can easily dig out several more: Charles Dickens was able to undertake several very highly profitable tours of the USA, thanks to piracy of his books. The American industrial revolution, the 18th century European printing revolution, the current growth of the Chinese economy: all of these were only possible due to people refusing to obey industrial trade secret and copyright laws.

Internet rules: I haven't invoked Godwin's law, I haven't breached Poe's Law, Rule 34 isn't in effect (though I'm sure I could find some photoshopped pictures of Cliff if you're really that interested); Skitts, Scopies, Danths... in fact, with the possible exception of DeMyers, I think I'm clear on all of those - and even then, I've provided references rather than block pasting quotations. Are you sure you've been debating online long?

The websites both revolved around defending copyright while the work is still in copyright. My concern is about the duration of copyright. These are related but different things - to (ab)use the usual car analogy, I'm talking about the extended 5-year warranty while they're talking about the annual servicing. If you want to change my mind, point me to a study which shows that extended copyright terms are actually better for the economy and society as a whole, rather than a small group of super-rich artists and the corporations which support them.

As to having a dog in this fight: yes, I do. I believe that extended copyright terms (and the potential for abuse that comes with them) is bad for society and the economy as a whole, and I believe it's important enough to be worth fighting for and debating. And while I'm sorry that you've chosen to disbelieve me, I've been fully open on my role in the creative industry: I freely admit it's a small one, but it's still there and I've put my money where my mouth is, in the shape of declaring all of my works to be CC:NC and/or open source under the GNU public licence.


Time to stop feeding the troll...

Well, not just yet ;)

1) "Because Google say it is". So, the British Library is being influenced by Google when they release a study indicating that up to 40% of all copyrighted material is in an orphaned state? For what it's worth, their study was released in September 2010 - two months before the government decided to quote Google as a factor in deciding to review copyright law.

2) "You work in IT and draw a salary. To coin a phrase: you don;t have a dog in this fight."

Wow. So: only people who *only* make money from creative works have any say? That would seem to exclude... oh, about 95% of people who produce creative works. Talk about elitist...

Very few people make a full time living from creative works - the PRS's own figures show that (as of the year 2000) over 50% of UK music artists made less than £100 per year from their works; less than 3% made what could be considered a living wage (i.e. over £25,000). (http://eprints.bournemouth.ac.uk/3704/1)/Birkbeck_06_04_final.pdf)

(and there's also PRS fees (12.2%) and collection-agency fees (20% for royalties, up to 40% for further negotiations) to take into account. E.g. http://www.fidelitymusic.co.uk/index.php?option=com_content&view=article&id=78&Itemid=90)

So: do none of those people have a say, either? Guess it's up to you and Cliff to decide how copyright should be managed - after all, I'm sure Cliff is interested in enriching and developing our national heritage and culture, not his bank account.

3) Stop43.com is a UK-centric campaign to better protect the individual's rights. It says nothing about copyright duration: instead, it's focused on how to protect works while they're still copyrighted. Useplus is about a better photo-tagging system, which would be used to better track copyright claims. It says nothing about copyright duration; instead, it's focused on identifying the owner of a given work.

All told, I'm not entire sure how those two websites are meant to leave me better informed...

(and arguably: a shorter copyright term would make it easier - and cheaper - to identify if a work is no longer copyrighted. Which would make it easier to identify and pay individuals for their still copyrighted work, thereby helping the cause of stop43...)

My entire point is that reducing copyright terms is good for society, good for the economy and good for creators as a whole. There will be fewer mega-rich people, instead the money will be more evenly distributed across the entire "creative" economy.

Conversely, extending copyright terms results in is a small number of rich people/corporations holding onto copyrighted materials for decades and using those copyrights to sue creative people: the Men at Work example I provided above is just one of the more gregarious examples - as per above, they were sued by a subsidiary of Sony BMG, who had bought up the rights to a 70-year old folk song *after* the original creator had died.

In truth, all they're doing in the long term is costing themselves - and everyone else - money: the only people who profit are the lawyers who handle the litigation; at best, you'll end up with something similar to the "patent warchest" system where major companies agree to not litigate against each other (screwing everyone else in the process)...


Hmm. Where to begin?

"No it does not. Copyright is either infringed or it is not. It's a binary thing, perception doesn't come into it. The method and ease/difficulty by which someone can be sued for infringement also remains unchanged."

In an ideal world, I'd agree with you. IN the real world, it's sadly easy to come up with examples of people being inappropriately sued for infringement. Such as:

1) the Australian band Men At Work were successfully sued in 2010 for using a flute-riff in their 1983 hit which sounded similar to that used by a folk song written in 1935. Better yet, the author of the folk-song had died in 1988; the lawsuit was brought by a company who bought the rights to the song in 1990. So: they wrote their song 48 years after the original folk song was written and were sued 17 years later by some industry middleman who had bought the song rights 15 years earlier.

2) Orion Pictures/James Cameron was sued by Harlan Ellison, as the 1984 film Terminator allegedly plagarised an episode of The Outer Limits (Soldier) he wrote in 1964. The two items take the "future soldiers travelling back through time" concept in radically different directions (Soldier does not involve robots, human extinction or the concept of using time travel as a weapon): Cameron is on record as calling Harlan an parasite.

(interestingly, T2 is far closer to the plot of Soldier, which makes me wonder if perhaps Cameron deliberately did this to extract a small measure of revenge - I'm assuming the settlement for Terminator was a one-off fixed sum!)

Fundamentally, there is only a limited number of plot devices: the longer copyright durations are, the greater the chance is that there will be a similar piece of prior art. And if your creation is profitable, the odds are good that someone will then try to sue you, even if there is a good chance they'll lose; as with patent trolls, people will often choose to settle rather than go through a costly court battle.

""works are being "orphaned" - i.e. the ownership is unclear"

No, works are not being orphaned and the acid test of ownership is simple. Is it mine ? No it's not therefore it belongs to someone else and I must do due diligence in finding out who that person is."

Funny - if it's so easy, then why does the UK and EU consider it to be such a major issue?



Fundamentally, if the ownership of a work is unclear (and if it's passed through several corporations, the ownership rights can be very muddled - computer games have proven especially vulnerable to this), the cost of confirmation can be excessive.

"Creating another layer of bureaucracy to handle so called orphan works - in many cases works which can be identified if some effort is applied - will serve only to distance creators from their works and the end user which will, in turn act as a disincentive to the creator."

Who wants to create another layer of bureaucracy? The entire point of reducing copyright periods is that there will be fewer orphan works (as the age of the work is a significant factor in how likely it is to be an orphan), which in turn reduces the amount of due-diligence activity which is needed.

"In my experience, and I have a fair bit, I haven't met one creative who's in favour of less protection for their work."

It depends what you mean by less protection: there's certainly lots of people who are happy to release their work under a Creative Commons licence. Many choose to do so with a "non commercial" tag, but there's still plenty who do freely give away their work for any use whatsoever.

(and I'd like to think I'm at least vaguely creative!)

"Every person I'm aware of who is in favour of relaxation in copyright falls into three camps:"

Sadly, I'm not in any of those camps. I have a full-time job in IT, and anything I create is released under a Creative Commons: Non Commercial licence. And when I've been asked for permission to use work in a low-value commercial context (video clips for DVDs, photos for magazines, etc), I've freely given permission with no strings - or royalty demands - attached.

Now, you can argue that I can do that because I'm effectively supported by my "real" job. But there's many people who give away their work and still make a living - Jonathan Cauldwell (Still Alive) is a good example.

"Tarring the vast majority of people working in the creative industries with the U2/Sting/Madonna brush is a classic straw man approach as the vast majority of people working in the creative industries don't earn anything like the amounts made by those people and as a result would like to hold on to every penny they can."

To quote the article: 72% of the money from this copyright extension will go to the record labels. 24% will go to major-name artists. Only 4% will go to the "vast majority" of creative people you mention.

In other words: they're getting virtually nothing, at a significant cost to the public, as well as the non-tangible impacts to the economy and culture. Is that really a system worth implementing?

"Much in the same way as most people don't just give away chunks of their salary every month to random passers by."

No... but they do give money to the government in the form of taxes, and this money is then (theoretically) used to benefit society as a whole: to a greater or lesser degree, it's used to maintain and develop transportation, health, education and infrastructure.

And that's pretty much how I see shorter copyright terms: by pushing works back into the public domain earlier, the cost of producing (and defending) new works is reduced, which in turn allows more work to be created. And for 99% of people (i.e. everyone who's not U2/Sting/Madonna), it's a very small sacrifice which ends up benefitting the economy and society as a whole. The end result is more money for more people.

Beyond that: there often seems to be a perception that once an item falls out of copyright, the original owner can no longer sell it. That's 100% not the case: they can still sell it - and they can add value to it (e.g. new mixes, remastered audio, extra media); it's just that other people can now sell it. For instance, it'd be interesting to see how much money Sony BMG are making on the Elvis albums which fell out of copyright back in 2007: I suspect they've seen little or no drop in sales, despite the increased competition...


Feel the negativity!

Wow. Some people are missing the point quite nicely.

Purlieu, El Presidente, Anonymous: this is nothing at all to do with the "Freetards" you're trying to mock: I don't think anyone disagrees that creators should have the opportunity to benefit from their creations - if nothing else, it acts as an incentive for them to then create more work. The issue here is (at least) fivefold:

1) New works are not going back into the creative commons for reuse by other people - and vast quantities of works are being "orphaned" - i.e. the ownership is unclear, but noone dares touch them in case they get sued.

2) The extended copyright terms are reducing the incentive to create new works

3) Very few creators benefit from the extended copyright terms - (ironically), it's generally only those which have already greatly profited from their creations

4) Managing the extended copyright terms creates layers of self-propagating bureaucracy which costs money and creates nothing of worth

5) Extended copyright terms make it easier for people to sue other people for perceived copyright infringement

Conversely, reducing copyright terms would have the following advantages:

1) Works could be reused while they're still culturally relevant, which could well benefit the original artist. For instance, the Northern Soul movement in the UK brought in new revenue for Motown artists who had failed to succeed in the USA; this happened because their records were basically treated as worthless and were being sold by weight from american warehouses. There's also the "keep calm and carry on" phenomena: millions of pounds of economic activity has been generated by a single, tatty, out-of-copyright poster found in a bookshop

2) It encourages the original creator to produce new works *and* enables new work of cultural and economic value from other creators who build on that work. Pride and Prejudice is a good example of this: people have added zombies to the plot, or set the story in India. Dracula is another example: the (copyright infringing) Nosferatu added the concept of daylight being deadly to vampires, and there's been literally thousands of books, TV series, comics and films which have leaned heavily on Dracula and Nosferatu

3) It actually leads to more creators receiving revenue, as it means there's a greater chance of their work being reused, in effect acting as free advertising. Again, see the Northern Soul phenomena

4) Reducing costs and bureaucracy means that consumers can spend more money on activities which directly benefit creators, rather than going on red tape.

5) It reduces costs for creators too, especially for complex projects such as songwriting, movies and TV shows, many of which often have to engage lawyers to fight off spurious copyright-infringement claims.

So, to recap: a shorter copyright term (e.g. 20 years) would have little or no negative impact to the vast majority of creators and would potentially have signficantly positive benefits for a large subset of creators - both the original creators and the people who can then build on their work.

Unfortunately, it's relatively easy for cynical corporations to hide behind rich, aging artists; it's even easier for them to drag out examples of impoverished aging artists (while simultaneously ignoring the fact that their creations obviously aren't returning any money at present; extending copyright terms isn't going to help that!). It's not as easy to sell the idea that more artists will actually earn more money if copyright terms are reduced...


Why Samsung won't open the Bada OS box


One word...

Maemo. An open-source, mobile phone OS backed by a major telecomms company... and it flopped miserably.

Admittedly, it was something of a skunkworks project and Nokia was notoriously bad at OS development, but still: just because something's open source, it doesn't mean it's going to succeed.

(it's a shame though - my N800 is still the best ebook reader I've found: you can read it in the dark and with the backlight turned to minimum, it'll quite happily last for over 6 hours on a single charge...)


Would you be seen dead with a shopping computer?


To me, $250 sounds quite expensive for a 7" tablet

A quick look on amazon.com (i.e. not amazon.co.uk) shows that the average price for a 7" Android tablet is around $180 (f'instance: http://www.amazon.com/Velocity-T301-7-Inch-Android-Tablet/dp/B004CFF6ZI/).

Now, given that Amazon has far better economies of scale for manufacturing and distribution, I'd expect them to be able to retail a similar piece of hardware at around $150 and still make a profit - even with the custom UI they've stuck atop Android. So why's it coming out at $250?


UK slashes red tape in apprenticeships scheme

Thumb Up

Plenty of negativity here...

Blimey: the government cuts bureaucracy and helps to fund employment, and people act like it's a bad thing...

Admittedly, there's almost certainly an element of unemployment-number-fiddling going on here, but at the same time, it's giving a reasonable chunk of young adults a chance to earn a bit of cash (more than being on the dole, judging by how much my younger brother is getting, now he's out of Uni and searching for a job) and gain some skills and experience; even if they're booted out of the scheme once the subsidies stop, they'll still be in a better position than they were when they started...