* Posts by juice

158 posts • joined 16 Nov 2010

Page:

Judge uses 1st Amendment on Pokemon Go park ban. It's super effective!

juice

"But now, a company far away can easily appropriate that park for their own uses with no permit at all. The result might be a very large crowd (just like with the concert) making heavy use of the park, interfering with local's use of said park, and if something bad happens, no liability for the 'organizers.'"

That scenario has been around for decades - see the old 90s illegal raves in the UK. Laws were passed in the UK to address that, and I'd expect the USA to have similar laws already in place.

1
0

Repairable-by-design Fairphone runs out of spare parts

juice

Re: 4 year life?

"It's cheaper to replace the entire thing with another second hand or an entirely new one. How do these parts sellers even sell anything with at least double the price a customer would pay!?!"

Because the cost of replacing the hardware is still cheaper than the cost of rebuilding your preferred/required setup on new hardware, especially in terms of time.

Admittedly, with things moving to "cloud" based sync'ing and storage, this is becoming less of a factor. But it is still a factor.

3
0

Sorry to burst your bubble, but Microsoft's 'Ms Pac-Man beating AI' is more Automatic Idiot

juice

Back to basics...

The thing is, they've managed to get a "perfect" score on the Atari 2600 version, not the arcade original. And for all that the port was well received at the time, it's a crude and heavily cut-down copy.

Sadly, there doesn't seem to have been much analysis of the way it was coded, though there is at least one hack out there which improves the graphics (http://atariage.com/hack_page.html?SystemID=&SoftwareHackID=5). But I'd be willing to bet that the algorithms controlling the ghosts are entirely deterministic, unlike the original where a random factor was included in the algorithms controlling the ghosts [*]

Beyond that, it's worth noting that the AI was only responding tactically, not strategically. Which is fine for a game like Ms Pac-man: if you can put your death off long enough, you'll eventually reach the maximum score. It wouldn't work as well in a game where there a

re other criteria - e.g. in Defender, you have to survive, kill all the aliens and protect the humans.

So yeah. They managed to write an AI which could produce a tactical solution for a deterministic situation with only 4 negative factors (aka: the ghosts). It's pretty much the most basic proof of concept you could produce.

Wake me up when they manage to produce something capable of tackling Defender or something more chaotic such as Robotron or Bubble Bobble...

[*] Unlike the original Pac-man, which was entirely deterministic; there were even books written on how to game the algorithms!

4
0

What augmented reality was created for: An ugly drink with a balloon

juice

Re: Wetherspoons now offer table-service via a phone app...

The gentleman in question was actually sitting in the pub and observing the results, but it's a good question - I'll have to take a look at the app at some point :)

1
0
juice

Wetherspoons now offer table-service via a phone app...

So someone I know took great pleasure in ordering drinks to random tables. Much confusion all round. And free beer, so it's not all bad.

9
0

User loses half of a CD-ROM in his boss's PC

juice

Back in the 16-bit days, the only form of portable media (in fact, pretty much the old media for the Atari ST and Amiga) was ye olde 3.5" floppy disc. And as impoverished teenagers and students, we tended to use the cheapest of the cheap. Magazine coverdisks were one source - I can recall people selling bin-liners full of these at markets and car-boot sales - but if you were feeling flush, you'd fork out the cash for a box of no-name disks from some far-eastern company you'd never heard of before [*].

Needless to say, quality control was an issue; aside from the usual plethora of read/writing issues, you'd often get issues with the protective plate not sliding open, or the disk failing to spin correctly inside it's sleeve. The best one I ever saw was when a friend enthusiastically hit the eject button on his Amiga, causing the disk to shoot out at a higher speed than normal; said disk literally disintegrated in flight, like an armour-piercing SABOT round...

[*] Much the same happened with the earlier 8-bit machines (magazine cover-tapes and no-name C90s - generally, the quality of the no-name cassettes was so low you could barely record audio on them, never mind the squeaks and squeals of a computer program. And then there were the budget VHS tapes as well; good luck watching anything you'd recorded on these, especially if you'd optimistically gone for LP recording; twice the duration and a quarter of the quality! Admittedly, budget CDs and DVDs were just as bad; there's nothing quite like picking up an old backup to find the aluminium peeling off...

8
0

Drunk user blow-dried laptop after dog lifted its leg over the keyboard

juice

Back in the day...

I once spilled Yop (drinking yogurt) all over a Toshiba laptop - it was back when Yop bottles had a stupid pop-off lid, rather than one which screwed off.

Initially, it didn't want to boot, but it eventually recovered after I'd removed/cleaned the keyboard and then left it for a few weeks - I'm guessing some of the liquid seeped under the keyboard and needed to dry out.

Ever after though, a slightly sour smell of strawberries lingered around it...

4
0

GiftGhostBot scares up victims' gift-card cash with brute-force attacks

juice

And there isn't a rate-limiter or captcha mechanism built into these websites because...?

Admittedly, rate-limiting gets a bit trickier if you're dealing with requests coming in from a botnet, but slapping up a captcha would seriously hamper this kind of trawling.

0
0

'Clearance sale' shows Apple's iPad is over. It's done

juice

Re: As I have said a million times

Agreed. I've been after an A4-sized tablet for a while for reading purposes - I've got a lot of scanned magazines from the 80s I'd like to read, but there's a little too much squinting required on a 9.7" screen and widescreen tablets make things even worse - a 12" widescreen display is only 5.9 inches tall, or nearly 2.4 inches narrower than a piece of A4 paper[*].

Alas, the iPad pro is a tad too expensive to use for occasional archaelogical browsing. I do occasionally eye-up convertible laptops on Ebay, but they tend to either have archaic technology, or are increasingly also widescreen; at 16:9, I'd have to get a 17" display to be able to view A4 pages at their original scale, which in turn drives up the weight and lowers the battery life.

[*] Can we stop using old money to measure screen sizes?

1
2

Git fscked by SHA-1 collision? Not so fast, says Linus Torvalds

juice

Re: That's not how hashes work

"Electron, what you post is nonsense. A 160 bit hash _does_ in practice produce a unique result for any given input (unless you spend 6,600 years of CPU time to search for two given inputs with the same result)."

160 bits make for a big number. A really big number, with lots and lots of zeros after it. And any halfway decent hash algorithm worth it's salt [*] will be designed such that even a tiny change in the input will produce a significantly different output.

However, 160 bits is not infinity. It's not even a googleplex. And if someone's deliberately trying to force collisions, you'd be foolish indeed to assume that things are safe.

Then too, assuming they're using the "1 GFLOP machine working for a year" definition of CPU-years [**], that figure of 6,600 years isn't as impressive as it sounds.

An Intel i7 can run at over 350 GFLOPs in dual-precision mode, while a modern GPU (e.g. Radeon RX 480) can theoretically churn out up to 5 TFLOPs in single-precision mode; the Tesla K80 used by Amazon for their cloud-computing back-end can churn out 8.74 TFLOPS in single-precision mode.

Then too, there's always the possibility of using distributed computing or specialised hardware - this type of problem is inherently parallelisable and bitcoin mining has shown how effective ASICs can be for this type of number crunching. Also, since the people most likely to want to force a collision are nation-states or hackers, they could well have access to a supercomputer or maybe even botnets - I wouldn't be surprised to see people offering collision-detection as a service, as more traditional profit streams continue to dry up (albeit with botnets increasingly being based on IoT low-power devices, there may not be many of these).

So, yeah. 160 bits is good. But these days, it arguably ain't good enough.

[*] sorrynotsorry

[**] http://www.gridrepublic.org/joomla/index.php?option=com_mambowiki&Itemid=35&compars=help_documentation&page=GFlops,_G-hours,_and_CPU_hours

2
0

'At least I can walk away with my dignity' – Streetmap founder after Google lawsuit loss

juice

It's not about the technology...

Instead, the question being asked in court is: did Google abuse it's search-engine dominance to starve out Streetmap, rather than just competing on merit?

It has to be said, the "Objective justification" argument presented by Google seems somewhat strange ("In my opinion, I think I'm best for the job"?) - in fact, I'm mildly surprised there hasn't been any comparisons to the browser-integration debacle which led to Microsoft getting a slapped wrist from the EU.

Anyhow, coming back to the technology, and Streetmap in particular...

I have to agree that I remember mapping technology being clunky back in the day - but I can't say exactly when that day was, or which suppliers earned my wrath - or indeed, how much of it was due to browser technology and connectivity speeds being much more primitive.

Because, y'know: it's been a decade or more. And beer.

Equally, I can't really comment about how good Streetmap was versus Google Maps back in 2007 - as far as I know, unless someone's maintained a video archive detailing their functionality at the time, the only way to accurately compare their relative merits would be to borrow a Tardis and jump back a decade.

Finally, I'd note that while £300,000 sounds like a large sum - I'd love to have that landing in my bank account every month - but for an IT company, it's not actually a huge amount. Once taxes and overheads are accounted for, it's only really enough to fund maybe half a dozen staff.

In fact, I suspect Streetmaps never really stood a chance against larger companies like Google, who could throw far more resources at their implementation, as well as integrating it with their other offerings and technology, such as natural language processing.

Perhaps if they'd gotten some investors behind them early enough, or if they'd been able to build up some sort of patent portfolio, Streetmap would have done better - other companies such as TomTom and Garmin have found themselves in similar situations and are having to evolve. But I can't help thinking that Streetmap simply weren't in a position to scale up at the rate needed to compete.

6
0

Shocked, I tell you. BT to write off £530m over 'improper' Italian accounts practices

juice

Re: BT NextGenitalia (aka. Forward thinking C***ts)

"Best Practice? Nothing to do with best accounting practice regards Tesco."

I'm pretty sure that unless you're working for the Mafia, best accounting practice never involves deliberate fraud. The point I was making was that BT makes it very clear at all levels that such activities are unacceptable and will lead to dismissal and possibly prosecution.

I'll also (un)happily note that upper management can often have a different view on "unacceptable" when compared to the lower tiers of management - and often seem to get away with far more at a much lower cost than other employees would be allowed to do - but as I said above, in BT at least, everyone has to provide explicit annual proof that they're aware of the rules and regulations which pertain to their role. I may even still have some of my old printed certificate-of-completions hanging around somewhere!

""I'm guessing it was more a case of inflated numbers and stuff being rapidly shuffled between divisions and/or bank accounts to make things look good and keep bonuses high."

In my book what you have just written 'is' fraud/corruption. You seem to see this a "light touch" manipulation?"

No, it's still fraud and corruption. The point I was trying to make was related to the original poster's "how you don't notice £530 million going missing": it wouldn't have been a single lump sum which could be easily spotted by running =SUM(A1,A20) over an Excel spreadsheet. Instead, it'll have been lots of things overstated, understated, assigned the wrong depreciation rate, calculated using an incorrect exchange rate, unrecipted, double-charged and a dozen other things that will only be understood by a professional accountant with full access to the books and a thorough understanding of the company's processes and the applicable national/international legislation.

Apologies if that wasn't entirely clear!

In the meantime, it looks like the knives have been sharpened and a few heads are starting to roll, starting with the head of BT's Continental Europe division (aka the ex-head of BT Italy). Though again, as per above, I'd guess his leaving statement will be phrased to handwave as much personal responsibility as possible away...

1
0
juice

Re: BT NextGenitalia.

That's a wee bit unfair. This wasn't BT, this was a subsidiary *in another country, with a (presumably) very different and complex regulatory process* who appears to have been cooking the books. I'm guessing it was more a case of inflated numbers and stuff being rapidly shuffled between divisions and/or bank accounts to make things look good and keep bonuses high.

I've no doubt that people will be running around within BT doing damage limitation, but this situation is absolutely nothing to do with BT's practices in this country, nor does it have anything to do with the politics of Openreach and Ofcom.

Also, it's worth noting that BT's employees in the UK - including in subsidiary companies - have to undertake a mandatory set of annual training courses which make it clear that equivalence has to be maintained and things like bribery and corruption are completely unacceptable. Which may not stop people, but at least they can't claim that they haven't been explicitly warned.

You have to wonder if other large-scale companies follow the same best-practice. Such as Tesco, with their £326m accounting scandal...

[ObDisclaimer: I worked for BT, about a decade ago. I don't have any shares in them, though, which may not be a bad thing at present!]

3
2

Boffins explain why it takes your Wi-Fi so long to connect

juice

I don't suppose

Anyone's got a list of which router chipsets are the most reliable/fastest?

0
0

Fake History Alert: Sorry BBC, but Apple really did invent the iPhone

juice

Apple invented the iPhone...

... in the same way that Ford invented the Model T, Sony invented the Walkman or Nintendo invented the Wii. They took existing technologies, iterated and integrated them, and presented them in the right way in the right place at the right time.

And that's been true of pretty much every invention since someone discovered how to knap flint.

As to how much of a part the state had to play: a lot of things - especially in the IT and medical field - have been spun out of military research, though by the same token, much of this is done by private companies funded by government sources.

Equally, a lot of technology has been acquired through trade, acquisition or outright theft. In WW2, the United Kingdom gave the USA a lot of technology via the Tizard mission (and later, jet-engine technology was also licenced), and both Russia and the USA "acquired" a lot of rocket technology by picking over the bones of Germany's industrial infrastructure. Then, Russia spent the next 40 years stealing whatever nuclear/military technology it could from the USA - though I'm sure some things would have trickled the other way as well!

Anyway, if you trace any modern technology back far enough, there will have been state intervention. That shouldn't subtract in any way from the work done by companies and individuals who have produced something where the sum is greater than the parts...

5
1

Soz fanbois, Apple DIDN'T invent the smartphone after all

juice

Apple may not have invented the smartphone...

But they pretty much set the mold by taking the existing technology and infrastructure of the iPod, adding a highly polished and well designed UI, and then integrating everything with network-agnostic functionality.

Back when the iPhone first launched, people were still thinking of smart phones as miniaturised computers - or at best, an upgraded PDA - and designed the UI and software accordingly. So, apps had scroll-bars which needed to be physically clicked and dragged with pixel-perfect precision, which was generally only feasible if you had a stylus or sharp fingernails. It' was all clunky and clumsy, especially since many devices still used single-touch resistive touch-screens and often sacrificed screen-size in favour of an awkwardly small physical keyboard.

Conversely, Apple ditched the physical keyboard and built the UI from the ground up to use capacitive multi-touch, with little bits of auto-assist technology built in everywhere. And to quote the old Apple Marketing slogan, It Just Worked - as long as you were happy with the functionality Apple was willing to give you.

Then too, Apple had some other advantages: the iPhone used the same connector as their iPod line. This meant that there was already a reasonably large third-party ecosystem out there (e.g. powered speakers, recharging docks) and the cost of buying replacement cables, etc was low. Also, you could share cables/battery packs/etc between your devices and you could charge via USB. And if you did want to use your iPhone for music, it had a 3.5mm headphone jack, unlike most other mobile phones at the time, which would at best have a 2.5mm jack for a mono earpiece

(And yes, there's an irony there, given that Apple has now ditched the 3.5mm jack. And to be fair, manufacturers like Nokia and Sony-Ericsson had fairly standardised power connectors, but these didn't transmit audio and they only sold cabled wall-warts, so if you did want to recharge via USB, you had to track down third-party cables, sometimes of highly dubious quality)

Finally, for all that iTunes is looking very long in the tooth these days, at the time, it was leagues ahead of the garbage supplied by other major manufacturers at the time (Sony, Nokia, etc), which were often unstable/buggy or hamstrung by politics. Sony in particular were bad for this, presumably because the media division ranked higher than the hardware division; the minidisk in particular was one technology which could have made a much bigger impact if they hadn't been locked it down so much to try and prevent music copying.

The use of iTunes also had a further impact, in that it provided a standardised and relatively simple way to push software updates to an iPhone, improving performance, stability and features. This was something other manufacturers simply couldn't begin to do, thanks in no small part to the fact that there was often network-specific elements embedded in the OS.

And iTunes also had a further, unexpected benefit, in that it offered a way for people to easily download - and pay for - new applications to their phone. Everything I've seen/read/remembered suggests that Apple initially failed to realise the significance of this, despite the fact that even basic games like Nokia's Snake had become a part of popular culture. Still, in time, iTune apps actually became a major driver of iPhone sales, thanks to effectively-exclusive titles such as Angry Birds, Fruit Ninja and Doodle Jump.

Mind you, for all that I admire and recognise the impacts of the iPhone, I've never actually owned one: they've always been too expensive, especially if you wanted extra storage and by the time I could justify buying one, Android phones were giving better bang for the buck, as well as offering far more flexibility - varied screen sizes, expandable storage, replacable batteries, widgets, etc.

2
0

Blue sky basic income thinking is b****cks

juice

Haven't we just had this rant?

http://www.theregister.co.uk/2016/10/18/basic_income_after_automation_thats_not_how_capitalism_works/

7
0

Chap creates Slack client for Commodore 64

juice

Bang those bits together, guys...

I seem to recall people doing similar with a ZX Spectrum[*] at the Manchester Play expo a few years ago - I think it was either for Twitter or IRC. When all's said and done, you're just using Ye Olde Machine as a bare-bones terminal.

[*] Admittedly, as this was a Sinclair machine, it was probably done with a lot of bodged parts rescued from landfill and was at risk of crashing if there was too much wobbling. We'll have none of that freshly organic artisanal rubbish here!

1
0

Basic income after automation? That’s not how capitalism works

juice

Re: Where to start...

To be fair, I was thinking more historically, but even more recent wars have had something of an impact - according to the ever-reliable Wikipedia, up to 80 million people died "[including] 19 to 25 million war-related famine deaths". That was 3-4% of the entire world population at the time.

Then too, it was significantly worse at a country level - some (e.g. Poland) suffered 15%+ casualties.

Any which way, war sucks.

2
0
juice

Where to start...

As badly written articles go, this one is... quite badly written. Where to start?

First, the entire article is based on the strawman that a basic income policy is only intended to address a shrinking job market. However, there's far more to it than that: not only does it reduce inequality and poverty (pleasing the left) and reduce government bureaucracy overheads (pleasing the right), but it also leads to more entrepenureal activity (pleasing both). After all, if you have a guaranteed economic safety net, you're free to experiment and take risks that would be unthinkable otherwise. And this isn't just a theory: this effect has been seen in many of the pilot schemes carried out to date (https://en.wikipedia.org/wiki/Basic_income_pilots) - in Madhya Pradesh, "The study also found an increase in economic activity as well as an increase in savings, an improvement in housing and sanitation, improved nutrition, less food poverty, improved health and schooling, greater inclusion of the disabled in society and a lack of frivolous spending".

Dismissing the concept as being just "charity" is therefore both foolish and misleading - as is the claim that it would "not be progressive or emancipatory".

Then, there's the shoe-factory example. There's an underlying assumption here that there's an infinite market for shoes - i.e. if you make 200 pairs instead of 100, you can sell all 200 for the same price as the original 100. In practice, the market will become saturated sooner or later, and then you'll have to either drop your prices or reduce your output. Either way, that shop-floor worker will lose out, as they'll either get fired, work less hours or get a reduced wage.

"There's no correlation between how burdensome and how well-compensated a job is". "Burdensome" generally isn't factored into compensation calculations because it's irrelevant (and often subjective, to boot). A manual job may require some degree of physical fitness but often requires little in the way of training or experience, so there's a very large number of people who can do it and the laws of supply and demand kick in again.

"The principle of production increase over leisure increase applies independently of the type of job in question". It does up to a point - the point where supply exceeds demand. At that point, you either scale back, start making a loss or end up with a large chunk of unsold/unsellable inventory - which essentially also means taking a loss.

Then, there's the claim that there will be new jobs to replace the old jobs. However, the current industrial revolution is different to previous ones in at least one important way: it's happening a lot quicker, and it's affecting many more economic areas. After all, a lot of it is being driven by software, and new apps and updates can appear virtually instantaneously across the world. Take Google Maps for an example: it's eliminated the need to keep a physical map in the car, and it's getting increasingly better at identifying and routing around traffic jams. So there goes the paper-map industry *and* the traffic-report DJs. Along with everything else that can now be handled by a mobile phone - checking your bank balance, taking photos, booking hotels, ordering food, checking mail, etc. There goes the bank-teller, the camera-manufacturer, the people on the phone and even the computer manufacturers...

Admittedly, people are using technology to create new jobs for themselves - t-shirts, 3D printed cosplay accessories, self-published media, etc. Sadly, the people I know who do this are generally making little or no money. Because with technology being so cheap and easy to use, anyone can do what they're doing and the rules of supply and demand have come into effect once more. However, having a basic income would give them more freedom to experiment, innovate and differentiate themselves, and therefore increase the revenue they earn.

It's actually worth looking at some of the classical civilisations to see how they handled over-production and over-population - Egypt, India, China, etc. Generally, what you ended up with was a heavily striated society with very limited movement between layers and increasingly complex social models and policies - such as the imperial examinations in China, where you had to study philosophy, poetry and even horseriding and archery. They also tended to have either low productivity or some form of resource-sink, such as the Great Wall of China or the Egyptian funeral industry. And in the long run, they also tended to be dominated by more efficient and less striated civiliations - the Romans, the Mongols, the British empire, etc.

Of course, there's always another approach to dealing with over-population: going to war - not only does it reduce your population (and that of whoever you target), but it distracts the general population and you also get to spend resources on equipping and training your troops. Win-win, except for the people at the sharp end of the axe.

So personally, if a Basic Income offers even the slightest possibility of avoiding a striated society or war, I think it's something we should be spending a lot more time looking into!

19
1

LG’s V20 may be the phone of the year. So why the fsck can’t you buy it?

juice

My last phone was a G4, but when it came to upgrade time, I looked at the G5 and jumped over to a Samsung S7 Edge. Not that there was anything specifically wrong with the G5, but the add-on technology seemed pointless and the S7's specs generally had the edge.

I passed the G4 onto a friend, and then things got a bit complicated, as it stopped working a month or two later due to the "reboot loop of doom", which turns out to be a well-known issue caused by component failure, to the point where LG has actually agreed to fix all affected phones regardless of warranty.

So. A bit of a pain, but at least we could get it fixed for free - I volunteered to help with this, as my friend's not technical. However...

I sent an email to their website, asking what was needed to submit a repair request. After two weeks, I got back an automated email apologising for the delay and asking me to resubmit the email if I still needed something...

In the meantime, I'd raised an RMA request, only for that to sit untouched for far longer than the 48 hours claimed on the website. I eventually rang them, only to get through to a human on an overflow line, who advised me that there wasn't anyone to take my call(!). The day after, I finally managed to get through to someone who could deal with the issue, and he finally got the process moving. He advised me that we'd have to post the phone without any additional items - i.e. no SIM card, memory card - even the back-cover and the battery should be removed.

Fair enough. Then, I received an email for one of those Inpost automated drop-boxes, printed off the return label, packaged up the phone by itself and popped it into the drop-box after it had scanned the appropriate QR code.

Two days later, I got another email from Inpost, and went back to discover there was something in the drop-box: some official LG packaging and a note telling us to include the battery and back-cover!

Thankfully, LG then confirmed that they'd received the phone anyway, and after about two weeks, it finally came back and my friend is happily using it once more.

Overall, it's put me off using LG for anything else in the future...

0
0

A journey down the UK's '3D Tongue' into its mini industrial revolution

juice

For prototyping, low-volume or custom items, 3D printing is great, whether you're printing out a replacement part on a US Navy ship in the Atlantic, making a high-precision part for a jet engine or printing out some props for your latest cosplay outfit.

In fact, prototyping is probably where 3D printing has made the greatest inroads and will continue to do so.

For mass-production, it's not so great, as it's a slow and expensive process. I was at a maker's fair this weekend and someone was using a 3D printer to make little rocketship models; each one took 2-3 hours to complete.

It'll be interesting to see how this improves as the technology matures - looking at the website for the HP Multi Jet Fusion, it's actually only claiming a 10x speedup against comparable 3D printers (i.e. ones with a six-figure price tag), which is still nowhere near fast enough for mass production.

Then too, if you have used 3D printing to produce a proof-of-concept and decide to use a more traditional method for mass-production, you'll have to redesign your widget from scratch to account for differences in material strengths and stress points.

Beyond this, some of my friends genuinely think that there's going to be a revolution of sorts, where every home will end up with a 3D printer sat in the corner; if you want a new item or something breaks, you'll just download the 3D model for it and set it printing.

I just don't see this happening. Partly because non-technical people have enough issues with standard printers - I get regularly summoned to help relatives change ink cartridges or install drivers.

Partly because I suspect there'll be issues with getting hold of the actual 3D models - as with cultural media (e.g. music, books, movies, etc), the people who make them will generally want to be paid for them, which means that there's likely to be a plethora of copy-protection mechanisms and formats.

And partly for the same issue/reason as per above: you can't always replace a mass-produced item with a 3D printed model. 3D printed items can be stronger, but they can also be weaker, as the guys who scanned and printed out gun components found out (http://arstechnica.com/tech-policy/2013/03/download-this-gun-3d-printed-semi-automatic-fires-over-600-rounds/).

There's also the question about other physical properies (e.g. heat resistance, expansion/contraction when temperature changes occur) and tolerances - the Multi Jet Fusion claims to have an accuracy of 0.2mm, but only after sandblasting. whereas injection moulding and CNC routers are usually accurate to around 0.127mm

0
0

Full Linux-on-PS4 hits Github

juice

Re: It's a fun experiment...

Performance-intensive stuff: most of this comes down to the GPU these days. The Pi itself is a key example of this; the fairly underpowered ARM chip (at least in the original iteration) relied heavily on the Broadcom GPU.

A fairly quick glance online shows the PS4 GPU to be roughly equivalent to a Radeon 7850 (http://wccftech.com/playstation-4-vs-xbox-one-vs-pc-ultimate-gpu-benchmark/). These look to be available for around 75 quid online, and come with 2GB of dedicated ram.

Admittedly, there's something of an apples/oranges comparison here, since I'm looking at second-hand prices. Then too, the PS4's custom-tuned architecture may well have some speed advantages - though conversely, GPU performance under linux is still generally behind that of Windows, and that's even assuming a hack like this is able to get access to all the hardware, and that drivers are available to take advantage of it.

Still, for around £150, you can get a quad-core machine with 8GB of ram, 2GB of dedicated GPU ram and a GPU equivalent to the PS4. And generally, that'll include a Windows 7 licence which can be upgraded to Windows 10 or junked and replaced with Linux.

And then you can spend the rest in the pub ;)

0
0
juice

It's a fun experiment...

But these days, it does seem a bit redundant.

Getting Linux running on the PS3 was interesting at the time, as it was pretty powerful for the price in some number-crunching scenarios, thanks to the Cell architecture. But Moore's law had already marched on a fair amount by the time Sony withdrew support for Linux, thanks in no small part to the rise of the GPU as a device for massively parallel processing.

These days, "consumer" hardware is very much a commodity. Android-based USB-powered thumbsticks can be picked up for less than 15 quid - or, if you want to build something for scientific purposes, for the same price as a PS4, you could pick up ten Raspberry Pis and slap them together into a cluster.

Or you could nip onto Ebay and pick up a OEM small-form-factor PC; at a glance, there's plenty of multi-core, 3ghz machines with 8GB of ram available for less than a third of the price of a PS4[*]

And with all of the above, you don't have to worry about the functionality vanishing if/when Sony patches the exploit.

It's still an interesting experiment, but it's definitely of limited use in the real world!

[*] This is exactly what I did a while ago; said box fits comfortably under the TV and does a good job of running Windows 10 with Kodi, Steam, iTunes and a few other bits and pieces. Plus, it's all controllable from my phone - including the TV itself!

0
0

A Logic Named Joe: The 1946 sci-fi short that nailed modern tech

juice

A million monkeys is a bit unfair...

These people were actively thinking about the future, rather than just hammering random keys. Though admittedly, it can sometimes be hard to tell the difference ;)

There's plenty of other interesting nuggets out there, too.

EE Doc Smith produced some spectacular space-opera cheese; much of this was the cliche "hero saving heroine from Certain Doom with the power of Science", but his Lensman series included some interesting concepts and his exploration of how to handle complex space battles was cited as an key inspiration for the US military's development of Command Centre capabilities in World War 2.

Robert Heinlein produced some equally interesting stuff - the militry concepts and tactics in Starship Troopers are well thought out (and the way these were ignored by the film is a major reason why I despise it) - and along the way, he also invented things like waldos (named after his story) and the water bed; his story was actually used as an example of prior art when someone tried to patent the concept!

Keith Laumer is much less well known, but produced some interesting concepts, especially in his Reteif series, where a diplomat wanders the cosmos, cleaning up after his incompetent bureaucratic superiors. Admittedly, it's hard at this distance to determine how much was original and how much was drawn from other sources, but he dabbled with concepts such as virtual reality, remote-controlled robotic bodies and cloning. It's possible at least some of this was driven by the fact that he suffered a stroke which restricted his mobility.

There's many more out there - for instance, the British government ignored Arthur C Clarke's ideas about geo-stationary satellites.

Sadly, one area where the Golden Age of sci-fi seemed quite weak was around computing science (though again, EE Doc Smith did come up with the concept of "robot controlled" spaceships as the first line in massed assaults). I suspect this was down to editors/publishers not being comfortable with the concept (and/or assuming the reader wouldn't be interested); Science was there to be controlled, not self-governing!

2
0

DARPA to geeks: Weaponize your toasters … for America!

juice

I'm mildly surprised...

That no-one's mentioned the Atomic Toaster from MDK 2 yet!

http://maxdockurtmdk.wikia.com/wiki/Toaster

0
0

You've seen things people wouldn't believe – so tell us your programming horrors

juice
Mushroom

Bad code? Don't talk to me about bad code...

I spend a lot of time trying to fix things with a codebase which dates back over 15 years and has been hacked on by dozens (if not hundreds) of people with highly varying levels of knowledge and experience.

The bit of code I'm looking at *today* is a prime example: it's meant to deal with account cancellations. How does it do this, you ask? Well, it runs a query to pull back every account with a cancellation date set *regardless of whether the date is in the future or not*, and then performs a pass in the code to filter this down to the customers who we're actually interested in. Because everyone knows databases are bad at applying date and primary-key constraints to queries.

Then there's the code which used a switch statement to round a timestamp to the nearest 15 minutes. Y'know, instead of using the modulus operator.

Or the code which used the "last modified" timestamp on a file to determine the next polling period, rather than using the "YYYYMMDD-HHIISS" metadata embedded in the filename - and the two could differ significantly as the process could take over an hour to run. Though to be (un)fair, this same code also mandated a two-hour overlap between polling periods, because who doesn't love reprocessing data?

Or the code which compared an array to itself and surprisingly always got a match!

Or the web-application page which was showing configurable options which should only be displayed to certain users. Aside from adding around 100,000 extra items to the document's DOM - each with active Javascript code registered against it - this also added several megabytes to the overall page size. Entertainingly, said page was the default landing page, so fixing this issue sliced over 50gb of data per day off the internal network.

And the list goes on...

I'll be the first to admit that I've written some bad code in the past, and newer code in the system is (generally) of a higher quality. But even so!

6
0

From Zero to hero: Why mini 'puter Oberon should grab Pi's crown

juice

Re: So many strawmen, so little time...

Fair point - I forgot about the RAM upgrade on the Pi2b - I'm still running RaspBMC on a 512mb B+, as it Just Works :)

0
0
juice

Commentee comment - could the author miss the point any more widely?

"The culture of computing for several decades has been C and Unix or Unix-like OSes"

Off-hand, I can think of a lot of operating systems which haven't fallen into these categories. The Japanese Tron OS, BeOS, RiscOS, Amiga OS, QNX, Warp, VMS, MS-DOS, Palm OS, etc. Some may have been written in C and some may have been a bit *nixy, but not at the same time.

They just haven't caught on in the same way as *nix systems. To me, a big part of the reason for this is that Unix came from a mainframe/multi-user/batch-processing background, and therefore had a head start when it came to modern "networked server" paradigms (e.g. LAMP), where a given machine may be running dozens if not hundreds of tasks in parallel. And, y'know, the whole "free as in speech and occasionally beer" thing for OSs such as Linux and BSD; combined with the dropping cost of hardware, this led to a huge takeup of *nix systems by amateur enthusiasts, which then fed back into the workplace.

Other systems - including Oberon - generally came from a consumer or real-time/single user perspective, and weren't able to adapt. Windows is a notable exception, though it's telling that Microsoft accomplished this by essentially ditching their old codebase and switching over to their multi-process/multi-user New Technology system - and which itself was built by engineers from DEC, who had previously worked on VMS, a server-orientated competitor to *nix...

"The IT industry assumes that operating systems have to be written in C to work -- wrong -- and must by nature be big and complex -- wrong."

I don't think anyone is claiming that an OS absolutely has to be written in C [barring the odd flareup of flamewars on places like Slashdot]; it's just that the most popular operating systems have been written in it.

As to whether or not an OS should be big and complex: that's a full-blown topic all by itself. Modern hardware is so much more complex than hardware from even just a decade ago, and we expect it to do far more: more data, more threads, more peripherals, more displays, higher throughputs, more parallel processes, virtualised hardware, etc - and we expect all this to happen flawlessly and reliably on low-cost, commodity hardware. Handling everything that can go wrong - from dropped network packets to processor stalls - is complex and needs lots of defensive code.

It's also worth noting that there's been many efforts to go down the micro-kernel route - QNX and Gnu Hurd being two prime examples, with the latter being a prime example of how "theoretically superior" concepts don't always come out as expected in the real world.

"But it should be something simple, clean, modern, written in a single language from the bottom of the OS stack to the top -- and that language should not be C or any relative or derivative of C, because C is old, outmoded and there are better tools: easier, safer, more powerful, more capable."

I'd love to hear suggestions on what should replace it? It sounds like you're rejecting things like Java and C# (and hence by extension things like the Android runtime)

Other than these, the last real attempt to do this was BeOS, and this failed. Partly due to allegedly dodgy behaviour from a certain industry giant, partly because they targetted the consumer market and partly because they couldn't get a critical mass of applications and developers.

"We should start over, using the lessons we have learned. We should give kids something small, fast, simple, clean, efficient. Not piles of kludge layered on top of a late-1960s hack."

Perhaps the biggest lesson to learn is that reinventing the wheel is expensive, time consuming and generally pointless. Most if not all of the technical lessons we have learned are already encapsulated in the current popular operating systems - they've survived and grown because they've evolved and rearchitected themselves along the way. Both Windows NT and Linux have moved towards "hybrid" kernel design - not quite microkernel, but not entirely monolithic. They handle a wide range of physical hardware from a vast range of manufacturers - CPUs, network/audio/network/video/etc. They handle as many real-world issues (packet drops, security attacks, parity errors, etc) as they can. There's literally thousands of man-hours which have been ploughed into making them as robust as possible.

Dismissing all of that as "hacks" is simply foolish. I'm reminded of the article by Joel Spolsky, written back when Mozilla decided to reinvent the wheel and reimplement Netscape Navigator from scratch, and in doing so essentially conceded the browser wars to Microsoft. http://www.joelonsoftware.com/articles/fog0000000069.html

"No, we should not be teaching children with "real world" tools. That is for job training. Education is not job training, and vice versa. You don't teach schoolkids woodwork with chainsaws and 100m tall trees"

Oddly, to my mind, that's exactly what you're proposed. In fact, you're essentially expecting them to first assemble the chainsaw before firing it up. And therein lies the thing which this article seems to have misunderstood; it's about fifteen years out of date. The computer as a singular device has long since stopped being the primary thing people need to learn about; these days, it's all about what you can plug into it (or what you can plug it into), whether that's a camera, a network, a mechanical device, a sensor or a coffee machine. To do this, you need a development environment and tools (e.g. an IDE and support libraries), and that's precisely what things like the Pi - and the linux ecosystem - offer.

So no, we shouldn't be pointing schoolkids at a tree and passing them the parts to a chainsaw. We should be giving them some planks of wood, a saw, some nails and a hammer and telling them to build a birdhouse based upon an existing template. Said template may have been sketched out in the sixties and look a bit crap, but it's tried and tested and the children are free to innovate and reinterpret it - maybe they can use a 3D printer to give it a tiled-roof look, or a CNC milling device to etch the face of their mum on the side...

5
0
juice

So many strawmen, so little time...

"The Pi's strength is its cheapness and the simplicity of its hardware, but at heart, software-wise, it's a PC... <rant about ARM vs x86>"

This is an odd complaint. At heart, the Pi is a mobile-phone chipset married to a low-end ARM chip, and it will run whatever OS is provided. It only takes a few seconds of looking at the official website (https://www.raspberrypi.org/downloads/) to see that there's a number of "officially approved" OS builds available for it, ranging from various flavours of Linux to Windows 10 /and/ RISC OS. And it doesn't take more than a few seconds to find ports of FreeBSD, Android and even more obscure OSs such as Haiku.

It's also worth noting that the Pi isn't bundled with an OS by default, which means that people are actually choosing to run Linux on it - as indeed, are many other "non-PC" devices, especially in the IoT landscape. After all, it's free and there's lots of existing dev tools and support libraries.

"There were some missed opportunities in creating the Raspberry Pi. The Foundation could have brought harmony and a de facto standard for firmware on ARM hardware"

The Pi was never intended to be a high-volume device. Instead, it was intended to be a relatively low-volume educational device, and it wasn't clear until after it had launched how popular it would become. Setting industry standards were never part of the foundation's remit.

Also, the Pi had only sold 5 million units as of February this year. Even if we assume that volumes have since managed to doubled to 10 million, that's a drop in the bucket compared to the "billions" of other ARM-based devices which the article itself notes have been sold in the same timeframe. So the Pi is hardly in a commanding market position!

Finally: as the article itself comments, the Pi deliberately sidestepped the firmware issue. What it doesn't mention is that this was for several pragmatic reasons - the impact on manufacturing costs being the main one. Because, once again, it was intended to be a low-cost, low-volume educational device.

"Failing that, the Foundation could have bundled RISC OS with it"

It's available on the website for free, and there was a fair amount of excitement/publicity when the Pi first launched about the fact that RISC OS was available. Which suggests that, as fun as tinkering with obscure OSs can be, people actually wanted to use an OS which has lots of existing tools and libraries available...

"Pi project founder Eben Upton fondly recalls his first machines, a BBC Micro and an Amiga 600. A kid could fathom those; I did, with my ZX Spectrum."

Ah, the humble Speccy - the grey +2 was my introduction to the wonderful world of computing. And in truth, I think it's a lot easier to learn how to use a computer these days. The 8-bit machines did offer a BASIC prompt on startup, but there was generally little or no support structure for people other than the official manual, whatever the local library had in stock and the odd magazine type-in (which quickly died off as the commerical world moved towards the use of machine code). These days, you can use the internet to search for documentation/prior examples, or post queries to somewhere like stackoverflow.

I'd also argue that it became significantly more difficult to learn how to code when the 16-bit era landed. You no longer had BASIC bundled with the machine and commercial C/Pascal compilers were relatively rare, underperformant and usually badly documented. So you had to either learn assembly or pick up a third-party program such as AMOS.

Then too, if your code crashed or went into an infinite loop back in the 8/16-bit days, you generally crashed the entire computer and lost all your hard work in the process. And let's not go into the time-cost of backing up to tape or floppy disk - especially the latter, since most home coders used repurposed magazine cover disks with distinctly variable levels of quality control...

"Twenty-first century Unix is too big, too weird, too full of arcane 1960s strangeness."

"Conventional wisdom is that this complexity is an unavoidable consequence of modern software's age and maturity, and that it's worth it. You just ignore the stuff you don't need.".

The ZX Spectrum was basically a 16k ROM bolted to 16/48k of ram, a 4mhz z80 CPU, and a custom ULA which did some magic to reduce component counts (and led to the infamous color-clash issues).

To take the current "high-end" Pi, the Pi 2 features 512mb of ram, a multi-core processor, an OpenGL capable GPU, an audio chip, a DMA controller (and an MMU), a mass media controller, a serial controller and a few other things for good measure. All essentially built into the one chip. The complexity of modern software goes hand in hand with the fact that the hardware is so much more capable. And since you can't chisel bits of silicon off the CPU, you pretty much have to ignore the stuff you don't need...

"Which brings me to the other cheap little educational computer you've never heard of: the OberonStation ... No, it won't do the 1,001 things Linux will, but it's no toy ... But what it shows is that complete, working, usable operating systems – not single-trick ponies – can be built, essentially single-handed, and can be understandable by ordinary mortals"

Hmm. An effectively proprietary OS, no USB ports, no soundcard, no network capabilities, PS/2 keyboard/mouse ports and VGA-only output. That sounds like a toy to me!

From a quick glance at the manual, Oberon was a vanity/sabbatical project built by two people in the eighties. I.e. it's pretty much ideosyncratic by definition and was designed back before the concept of networked computers/IoT became mainstream. Also, the manual states that the system "can be understood by a reasonably determined person", which is definitely a step beyond being understandable by an "ordinary mortal"! So I really can't see any justification for using it these days. Especially since any OS-level skills/knowledge you pick up can't be reused on other devices.

So no, Oberon shouldn't grab the Pi's crown. If there's even a crown to grab. Which there probably isn't, since there's so many competitors out there, starting with the millions of Arduino devices out there. The fact that the Pi Zero is so low cost may well cause it to grab some more "makers" market share from Arduino and others of the same ilk, but there's still plenty of choice out there!

4
2

Microsoft working hard to unify its code base, all the way down to the IoT

juice

The right tool for the right job...

You don't use a sledgehammer to put a screw into a piece of wood. Unless, y'know, it's right next to you and the screwdriver is still in the toolbox...

It sounds like we're going back to the "write once, run anywhere" ethos that Java once enthused about, and it's likely to encounter the same issues that Java did: the levels of abstraction needed to get the same code running on devices A and B mean that you need more physical storage, more run-time memory, more processing power, and more electrickery to keep things ticking over. And for the IoT ecology, all of these - especially the electricity - are generally in short supply. It's the age old "cheap, powerful, efficient: pick two" dilemma, and in a commodity market, cheapness is generally mandatory.

Also, there's a question about what's going to be done with all the data spewing from these devices. Is someone really going to gather all the stats needed to monitor a fridge compressor - and even if they do, are they going to be able to put together a realtime monitoring mechanism *and* have some way of exposing it securely for customers to access? That sort of thing costs time and money and unless there's some sort of high-value support contract in place, there's little or no reason to provide it. Especially since in a few years time, there'll probably be a new model of the compressor and the entire thing will have to start again...

(To be fair, there is a case to be made for having a widget sat atop the freezer that monitors for pre-defined, short-term issues - a change in the compressor's RPM or power usage, a prolonged change in temperature, etc - and punts out an alert via email to the butcher and/or the company which provides the support contract for the freezer. But that's very different to the kind of real-time monitoring/tuning/statistical analysis that Microsoft are talking about, and requires far less resources to implement)

8
0

Doctor Who's good/bad duality, war futility tale in The Zygon Inversion fails to fizz

juice

Re: We've been here before...

1) Maybe - looking back now and reading the wikipedia summary, I guess it can be taken either way. But even if that is the case, it still feels unethical - the Doctor is basically refusing to take "no" for an answer and forcibly wiping people's memories until they agree with him!

2) True, but that doesn't address the issues which led to the splinter group becoming terrorists.

1
0
juice

We've been here before...

'm starting to regret being so hard on RTD back in the day, as this felt fairly similar to some of the stuff turned out back during his reign. The Doctor bumbled around without actually achieving anything, there were some heavily telegraphed "plot twists", and lots of people died because the Doctor was faffing around. Then too, the entire ending hinged on a macguffin/Deus Ex Machina and the story fizzled out with an implicitly contradictory message, a plot hole large enough to migrate the entire Zygon race through and nothing was done to address the consequences of the various events (e.g. lots of dead people) [*]...

On a brighter note, the dramatic speech actually was quite dramatic.

[*] SPOILER/RANT ALERTS

1) The cease-fire has failed /fifteen/ times, and given that Kate doesn't look to have aged drastically, this has happened within the space of no more than a couple of years. I.e. things keep breaking down to the point of a full-blown MAD scenario within 3-6 months. Surely that's a sign that the peace treaty is a complete failure?

2) As much fun as stealing the plotline from Sunshine of the Spotless Mind must have been, the memory-wipe only affected the people in the room. What about the millions of Zygons outside the room and the unknown number of humans who knew that the uprising had occurred? Is there to be no justice for people affected by the atrocities carried out by the splinter group? If nothing else, the Zygons are going to have to choose some new leaders...

3) Similarly, even if the Doctor did manage to magically erase the memories of everyone on the planet, what about all the people who died - all their friends, family, medical and legal records, etc. If the Doctor is prepared to go back and wipe out people's memories of their loved ones to artificially maintain a demonstrably unsustainable peace, he's a much bigger monster than anyone else could ever be!

4) Why did the Doctor keep clumsily asking if Osgood was human or Zygon? Of all the entities in the universe, he should be the one most aware of the power of an anonymous symbol (e.g. such as a question mark...). It would have made more sense for Clara or possibly even Kate to ask that question - Kate especially had good reason to demand an answer!

5) And since someone will no doubt spark up with a "you don't have to watch it" comment: I've actually enjoyed some of the episodes this season; it does feel like there's an effort being made to steer things towards a more interesting path. And with some fairly rare exceptions, there isn't exactly a huge amount of British sci-fi to pick from!

2
1

After Burner: Sega’s jet-fighting, puke-inducing arcade marvel

juice

Re: The music in this game was awesome

The soundtrack was included on a Your Sinclair covertape (http://www.ysrnry.co.uk/ys36.htm) and I have fond memories of listening to those tunes while playing various games on my humble speccy.

It's also worth noting that Afterburner is part of a lineage at Sega which essentially started with Space Harrier (which also offered a deluxe seated edition[*]), and the Afterburner/G-Loc games in the arcade, before moving onto the home consoles in the shape of the Panzer Dragoon series, before going out on a high note with Rez. And the person who designed Rez (Tetsuya Mizuguchi) then went on to create Child of Eden. though I'd personally say Rez is the better of the two...

[*] I've got memories of a trip to Blackpool as a young'un, and I could swear that I saw my cousin playing Space Harrier while perched on some funky mechanised fighter-pilot style seat, but the only images I can find for the SH seated version show a fairly boring wooden all-in-one cabinet...

1
0

To save mobile web, we must destroy JavaScript, HTML and CSS

juice

To save the village...

We first had to destroy it.

Let's be honest here (as several people already have been): yes, there's a lot of JavaScript and HTML cruft out there, with varying degrees of effectiveness and efficiency. And yes, optimising it would reduce CPU usage and maybe improve battery life a bit. But that's not the main problem for mobile devices - or indeed deskbound computers.

Instead, it's all about the network traffic - both the size of the data and the lookups/translations required to determine the route to said data. And while HTML/JS contribute to the size of the data, I'd be willing to bet that for 90% of the websites out there, the binary data (i.e. images) far outweighs the size of the code.

F'instance, on this very page... if I download it, there's about 1.35mb of data. 34kb of this is the page/content. There's another 180kb for jQuery and another 85kb of CSS.

There's then around 500kb of what looks to be advertising-related JS and a further 1280kb of data spread across some 120 images. And that all needs to be cached, decompressed and generally tinkered with to get the page rendered.

And that's not going to change, no matter what form the code wrapped around it takes.

1
0

Google Chromecast 2015: Puck-on-a-string fun ... why not, for £30?

juice

"Phone apps masquerading as remote controls"

Eh? The remote app for my old WD TV Live worked pretty well (much better than the WD TV Live's SMB mounting, but that's a different story) even on my old Samsung S3.

Meanwhile, my LG G4 (and the G3 before it) works fairly well as a basic TV remote, and does a good job with both the HTPC (as a KVM), Kodi/Xbmc - and potentially also the Xbox 360, though I've never actually tried using it for that!

I can also use it to VNC into my desktop machine upstairs, and it also does a good job of acting as a remote control for the iTunes install sat on the same machine; Retune even lets me tell iTunes to stream from the desktop down to Kodi so I can have good tunes and psychedelic visuals running whenever I'm downstairs.

Overall, both my original TV remote and the Logitech Harmony have been gathering dust for a wee bit now...

0
0

Hide the HUD, say boffins, they're bad for driver safety

juice

Re: Wrong question

Having just come back from a 2,500 mile drive to Austria and back[*]... I found cruise control to be the best thing since sliced bread. However, the long roads do seem to encourage some bad driving practices on the continent; on the dual carriageways, people tend to overtake with their cruise control set to just a few kph faster than the speed you're driving at. So if you're coming up to a slower-moving vehicle, you either have to brake or rev your engine to nip out before the cruise-controller blocks you in...

Anyhow, back to HUDs, and it's the same as anything else (e.g. smart-phone interfaces): it'll take time

to evolve something which offers relevant information in a non-obtrusive way. Simple shapes/icons, use of colour-coding, etc. The article's point about fighter-plane HUDs is a good one; not only do the military spend lots of money on trying to make the HUDs effective, but the pilots themselves are heavily trained to make best use of them. Something which can't be guaranteed when it comes to Joe Bloggs in his company BMW...

In fact, I suspect the main issue will be that car manufacturers will have to downplay the expectations of people who've seen the heavily contrived VR/HUD displays in things like Minority Report and Iron Man. Slapping something like those onto someone's windscreen is pretty much a guaranteed recipe for disaster...

[*] And the worst bit of this journey? It wasn't the french potholes or the german trucks. It was the M1 and M25, thanks in no small part to the huge swathes of 50mph semi-permanent roadwork zones and the enforced slowdowns for accidents/closed lanes/temporary roadworks. Especially since at least two of the latter proved to not exist at all! And that brings up another point about HUDs and "smart" roadways: information needs to be both relevant and timely...

7
0

LG slaps SIX CORE Snapdragon 808 in LEATHER G4 dog&bone – not overheaty 810

juice

As a G3 owner...

I'm not sure this is worth an upgrade!

About the only thing which stands out on the list is the camera - though it'll be interesting to see how well this performs, given that image/video processing is one of the few things out there which can greatly benefit from speedy multi-core CPUs...

0
1

Oxford chaps solve problem in 1982 Sinclair Spectrum manual

juice

Well...

obDisclaimer: I've known Matt for years, and attended a talk he gave in Sheffield about this very performance, which I recorded - https://www.youtube.com/watch?v=r1a3JYp-VFs

Said recording covers most of the points above, but to summarise:

1) It's not some sort of publicity stunt. It was just a bit of fun for his local musuem, who were putting on a "Geek is good" season - and they approached Matt with a suggestion for hooking up some Speccies and BBC Micros together.

2) Memory limitations. The symphony is over an hour long, so can't be crammed into 48k of memory, especially in BASIC.

3) Hardware. Unsurprisingly, getting hold of lots of *working* Spectrums is pretty tricky these days. Matt had to search high and low to find enough kit for this - and even then, several failed to work on the day

4) Raspberry Pi. Each model of speccy has slightly different timings, so the pi was used to keep them in sync; the code for the music ran locally on each Speccy, rather than using them as dumb terminals

In the end, it was a bit of fun for a museum display. If anyone wants to go one better, then grab a 48k Spectrum (or emulated equivalent thereof) and get tinkering!

27
2

Doctor Who's tangerine dream and Clara's death wish in Last Christmas

juice

Why are people quoting Inception...

When it was a fairly blatant rip-off of Existenz with a bit of Half Life 2's head-crabs thrown in. Or am I in the wrong reality again?

As story-mashups go, it wasn't too bad, though it's perhaps telling that no-one in the family (save myself) could be bothered watching it on Xmas day; instead, we watched it on iPlayer on Boxing day...

0
2

Goes like the blazes: Amazon Fire HDX 8.9 late 2014 edition

juice

Wake me up...

When someone produces a decent/sensibly priced 12" tablet with a 4:3 viewing ratio, so I can read magazines and web-browse at a sensible viewing scale - I'm getting tired of poking and prodding the screen to try and hit the tiny links on my company's mobile-unfriendly Outlook web client, even on my LG G3's 5.5 inch screen!

All these variations on a 16:9 7"/9" tablet are missing the point - not only is the market glutted with the beasties, but they're not really big enough to offer a significant advantage over the many 5"+ phones which are now available...

2
3

Post-Microsoft, post-PC programming: The portable REVOLUTION

juice

Re: What tosh

Round our way, the vast majority of developers use Macs... because our estate is basically LAMP/LAMJava, so it's possible to do most things "natively" in MacOS without having to install some variant of Linux that the IT team won't support and that doesn't quite work with the systems forc^H given to us by the corporate mothership. And as with most Apple stuff, the hardware is pretty nice; certainly more so than the Thinkpads that non-tekkies get.

Though it is quite fun watching them run around trying to find a VGA adapter dongle whenever they need to do a presentation - those things have become like hen's teeth around here...

0
2
juice

Um...

"I should note that to really get any development done on an iPad you'll need a real keyboard"

We do have these things called laptops... maybe you should try one of them?

4
1

Trickle-down economics WORKS: SpaceShipTwo is a PRIME EXAMPLE

juice

There's a difference between trickle-down technology and trickle-down economics, and this article seems to be confusing the two. It's also conflating "industrial" technological developments with "personal" technological developments, as well as the /purchase/ of luxury items versus the /creation/ of luxury items. Oh, and it's also completely omitted the role of government in technological developments.

Overall, if it was a piece of GCSE homework, I'd give it a C-. Anyhow, to justify my marking...

Trickle-down technology: yep, yesterday's high-end/luxury feature is today's mid-tier value-add and tomorrow's low-end commodity, whether it's something like car air-conditioning or a quad-core mobile phone with a HD-resolution screen. Though to counterpoint this, it's worth noting that companies often withhold technology trickle-down to avoid cannibilising the high-end market. This also means that the high-end market has higher profit margins: same equipment, different configuration, higher price. Microsoft and Adobe are key examples of this in the software world; AMD and Intel offer similar examples when it comes to CPU frequency/clock-speed locking.

And whichever way you cut it, TDT is absolutely nothing to do with rich hobbyist tinkers, except for the fact that they're part of the initial "rich" group who can afford to buy thi

ngs before they're commoditised.

The other point is that the industries mentioned in the article (cars, rockets, IT, etc) have pretty much all developed out of government investment - which in turn has been mostly driven by war - WW1, WW2, the cold war, etc. In fact, the technological underpinnings which have allowed Branson and others (e.g. John Carmack) to try and progress space travel can be traced directly back to when the German government took a bunch of amateurs tinkering with rockets and threw lots of money at them; when WW2 ended, the same people ended up working for the USA and USSR governments.

Trickle-down economics. In the simplest form, the idea is that giving tax breaks to the rich will improve the overall economy. However, there's a few flaws in this, the biggest of which is the assumption that the rich will go out and spend the extra money. Given that the rich (pretty much by definition) already have everything money can buy, they're far more likely to invest the money in abstracted, minimal-tax financial schemes - and these schemes are likely to be offshore and hence deliver little or no benefit to the local economy.

Similarly, if the rich do spend more money, it's likely to be on "luxury" brands and services and as highlighted above, a significant percentage of the cost for these items is likely to be for the "brand" rather than the resources needed to produce it. To use Apple as an example (as it's cited in the article), a 64GB iPod Touch currently costs £250 at PC World whereas a 32GB iPod touch is £180. That's an extra £70 for just 32GB of flash memory, which can be currently had elsewhere for £10 - £15 so (to grossly simplify things), Apple is getting £60 of additional profit from the "luxury" model. And that profit's going straight back to Apple at the high end of the economy; the low end of the economy isn't seeing a single penny of it.

Industrial technologies vs "personal" technologies. In brief, a personal technology is one where the technology can be mass-produced, it offers a significant improvement over "muscle power" and the ongoing cost of use is low enough for an individual to fund it. Bicycles, cars, computers, mobile phones: each one in turn offered a quantum leap forward in terms of travelling times, carrying capacity, processing capacity and communication capabilities - and the ongoing costs for each are relatively low, thanks in no small part to the fact that (to a greater or lesser degree) they're built on infrastructure derived (again!) in no small part from government investment.

Industrial developments however, carry too high a cost for most individuals to afford them or are simply impractical for the majority of individual uses, so tend to be used for mass-transit. And guess what: these are (again!!) usually funded or at least partially subsidised by the government - aviation fuel and train infrastructure being two good examples.

In fact, to keep with the aviation example: it costs around £7,000 to get certified on a light plance. Then, there's also the cost of maintaining the plane and buying the fuel, not to mention storage. Even hiring a plane is expensive; the cheapest I found at a glance online was £350 an hour; conversely, a low-end hatchback car can be hired for 3 days for just £40 - or effectively around £0.55 per hour!

It's therefore unsurprising that a lot of pilots get their certification through a stint in the military... which is funded (again!!!) by the government.

It's not unreasonable to expect that even when "commoditised", space-flight will prove to have a similar cost ratio when compared to aviation. Whichever way you cut it, climbing out of a gravity well requires a lot of energy, puts an incredible strain on components during flight and requires a lot more technology (e.g. vacuum seals, etc). And as for the cost of getting certification, there's probably going to be at least one extra zero tacked onto that £7,000!

So, to summarise: Branson's investment is a good thing: he's not wasting his cash on "luxury" items and there's likely to be trickle-down technology. But space-flight will never be a commodity technology, nor will it ever be something that the average individual can personally own, and the trickle-down effect is likely to take years - if not decades - to manifest. And the vast majority of the money which comes out of developing space tourism will be going straight back into the high end of the economy. And it's all only possible thanks to that bogeyman of Republicans and Conservatives: big government. And any further significant developments (e.g. space elevators) will almost certainly have to be backed by big government, in much the same way as the Chunnel and other similar infrastructure projects have been.

To offer a final counterpoint to the article: if you want to boost the economy, then a better approach would be to increase spending power at the bottom end, where it's much more likely to be spent on physical and/or low-margin goods and services which need (relatively speaking) much higher levels of resource to produce - and in far higher volumes, to boot. For instance, if you give a multi-millionaire an extra £250,000, he might go out and buy a single high-end car. Give ten non-millionaires £25,000 apiece, and they'll go out and buy ten mid-level cars. And aside from the ten-fold increase in resources needed to produce those ten cars, where a luxury car is likely to involve significant levels of imported resources (ranging from engines up to the assembly of the entire car), a much higher percentage of the resources for a mid-range car will have been drawn from the local economy - as will the resources needed to maintain those ten cars (e.g. garages, mechanics, etc).

To offer a final counterpoint to the article: if you want to boost the economy, then a better approach would be to increase spending power at the bottom end, where it's much more likely to be spent on physical and/or low-margin goods and services which need (relatively speaking) much higher levels of resource to produce - and in far higher volumes, to boot. For instance, if you give a multi-millionaire an extra £250,000, he might go out and buy a single high-end car. Give ten non-millionaires £25,000 apiece, and they'll go out and buy ten mid-level cars. And aside from the ten-fold increase in resources needed to produce those ten cars, where a luxury car is likely to involve significant levels of imported resources (ranging from engines up to the assembly of the entire car), a much higher percentage of the resources for a mid-range car will have been drawn from the local economy - as will the resources needed to maintain those ten cars (e.g. garages, mechanics, etc).

In fact, there's evidence to suggest that giving people a guaranteed basic income actually has a major benefit to the economy as a whole; not only does it simplify administration and thereby /reduce/ government, but it also has significant social benefits: crime drops, child nutrition and school attendence improves, people save more and produce more startups. In fact, that's pretty much the key premise behind the article - but instead of a small handful of Bransons and a small number of indirect long-term economic benefits, we get major direct ongoing economic benefits, hundreds - if not thousands - of entrepeneurs *and* Branson will still be free to tinker with spaceships - in fact, he may even have more cash to do so, if tax revenues rise to the point where government can cut taxes.

Admittedly, the above is simplified and there's plenty of other factors to take into account. But hey... tis the end of the day.

5
3

Xperia Z3: Crikey, Sony – ANOTHER flagship phondleslab?

juice

Re: Five Hundred and Forty Nine???

A quick glance dug up a sim-free price for the iPhone 6 of 539 quid, or just a tenner less - and that's for the low-end 16gb model.

Either way, there's not going to be many picking one up sim-free; most will pick one up on a contract. And interestingly, another quick glance at carphone warehouse shows that contracts for the iphone 6 are currently trending around a tenner (per month) higher than the equivalent contract for a Z3 (e.g. £43 vs £34.50 for a bare-bones 24-month contract with no up-front costs).

Then too, give it a few months and the Z3'll have come down in price - I wouldn't be surprised if it could be picked up for under £400 after Christmas...

0
1
juice

Z3 and G3...

I picked up the LG G3 a month or so back - as tempting as the previews of the Z3 were, my contract was up, the S5 didn't look particularly inspiring and I didn't feel like waiting for the Z3 to come out, partly because I'm never keen on buying expensive/high-end kit during the initial launch - I much prefer waiting for the inevitable hardware/software issues to be debugged and patched out.

Speaking of which, the G3's now had 3 software patches, two of which specifically targeted performance and battery life (the third one doesn't seem to have done much other than tinker with the keyboard a wee bit). And even without enabling the "experimental" ART runtime, so far the battery life is pretty impressive; even with fairly heavy use (2-3 hours) a day as an ebook reader and the usual online timewasters such as Facebook, I'm comfortably getting 2-3 days usage on a single charge; it's easily triple what I was getting out of my (admittedly slightly aging) Galaxy S3.

... which is all a long-winded way of saying: were the comments in the article about the G3s battery life based on how well it performs now, or how badly it performed when first released? I know re-testing kit can eat up a huge amount of time, but it'd be nice to get a proper assessment of how the various flagship models compare to each other "now", rather than how things were several software revisions ago - especially since these days, software affects everything from battery life and call quality to camera performance and video quality!

0
1

Want to see the back of fossil fuels? Calm down, hippies. CAPITALISM has an answer

juice

The problem with this article...

Is that we don't just use fossil fuels for energy. Aside the the obvious uses (plastics, fertilizer), any industry which uses chemicals (pharmaceutical, cosmetics, etc) need them...

Cracking the energy issue is a start. But it's by no means the full picture.

16
1

Read The Gods of War for every tired cliche you never wanted to see in a sci fi book

juice

It wasn't until I got to the end of the review...

That I realised that it wasn't one of John Ringo's books ;)

From the review, it does sound like one of the cookie-cutter military-scifi books that Jim Baen used to love. And tucked away on the Baen website (http://www.baenebooks.com/c-1-free-library.aspx) is a nice set of free novels from various authors - Baen was one of the first to realise the marketing potential of giving away older books for free, especially if they're part of a series.

Among many others, there's David Drake's Redliners (military leading civilians through a hell-planet's carnivorous jungle), a number of titles from David Weber's Honor series (space opera with epic quantities of ship battles) and even a few of John Ringo's titles.

Alas, the online archive seems to have shrunk over time. However, Baen also gave away CDs with many titles on, which are available - with Baen's knowledge - at http://baencd.thefifthimperium.com/. So you can still brush up on AI-tank battles (the Bolo series), or go to war with the Romans against an evil empire controlled by a intelligence sent back from the far future (Belisarius). Or you could even join some genetically modified bats and rats - armed with a Shakespearian data download - as they take on an insectoid army despite the worst efforts of some incredibly inept human commanders (Rats, Bats and Vats)...

1
1

Ofcom will not probe lesbian lizard snog in new Dr Who series

juice

Actually...

First: Dr Who isn't Sci-fi (and arguably hasn't been ever since RTD picked up his pen): it's fantasy with a bit of technobabble and the occasional[*] Deus-ex-machina thrown in.

Past there, the kiss scene was pretty blatantly crowbarred in[**], but to be fair, the entire episode was pretty much made up of heavy-handed, self-indulgent and distinctly clumsy set-pieces, most of which were intended to establish this series overarching plot-thread rather than progressing the story at hand.

So overall, I'd say there's far more worthy things to complain about[***] ;)

[*] Alright, more than occasional, especially if you throw in the way the sonic screwdriver gets used these days. I was trying to be generous...

[**] Given that the robots stopped moving instantly when you stopped breathing, the characters could have gulped a breath every 30 seconds and gotten away scot free without any issues at all...

[***] No, I wasn't impressed. And I am getting bored of footnotes, so I'll stop now ;)

5
2

Microsoft: Just what the world needs – a $25 Nokia dumbphone

juice

46 hours playback, 32gb, usb rechargable?

I'm actually tempted to pick one up as a pure MP3 player - there's not many out there which can claim to have that level of battery life...

0
1

Page:

Forums

Biting the hand that feeds IT © 1998–2017