Re: That is all well and good but...
> So when I use Firefox on OSX why does nothing play? Safari is nowhere in sight.
Really? You found it quicker to type a question than to visit the links posted above? Tch.
6345 posts • joined 21 Jul 2010
> So when I use Firefox on OSX why does nothing play? Safari is nowhere in sight.
Really? You found it quicker to type a question than to visit the links posted above? Tch.
>so it baffles me why those sites cannot default to that method of delivery.
Apparently the 'iPad' method of delivery is HLS, which doesn't currently allow full iPlayer functionality without more effort from the iPlayer development team. They don't want to make that effort for just one browser, since they are keen to get HTML 5 delivery working across all devices.
The dev team say that OSX Safari is missing something called AVC3, which is required for HTML 5 iPlayer delivery. I don't know, but maybe their assumption is that it wouldn't be that hard for Apple to add it to OSX Safari (since it's in iOS Safari already).
In the mean time, their unofficial advice to OSX users seems to be ' pretend to be an iPad or use Opera 32'.
>So they can show content without flash but for some reason are stubbornly using flash as their default.
The BBC team give their reasons here:
Something to do with OSX Safari not supporting AVC3, and HLS not allowing the full iPlayer functionality. Mac users can use Opera 32 instead to access BBC HTML 5 content, though (and I'm no expert) it would seem more sensible if Apple could tweak Safari.
The BBC is moving away from from Flash, having had a HTML5 player in Beta since September:
Apple iOS, Windows 10 Mobile and BlackBerry users will get the HTML5 player by default, as will compatible desktop browsers where Adobe Flash is not installed or enabled.
However, the only OS X browser it works on is Opera 32. ( http://www.bbc.co.uk/html5 ). The reason the BBC give is that "Safari on Mac OS X doesn’t support AVC3 via its Media Source Extensions implementation. It does, however, support HLS, and whilst we could offer HLS streams to Mac OS X Safari users (some of you have noticed that you can pretend to be an iPad and you get a working player) we’ve deliberately not enabled it during the trial. " Apparently trying to achieve the full capabilities of iPlayer (HD programmes, Live Rewind etc) in HLS would be too much effort for just one platform, and slow the complete switch away from Flash.
( http://www.bbc.co.uk/blogs/internet/entries/8be5501d-43e7-4bf6-8f1e-e7037980a0f0 )
So, I don't know how this works, or even what AVC3 is, but could it be that tricky for Apple to implement? It doesn't affect me, but can somene make a 'feature request' to Apple? Or do all Mac users watch content through iPads and iTVs instead?
Still, it is encouraging that the BBC is already on the road away from Flash.
“Women and cats will do as they please, and men and dogs should relax and get used to the idea.”
- Robert A. Heinlein
Similarly, I've been told that a sportsman like David Beckham can take advantage of some damned tricky physics and kick a ball so that it curves in mid flight. Its never been suggested to me that he understands that laws of physics that govern the flight of the ball, only that he has a damned good feel for it.
>I am not convinced that a proprietary dongle is any better than a proprietary 'smart TV'.
With respect to the topic of this thread, it is clearly better for a cheap dongle - proprietary or open-source - to be 'bricked' by malware than an expensive TV.
A discussion about Kodi and its sources is a different conversation.
> Apparently you think I live in the UK. I do not ;)
Your browser's address bar reads: Theregister.co.uk...
If you engage in a discussion about consumer rights, fine, but please respect that the the UK is the default value.
Also, you have used that icon incorrectly - if you hover your mouse over each icon, you can read a guide to how to use it. For the icon you have used, we'd be expecting references to equations, rockets, or very, very small things at least!
Anyways, no worries, and welcome to the Reg! : )
>What if it connects to the neighbour's network? Or somebody walking past?
Android TVs don't do that! Stop plucking bullshit out of the air, FFS!
There is nothing stopping you from using these TVs as dumb screens - just don't connect them to the fucking network if you don't want to. It. Really. Is. That. Simple.
>But, sadly, that doesn't sound flash enough for marketing so this shite is wheeled out instead.
This 'shite' (iPlayer et al) is useful for many people and adds less than tenner for the bill of materials on a £400+ television, if that.
Or for viewing in wider range of lighting conditions: Just buy a TV but don't connect it to a network. Easy.
If you want the 'smart' functionality, it can be delegated to a discrete and low cost dongle.
Projectors are good for some circumstances, but not for all.
> there will be nothing you can do
That is evidently incorrect. You can do something, and it is easy: Do not connect your TV to a network. Strewth.
'Reasonable time' is the phrase arrived at by our elected representatives, not the TV vendors. Five years is not the mean time before failure, either. Nothing is stopping you from making a case to Trading Standards if your TV fails after seven years - five years is merely a figure that the vendors are using, and what they offer has no effect on your statutory rights. However, what is 'reasonable' depends upon the product.
If you really have an issue with it, write to your MP.
People don't see themselves as being 'nickle and dimed' when the television they can buy for £500 today is far bigger and of higher resolution than a set the same money would have bought them a few years back. Oh, and don't suggest the general public are 'stupid' - it betrays your ignorance.
>What I really want to know is "How is the user going to fix this problem when they don't even have a PC any more?"
Well, if the product proves to be 'not fit for the purpose for which it was sold', the onus is on the retailer to sort the issue out for the buyer
If you don't want a connected TV, don't connect it to your network.
Similarly: If you don't want your TV to pick up terrestrial broadcasts, don't plug in an aerial.
For smaller sizes, you could just buy yourself a monitor I guess, but at bigger sizes every TV using the latest panel technology (OLED, Quantum Dot, local dimming etc) will have a 'smart' functionality and a tuner or two. The functionality really doesn't add much to the cost of the TV.
Absolutely, James. And yeah, even if there is some good content on TV (the David Attenborough programme about Bioluminescence on BBC is awe inspiring) it might not be broadcast when I want to watch it - so yeah, streaming is how I watch most of my video.
For sure, it is *nice* to have streaming built into the TV, but the chances are that your PVR, Blu-Ray player or games console can do the streaming duties too. If not, a Chromecast or equivalent can be had for next to nothing (compared to a TV) and a phone or tablet makes a good remote control (especially when you need to enter text to search for content). Heck, I've got an old phone in a draw that happily stream HD video over a HDMI cable.
A large reason that we are seeing 'smart TVs' being sold is that the required circuitry adds very little to the cost of a TV. It's a bit like 3D functionality - it doesn't add to the cost of TV, because the refresh rates have been made higher for other purposes - so it is included even if the buyer isn't likely to use it.
(Actually, some of the BBC Nature stuff deserves to be bought on Blu-Ray - video of flocks of birds will upset streaming codecs. :))
>The whole thing is hideous and I would give it a couple of years at most before personalised ads start getting shoved down to the viewer and there will be nothing you can do
Er, just use the TV as a dumb screen and use the HDMI inputs for your choice of box, dongle, Blu Ray or computer. If you don't want the TV's smart features, just don't connect it to the internet. Easy. You have several HDMI ports, USB, Composite, and even DisplayPort on some models. A good number of people with Sky or Cable boxes don't even use the TV's built in tuners.
>because you agreed to the Google licence agreement when you bought the telly - yes really - I'm not making this up.
It seems that you are making this up. FFS, it's a Sony TV, and Sony will always have it work with other video sources (that they would like you to buy from them). Unlike their smartphones, Sony don't *need* to run Android on their TVs - their previous TV UI's were fine for select input / change volume/ adjust picture - so Google don't have the same leverage to make Sony hobble their TVs even if the wanted to.
Yeah, that's all true, but you don't *have* to use the TV's built-in streaming hardware. You can get the same functionality from a discrete box or dongle in the event of the TV's OS becoming out of date or unsupported.
HD streaming dongles start at around £15, the 4K versions will have dropped in price by the time there is enough 4K content around to be worth bothering with.
>Why the joke icon????
Because unlike a phone, a TV doesn't need network access to perform its primary function - displaying video. All of the 'smart' or 'connected' functionality can be provided by an inexpensive (compared to the tv) box or dongle. Therefore, a TV with an out-of-date OS is still fit for purpose - hence the joke icon.
TV updates that affect the playback of video (adding new HDR formats, for example) can be done by downloading the update on a computer and transferring it to the TV with a USB stick.
Modern TVs come with a 5 year guarantee as standard, in keeping with our statuary right to have it last a 'reasonable' time.
All the Sony TVs have Android built in, and they make some of the best LED TVs, along with Samsung who use their own Tizen OS.
LG, who are the only ones making OLED tvs, use WebOS.
LED sets are brighter, so perhaps more suitable for watching in well lit situations, OLED sets have perfectly black blacks, making them better for watching movies with the lights down.
You won't be able to buy a 'dumb screen' at a TV sizes, but nobody is forcing you to plug an ethernet cable into it. By the time 4K content is more widely available, external HDMI 2.0 boxes should be cheaper.
Thank you Monty75, that was my reading of it too. Everything online says "This is a developer preview, and as such is currently case-sensitive only". Compare and contrast with the Reg article:
"The file system is also case-sensitive and that apparently cannot be disabled, which will lead to all sorts of knock-on compatibility issues. Yep, you will have to buy more Apple gear: a new watch to go with your new phone to sync with your new laptop. Apple is always looking after that bottom line."
Talk about adding 2 to 2 and getting 5. The lack of [fact checking] around here is getting beyond a bit daft.
>but its usage is spotty at best - the apple keyboard will pop up for certain passwords,
That sounds like a feature not a bug, if you can't trust the vendor of a 3rd party keyboard.
>no microphone on the new keyboard, I'd forgotten how constraining Apple products are.
Allowing 3rd party developers to use the Siri APIs has only just been announced at this WWDC, so it is possible that 3rd party iOS keyboards will allow voice input soon. Maybe.
>Apple's attitude to third parties – where it dictates terms and expects people to follow them – has not worked out so well in other markets
It seems to have worked well in the audio peripheral market- the 3rd party 'made for iPod/Phone' - headphones and speaker dock market.
I'm an Android user, and in most stores the selection of iPhone headsets is much wider. They only ever half work with Android phones ( because a, Apple is awkward and b, even within individual Android vendors, the implementation of the 3.mm audio input/output socket varies)
You want change for change's sake?
One of the nice things about OS X (originally called Mac OS X) is that it doesn't force change upon users, like Windows has done over the same period. Ideas from touch-based OSs - such as multi-touch gestures on Apple trackpads - have been added to OS X, but they never stopped the user from doing things the way they already had been. Heck, unlike Office in Windows, Apple still let users use menus, if that is what they want to.
Oh, there was some Mac news that wasn't in their keynote: a new file system called APFS, still in Beta.
(For the record, I mainly use Windows - familiarity breeds contempt, I guess. I have administered a few Macs though, and found them to be pretty civilised. It could be that I haven't used OSX enough to discover any massive annoyances)
All this coverage of this acquisition on the The Register, yet I haven't seen anyone make the point that potentially LinkdIn's biggest competitor is the long established recruitment agency industry. They were making money from their own silos of user/client-provided data long before Facebook et al were on the scene, agencies that would take a percentage of someone's earnings. What value would they add? Why, no more than consult their databases and liaises with employers and employees.
LinkdIn has the potential to disrupt that - if anything else, it could automate the process of checking references, from the point of view of recruiters.
This isn't my point of view, but one that given to me in a pub by the head of recruitment for a large company a few years back.
Yeah, I take your point. However, in the rapidly evolving world of IT, the public need to think about things *before* they happen. Analysts play a role in this - even if they have as much insight and foresight as a Wired.com hack, because here we all are, offering argument and counter-argument.
The analysts don't have 20:20 crystal balls, but they usually do offer their reasoning. We obviously can't go by the predictions offered to us by any players in the game, such as Google, Samsung, MS, Apple etc because they want to bend our perception to their ends. The big players do, however, have their own analysts, and I suspect that some of them are very good at what they do, and have more expensively-gained information to study.
tl;dr: 1, Laypeople speculating is healthy
2, We won't read here what the best analysts think
If I was serious about security (i.e, I had need to have client's data on my phone, which would result in fines for me were my phone to be accessed by a third party) then the saying "If you want to go there, then I would't start from here" would seem to apply.
Here's the thing: I can't easily find any information about just how vulnerable - or otherwise - Android (various versions) is to attack, both proof-of-concept and seen-in-the-wild. Perhaps the Reg could put together a sketch of the current Android security landscape?
I don't even know if it is safer to have an older version of Android, but with no extra apps installed, or to have a new version but with dozens of apps from the Google play store.
If I was a doctor or a lawyer, I'd just buy an iPhone and be done with it. If I was a terrorist, a whistle-blower, or a very high level executive or engineer, I'd be spend some time and discipline studying operation security before making a decision.
>Google, with its perpetual attention deficit disorder, never sat down and thought properly about an update mechanism for Android.
They didn't have a choice - The way most ARM-based systems were designed doesn't allow for a one-ROM-fits-all (Linux distro-style) updating. Google bought Android in, as they were desperate to catch up with the iPhone. That was at the beginning.
In Act 2, silicon was advancing so much that two-year-old phones weren't really worth updating. It's only been the last couple of years that older phones have been good enough to keep using (though of course a new budget, but pretty good, Android phone won't break the bank).
>We need something like linux for phones. Something that users can install on a wide range of hardware and still have something functional.
>>Isn't that what AOSP ROMs like Cyanogenmod are?
Alas, no - those ROMs still need to compiled beforehand to work on a specific handset.
The issue isn't always with the phone manufacturer, but with various chipset manufacturers... they don't don't always get a new Android binary blob over to the phone vendors. Where is their motivation to do so?
Saying that phone vendors don't do updates because they love built in obsolescence is art school level of analysis. you might be right on occasion, but your reasoning is suspect.
>Just as Safari and other Apple services are an inextricable part of iOS.... ...Sure, using the Oracle lawsuit might be an easy excuse, but if the EU likes how Apple does things, might as well jump on the bandwagon.
The EU only tries to hobble companies that they consider to be abusing their dominant market position... Apple have a small (but lucrative) slice of the market, so they will be left alone.
>all the existing customizable phones are hopelessly out of date?
Out of date? If it still makes phone calls and sends texts, it won't be out of date. I'm aware of the good work done by people on XDA, but really, they are often trying to customise something that should have been good enough to begin with, bringing bugs and security holes onto the process. I'd be interested to see a percentage figure for the number of phones that run an Android version that din't come from the vendor. It's a phone, not a toy.
Android has slow updates because of its architecture - Google were in a hurry to catch up with the iPhone at the time. The way ChromeOS updates show how Google would like things to be done.
>Or are they going to leave it fifteen years and then look into a monopoly action against Apple costing more than it ever recovers, ala Microsoft vs EU?
What the hell are you thinking? Apple don't have a monopoly! How - or why - would you prosecute a company for the abuse of a monopoly it doesn't have? Shit, I'd be surprised if they enjoyed 25% market share, let alone 50%.
The EU mandated micro-USB for charging because Samsung used several different connectors, Sony used several different connectors, Nokia used several different connectors... and these proprietary connectors were hard-wired to their wall plugs. The only company that used the same charging connector for its gadgets over several years was Apple, and their wall plugs just had a USB-A socket.
You don't have to buy your Thunderbolt cables from Apple.
It's a bit like FireWire, some version of which were faster than USB 2 - most people ('consumers') didn't really need the extra speed and features (not being packet-based, FW is a more natural fit for audio recording gear). Of course, the people who did need it, initially for high res scanners and then digital video cameras, made good use of it (or whatever the hell it was Sony called it) for many years.
Niche kit always looks a bit pricey, regardless of who makes it.
> file sizes and storage density having all but stagnated in the past decade.
4K televisions are becoming very inexpensive, and they like to be fed with a lot of data (their resolution is higher, but they also use more bits per pixel). I mention this because video files have driven consumer HDD sizes and interconnects in the past.
You're not imagining. Intel developed the optical version of Thunderbolt - then called LightPeak - first, before reverting to copper on cost grounds. Apple contributed the Thunderbolt name, and it was mainly Apple who used it - though Sony, bless 'em, made a VAIO laptop with a USB-A Thunderbolt port for driving an external GPU.
This was some years ago now (indeed, VAIOs were still Sony), but it is only now that the idea of external GPUs are gaining traction amongst the gaming crowd. And Apple's next cinema display is rumoured to have its own GPU, because its existing Macbook Pros don't have a connection capable of shunting 5K video.
As always, I'll let the gamers and Apple users pay the first-adopter's premium and iron out the bugs, and look forward to it being cheap and reliable in a year or two.
Thunderbolt started off as an Intel concept that used optical fibres called LightPeak but it proved too costly, so they reverted to copper, and Apple contributed the Thunderbolt name. Lightpeak seemed an attractive idea to me at the time, because a noisey computer / server / GPU farm could be kept in the next room - or indeed the garden shed - without much compromise. These days though, computers good enough for my purposes are generally cooler and quieter.
I keep hearing about photonic circuitry too, but it seems to be a few years off at the very least.
As regards consumer and desktop devices, copper-cable based solutions offer a usability advantage of optical connections* - they can carry power, so a single cable can do 'everything' (power, video, storage, peripherals etc).
*Yeah, some people are working on power-over-fibre, but the use cases remain specialised (underwater robots, MRI machines, physics labs etc). My instinctual reaction to 'consumer fibre with 5W lasers beams' is "Arghh, my eyes my beautiful eyes!!"
I can see scenarios where a fibre optic connection would be useful.
>She dropped her phone and felt she would have been unable to dial 000...
The advice given in CPR training is to administer CPR first, and shout for someone else to call the emergency services afterwards, such is the urgency.
(FFS do NOT take my word for ot, but take a course yourself)
True, a Nokia 3310 *might* not have slipped from her hand, or might have been *easier* to dial [emergency number]. The nature of hypothetical questions is such that she might have left her Nokia in the car, instead of bringing it into her house to play Angry Birds.
The actual 'feature' she used was first seen on Motorola X handsets after they bought by Google. They made use of a smaller co-processor that could continually monitor the microphone for a 'O k Moto' voice prompt. Amazon have subsequently taken the idea and built it into a speaker-like device.
If Siri or its competitors can ring for an ambulance *and* relay location data, that would be a potentially life-saving feature. As it is at present, paramedic only have cell-mast triangulation dat, though in urban locations wifi-based location is often more accurate. Having the desk have the ability to instruct your phone to provide audio cues for the CPR rhythm - even better.
As always, the devil is in the details of the implementation.
>format sea cucumber
AND I have my new passphrase!
Right o', gotta get on and set it as the passphrase for my Linkdin, MySpace, AshelyMadison, Beano and HSBC Bank accounts.
Might as well change my username whilst I'm at it... how does BlueTiger97$ sound to you guys?
When I do use Google voice search, I'm still surprised at how well it works. It's curious that I'm still surprised, I guess.
My accent is closer to RP than some people's, and for some odd reason I'm more likely to use it when I'm confident that it will understand me correctly (i.e, I know it find find 'star wars cinema' easier than some rare place name )
>Even then, privacy concerns were paramount. Yet there is no more or less privacy talking to a VAPDA than there is typing into Google.
Eh? That's clearly not true:
-Typing a Google search: Google knows you're searching for "Haemorrhoid cream".
- Speaking Google search: Google AND your friends / co-workers know you're searching for "Haemorrhoid cream".
A lot of people have made peace with Google, Amazon and KinkyStuff.com knowing things about them that their friends and neighbours in real life do not.
Well that's kind of the point: Henry Ford went in for an assembly-line approach. When you say he only famous because he put his name on his cars, you are sidestepping the whole *reason* his cars became famous in the first place. He didn't invent assembly lines or internal combustion engines, but he put his resources behind a combination of the two.
Or: Is a man who invents cars a better engineer than the man who invents machines to make cars? It's clearly a nonsense question.
Sometimes a person becomes associated with a technology because they were in the right place at the right time, with whatever motivation and whatever resources (brainpower, reputation, money) they happened to possess.
Heck, Aldous Huxley adopted Ford's name as a signpost in a fork of human history in Brave New World. The novel Catch-22 was a warning about how the manufacturing techniques in WW-2, echoing Dwight D Esienhower's Farewell Speech, had continued into peacetime. Heller's mate Kurt Vonnegut was a straight-up journalist until his editor mistook his true-to-life reportage of a post-war factory as science fiction.
There is technology, and then the is use that technology is put to. Would the name Oppenhiemer be as well known had his bosses not decided to finace the Manhatten Project?
Maybe individuals are important, maybe not - I don't know - though the telephone is held up as an example of very similar invention patents being filed on the same day by different people on different continents.
If inventor X had been 'run over by a bus' as a child, would inventor Y have invented the same thing within a year?
If a Salesman or Military Commander had not promoted invention A, would others have done sooner or later?
tl,dr If you are interested in technology, study the scientists and inventors. If you are interested about how technology impacts upon people's lives, study people, scientists, inventors, manufacturers, salesmen, generals, presidents, etc
I once worked in a UK nuclear site... Down some scarcely used corridors would be black and photos of the physicists from the early days... Most of them smoking a pipe.
Yeah, but isn't the name Henry Ford as well known as Karl Benz or Rudolf Diesel?
For that matter, Isambard Kingdom Brunel is famous too - again, for the scale of his implementation of existing inventions.
>Nowadays you’re as likely to need an out of this world ego and background in Silicon Valley financing.
That'd be Buzz Aldrin, then! But seriously, his science fiction novel Encounter with Tiber (1996, written with John Barnes) reads almost as a manifesto for private enterprise getting mankind into space on routine basis. It's actually a good primer on many of the spaceflight concepts being planned in the near-to-middle term (the parts of the book based around Sol) and longer term (the technologies used by the Tiberians of the title).
I'll see if I can dig out a link to a good outline in the next ten minutes!
>Who else read the Dr Strangelove lines in his voice? lol.
How could I not? :)
50:50? Dr Strangelove disagrees!
Dr. Strangelove: Well, that would not be necessary, Mr. President. It could easily be accomplished with a computer. And a computer could be set and programmed to accept factors from youth, health, sexual fertility, intelligence, and a cross-section of necessary skills. Of course, it would be absolutely vital that our top government and military men be included to foster and impart the required principles of leadership and tradition. Naturally, they would breed prodigiously, eh? There would be much time, and little to do. Ha, ha. But ah, with the proper breeding techniques and a ratio of say, ten females to each male, I would guess that they could then work their way back to the present Gross National Product within say, twenty years
Turgidson: Doctor, you mentioned the ratio of ten women to each man. Now, wouldn't that necessitate the abandonment of the so-called monogamous sexual relationship, I mean, as far as men were concerned?
Dr. Strangelove: Regrettably, yes. But it is, you know, a sacrifice required for the future of the human race. I hasten to add that since each man will be required to do prodigious...service along these lines, the women will have to be selected for their sexual characteristics which will have to be of a highly stimulating nature.
Russian Ambassador: I must confess, you have an astonishingly good idea there, Doctor.
I'm not downplaying the importance of regular updates (or defending the chain of OEM > ODM <> Carrier > Regulator > User), but Planty has made a valid observation - News coverage, or personal accounts, of attacks on Android in the wild are a a bit thin on the ground. I say this not because I don't think they exist, but because I am curious.
Again, I'm not saying ignorance is an excuse for complacency.
Nice article, but doesn't Sir David MacKay FRS, FInstP, FICE deserve to have his name mentioned?
"In his final interview before his untimely death, DECC’s chief scientific advisor called it an “appalling delusion” that the UK could meet its energy needs from renewables."
Perhaps In his final interview before his untimely death, DECC’s chief scientific advisor David McKay called it an “appalling delusion” that the UK could meet its energy needs from renewables. might work just as well?
Also, it should be noted the the conductor of the interview asks viewers to "please do not quote him out of context or sensationalise what he said."