6266 posts • joined 21 Jul 2010
Lately? They've been around for as long as I remember.
"Windows is checking for a solution to the problem" always has me in stitches!
>this will increasingly devalue photographs as a source of future historical record
Patented in 1947, may I present this photo-retouching table?
It vibrates the photographic negative that the artist is working on, so that brush strokes are rendered invisible:
Before we used the term 'photoshopped', we would talk of people being 'airbrushed' from history.
>Does Google map thing replace an A to Z?
It can do. Other mapping and navigation solutions are available.
As an added bonus, Google Maps will show areas of slow-moving traffic in real time, and so suggest routes that are quicker at the time. It will also show you where you are on the map, and be up-to-date with business address and even show you photo so you know when you are there. It will also tell you the opening hours of a public house, and at what times it is typically busy.
Of course the downsides are that you need some battery life in your phone (though most cars can be fitted with a phone charger) and either a data connection or the foresight to pre-load map data onto your phone.
In fact, this is very good example of the 'herd benefit' of using anonymous data from many users - Google know when their is congestion because some Android phones will be sending speed and location data to Google - so if everyone on a particular road is going at 40 Mph when an hour earlier they were doing 70, Google knows there is an accident or roadworks. Of course, Google being Google, they do have your identifiable location data too unless you opt out of it, but it still stands as an example of the concept.
>Odd that no one has mentioned that stupid Beacon rubbish thing that Apple were punting.
No one, except for the Bluetooth SIG in their specs for Bluetooth 5.0, which will have 8x the bandwidth for 'connectionless traffic' than previous versions.
>On the other hand there are undoubted benefits from Google's mining if, like me, you live a life unlikely to attract the attention of either the police or criminals but suffer from increasingly failing memory.
The drug smuggler Howard Marks was once asked how he, a man who smoked a lot of hash, could remember enough of his past to write a best-selling autobiography, Mr Nice:
"Oh, that was easy, the FBI had me under surveillance for years... I just asked them for their file on me under their Freedom of information laws."
I believe you may have confused Apple's differential privacy with something they implemented in OSX Safari a few years ago.
It was the the feature in Safari that would make advertisers believe you had visited sites that you hadn't - presumably websites drawn from a pre-compiled whitelist (so no KinkyStuff.com or ISISareCool.org).
Differential Privacy is different, so take a few minutes to scan https://en.wikipedia.org/wiki/Differential_privacy
At this stage, exactly how Apple will implement it is not known, but the concept is that Apple will have data about all their users, but can't reverse engineer that data (because of maths) to identify anything about an individual user.
Of course it goes without saying that the implementation key.
>IOW, they probably already have ways to differentiate differential privacy.
Akin to encryption, it depends upon how the differential privacy is implemented in the real world. From what I understand, it is built upon proven mathematical ideas.
>Google can read all my email if it so desires; but as that's an informed choice I don't see the problem. (I see it as the price for the convenience of Gmail).
Also, Google have not suffered any massive security breaches, a la Ashley Madison, Sony et al... Google seem to be capable of keeping your data out of the hands of blackmailers and extortionists.
The agencies don't just use one data point to identify would-be terrorists, the chances of injected noise giving you - and no other Apple users - the profile of a terrorist are practically non-existent.
In any case, this data would have to captured by the agencies in transit, because the whole point of differential Privacy is that you can't be identified from Apples data.
Differential Privacy has been developed by academics for years. Most technical experts welcome its adoption by Apple, but of course they look forward to seeing the actual implementation before passing any judgement.
You've not been to the United States, have you?
I really don't think that Snowdon thought that Russia was all sweetness and light before he sought asylum there.
>Without the proprietary biggies, the public will then turn to open communication platforms following open communication standards that no country can control.
Sadly I suspect your thinking is wishful: the average user won't bother. For evidence, look at how many people use Facebook Messenger.
Sometimes it can be better to to side with the big corporations, since they aren't as easily cowed by governments. Sometimes, that is. I'd sooner trust Apple - since their business model is to empty my pockets for hardware - than I would Facebook, which has both Ayn Rand-ian ideologies around privacy and an advertising-based business.
Edge suspends Flash content on tabs that aren't visible. Chrome doesn't do that by default.
The video tests MS conducted were based around streaming Netflix. I tried to find further info about the tests, but couldn't. However, it wouldn't be worth MS fudging the tests because of the backlash should they be found out. (And in any case, their findings reflect independent results).
Curiously, Netflix is only available in 1080p on Edge and IE on Windows, and on Safari on OSX - all other desktop browsers are 720p, and both HTML 5 and Silverlight are used.
Just to clarify:
If you are driving on a road that is flooded to a foot or two and there are other vehicles using it, don't go fast just because you are in a Chelsea Tractor - the resulting wave has fucked* the engines of smaller diesel cars and vans that would have been just fine had you not shown up.
If you are crossing a river in the back of beyond in a 4X4, then for sure, do want you your training and experience tells you is best.
* This was the term our mechanic used - I assume it is technical.
The Reg article points out that "There is no serious suggestion that Elon Musk crashed his Tesla Model S or otherwise accidentally drove it into a body of water.", but doesn't mention the tweet Musk made immediately before, linking to an article about a Kazakh man driving his Tesla through a flooded tunnel.
The car floats, so the depth of the water is not an issue. What is an issue is the speed of the water, since the wheels won't be powering the car very efficiently. The tweet from Musk was in response to a news story - someone in Kazakhstan had driven their Tesla through a flooded tunnel, where the water wasn't flowing very quickly (compared to a river).
It does look like the driver was a pillock though - he was moving quickly enough to create a bow wave that could cause water to enter and damage the engines of the other people's cars. Some 4x4 drivers have been know to do the same on flooded roads.
Horses for courses. :)
For the needs you have outlined, there are plenty of machines to choose from in the 'mobile workstation' or 'gaming laptop' categories - so happy days! And hey, better stuff is yet to come, with the promise of external GPUs and fast interconnects that blur the line between internal and external storage (if one's data is valuable, then having it stuck on one device is not the greatest idea anyway).
However, you are a roving tech blogger or journalist and don't need the power, you might be glad for a lighter 'ultrabook' machine.
I like that there are a few approaches being explored at the moment. Lenovo, MS, Apple, ASUS and others are all trying various form factors. Some people might want one device to do everything and perhaps accept some compromises, others might be happier to carry a couple.
This was an interesting article because it was assessing a device against tasks which weren't its main focus.
Tools for the job... The iPad Pro wasn't designed to replace the MacBook, which would clearly be a better device for writing lots of text. The iPad Pro has it own strengths, but this article was about using the iPad for MacBook-like tasks 'in a pinch'.
In short, it sounds like an iPad Pro might suit you if you mainly work with images and graphics away from your desk, and only occasionally need to write a report or use a spreadsheet.
It wasn't a formal review, as the story was tagged 'Road Test'. It gave one persons view of the iPad Pro for their particular workflow when travelling. It could be a useful accompaniment to the more formal reviews of the iPad that can be found elsewhere on the internet.
The Reg will occasionally have articles tagged 'Hands On Review', which are first impressions.
The proper reviews are tagged 'Review'.
Generally - and feel free to go through past Reg articles to confirm this - the Reg is snarky about Apple when reporting rumours, launch events and the like, but generally gives good reviews of Apple kit itself.
Other reviews of Watch OS 3.0 suggest that it makes things simpler for users, in part by making less use of the 'digital crown' and using touchscreen more - actions that are already familiar to iPhone users.
For my taste, the Apple Watch does too much - but that's me. A small, tough watch (stainless steel and sapphire ) with an oh-so-useful rotating bezel does me fine. If something similar with some discreet LED for notifications was made, I might be tempted.
> So when I use Firefox on OSX why does nothing play? Safari is nowhere in sight.
Really? You found it quicker to type a question than to visit the links posted above? Tch.
>so it baffles me why those sites cannot default to that method of delivery.
Apparently the 'iPad' method of delivery is HLS, which doesn't currently allow full iPlayer functionality without more effort from the iPlayer development team. They don't want to make that effort for just one browser, since they are keen to get HTML 5 delivery working across all devices.
The dev team say that OSX Safari is missing something called AVC3, which is required for HTML 5 iPlayer delivery. I don't know, but maybe their assumption is that it wouldn't be that hard for Apple to add it to OSX Safari (since it's in iOS Safari already).
In the mean time, their unofficial advice to OSX users seems to be ' pretend to be an iPad or use Opera 32'.
>So they can show content without flash but for some reason are stubbornly using flash as their default.
The BBC team give their reasons here:
Something to do with OSX Safari not supporting AVC3, and HLS not allowing the full iPlayer functionality. Mac users can use Opera 32 instead to access BBC HTML 5 content, though (and I'm no expert) it would seem more sensible if Apple could tweak Safari.
The BBC is moving away from from Flash, having had a HTML5 player in Beta since September:
Apple iOS, Windows 10 Mobile and BlackBerry users will get the HTML5 player by default, as will compatible desktop browsers where Adobe Flash is not installed or enabled.
However, the only OS X browser it works on is Opera 32. ( http://www.bbc.co.uk/html5 ). The reason the BBC give is that "Safari on Mac OS X doesn’t support AVC3 via its Media Source Extensions implementation. It does, however, support HLS, and whilst we could offer HLS streams to Mac OS X Safari users (some of you have noticed that you can pretend to be an iPad and you get a working player) we’ve deliberately not enabled it during the trial. " Apparently trying to achieve the full capabilities of iPlayer (HD programmes, Live Rewind etc) in HLS would be too much effort for just one platform, and slow the complete switch away from Flash.
( http://www.bbc.co.uk/blogs/internet/entries/8be5501d-43e7-4bf6-8f1e-e7037980a0f0 )
So, I don't know how this works, or even what AVC3 is, but could it be that tricky for Apple to implement? It doesn't affect me, but can somene make a 'feature request' to Apple? Or do all Mac users watch content through iPads and iTVs instead?
Still, it is encouraging that the BBC is already on the road away from Flash.
“Women and cats will do as they please, and men and dogs should relax and get used to the idea.”
- Robert A. Heinlein
Similarly, I've been told that a sportsman like David Beckham can take advantage of some damned tricky physics and kick a ball so that it curves in mid flight. Its never been suggested to me that he understands that laws of physics that govern the flight of the ball, only that he has a damned good feel for it.
>I am not convinced that a proprietary dongle is any better than a proprietary 'smart TV'.
With respect to the topic of this thread, it is clearly better for a cheap dongle - proprietary or open-source - to be 'bricked' by malware than an expensive TV.
A discussion about Kodi and its sources is a different conversation.
> Apparently you think I live in the UK. I do not ;)
Your browser's address bar reads: Theregister.co.uk...
If you engage in a discussion about consumer rights, fine, but please respect that the the UK is the default value.
Also, you have used that icon incorrectly - if you hover your mouse over each icon, you can read a guide to how to use it. For the icon you have used, we'd be expecting references to equations, rockets, or very, very small things at least!
Anyways, no worries, and welcome to the Reg! : )
>What if it connects to the neighbour's network? Or somebody walking past?
Android TVs don't do that! Stop plucking bullshit out of the air, FFS!
There is nothing stopping you from using these TVs as dumb screens - just don't connect them to the fucking network if you don't want to. It. Really. Is. That. Simple.
>But, sadly, that doesn't sound flash enough for marketing so this shite is wheeled out instead.
This 'shite' (iPlayer et al) is useful for many people and adds less than tenner for the bill of materials on a £400+ television, if that.
Or for viewing in wider range of lighting conditions: Just buy a TV but don't connect it to a network. Easy.
If you want the 'smart' functionality, it can be delegated to a discrete and low cost dongle.
Projectors are good for some circumstances, but not for all.
> there will be nothing you can do
That is evidently incorrect. You can do something, and it is easy: Do not connect your TV to a network. Strewth.
'Reasonable time' is the phrase arrived at by our elected representatives, not the TV vendors. Five years is not the mean time before failure, either. Nothing is stopping you from making a case to Trading Standards if your TV fails after seven years - five years is merely a figure that the vendors are using, and what they offer has no effect on your statutory rights. However, what is 'reasonable' depends upon the product.
If you really have an issue with it, write to your MP.
People don't see themselves as being 'nickle and dimed' when the television they can buy for £500 today is far bigger and of higher resolution than a set the same money would have bought them a few years back. Oh, and don't suggest the general public are 'stupid' - it betrays your ignorance.
>What I really want to know is "How is the user going to fix this problem when they don't even have a PC any more?"
Well, if the product proves to be 'not fit for the purpose for which it was sold', the onus is on the retailer to sort the issue out for the buyer
If you don't want a connected TV, don't connect it to your network.
Similarly: If you don't want your TV to pick up terrestrial broadcasts, don't plug in an aerial.
For smaller sizes, you could just buy yourself a monitor I guess, but at bigger sizes every TV using the latest panel technology (OLED, Quantum Dot, local dimming etc) will have a 'smart' functionality and a tuner or two. The functionality really doesn't add much to the cost of the TV.
Absolutely, James. And yeah, even if there is some good content on TV (the David Attenborough programme about Bioluminescence on BBC is awe inspiring) it might not be broadcast when I want to watch it - so yeah, streaming is how I watch most of my video.
For sure, it is *nice* to have streaming built into the TV, but the chances are that your PVR, Blu-Ray player or games console can do the streaming duties too. If not, a Chromecast or equivalent can be had for next to nothing (compared to a TV) and a phone or tablet makes a good remote control (especially when you need to enter text to search for content). Heck, I've got an old phone in a draw that happily stream HD video over a HDMI cable.
A large reason that we are seeing 'smart TVs' being sold is that the required circuitry adds very little to the cost of a TV. It's a bit like 3D functionality - it doesn't add to the cost of TV, because the refresh rates have been made higher for other purposes - so it is included even if the buyer isn't likely to use it.
(Actually, some of the BBC Nature stuff deserves to be bought on Blu-Ray - video of flocks of birds will upset streaming codecs. :))
>The whole thing is hideous and I would give it a couple of years at most before personalised ads start getting shoved down to the viewer and there will be nothing you can do
Er, just use the TV as a dumb screen and use the HDMI inputs for your choice of box, dongle, Blu Ray or computer. If you don't want the TV's smart features, just don't connect it to the internet. Easy. You have several HDMI ports, USB, Composite, and even DisplayPort on some models. A good number of people with Sky or Cable boxes don't even use the TV's built in tuners.
>because you agreed to the Google licence agreement when you bought the telly - yes really - I'm not making this up.
It seems that you are making this up. FFS, it's a Sony TV, and Sony will always have it work with other video sources (that they would like you to buy from them). Unlike their smartphones, Sony don't *need* to run Android on their TVs - their previous TV UI's were fine for select input / change volume/ adjust picture - so Google don't have the same leverage to make Sony hobble their TVs even if the wanted to.
Yeah, that's all true, but you don't *have* to use the TV's built-in streaming hardware. You can get the same functionality from a discrete box or dongle in the event of the TV's OS becoming out of date or unsupported.
HD streaming dongles start at around £15, the 4K versions will have dropped in price by the time there is enough 4K content around to be worth bothering with.
>Why the joke icon????
Because unlike a phone, a TV doesn't need network access to perform its primary function - displaying video. All of the 'smart' or 'connected' functionality can be provided by an inexpensive (compared to the tv) box or dongle. Therefore, a TV with an out-of-date OS is still fit for purpose - hence the joke icon.
TV updates that affect the playback of video (adding new HDR formats, for example) can be done by downloading the update on a computer and transferring it to the TV with a USB stick.
Modern TVs come with a 5 year guarantee as standard, in keeping with our statuary right to have it last a 'reasonable' time.
All the Sony TVs have Android built in, and they make some of the best LED TVs, along with Samsung who use their own Tizen OS.
LG, who are the only ones making OLED tvs, use WebOS.
LED sets are brighter, so perhaps more suitable for watching in well lit situations, OLED sets have perfectly black blacks, making them better for watching movies with the lights down.
You won't be able to buy a 'dumb screen' at a TV sizes, but nobody is forcing you to plug an ethernet cable into it. By the time 4K content is more widely available, external HDMI 2.0 boxes should be cheaper.
Thank you Monty75, that was my reading of it too. Everything online says "This is a developer preview, and as such is currently case-sensitive only". Compare and contrast with the Reg article:
"The file system is also case-sensitive and that apparently cannot be disabled, which will lead to all sorts of knock-on compatibility issues. Yep, you will have to buy more Apple gear: a new watch to go with your new phone to sync with your new laptop. Apple is always looking after that bottom line."
Talk about adding 2 to 2 and getting 5. The lack of [fact checking] around here is getting beyond a bit daft.
>but its usage is spotty at best - the apple keyboard will pop up for certain passwords,
That sounds like a feature not a bug, if you can't trust the vendor of a 3rd party keyboard.
>no microphone on the new keyboard, I'd forgotten how constraining Apple products are.
Allowing 3rd party developers to use the Siri APIs has only just been announced at this WWDC, so it is possible that 3rd party iOS keyboards will allow voice input soon. Maybe.
>Apple's attitude to third parties – where it dictates terms and expects people to follow them – has not worked out so well in other markets
It seems to have worked well in the audio peripheral market- the 3rd party 'made for iPod/Phone' - headphones and speaker dock market.
I'm an Android user, and in most stores the selection of iPhone headsets is much wider. They only ever half work with Android phones ( because a, Apple is awkward and b, even within individual Android vendors, the implementation of the 3.mm audio input/output socket varies)
You want change for change's sake?
One of the nice things about OS X (originally called Mac OS X) is that it doesn't force change upon users, like Windows has done over the same period. Ideas from touch-based OSs - such as multi-touch gestures on Apple trackpads - have been added to OS X, but they never stopped the user from doing things the way they already had been. Heck, unlike Office in Windows, Apple still let users use menus, if that is what they want to.
Oh, there was some Mac news that wasn't in their keynote: a new file system called APFS, still in Beta.
(For the record, I mainly use Windows - familiarity breeds contempt, I guess. I have administered a few Macs though, and found them to be pretty civilised. It could be that I haven't used OSX enough to discover any massive annoyances)
All this coverage of this acquisition on the The Register, yet I haven't seen anyone make the point that potentially LinkdIn's biggest competitor is the long established recruitment agency industry. They were making money from their own silos of user/client-provided data long before Facebook et al were on the scene, agencies that would take a percentage of someone's earnings. What value would they add? Why, no more than consult their databases and liaises with employers and employees.
LinkdIn has the potential to disrupt that - if anything else, it could automate the process of checking references, from the point of view of recruiters.
This isn't my point of view, but one that given to me in a pub by the head of recruitment for a large company a few years back.
Yeah, I take your point. However, in the rapidly evolving world of IT, the public need to think about things *before* they happen. Analysts play a role in this - even if they have as much insight and foresight as a Wired.com hack, because here we all are, offering argument and counter-argument.
The analysts don't have 20:20 crystal balls, but they usually do offer their reasoning. We obviously can't go by the predictions offered to us by any players in the game, such as Google, Samsung, MS, Apple etc because they want to bend our perception to their ends. The big players do, however, have their own analysts, and I suspect that some of them are very good at what they do, and have more expensively-gained information to study.
tl;dr: 1, Laypeople speculating is healthy
2, We won't read here what the best analysts think
If I was serious about security (i.e, I had need to have client's data on my phone, which would result in fines for me were my phone to be accessed by a third party) then the saying "If you want to go there, then I would't start from here" would seem to apply.
Here's the thing: I can't easily find any information about just how vulnerable - or otherwise - Android (various versions) is to attack, both proof-of-concept and seen-in-the-wild. Perhaps the Reg could put together a sketch of the current Android security landscape?
I don't even know if it is safer to have an older version of Android, but with no extra apps installed, or to have a new version but with dozens of apps from the Google play store.
If I was a doctor or a lawyer, I'd just buy an iPhone and be done with it. If I was a terrorist, a whistle-blower, or a very high level executive or engineer, I'd be spend some time and discipline studying operation security before making a decision.
>Google, with its perpetual attention deficit disorder, never sat down and thought properly about an update mechanism for Android.
They didn't have a choice - The way most ARM-based systems were designed doesn't allow for a one-ROM-fits-all (Linux distro-style) updating. Google bought Android in, as they were desperate to catch up with the iPhone. That was at the beginning.
In Act 2, silicon was advancing so much that two-year-old phones weren't really worth updating. It's only been the last couple of years that older phones have been good enough to keep using (though of course a new budget, but pretty good, Android phone won't break the bank).
>We need something like linux for phones. Something that users can install on a wide range of hardware and still have something functional.
>>Isn't that what AOSP ROMs like Cyanogenmod are?
Alas, no - those ROMs still need to compiled beforehand to work on a specific handset.
The issue isn't always with the phone manufacturer, but with various chipset manufacturers... they don't don't always get a new Android binary blob over to the phone vendors. Where is their motivation to do so?
Saying that phone vendors don't do updates because they love built in obsolescence is art school level of analysis. you might be right on occasion, but your reasoning is suspect.