Well you're obviously a fan of fake news, because that's clearly where you got some of the 'facts' you've cited.
As one example, the BBC is NOT state owned or run. I'd start by researching this for yourself
212 posts • joined 11 Dec 2009
Well you're obviously a fan of fake news, because that's clearly where you got some of the 'facts' you've cited.
As one example, the BBC is NOT state owned or run. I'd start by researching this for yourself
GPS (Navstar) current official stated worst case accuracy for the civilian signal is 15m NOT ~1.5m. Since both values are worst-case, it's entirely possible for GPS to achieve 1.5m accuracy some of the time although the highest quality GPS receivers as required by the FAA have average accuracy of 2.1m. However Galileo will achieve single digit centimetre level accuracy a large proportion of the time too, so it's not really comparable.
Galileo is a 'read only' system PERIOD
Err, NO. Galileo offers two way communication for a subscription fee. It's intended application is emergency beacons in ships and aircraft.
If you're going to accuse people of being high, it would be a good idea to make sure you are in full possession of the facts first.
Well, to give the benefit of the doubt, I would assume he is talking about the length of sheathing you need to remove not the thickness.
They suggest that the 'top' sites don't use these features because they aren't required, but the truth is that they don't use these features because they are still trying to support ancient browsers which predate these specifications coming about.
So as a result they propose some of the most useful developments which came about to make browsing faster, more secure and reduce battery drains. Developments which are meant to rid us of the likes of ActiveX, Flash and their ilk? Development which are meant to reduce page bloat.
SVG images are much smaller and much higher quality, you only need to serve a single version not a range of different sizes.
Canvas provides an alternative to Flash (which is way more insecure).
Websockets allow for 'push' to the browser, eliminating the need for polling - reducing bandwidth consumption on mobile devices.
Granted some of the listed features are probably unnecessary but certainly not all of them.
I said pretty much the same thing for years, then I picked up an original Moto G - not quite $50 but still cheap. Fixed battery, was a difficult decision but I figured at the price that it might not be so bad. Still using it to this day 2 and half years later, easily get a full days use, often enough I eek out a couple of days. The point being that battery technology has improved and over the lifetime of the phone, which for the majority of people will be no more than 3 years, the degradation in the battery life won't be an issue.
I would still opt for a version of the S7 with removable battery if such a thing existed but I've changed my opinion on the battery issue to the point where I can live with a non-removable battery. These days you'll find it nearly impossible to find tablets with a removable battery, all kindles have a fixed battery, even my laptops have fixed batteries and yet I don't hear a storm of complaints from people about these devices because the batteries last just fine. As with all things, there will be batteries that fail early - warranty replacement usually takes care of this and if you are replacing batteries regularly with a modern device then it's a sign that the battery design is flawed and not batteries in general.
Perhaps the Register would do me a personal favour and get an official answer from Samsung UK about why they haven't brought the Silver (or White) versions of the Edge to the UK. Despite all the UK press parroting the availability of a Silver version in every single review and article, it's simply nowhere to be found. Are they having trouble manufacturing the Silver version, or are they simply holding it back for their preferred customers elsewhere in the world?
I've only managed to get a 'no comment' from Samsung, perhaps if the question came from a journalist they might be more inclined to answer.
"This does more sense as a loss leader"
Well this being Microsoft, I expect that once they've hooked enough marks ... err, I mean customers ... they'll announce that they will no longer be supporting the linux version and existing users will have to move to the full MS stack.
Even if it doesn't happen, who would be stupid enough to trust Microsoft on this?
"I have no issue with most of this article, but this paragraph ignores that SQL Server has been available on AWS RDS for years so is somewhat misleading."
Yes, and it's priced accordingly high though. The company I work for have an MSSQL instance in AWS, a necessity of one of our data partners, but it's costing a small fortune compared to an identical MySQL instance. If MSSQL on linux reduces the costs to the end user for Amazon RDS then it's popularity would definitely increase.
There is an architecture change - from 32bit to 64bit and 900Mhz to 1.2Ghz - https://i.imgur.com/KRRd7OQ.jpg
The Galileo roll out can't happen fast enough.
The latter. SUCCESS is a constant and shouldn't be changed, hence the CAPS (all caps for constants is a widely followed convention).
Seems like a completely stupid idea if you ask me, those propellers could do serious injury to the birds. While eagles offer some serious advantages in terms of speed, manoeuvrability and targeting ability (eyesight) the risk posed by the drone would have to be significant to even consider deploying the eagles.
You can't shun what you don't know about ...
"During WW1, the government took over all the railways and worked them in to the ground and failed to maintain them."
"by the time WW2 came around, when the government did exactly the same thing - taking them over, running them in to the ground and failing to maintain them properly."
You may, or may not have a point about government ownership, but you rather shoot yourself in the foot by citing these two examples. There was a reason why a lot of things, not just the railways, fell into disrepair during these periods and it had absolutely nothing to do with government ownership - the guys responsible for doing the work of maintaining things were all busy fighting.
So I was genuinely curious whether HTL claims stack up, and though information was slim on the ground I found the following document which details the ISO standard testing of Verbatim brand BD-R HTL discs.
The most interesting part is the table at the bottom of page 3. Tests performed at 25C and 50% humidity.
95% of Double Layer discs tested under the ISO conditions were rated to last a minimum of 336 years or 550 years depending on the test method used. Triple Layer discs were rated at a minimum of 2672 or 3588 years.
The ISO tests have been performed on M-Disc, here is the certificate:
It shows 95% of M-Discs will last at least 530 at 22C and 50% humidity. Which may actually be lower than the Verbatim BD-Rs ... depending on which test method they employed. Notice that the M-Discs were tested at a lower temperature.
Now the only way to be sure would be to see the same ISO test results for M-Disc BD-Rs SL/DL/TL however it doesn't seem that the lifespan for good quality BD-Rs is overstated, it seems the opposite is in fact true. They are understating the lifespan while M-Disc is overstating it.
I can't find anything contradicting the claims for HTL, maybe you can do us all a favour and point us in the direction of research which shows differently? There is obviously some variation in the relative quality of different manufacturers discs - i.e. buy branded
The basic premise of any HTL disc is the same as m-disc, inorganic data layer, no separate reflective layer. A lot of commentators put HTL BD-R in the same bucket as M-Disc. After all what makes the claims for HTL less believable than the claims for m-disc?
A much cheaper alternative is BD-R HTL - 'only' lasts 100-150 years, but it's a fraction of the cost - £0.33 per 25GB as opposed to £4.40 per 25GB for the m-disc.
In my opinion, BD-R HTL makes more sense for backing up things that no-one will care about in 150 years. Yes, your descendents may be interested in your family photographs in the future but will they be willing to spend thousands of pounds on one of the few remaining drives capable of reading the format? In no more than two generations whichever format you chose will be obsolete and your kids or grandkids will have copied the files to the latest archival formats and media - or just as likely, binned the disks.
1000 years is overkill, especially at 13x the price. For governments, large museums, libraries and archives M-Disc makes sense, but for the average home user not so such.
"If Deckard was an android, why was he wandering around unchecked while Holden was busy trying to round up the Nexus 6s (before getting blasted?)"
The 'authorities' didn't know about Rachel either, the implication was that the Tyrell Corporation was producing unregistered replicants which were freely roaming around on Earth. Even if we do assume that the authorities knew, what reason would they have to object to sending one disposable machine to kill others if it avoided risking the life of a human officer.
Don't forget Nexus 6 models were psychopaths, not serial killers. The majority weren't violent* they just had the capacity for it and the greatest danger actually came from them knowing that they weren't human and were destined to die 'young'. Given those facts, having replicants who didn't know they were replicants on earth might have been considered an acceptable risk, especially since so many successful humans are also psychopaths.
All of this is predicated on the belief that Deckard is a replicant, yes Scott believes he was, but if he wanted everyone to know this the ending could have been more definitive. The reason the film (director's cut) ends the way it does is because we're supposed to be unsure, the whole film is about the nature of humanity - we're all supposed to have doubts and ultimately wonder whether it matters what we are but instead who we are. This was the core theme of the book too, although it ended differently. This is also a recurring theme in and a lot of Dick's work. He preferred his readers to think than to spoon feed them resolutions, at least to me he was much more of a philosopher than a writer.
* If all Nexus 6 models were violent then they wouldn't serve the purposes they were created for - who wants a sex worker that will kill every John or a maintenance tech who sabotages everything?
@Tridac: I think if you re-read what you wrote, reversing the paragraphs you'll have both your answer and eliminate your doubts.
In case you still have any doubts, just Google, these meters do allow the supply to be turned off remotely, that's the entire point of them.
"We're working on it right now (in true BBC glacial fashion)
"To use the player, visit BBC iPlayer or iPlayer Radio"
So not BBC News then? That's the only BBC News site I visit daily ...
(That doesn't mean I don't watch BBC programmes, I do, just through the higher quality DVB broadcasts stream on a large 'monitor' which I call a television™ attached to a magical box which records everything I could possibly want.)
> Except the existence of a padlock icon would be a dead giveaway....
Have an upvote and a beer on me.
Yes this is different from startcom - the biggest difference is that they won't charge you to revoke or re-issue a certificate. They also automate the process including renewals.
Don't underestimate the automation either, this includes auto-configuring the server to use a strong setup, no weak default ciphers or protocols etc. The configurations will adapt as new ciphers emerge and old ones are deprecated. Most sites operating with encryption now are still using default configurations which render them insecure since many admins assume that simply having the certificate is enough.
Requests are encrypted - the only information an eavesdropper can obtain is the IP address of the server - not the pages you visit etc.
They only hand out certificates to people who can show possession of the domain associated with the certificate. The issues experienced in the past were with CAs who handed out certificates without checking that you actually controlled the domain in question. I fail to see why all the negativity ...
I'm afraid you don't seem to know how the CA system works. I can get a certificate for theregisster.co.uk from every single one of the major CAs if I possess that domain.
If that causes you concern then you should remove _ALL_ CA root certs from your browser.
In short yes if those domains are configured on the server, it doesn't create a wildcard certificate.
" We have a FREE system that is only accurate to a metre or so."
As opposed to GPS which is only accurate to 8 metres. I'd say an 8 fold increase in accuracy (much greater for educational, research and commercial licencees) is worth the expense, especially when you consider that GPS coverage isn't perfect in northern Europe - hell, I'm in the US right now and even here it's perfectly lousy at times.
Your position seems to be that we should put all our eggs in one basket and trust to the goodness of the American government. Which is just incredibly naive and foolish. Stop trying to tie up what is a worthwhile project with the political mess that is the EU, even while I'll campaign for exit from the EU I'll still be flying the banner for Galileo - the project may well be badly managed and servicing the egos of bureaucrats but the end goal is still worthwhile.
One crucial difference, aside from the one about LE being trusted by all browsers, is that LE are shipping a one-click solution that can be integrated by shared hosting provider or run as a standalone application on the server. It generates the keys, requests and downloads the certificate and configures the server for you - meaning you don't need to know how to do these things yourself. That has always been a barrier for many smaller sites, the need to go through a multi-step process including manual configuration changes and LE addresses that.
> No, a free CA will just make easier to obtain a fake certificate for somebody's else site if no vetting procedure is in place before releasing a certificate. It actually make snooping *easier*.
Why don't you go read up on the project before you say anything more? Let's Encrypt's validation/verification procedures are far stricter and more robust that every large CA I've dealt with. Additionally their process is automated removing the possibility for user error - the system simply won't issue a certificate for a domain for which you cannot prove possession (* though not ownership, this is no different to any other CA).
You're confusing the purpose of Let's Encrypt. The project has nothing to do with increasing trust in certificate authorities.
Which doesn't mean you cannot trust Let's Encrypt, only that if you're looking for a solution to the trust problem then you need to re-invent the whole certificate system. LE is about giving everyone access to certificates free of charge, with no strings, no 'revocation fees' and no limits to the number of certificates you can deploy. It's about removing the barriers to deployment even on the lowliest website and thus bringing about the long overdue age of complete encryption to the internet.
Aside from issuing certificates, the project also comes with a suite of tools which will properly configure your server to use the best possible TLS configuration, which alone makes it extremely valuable. Many servers still offer outdated or incomplete configs which are no longer secure, LE is offering a one-click solution that handles the whole process for you.
There are plenty of other projects and solutions attempting to solve the 'trust' issue, including Public Key Pinning - although you still have to trust the browser and intermediate proxies, and let's face it, if you cannot trust those then no amount of encryption is meaningful.
In the world of security, anything less than an A is never a pass. Incidentally, SSL Labs hand out an A+ for the top grade, which is what everyone should be targeting.
I'm assuming they'll use H.265 for broadcast and streaming, H.264 would be an odd choice. Still, H.265 can't work miracles, if broadcast quality HD using H.264 is 9Mbps (30+ for Blu-ray) then they can't cram 4x the information into an H.265 stream of 13Mbps without seriously degrading the quality. Even Netflix opted for 15Mbps, which is still way too low. For broadcast quality parity you are looking at at least 18Mbps but reasonably we should demand better from 4K broadcasts and not let broadcasters squeeze the image to the point where 4K broadcast/streaming looks like Blu-ray HD.
Try double that ...
"how the basic (low-precision) Galileo free and open service compares to the current free and open GPS tier?"
The uncorrected free tier guarantees accuracy within 1 metre, compared to GPS which provides uncorrected accuracy within 15 metres. These are the worst case scenarios with a good lock, obviously GPS performs better than that in the real world, a consumer grade receiver will usually average around 4m at best, but similarly you can expect the average performance of Galileo to be far better too - if it compares to GPS, then you could an average best of 30cm from the free service.
The Galileo commercial tier guarantees accuracy to within 1cm. This is marginally better than survey grade GPS augmented with RTK which is within 30cm but normally averages within 'a few centimetres'.
These figures are changing all the time, GPS accuracy is being improved all the time, especially the with the use of correction services, and until Galileo is fully operational the claims made for the system are unproven. However the accuracy of Galileo's free service was enough to scare the US into formally protesting the project and threatening the EU, but probably not for the reasons you think - the US government makes a lot of money from their commercial GPS and Galileo represents a threat to that business.
The EU want a global system that can be used by EU businesses, citizens where ever they are operating in the world. Galileo offers total coverage, loran doesn't even come close. They also want a system which is more accurate than GPS, not less. eLoran offers accuracy to about 8 metres, Galileo to within an inch.
It's not a unit per se, you pay for capacity not usage. So 1Mbps per second of capacity for a whole month is $1.
If you want to work back from to get the per MB the value would be 1 ($) divided by the total number of bytes that _could_ be delivered which is 1 (Mbps) divided 8 (bits to get number of bytes you can transfer in one second) multiplied by 2,592,000 (assuming a 30 day month).
1 / ( (1 / 8) * 2,592,000) = 1 / 324,000 = $0.000003086 or $3.086x10-6 per Megabyte
which could be expressed as $3.24 per TB
The EU had (now abolished) Three Pillars, and it has the concept of the Four Freedoms, but I don't think I've ever heard of the Four Pillars, although four would have made more sense, three always seemed to imply the EU was like an unstable bar stool.
As studies have shown, drivers in London are exposed to considerably more fumes and pollution than cyclists primarily because pollutants that enter the vehicle build up in the enclosed space. Other factors include air intakes on vehicles being at the level of the exhaust pipe of the vehicle in front of them and cyclists being exposed to moving air and being higher off the ground.
The ISS is a sealed box, air has been recycled up there for years now. Traces of everything that leaves the human body remain on every surface in the station despite regular cleaning. In some places the actual fabric of the station is being damaged by the high bacteria levels. It's a floating septic tank, and worse, bacteria and fungal behave differently in space which may lead to more potent mutations. If you wash salad vegetables before consumption on Earth, and you should, then you'd be really stupid not to be doing it on board the ISS.
No it hasn't, Google are seeking to make QUIC the sucessor to HTTP2 and are pushing for standardisation. Additionally they are converting all Android and chrome apps using google services over to use QUIC instead of HTTP (1 or 2).
I'm fairly certain I said "men in the middle ... injecting ads" and nothing at all about TLS eliminating all adverts.
I was referring only to ISPs and the like inserting their own adverts into pages, sometimes replacing site operators adverts (Redmoon, CMA, etc).
"It's equivalent to a clothing retailer sewing GPS trackers into your clothes that anyone could use, without telling you, and expecting the average Joe to know that they need to ask for it to be disabled."
Umm, they are already sewing RFID transponders into clothes, not quite as bad as GPS but it allows people and stores to know what underwear you have on when you pass within range of their readers i.e. when entering and exiting the building.
If all web traffic was encrypted using TLS then ISPs and other men-in-the-middle wouldn't be able to inject headers or adverts.
Where have you been for the last ten years? They haven't suddenly become creepy overnight, their entire business model involves spying on you - there is barely a website in the world now that doesn't use Google hosted content which only exists so that they can track you everywhere you go.
Their official purpose is to allow the government to turn off your power remotely, per household, for short periods in order to prevent demand exceeding supply. The rest of the stated reasons for their existence are just the 'sugar' which is meant to help the medicine go down.
Welcome to the future of unpredictable renewable energy supplies. Free stock tip - invest in candle manufacturing firms.
TLS 1.2 support has been available for a few years now.
Responsible certificate authorities offer free upgrades to SHA-2 signed certificates.
While it's true that a top TLS grading says nothing about the overall security of the server - whether that's http server vulnerabilities, web application or just poorly implemented authentication, it's still a valid indication of the strength of the TLS configuration. Sites should be targeting an A grade irrespective of whether they are otherwise locked down tight. Let's not forget that this is a rapidly changing landscape, SSL 3 (1996), RC4 (1987), CBC ciphers (1976) were all considered secure a year ago - admins should endeavour to stay current with their TLS configurations so that they won't get caught out as ancient protocols, key exchange methods and ciphers are broken.
TLS is not just about security of login credentials but security and privacy of data, which is at least as important as protecting against intrusion/exploitation of the server. One without the other is like securely locking the doors on a house made of glass, no-one can get in but they can see everything that you do.
You're breaking the standard if you still offer RC4 in your client, or use RC4 on your server.
All sites should achieve at least an A grade with https://www.ssllabs.com, an A+ grade is the goal. If you get less than an A you're doing something wrong.
Err, what you're describing is not a smart meter, it's a thermostat, entirely different piece of kit.
I know rolling blackouts are EU and government policy, that much is undisputed and has even been reported here on the Register, although the wider public seems to be completely oblivious.
However I'm avoiding accusing our government of lying to us about the true purpose of smart meters, even though all evidence points to that conclusion since I don't wish to invite charges of liable from government lawyers.
Biting the hand that feeds IT © 1998–2017