I have a tendency to type a word that my brain has lined up for later in the sentence if it sounds similar to the one I'm actually typing.
1188 posts • joined 28 Mar 2007
I have a tendency to type a word that my brain has lined up for later in the sentence if it sounds similar to the one I'm actually typing.
It is unfortunate that we have a one-sided bias on the articles about this on hear, but have you ever heard a lawyer in court?
"My client did not kill Mr Smith as he was in another country. But *IF* he had, there could have been no motive. There was also evidence that he bore no malice to Mr Smith. There was also no weapons found . Even if Mr Smith had the knife that the learned gentleman says was present, there was no way the injuries were consistent, and Mr Smith was also left-handed, and had provably damaged his hand bowling the night before. And even if my client *had* killed Mr Smith, by Mr Smith's own admission he was holding a gun, so it would have been self-defence."
If you want to defeat someone's argument, the true scientist (not claiming Mr Page is one!) will approach things from all angles and cover every point, several times over. "I don't believe climate change is happening, but let's ASSUME it is - look, the ice sheets still don't just disappear as predicted" is perfectly valid and non-hypocritical.
That's what actually makes me read the Page articles - I'm sure what started me off was one on wind-farms where a bunch of people broke down the costs of them and went to EXTREMES like assuming wind-turbines were 100% efficient, and we could plant millions of them for free, and we could blanket the whole of the UK, and we had no cable losses, and we have seven centuries of gale-force winds, and... and... and... still it wasn't practical to deploy them. That's how you do it.
Resting your argument SOLELY on "I don't believe climate change happens" is silly. What you do is hold an opinion on each piece of the puzzle - I don't believe climate change is happening, I don't believe it's man-made, if it is I don't believe we can do much to prevent it that isn't WORSE for humanity, if it happens it won't be as bad as predicted, if the temperature rises we won't see sea-level rises, if the sea-level rises we won't necessarily turn into Waterworld, even if we do the consequences may not be as dire as we think, etc. etc. etc.
Only a fool bases their argument against something on a singly-held belief when there's no unanimously accepted evidence either way, because if they are wrong their whole belief collapses. Whereas if the person above was wrong, say, in JUST that temperature rises won't cause sea-level rises (for instance), the rest of his argument (and beliefs) are still perfectly valid and STILL need to be countered in order to say he's an idiot.
My personal hate since Windows XP at least and much more to-the-fore on the newer Windows.
I actually went out of my way to disable search on Windows 7. I killed the service, replaced the start menu and DID NOT BOTHER to put search options back on. I find it very annoying to expect me to do the equivalent of "Google" my apps in order to find them, even on a menu designed to do nothing but run apps I have installed. Windows XP's treated documents the same and the "index your disks by default" was also something that was switched off in the first day (and people comment on just how much faster my machines are than their equivalents with their setup).
I actually did need to find a file after doing so. Once so far in, what, three months? And only because I'd put my old hard drive into the new machine and wanted a very ancient file that I knew was on there somewhere (but manually searching 500Gb for a tiny setup file was not productive). So I installed Agent Ransack (at the recommendation of the start menu replacement I was using) and it was fabulous. What Windows search should have been without the all-the-time indexing and the stupid dog, and with so many more options.
Within 1 minute, I had my file. And I haven't needed to use it since, and it had ZERO indexing to help it.
Strangely, on Linux, I actually LIKE slocate - which indexes the whole drive on a schedule - maybe because it does so unobtrusively and (properly) in the background and my Linux machines tend to be on 24/7 so an indexing operation at 4:30 in the morning isn't noticed. But on a desktop or laptop, I don't WANT things indexing, collecting usage patterns, trying to second-guess me, and making me "search" for things. I just don't have a disorganised data storage that makes it necessary. Hell, for years I found having empty "My Music", etc. folders horribly messy and inconvenient and things insist on constantly recreating the damn things.
So my desktop has no search options except Agent Ransack (which is actually linked to Classic Start Menu's, Start... Search... For Files and Folders option), my computer doesn't launch into a disk index because it hasn't done it in a week because I switch it off overnight, my start menu doesn't REQUIRE searches and suggestions because it's well organised, and my desktop has five icons on it. And you know what? I find absolutely everything quicker and easier than everyone else I know, going back further than anyone else has data stored, and my memory is absolutely, 100% ATROCIOUS so it's not like I just rely on my brain to do the work (the exact opposite - I use my computer to be a tool).
And when there is something that I literally haven't needed to use in nearly a decade, which I stored away because I knew it wouldn't be around for ever but would be incredibly useful should the need arise, and which I needed to dig from 500Gb of data, a proper, real, necessary search, for vague keywords over the whole of my data takes minutes for the absolute worst-case scenario.
I don't want to "Google" my files. I know where they are because I foldered them and named them properly and I have the ability to sort by type, name, folder, date modified etc. at the click of a button. I don't need to "Google" my programs, because they are all categorised, organised, and named so that I know what they do (WinDirStat? Serviio? Even Agent Ransack? Please, that's not a helpful product name and it would take forever to remember what the damn thing is called, go through the list of software that *almost* match that name and pick the real one).
Can other people work with my system? Hundreds of kids and staff do every day, because I built the computer images and I organised the desktop and instead of the 400+ icon MESS that was inconsistent across desktops, all the machines now have five-six categories (the sixth is hidden unless you're a teacher!) of software, which have subcategories and relevant icons and identical Start Menu entries (for those who like the keyboard - I just redirect Desktop and Start Menu to be the same, read-only, set of icons). From 5-65, everyone uses it and no-one complains they can't find things. And when a new piece of software goes in, they guess where to go for it 99.9% of the time.
Stop turning my computer into a website. This applies to everything from server-apps (Really? I need to run an IIS instance and load up IE so you can show me the state of the RAID array?) to search bars to fancy desktop effects (Metro etc.). It didn't work for Active Desktop (which I switched off, again, within the first day and never enabled again).
My computer is a personal tool. Start messing with how it works so that idiots can use it, and all the professionals will run a mile.
So not just an unnecessarily accurate metric conversion, but also the wrong unit for the dimension involved anyway. Nice.
P.S. Ignoring the mixture of units, "an eight by four sheet of 20mm MDF" and "three metres of four-by-two", would translate more to an 8km jaunt for a 2.5m by 1.2m slab of 20mm MDF, and three metres of 100x50mm, which are all perfectly manageable units, within the accuracy of wood-cutting (which is generally atrocious anyway), and actually used just as much, if not a lot more than "2 by 4" by anyone in the trade nowadays.
P.P.S. if you really want to get finicky, you'd have to ask for 13/16ths MDF, which is just as silly as stating metric conversions to hundredths of millimetres for a 2.5m bit of wood.
Nothing to do with the fact that a lot of the "dodgy" torrent trackers deliberately insert fabricated IP's into their torrent swarm lists?
If your primary evidence has come from the people who provide the alleged illegal facilities to those who you are trying to sue, you are doomed to fail.
And that's not even touching on the fact that it's NOT accepted in a court of law that an IP alone is good enough to correlate to a particular person.
Although all the fancy laws they make up seem to say you're doomed the second you appear on their lists, in actual fact higher laws with greater precedent and greater preponderance of evidence required mean that they all get thrown out (e.g. ACS:Law, etc.) unless you were guilty and admit it. That's not to say you should not admit it if you are, but that 16% of those reports were entirely 100% fake. So do you shout at your kids because they *might* have done it, or do you fight the case because their evidence *might* be wrong?
Can't say that I like the Bond franchise "reboot" at all, to be honest, but Q was Q for a reason. He may not have invented those gadgets, you see, he probably didn't understand the new-fangled gadgets very much. But he was the Quartermaster, in charge of the department, herding the young geeks (like the new "Q") to get the results that could be tested to his satisfaction and only rolled out to the field agents when he was happy with them.
He probably could do all their jobs, at a push, but he didn't. He was the old-guy in charge of the young-guys that tweezered the talent out of them while curbing their enthusiasm to go too mad or use unreliable technologies. He was the equivalent of the ageing cardigan-wearing scholar in charge of the supercomputer datacenter - maybe he couldn't tweet properly on his first attempt, but he was around while computers still ran off punched tapes, could write better code than anything the young'uns could churn out and was there as an experienced, calm, sensible influence to herd the department away from schoolboy fantasies and towards a product that would save someone's life every time.
Bond knew that. He knew how Q worked. He knew that a Q gadget would never let him down. He knew that Q didn't want him touching stuff, especially stuff that hadn't been approved yet, because he was interfering in the old-guy's projects. And, numerous times, you could see Bond silently thank Q's attention to detail and testing and reliability.
Q *WAS* a Desmond Llewellyn (not the actor, the character - the actor was notoriously inept with technology, as was the actor who played his successor "R"). He was an old guy, who spent the afternoon sitting in a back room, maybe smoking a pipe, waiting for that young, keen guy who he'd sent back to the drawing board to see that he was right all along and came back with what he knew would have to be his correction / workaround / revamp because Q already knew it wouldn't work any other way, just from instinct. He didn't stamp on keenness and new ideas, but he was there as a mentor and merely guided those enthusiasms and channelled them through a filter of years of experience and got something even better from them.
You can't mess with Q. A young-geek Q isn't what would exist. Maybe a Q-in-training, or a Q-apprentice, yet to see that the little old man who makes him re-do everything is really teaching him how to be a better Q and who, in years to come, will joke with him in that backroom and see Q's human side come through and joke with him about how they were like the new recruits once.
The Bond reboot has missed the point of quite a lot of the characters and (admittedly) their enlarged selves that have been portrayed in the tackier of Bond movies. You can't mess with Q, though. That's the last straw.
P.S. NEVER has an actor and character had a more profound exit from their starring role than Desmond Llewellyn and Q. Never. Perfectly timed. Perfectly matched. Perfectly humble. Perfect.
So a number plate, for example, would be personal data.
And my explicit (not implicit) consent would be required by each organisation who wants to store that personal data.
Which would kinda blow out of the water those car parks that want to record ANPR lookups, the congestion charging zone, etc.
Not that I think that's a bad thing. But it may be an unintended consequence.
Wonder how much custom hidemyass loses as a result?
Mobile minutes aren't expensive if you're on a contract (I have 1000's of minute left at the end of each month).
I don't have to change my number for it to work.
Not all handsets will let you do VoIP dialling direct as if it were a normal contact without having to switch things on.
It's not so much sending as receiving on your existing mobile without having to do any fancy "switching". I just want the phone I have to have signal where I need, when I need, without having to contend with VoIP settings, extra software, etc. and just having people ring me on the number that they would if I were out and about without having to pay a third-party to forward it for me.
Wireless on = an hour of service (granted, you might be nearer a plug, but still hassle).
GSM on = as long as I like.
And if Virgin offered one, I'd have one tomorrow.
The problem with most large companies is their killer products never get pushed to market properly, and their junk gets foisted on everyone no matter what.
Seriously, T-Mobile, Virgin, whoever can runs my damn phone that's on the Virgin network, gimme a femtocell and I'll pay you £50 and the privilege of free use of my broadband to do what your back-end radio network should be doing anyway, just for those 100 sq.ft that I call "the ground floor of my house" that I can't get a signal on ANY network whatsoever.
I wouldn't even care about the occasional outage, to be honest, that's just part and parcel of living in a modern world reliant on technology to be 24/7. 99.9% would be more than good enough for me.
The school I work for have bought a suite full of touch-ready machines in preparation for our now-inevitable Windows XP -> Windows 7/8 upgrade.
I am honestly dreading the amount of hassle the cleaning staff will give me at the end of the day, scrubbing off all those child's fingerprints, not to mention the amount we'll spend again with the company with use to sooth the bogie-fearers who come in once-a-term and clean the PC's with antibacterial junk (which I personally think is daft, but I'm not paying for it and they don't break my computers).
Not to mention how it will translate to lessons where you can touch on some machines and not others in the school and so on.
Touch is good. Has been for years. In very, very particular situations. Our intranet-showing communal screen in the staffroom is touch so that people can navigate it without me having to supply mice and keyboards. It's basically a locked-down kiosk.
But on the desktops? And expecting it on laptops? And DESIGNING the OS to use it as a core input method? It's silly. It's always going to be second-place for serious work, and tuning the OS towards it AT THE EXPENSE of normal usage is not a bright idea.
Our Windows 8 test image? I installed Classic Shell and go rid of Metro as much as humanly possible.
This site was categorized in: Humor, Video sharing, Photo sharing, Adult Themes, Nudity
It may not be ALL new, but if there's a shred more independent evidence for the findings then, in science circles, that's news. Because it means that there's an increasing chance that the hypothesis is correct.
Science is like RAID. If the data isn't confirmed on/by several independent places stored in/obtained by slightly different ways, then it's not reliable enough to be counted upon.
When working in customer support, always think of the worst possible response you could get before saying something.
Because if the guy on the other end had been German (i.e. moving back home), or Jewish, or was the EU minister for racial relations or whatever, it was never going to end well.
I predict someone only got a slap on the wrist and a harsh word, but it could result in you being sacked. Maybe that's what the guy was after - I know a few people who work in customer support who like to "go out with a bang" when they go, even if it ends up costing them their last month's wages.
Surely the downside is that the billing mechanism's primary source of identification is a yellow and black plastic sign that you could knock up in any printers in about ten minutes, and it's difficult to refute false claims even if you KNOW it wasn't you in that car?
I think that the more weight that we put on number plates being read by technology (petrol stations, congestion charging, etc.), the more technology should be IN the number plate to stop it being faked. Because at the moment it's a bit of plastic read by optical cameras with no bells and whistles beyond a bit of OCR (i.e. you could fool it with a well printed bit of paper).
And is it technically illegal (aside from the obvious obtaining-services-by-deception) to not have a valid number plate while INSIDE a private car park?
Again - <sarcasm> because that's such a natural fraction that even an alien would pick it.... </sarcasm>
Binary - I can count to 1024* just on my ten fingers. Beat that.
*technically 0-1023, but who's counting?
Like the old adage about temperature.
When it's hot, we measure in Fahrenheit ("It's in the 100's today!")
When it's cold, we measure in Celsius ("It's below zero!").
It absolutely blows the mind of foreigners*, which may be enough of a reason to keep it.
*Living with an Italian, before you think I'm xenophobic.
Maybe. If feet had any scientific reference base whatosever.
Hell, even the metre's reference base is a ridiculous fraction of the size of the Earth, just to be convenient.
By that standard, binary is an infinitely better system.
Addition, multiplication, subtraction become a doddle and you can get up to 1024 on your fingers without even struggling with thumbs and knuckles.
I've always wondered when we'll start teaching kids in binary. So much easier to grasp if you've been born with it.
I think the EU pretty much get this right (after much controversy and prompting): I don't care what you use. Just put the SI next to it.
You can state it in badgers per field as far as I'm concerned, but the INTERNATIONAL STANDARD is SI, so putting that it alleviates any doubt in conversion (US vs UK gallons), lets the majority of people know what's going on, and doesn't stop people saying their beer is in pints (568ml), the kind of phrase that is inbuilt deeply into languages already.
Aircraft heights? Despite being able to comprehend imperial measurement, 50,000 feet means nothing to me (why use feet for such a huge distance when they are several suitably large equivalent measures in imperial already?). State it in SI too and I can at least work what that means, some kind of comparison, how long my car would take to drive up to it, etc.
Nobody cares what you measure in. Just put the SI next to it so we can at least have a decent comparison to a standardised unit that's universal (there are countries in the world that won't know what an inch/foot/yard/chain/furlong/mile is, but there aren't any countries in the world that won't know what a metre is - even if that's only by "it's about a yard").
That's nothing - when I press the Degauss button on my SVGA monitor, it makes an horrendous BOING noise and the picture wobbles for about 5 seconds and all the colour goes funny before returning to normal.
Spot the problem. Because it took me a few goes to notice it.
It's basically a split-second where the screen is powered up.
Much as I love all the Apple-hate, this is really not worthy of an article, let alone such a hyperbolic one. The forums posted even say it's so brief that they had trouble capturing it.
It's like me complaining that I can see a slight black outline race towards the edge of any LCD screen when you first turn it on. It's called the screen coming on. And it doesn't impact on anything, at all, ever, anywhere.
It's better than that. If Obama won a Peace Prize, it makes you wonder what the criteria is. Propagating wars that you told everyone you'd shut down if you were elected, continuing to imprison people in foreign countries a decade after their arrest and still without fair trial, etc. It's hardly "peaceful" even if you don't count that the same as being "respectful of human rights".
If nothing else, nominating and awarding what is basically an entire continent just because it didn't OFFICIALLY declare war (but sent millions of troops to go invade foreign countries to instil a different leadership compatible with the US) is a bit far-fetched.
Because only 1 in 277 attempts of the technique he championed was successful and he said himself that it would never be viable for human cloning.
I'm not anti-clone (which is a bit like being anti-science in a way), but it's not a terrible useful or reliable technique even years after Dolly.
Humour brought down to stark reality, so the reason they didn't clone him is that the sheep that inherited his estate said it wasn't viable science.
I've often thought, network wise, of having some illuminated "thing" or other to indicate various network statistics. It would be nice to just glance and know that everything's running, or be able to tell instantly you open the door that something quite minor that you otherwise may not have noticed has gone wrong.
And each time I think of it, I can't think of a reason that a simple web-page or monitor with green/red boxes on it wouldn't do just as well.
The child going to the toilet? After the first time, they really don't know where to go? I can't even think of a sensible use that isn't very specific (and so you'd probably just make it yourself or buy something for that purpose) or so generic that you wouldn't bother to buy a particular device to do it.
His biggest market? Executive toy / flashy thing put in the background while someone reads the weather, for about a week. That's about it.
Hell, I saw a kickstarter for something similar - a "programmable flashlight", which seems about the most hideous waste of computing resources I could ever imagine. This is just the same thing, scaled up.
Tablets have their own niche.
Laptops have their own niche.
Desktops have their own niche.
If you failed to keep up with them, I can't say that you've really got your eye on the ball as an IT company. Nobody's going to stop buying desktops to go out and buy tablets, and vice versa.
If anything, the decline is BECAUSE of Windows 8. What's the point in buying a PC now if it would come with Windows 8 in a few months time and you'll inevitably have to upgrade to that anyway?
And, personally, my workplace are holding off until Windows 8 SP1, so that we can get an idea of just what's wrong and how to fix Windows 8 before we even think about deploying it. We bought new computers in the summer. We bought Windows 7 / Windows 8 compatible ones with touch screens. We slapped our old XP image onto them until we know what we're going to upgrade to and how we'll go about that. And we only did that because we're a school and like to make changes in the summer - the machines they replaced were fully functional and more-than-good-enough for another couple of years still.
I'm not that shocked.
Once the power goes, the board will slowly degrade, yes, but there's nothing to "damage" the electronics as such. If it survives the initial plunge and short-circuit, it'll probably stay down there for quite a while unaffected.
What will happen is moving parts will seize and the boards will get mucky. Presumably the battery bulge is because of a short-circuited battery from the water. Everything else? Well, it'll survive until there's time for something to actually eat the copper tracks under the lacquer on the PCB. Speakers go because they physically are made of cardboard for the main component and that degrades in water very quickly.
I've seen computers operating for years with dead animals inside them. I've dealt with cameras, phones and laptops that have been submerged in the muddiest of waters. Pretty much, so long as there's no physical damage to the machine (not just cruft, but something actually breaking / shorting / disintegrating), it'll turn back on once it's dry enough and you power it up. Cameras suffer worse because of their lens movement machinery but SD cards are nigh-on invincible from the perspective of water. So are USB sticks. It's things with power that can short and burn connections that you need to worry about, but even I've had laptops that have survived major coke / coffee / tea spillage while turned on.
If you have no power, then there's no real risk to the components. There's nothing to short the memory or overwrite and corrupt the data, even if the chip itself is completely submerged. No power = no voltage and the silicon chips are sealed units. The surrounding water is no different to just putting a connector across all the pins - when there's no power, it won't do anything at all. And everything else is plastic, lacquer-coated PCB's and various contacts. You'll find the edge connectors corrode faster than anything else in there and they will still take months of submergence to actually wear away to the point they can't work again with a clean wipe-over.
It's not at all surprising. In fact, I'd be most miffed if my own phone couldn't do just that.
You've made the mistake of thinking that this has anything to do with computing at all. Consider:
Guy rings you up. You don't know who he is, never heard his voice before.
He says he has some photos of you.
He'll send them to you if you want.
Just open your front door and he'll leave them on the mantelpiece.
Then you can open them in your own time.
If you haven't smelled a rat by line #2, you're an idiot.
My mum and dad are actually completely computer-illiterate. They are pensioners and they *can* play games like the Wii with some prompting (mum's actually a Mario-addict from the Gameboy days), but in terms of doing things if it doesn't start as soon as they press a button / put a disk in, they are absolutely baffled. They share a Facebook account that was their first ever online presence, made two years ago - up until then, they had no PC experience whatsoever (mum can type because she used a typewriter in a hospital job 40+ years ago but she still stabs the keyboard too hard), never owned a PC, never been on the net, never had an email account, never even done it through the TV or Wii or anything along those lines. Hell, it took years for them to learn to send a text.
When they get something dodgy (online, offline, on the phone, by text message, by Facebook, by email, by something popping up, or some dodgy bloke knocking at the door), if Dad isn't already shutting the door on their face, they are on the phone to me or my brother. They don't click on emails from strangers (in fact, they get rather annoyed that people they don't know CAN send them email or even Facebook messages), they don't download things, if the window today doesn't look like it did yesterday or something pops up asking permission they phone up or they just switch the computer off.
This isn't the result of intensive training - this is simply experience of what they've heard from others getting scammed, and application of their off-line principles to on-line actions (Who the hell are you and why are you talking to me on Skype / phoning me in the middle of the night?).
It's not an IT skill. It's a life skill. It doesn't matter WHO'S on the other end. If you don't know them, and don't think they have genuine business with you, hang up. Even if they do have genuine business with you, they will contact you another way that you *can* verify them.
But strangers popping up on Skype and asking you to do things for them (like click links etc.)? Come on. This is nothing to do with IT at all. It's common sense, even in pensioners with no IT skills above clicking on Facebook and replying to messages on there by text (not to Facebook, direct to the people in question!) after DECADES of bringing up two IT-literate sons.
Hell, Dad even sent me a message once asking if an email was genuine and actually included in the question he asked were the words "I don't even have an account with that bank!". Guess what, Dad? It's probably a fraud, then. Although not the best in deduction, he checked before he did ANYTHING.
Because that version number would have stopped you accepting a download from a stranger and executing it?
There's something to be said for letting natural selection take its course and get these people forcibly off the net when these viruses delete their files and/or their ISP's kick them off.
Just how many warning signs do you need? Of course, to them, their antivirus "didn't do it's job", like saying the burglar alarm didn't do its job when you left the doors open, only alarmed the shed, invited the burglar in then went upstairs to sleep, and would have ignored the bell going off anyway.
Why would you care?
It's not a question of just throwing those TIFF's into a website and hey presto, no matter what this article might state. There's an awful lot of processing, correlation, removing duff/duplicate data, censoring, resolution reduction, lining up, etc.
What will happen is you'll do a day in the air and then it'll take the guys back at the offices a few weeks to smarten that data up for production - so driving the whole thing up there and plugging it in won't be a chore and it'll only take a day or so to copy on a fast network / datacenter local net. You wouldn't bother to try to send it over the airwaves, it's just a waste.
And if you're an operation on the scale of megapixel-images of the entire country, you'll have a datacenter to host them, and you'll have multiple cameras, and multiple storage units and you'll do, say, a month stint in the air with the pilot, changing the storage each day for one of, say, 7 storage units - and copy each completed one off over the next week to your real storage where over the next 6 months your people will spruce it up and eventually publish only about 50% of it (given the amount of unusable footage while the plane turns, isn't lined up, is at too steep an angle, duplicates parts of existing strips, etc.).
You honestly think you'd pay for one of these things, strap it to an aircraft, fly around for 8 hours, and then sit on the group trying to beam Terabytes home over the local pub wireless or something?
What makes you think running a cable through a municipal sewer is any easier?
Permits. Rats chewing (they'll bite through ANYTHING and have to to stop their teeth growing through their jaws). Water. Damp. Effluent (which means new workers, new regulations, new equipment, new H&S training and standards, new corrosion problems, etc. etc. etc.). Then you have to get the cable into the sewer (which is no mean feat though probably easier than digging up a road), and EXIT out of the sewer at points enough for you to - well - dig up the road and go to the premises. Then you'll need repeaters and stations along the route which will need to go on the pavement (not the sewer) and go back to traditional permits and installations anyway.
All of that needs approval from the sewer companies, too, and the local councils, and you can't just shove any amount of things down there - sewers block all the time because of their under-capacity (I just had a thing put through my door telling me Thames Water will charge me £80 a year extra from next year so they can replace Victorian sewers in London - makes you wonder why they haven't factored that into their business contingency plans and normal upgrade plans since, well, the Victorian era).
In terms of overall access, it's no better than just owning a permit to put your own "tube" into the road alongside it. You gain little by doing it, and it's a lot of investment to start with, so the risk isn't really manageable. And you'll have to do this with a "new" idea, which inevitably means years of trials, problems, and complaints - and fight every council for some new "right of access" to their sewers that they currently don't offer to anyone else.
It's just not worth the hassle.
Because there is no commercial incentive and we have no "public service" company to do it. Even if it were a government project, it would have been done by now if it was profitable.
The problem is that it's not profitable to spend £100,000 on permits, street repairs, cabinets, cable, hourly wages, etc. to get £20 a month from 100 people (if you're lucky and sign a load of people up). Not for years. And then you barely eek a profit because of the upgrades and support those users demand without paying much more. All the business plans would basically see a pittance for a lot of effort.
Whereas some roads, or just buying up companies who went bankrupt thinking the pittance would sustain them as in your example, are much more profitable. You know you're going to make money from day one no matter how many sign up.
The % of people affected who would buy broadband from you.
The % of those people who would buy something other than bare-basics broadband.
It just doesn't add up even for a lot of quite "good-looking" streets. You won't make a profit, and if you do it will be at enormous risk.
If we had a state telecoms operator, they would have been told to "just do it", and it would have taken years but it would have happened. As it is, while there's no profit to be made from cabling your in-law's street, nobody will step up and do it and CERTAINLY not when they can just ADSL over existing copper (which you have to compete with).
That's why Virgin push the "fibre" aspect - the speeds, etc. Because, over ADSL, most people wouldn't ever bother to choose them and they're hoping to dig up pavements, gardens and god-knows-what to install fibre for you to do it (which isn't fibre most of the time either, but at least technically better than ADSL even so).
Some things are just not profitable, for small or large companies, even including state subsidies. It probably costs more in copper or fibre to activate that road than you'll see back in 25 years from its residents, if there's a competing ADSL service. Where there is no competition, you'll still only see a pittance after a few years.
And the less people on the street, the further from other streets, the longer the runs, and the most satisfied people would be with just a basic service, the closer you get to making a loss by even bothering.
But in high-density inner-city areas that are well-catered for if you bought up, say, NTL, or Videotron assets - easy money. Guess where the focus goes.
Off-topic, the school I work for was just quoted for a leased line. We are literally 20 metres from the BT exchange (hell, I could lob a patch lead out of the window, it's so close) but their ADSL2+ product is so poor that we've given up trying to balance their dodgy connections over multiple ADSL2+ business lines (my solution of building a Linux-based load balancer / failover router gained us three years of leeway but the modems are so often down or not passing traffic now that it's pointless trying to compensate any more).
The leased line provider undercut Virgin by 20%, and sent us a map of the street cabinet they will run from - it's 12 metres away. The hilarious thing? The cabinet is owned and underlying cable will be supplied by Virgin anyway. They got undercut by their own reseller on a 3-year leased line contract. But we only have that option by either paying a lot of money or being INCREDIBLY close to a huge exchange in the middle of a populated town.
I wouldn't like to think what a leased line costs to some place out in the sticks, for the same reasons as cable and fibre being expensive or unavailable in those sorts of places.
Is this such a bad thing?
I have, in front of me, an 8-core, 8Gb, 1Tb laptop with stupendous graphics ability. It was the cheapest that fit my criteria (which focused on things like having a numpad, having enough USB ports, etc.) And what am I doing with it? I'm browsing the web, sending email, and some mundane network admin tasks etc. Where's all my processor power actually being used most? Games. Outside of that, I'm just drawing pretty boxes in (apparently) extremely inefficient ways. I'm using 3Gb of memory with hardly anything running and although some of that is file cache, that's something that will be unnecessary soon if SSD's make their final leap to affordability.
With the limit on processor speed, people started to take advantage of multi-core. With a limit on that, people jumped onto GPU assistance. With a limit on the power that a certain size of a device can do overall, hopefully we'll go back to some good old fashioned efficient code. Like not requiring 3Gb, having dozens of "services", and lots of "frameworks" to draw a couple of 2D apps on the 2D screen (and I don't even have flashy stuff like Aero etc. enabled!).
I program myself, and I actually feel intimidated by the sheer amount of power available to me when I need it. And, yes, I get lazy and think "Ah, it'll be fine on a modern machine" but I think we'll have to go back to some decent programming again.
Of course, what will happen is instruction sets will grow (apparently the AES instructions in my processor allow me to do 2Gbit/s of encryption compared to 200Mb/s in software), chips will increase in size, cooling will take precedence, and we'll end up with huge monstrosities that still take 30 seconds to load whatever-version-of-Word-is-around.
It's both hilarious and sad that first-boot startup times, and program-first-run times haven't changed (or on average people's PC's has significantly lengthened) since the DOS days. Hell, I can emulate Windows 3.1 booting quicker than I can boot Windows 7 - and although they do a lot more now, there's not a lot there that actually ends up as end-user-visible changes.
Though people would (rightly) shout me down here if I suggested doing it for professional purposes:
Why only one drive?
RAID it, or at the very least mirror it, because the expense and hassle of your time to do all that wasn't worth getting a any sort of backup or fail-over apparently.
I'm not suggesting you'd do tape for data that you already have, or for home use (too stupidly expensive and slow), or even off-site backups. But if it took you a month to get all your DVD archive over, it would have taken only a day to slap in another hard drive and mirror it or, more sensibly, two to make a proper RAID.
Not a backup solution for critical data (can't stress that enough) but more than adequate to protect against drive loss when it's the FORMAT of the data that you spent so long getting right, rather than the actual ability to store that data.
To be honest, even their non-sandboxing "security" was present in the 80's. It's been the first thing you do in any security circle - isolate user mode from kernel mode as much as possible. Everyone else got the hang of it, Microsoft was still running user code in kernel space and vice versa up to Vista at least, and because it was for "compatibility", all doubts were ignored.
That's pretty much why Windows has such a terrible reputation, security-wise, and why viruses exist in the volume they do. Because there was little to no separation between concepts that people were separating from each other back in the 60's. Hell, some things like 3DFX drivers (I think, I can't remember the exact device) basically ran a service in kernel mode that accepted commands from user-space drivers. And you could quite literally DMA any piece of memory ANYWHERE on the machine from user-space (and not even just from privileged users). They let that junk in, at some point, and allowed it to operate.
Tell me why processes all can see the Windows folder? Why they were allowed to write to it for YEARS? Why they were allowed to keep copies of system-wide DLL's in their own folders (and thus create DLL-version hell) and even overwrite system-wide ones with their own version? Why they were allowed to kill other applications? Why they were allowed to tamper with system settings in the registry at all?
On Windows, for years, every program was equal and could do just about anything it liked, including killing off system-registered anti-malware without the user knowing. They didn't want to sacrifice DOS compatibility and rather than emulate or isolate, they just allowed programs free-reign. Only now are they realising the problems associated with that, pushing products to "fix" that, and actually doing something about the security of their systems.
Anyone with half a brain knew they didn't care about the end user security years ago. That's what we were all moaning about. Now that it's come back to bite them and can be used as a sales metric, they suddenly want to fix all that?
Letting OpenVPN in UDP mode traverse the firewall in either direction.
Because it wouldn't let you do it. And you had to use tcp mode and/or install a "proper" software firewall on the machine.
Bugged me for YEARS. Windows Firewall was fine as a basic security front but died a death under any non-standard usage.
"* NOTE: If it's a netbook with a decent battery life, the whole process can be done from the pub, if you have a pre-prepared USB stick full of tools and wifi at the pub."
And if ever people wondered why I get paranoid about not plugging into public Wifi, there's your answer.
God, I know for a fact that I just remove stuff like that from machines, even if it comes bundled, on first sight. And when people ask "Is there something cheaper to renew each year?" I go out of my way to put them on better and usually free AV and get rid of whatever they have.
I work in a school. The other week, I got my usual email that I get from my systems, which lists the domains of websites that people have visited using the school systems that week. We list it by number of visits, so the top one is more frequent and the bottom one, some several thousand entries down, is usually single IP's or CDN's. I spotted a update.mcafee.com URL in there, right at the bottom. God, I nearly had a heart attack. Had to go track the machine down and flush it out. Turned out to be an ancient laptop that someone had found from somewhere.
To be honest, personal software firewalls / AV etc. are more than catered for by freeware. And as I like to point out to people, I've never seen a paid-for AV spot anything that a free one didn't, or clean it up any better (if at all!), and in fact VirusTotal will show you that for any file just about 70% of all antivirus packages out there will totally miss it.
The difference is, it's a real nightmare to stop Symantec / McAfee software running and get it off the PC when you do need to and it bugs the hell out of you. And, of course, they expect you to pay for it at some point.
I don't think interest in traditional PC's is waning worldwide, and I certainly don't think that anti-malware writers are suffering because of that even if it is. They're suffering because people are slowly realising that they are nothing but a miner's canary anyway, cost the earth, bug the hell out of you, stick to machines and won't let go, and slow the machine down by ORDERS OF MAGNITUDE.
I own a tiny piece of amber with a little tiny fly in it. I used to have the details of the exact dates, etc. but can't find them now.
I find it wonderful, even if it's nowhere near as magnificent as this one. I actually kept my QX3 just so that I could look at the insect (which is very near an internal crack / impurity in the amber so is difficult to spot from some angles).
It always makes me wonder just how they get caught in it - I mean, did it drop from the tree onto their heads (surely that would squish them slightly), did it ooze around them (and then you'd have expected the spider to let go or be seen to be moving away), or what?
My tiny fly, hell, it could have been dead before it even ended up in there - it's hard to tell. But this one makes amber take on a whole new menace for insects. Future sci-fi plot anyone?
Yep, but almost every English home has only an electric fridge / freezer and almost no-one has a gas freezer in their house (maybe in something like a caravan or boat or similar). I've seen them, and know of their existence, but to say they are rare is an understatement. Which kinds proves the point above about people rushing out to buy such specialist equipment should blackouts become a regular feature.
And though freezers can last a day or two when kept closed, there's no point having a freezer you can't open if the blackout lasts more than a single day. On day 2 everyone will be going out to find a) a generator for their existing freezer or b) a freezer that doesn't need a generator.
Which is the whole point. Most of the UK housing is pre-designed to rely absolutely on electricity. Hell, a lot of boilers won't ignite without electrical power too, even if the gas stays on. You can have an all-electric house (quite common, in fact) but I've never seen an all-gas one. And nobody has water storage tanks because their water supply also is pretty reliable. All these things have come about from a reliance on electrical power, in one way or another.
So when the blackouts occur, people won't be using dual-fuel fridges in replacement for their huge SMEG. They'll buy a generator, and run the whole house with a £200 transfer switch and a visit from an electrician (I just went and priced it up and found an electrical guy who'd do the whole shebang, labour included, to power a house from a generator with proper legal transfer switch for £400 + generator cost). So the whole "let's save the environment by cutting back on electrical supply" is likely to do the exact opposite if it leads to prolonged blackouts.
Or they could have just built a few decent power stations and forgotten the whole "green energy" initiative nearly thirty years ago and we wouldn't have the problem for several thousand years.
Not really - the inductive load of the motor in a fridge is the worst possible load to put onto a battery power inverter (which is what a UPS is). If you have a large UPS, it might suffer it, but small ones will just blow or fail to start the motor at all. There are warnings about it with every UPS manual, if you look.
Motors generally need something more meaty, like an engine-powered generator, to be able to cope and even then you're supposed to stagger their startups to reduce the initial load. They have a very unusual energy use curve and if you don't stagger them they will blow fuses or just not start up properly at all (can you damage the appliance? I'm not sure), whereas once actually running they'd happily all cooperate without stressing the power source.
UPS would be okay for things like lighting, small household chargers, computers (obviously) but they won't run your fridge, freezer, washing machine, tumble dryer etc. EVEN IF within the stated wattage, because of the motor in it. Basically any copper-coiled electric motor of any significant power is a no-no. Pumps etc, would generally come under the same remit but something small like a fish tank won't worry your average UPS.
I believe he is. I won't be the only person sitting here thinking "Well, if it does happen, I'll just invest in a generator." Inbuilt in that decision is the capability for my own power, for conservation of that power, for choosing the right appliances for that power, and everything else mentioned.
Hell, I know people who have generators that they turn on in the smallest of brownouts to keep the freezer or fish-tank going, so it's hardly a huge step to contemplate once blackouts become common enough to affect daily life. And, when it comes to it, I have a petrol strimmer that gives 700W of rotational power for hours on end from £5 of petrol to a rod to which I could attach to any electric motor and get sufficient power out (even mains-level if I use the cheapy inverter from Maplin's that I have in my car and a suitable battery) to do something with if blackouts are going to start becoming commonplace. It might only be a bulb and the laptops, but it's going to make me think more about providing that power than anything else if I get home and have no power for the whole evening.
I certainly won't be splashing out on solar panels or home wind turbines, though, because the subsidies for those are likely to dry up very quickly in such circumstances because of their inefficiency and I'd be left with the bill and a device that will die before it will reap back my outlay on it.
You have no idea how reliant on electrical power you are until you take away modern 24/7 supply of it. My first thoughts would be "freezer, fridge, lighting, heating (depending on the time of year), fish-tanks, everything else", but over any extended period of blackouts, you'd find my basics taken care of and I'd be there watching TV (even if it's pre-recorded DVD's because the transmitters are down) - and I wouldn't be alone.
When I lived with my parents, during any sort of extended blackout (sometimes scheduled because of local works etc.), our neighbours and my dad would already have been out to the trade shops and have ready in place enough generator power to keep enough going in the house that no food would go off and people could do their household chores. Hell, I can remember him throwing single-sockets over to the neighbours and telling them to plug their lights into it.
Electricity is the one thing you can make on your own, even if you have to improvise. More worryingly, powercuts will lead to a HUGE increase in the price of petrol and things like generators and two-stroke oil (not to mention the related greenhouse gases that "conservationists" want us to not put out, but which they have basically increased in such situations by not providing nuclear) - and some houses rely on electricity to do everything from lighting to heating to storing food to keeping livestock to having a bath.
Your iPhone/laptop/Blu-Ray kinda becomes useless as entertainment in a blackout if you can't charge it and though most people will tolerate one night, after a week or a month of blackouts, they'll do something about it. Which generally involves spending lots of money to become independent of the grid.
The fact you call Metro "modern UI" means you're probably an MS troll. :-)
But above that, the new UI is *precisely* the reason most people are sceptical about the new OS. It's basically Windows 7 with knobs on, but those knobs make it harder to do things like open and close doors. Windows "classic shell" replacement utilities abound at the moment and did do even for Windows 7. I've been asked by my workplace that, should we deploy Windows 7/8 next year, we reinstate the classic shell as much as possible. Now we're a school so we get stupendous discounts on MS software if we hassle them enough, so it's not like we wouldn't pay it some other time anyway, but the fact is that you can't ask a six year old to have to type a program name into a search bar to find their programs all the time when they could have just clicked on an icon - Sure, we can cater for that problem with desktop / Metro icons, but the "modern" UI is no more efficient than the old system for certain use cases.
Including, for example, mine. Where my Windows 7 upgrade (and months of Windows 8 testing) meant that within the first day, I saved more time by wiping out the Windows 7/8 interface and replacing it with a classic shell from a third party (not even a damn option, only "Microsoft Knows Best") than I did anything else. It literally was more cost- and time-effective to remove the UI and replace it with someone else's so I could get on with the work of building the PC so people could test it. IN THE FIRST DAY. Don't get me started on narrow-down search of other 1000 applications compared to nicely organised folders on the start menu (narrow-down is WONDERFUL when it needs to be used, which isn't then - I have narrow-down instant search of 15 years of email from Opera and it's brilliant). It took myself and my boss TEN MINUTES to find out how to exit a Metro app (without Googling) using only the touch interface on the brand-new touchscreens we bought using Windows 8 RTM.
Same with recent Ubuntu's. I spent weeks trying to get used to our last Ubuntu deployment and couldn't and in the end wiped out the Unity interface and all its trimmings and went back to basics. Same reasons, basically. We still haven't picked a distro for the next deployment yet but Ubuntu is looking slightly less favourable given the extra hassle we have to go to to let us open programs with one click, close programs and not have things pop up, slide over, etc. while we're working.
People work in different ways. No-one wants Metro to die an absolute death and be another dead end in the corridors of forward compatibility. All we want is an option to turn the fecking thing off and go back to how we're used to working. We honestly DO NOT CARE how much better life could be if we could all get used to it. The computer is there to obey us as a tool, not tell us how we have to work - and most of us have jobs that DO NOT need to involve learning yet-more computing paradigms that may eventually disappear when we could just do things as we've always done.
I spend enough of my life setting up machines for other people. I don't WANT to spend hours (and in some experiments with Windows 7 MONTHS) "getting to know how to use the tool" when the end result is the same as that which I have always achieved. I just want the damn thing to do what I ask and nothing more, which involves no ripping out decades of established UI metaphors in favour of shiny new interfaces that slow me down and make me remove them.
But the sad fact is that you can cause a lot more damage, inconvenience and "terror" a lot more cheaply by flying someone else's airplane into a tower. It's taken over a decade for the world to right itself after that at enormous military expense and there's still a lot of things that have never returned to how they were before (e.g. airport security procedures for travellers, certain countries' reputations, etc.)
Taking down the data networks of an entire country is no mean feat - especially given the diversity and sheer number of connections that involves cutting (e.g. taking down satellites too). And you're unlikely to affect the military because they have their own independent means of communication and, if you do, well that's an act of war and someone will get blasted back to the 19th Century pretty damn quickly.
That's the problem with Bond villains - they try to scale up before they've actually created any reasonable mayhem in the first place. Fort Knox, Silicon Valley, global media, a satellite that reflects the Sun, it's all too ambitious for a first hearing of their name and they have a shocking tendency to be susceptible to and victim of pretty young women who know their entire plans.
If you wanted to cause chaos today, take out the DNS servers. Smaller target, easier to do, much more impact (and requires little technical know-how to actually take down once you know where they are). Gain yourself a reputation, and THEN threaten things that would have huge, permanent knock-on effects. Hell, you'd probably do more damage to the world by taking out a certain software company than anything else.
They do, they just don't say they do.
I have one of the cheapest 32" Samsung affairs from Amazon (after testing it's suitability it in my local product demonstration store, I mean Dixons). It only mentions Freeview on the box, says it's "HD" (proper, not "HD Ready") but never together with FreeView. But yet it does do Freeview HD perfectly.
This is it if you'd like to see: http://www.amazon.co.uk/Samsung-UE32EH5000-32-inch-Widescreen-Freeview/dp/B007JURCH8
The fact that I never use HD channels on it is neither here nor there, but it certainly does FreeViewHD. The magic incantation to Google is "DVB-T2".
Sorry, but 1080p is nowhere near the envelope edge. It barely touchs the stamp.
We've have 1920 x 1080 monitors for decades now, and large and more powerful setups are easily cobbled together. Even today's HDMI standards beat that into the ground (even doing stereo at that resolution / refresh) quite easily with room to manoeuvre.
And even if you then take it to logical extremes and equate display data to just any other data (which it is), then Gigabit Ethernet and 10Ge have been doing those sorts of bandwidths, with even less latency, with digital data over 40-100m (not the poxy 1-2m of an HDMI cable without a repeater) runs for many years now.
That said - 4K is a waste of time and OLED has been "coming" for years now. Sure, they'll both sell units of both technologies and, eventually, one with both. But you won't see the mass scale of the recent CRT -> LCD, huge box -> convenient wall-mounted panel, Analogue -> Digital, SD -> HD upgrades that everyone has done (usually in one single leap rather than piecemeal). In fact, it'll probably be a while before you see anything like that again, as much as the manufacturers would like to fabricate glossy things that we "must have" even if we can't see them - like the Emporer's New Clothes.
And upscaling is a waste of time - it amounts to "let me blur that image slightly so you don't see the edges of the pixels from the original source".
There's nothing stopping the data getting to the displays at all. There's nothing stopping the resolutions increasing and the colours getting brighter. But there is a huge barrier to actually selling this to people - which is that most won't be able to see it and hype dies off quickly. My dad's 53" TV? You'll be lucky if he's watched anything HD on it in the last 2 years, because he doesn't have HD kit or HD subscriptions. But it was a nice big telly to replace the horrible huge CRT box he had before.
It's not "not ready for prime time" - it's more "technology will work, nobody will buy it".
+1 just for circumcisional and accompanying explanation.
"Excepting that the gates contained various technologies to ensure that atmospheres (gaseous or aqueous) wouldn't pass through the gate."
1) You spend far too much time watching sci-fi, Sheldon.
2) How does the gate know the difference between an atmosphere and you breathing, between a cup of water or a wet exterior and the sea coming through? For any race, breed, species that happens to use it (what about a slug?) How does it apply force to individual molecules to stop them transiting after having determined their part of the organic system that wanted to travel through the gate, compared to - say - some bacteria in the ocean or in your breath?
There's a reason sci-fi glosses over such things - they are impossible to work simply in practice because of such issues (e.g. "Heisenberg compensators" et al). And if you have two stargates, you have an infinite energy source, which is probably quite capable of blowing up both of them if you do something wrong (e.g. a single particle gets trapped in an inter-gate loop with differing gravity on both sides, where it then gains more and more energy from gravity until it blows the hell out of the gate - space-debris-like).
3) It's a joke about a sci-fi series, something that should NEVER be taken seriously. But apparently you missed that the first time round.