1188 posts • joined 28 Mar 2007
I said from the second I watched the episode:
He should have offered Tom the partnership on the condition that Helen works for them both and organises things. They even said just minutes before that Tom worked best and won whenever Helen was organising him on the team. She'd cost a small percentage of the investment he was putting in but would produce far more productivity from Tom than that.
Sometimes, the answer is staring you in the face, like on a previous series where they had to advertise a new music gadget - one team had the idea of saying that you can't carry your jukebox with you, but you can carry the gadget, and had to think of a TV advert idea. And literally MINUTES later they were struggling to push a full-size jukebox up the stairs into the studio in order to film their advert. Stop what you're doing, get the camera, film it, that's your advert - you trying to push a jukebox everywhere you go instead of just using gadget X. Instead, they had some overly-abstract piece of crap.
Thanks, stopped me having to come up with other excuses not to buy a Microsoft console:
1) Unfulfillable sales-spin of the highest order.
Seriously, stop talking shite. Do you mean it can render in HD? Do you mean it can render X million polygons a second with full shaders etc.? Or do you mean I can put up a pre-rendered image and say "It's as good as Avatar" (not that I've even seen the movie, nor care)? Because what you WANT people to think you mean is vastly different to what you ACTUALLY mean. What you WANT people to think is that this is equivalent to a modern supercomputer churning for months on end - in reality, it'll be underpowered compared to a recent gaming laptop.
2) The games I get most enjoyment out of are almost inversely proportional to the time / effort / processing spent detailing the characters.
Games development is already far too long and far too expensive and I wouldn't touch a game that advertised itself on the basis that it "looked more realistic". It just means they spent more time on the artwork and less on the gameplay. Any idiot can rent Avatar if that's the sort of thing they want to look at - personally, for me, a games console plays games.
3) The AI claims are utter tosh.
What you mean is that the individual pedestrians have a handful of attributes and select one of many pre-selected options as to what to do (run, shoot back, hide, etc.). That's *NOT* AI. It's simple heuristics. And it doesn't matter if you do it with one or a million on-screen characters, it doesn't mean they are doing anything other than the same pre-scripted actions that every game of that sort has. Show me a problem where you can just throw more power at something and get "better" AI and I'll demonstrate that it's not AI at all.
And in terms of gaming - I've yet to see any significant advance in "AI" players in decades. Hell, most AAA titles still have things getting stuck in walls, being lured to certain death one at a time without the rest noticing, walking into quite obvious traps and literally being manipulated by the player into dying in the way the player wants. "True" AI doesn't make for fun games at all, anyway.
Tried, Trusted, Tested.
Why would anyone even think that slapping a random chip up there would be a good idea? Yeah, for everyday entertainment, or casual use, or non-essential admin work, but for life-critical functions?
Hell, they've *already* got four of the things doing nothing but double-checking each other's calculations, and it's an incredibly hostile environment (think what happens if there's a tiny bit of condensation - you can't just open a window, not to mention flying through varying EM fields, cosmic rays, and potentially unstable power - and tech support is several months away). By comparison, the liftoff was nothing.
I'd much rather they were using these old clunkers, presumably with chips that have 30+ year errata's, billions of hours of real-world testing in commercial and industrial environments and production lines established for decades, than trying to slap an Intel with an undiscovered FDIV bug up there. Plus the old chips were predictable to an extreme degree (down to the individual cycle). Modern chips are multi-core, have vast caches, unpredictable bus intersections and all sorts of problems. That's not what you want. And 1.4MHz is more than fast enough for doing anything critical, it's going to be vastly outweighed by the time for that action to actually have a physical effect.
Boot-cycle laptops? The only times I ever reboot mine fully is when something crashes really hard (and is usually followed by days of diagnosis to stop it happening again). Laptops brought this amazing thing called "suspend" and "hibernate" to the user's attention, and it's now an option on every shutdown dialog on every operating system in common use.
My laptop is XP too, so the "uptime" of the single session that's been running on it for months a time is huge and quite impressive by now - XP isn't the most stable of OS but it obviously still pretty damn good, without ever seeing a BIOS screen in the meantime (I much prefer suspend because it can last being "off" all day and it's rare that the laptop is off for more than an entire day, and it resumes instantly).
More importantly - why not just have the SSD-caching HDD's that are coming out now and fit in a single 2.5" drive and would presumably take a lot less power? Or just two HDD's in a RAID setup? Dual drives in a laptop just sounds like the perfect way to increase power, costs and component count, i.e. things to potentially break, without getting much benefit for it. Gimme a laptop with two drives and I'll stick two SSD-caching HDD's in there, mirroring each other - and get better power consumption, better data security and probably cheaper too.
If you read up on evolutionary history, basically everything dies off quite regularly (as in 40-50% of species just disappear) and what doesn't die off evolves into different species to fill the gaps (it's believed that a lot of the "this species is critical to this environment's survival" assertions are a load of crap). The world copes and moves on without that ultra-rare toad which probably hasn't made any contribution to the local environment since it had a population of more than 100,000.
Think about it - if it was endangered or rare, and nobody really noticed apart from the scientists looking for that exact species - how large exactly was the part it played in the local ecology?
The biggest "problem" with species dying off is that they may have stuff we could use, e.g. if certain species of animals die off or kill off some associated plantlife (because it's not longer fertilising it), etc. then we may not find things that plant/animal life can do - e.g. produce human-painkillers, help us locate the DNA segment for property X, let us find missing links between species etc.
Species are created, evolve and die every single day, globally. Unless you can prove that one has a dramatic effect on the local environment or something to really show science, there's little reason to go to special efforts to keep it (artificially, in captivity, in man-made environments, etc.). Do we moan the millions of subspecies of cow that haven't been chosen for breeding because they don't produce a lot of milk or make manky-tasting beef? Not really. But apparently a frog that nobody's seen in decades is "important". Personally, I think you'd have been better off just leaving it alone rather than acclimatising it to human contact which is likely to end up killing it (either because it gets too friendly and runs across a road, or because it catches something from us, or because it starts to feed from human waste and thus becomes a dependent pest).
Every single time humans interfere with species distribution and movement (e.g. introducing cane toads to Australia) we make things worse even if we're supposed to have thought everything through. Now most of the pandas in the world are in zoos - an environment that they are notoriously inept at breeding in, for instance.
Leave the fecking things alone and stop going looking for them. By the time you realise there's only a handful left, they aren't important to the local ecology any more and just become an administrative and financial burden for zero practical gain except for some PETA nuts to go "Aw, we saved him!" until the animals die in captivity.
Well, I'd never heard of the CISSP so I went googling:
After ignoring the paid ad on Google:
"CISSP Courses - Get CISSP Certified In Only 7 Days
Pass First Time Or Train Again Free"
I hit upon this link to a book on computer security which contains pretty much that as a real example of a question on the CISSP:
And given what I do know of MCSE exams (and precisely the reason I avoid them), I can quite believe this to be the case.
So it seems he's not talking all-that-much bullshit after all.
Tor isn't as safe as you think. It really isn't. Even the authors admit it has a lot of weaknesses.
But you're missing the point - monitoring EVERY node on Tor is an incredibly stupid idea to try to "catch" someone or find out new people. If you want to do that, you just tap existing known trouble-makers and they'll lead you straight to it without having to crack Tor, or PKE, or anything else.
Tor is nothing but an anoynmised network if you're not really interested in who's on there. If you are interested in someone there, then you already have everything you need to find out who they are by other means (doing your job - espionage). Tor has no end of problems with regards to exit node, groups of hostile nodes, etc.
And, tell me this: When you installed Tor, did you check the download against the PKE-signed certificate verifying its authenticity that you obtained from a completely secure channel - the only one I can think of being direct from the Tor authors (and, obviously, know that the Tor authors are trustworthy?). Or did you just click the link and install direct from an unsecured website KNOWN to host a government-opposed secure communications network without checking whether that request was intercepted in any way?
Any decent terrorist worth their salt wouldn't be caught by association like this anyway.
Hell, Bin Laden "supposedly" evaded capture for 10 years by just couriering his messages to a cybercafe on USB stick. If you had any brains the actual connection would just be to a huge public hub frequented by everyone (e.g. Skype) and the actual communications would be encrypted with the best thing you can find (not some roll-your-own, or depend-on-company-X, encryption).
PK Encryption was INVENTED by GCHQ in order to avoid being heard by foreign hostile militaries. Since then, the process and implementation has expanded to cover all sorts of theoretical and practical holes, been rewritten from the ground-up, and been analysed by every mathematician studying the areas for weakness, to result in modern-day open-source encryption products that have *NEVER* had decryption of their data without the key proven by any court or individual in the world, for at least the last couple of decades.
PKE is just something that you have to accept won't be monitored or "broken" within a useful amount of time to even the best military. If military's are demanded that classified files are encrypted using it, are using it in their weapons guidance and other communications, you can be pretty sure you're not going to just lever it open with some backdoor. Let's call PKE pretty much "unbreakable" in any timeframe that you'd need to break it in.
Given that, the only way to "find" a terrorist is to monitor known terrorists and see what they do, who they talk to, what websites they go on. That doesn't need a full-blown, every-person monitoring programme. It just needs you to do some good old-fashioned spying for much greater effect than trying to "crack" their encryption or decode their messages. GCHQ are trying to stay relevant in the modern era - one which they invented - and are failing. They're not "spies" as such, their communications interceptions now rely purely on the mistakes of others rather than their skill, and they are facing rapidly shrinking budgets and usefulness. Saying they just need more money to be able to perform the impossible is their way to continue existing.
All your doing is a bit of graph theory and probability - badly. It's not going to catch anyone with brain enough to plot something major, and it's *not* going to magically tell you that X is a terrorist without an awful lot of ground work that you were doing anyway. And it will be outclassed by just tapping the ONE guy you know is already a terrorist and see who he's communicating with - which would probably be made easier by just slapping a bug on his residence.
GCHQ were great, once. Now they've realised that they've obsoleted themselves, and are trying to blind people with large figures for the promise of impossible capabilities. Sadly, they'll probably get it.
PvZ is *Unlike* that regurgitated junk (DNF) - hence PvZ is good.
The balance in PvZ is annoyingly good and makes you want to play it more. But then, so did Peggle, and Bookworm for myself. PopCap knew how to balance a game perfectly, no matter how simple it seemed.
And already, I'm referring to them in the past tense!
Nah, I was playing that for hours when I got it a few months back. Stupidly addictive, unlike most of the regurgitated, steal-an-engine-and-make-it-prettier junk that's out there.
Here's a hint. You're suck on a desert island with only a PC and a single game installed on it:
Duke Nukem Forever, or Plants Vs Zombies?
I think I know what most people would choose.
Knew I should have bought the PopCap Complete Collection when it was on the Steam sale last week. How else am I going to make sure they don't ruin all those perfectly good games forever?
It's called "supervising your children". Some parents do it in different ways (i.e. physical supervision), some parents do it like this (making sure they aren't in a bad crowd, making sure you'll be made aware of anything that might crop up that might be undesirable), some parents don't give a damn.
It's the digital equivalent of phoning up the parent who's having the sleepover birthday party to "get to know them", or chatting to other parents at the back of the Scout hut for a few sessions to make sure they're alright. It's not that big a problem, and at least it means the parents give a damn, which sadly few do nowadays.
The best way to ensure that people aren't sending your kid rude photos and hurtful messages is to sit next to them and read every line before they do. The next best way (that also gives them the freedom to explore, be responsible and make decisions) is to keep an eye from a careful distance and just make sure they're within the right crowds. That's all this is.
Yeah, that was my immediate first thought. Who are they to say how many people I can have accessing their disks or not?
It's not like they're actually doing anything other than offer disk space over a LAN, and many of their rivals are either open-source or serious kit. A small office isn't going to need more than a couple of disks in an array and if they do their existing servers and/or a cheap box from any network supplier will do it. What exactly does the Windows software offer over and above file access? It's very doubtful that it's worth a per-user-licence.
Anyone seriously connecting a Windows XP box direct to the Internet is going to have bigger problems than window-scaling anyway. And all that window-scaling stuff hasn't ever been a problem for anyone except possibly CERN. If you're trying to use connections with more than a couple of hundreds of milliseconds of latency, there's something wrong anyway - something that TCP window scaling can help only a little (10-20% at best - it's not the miracle worker you claim it to be) but can't "solve". There's also a huge range of other problems that are infinitely more likely to be the cause (the remote server / connection is just that crap being one).
In the last 15 years, I have never needed to tweak or enable TCP window scaling to download at phenomenal speeds on any operating system. If I did, virtually all of the machines I've ever used commercial actually have had intermittent routers, proxies and caches that would take care of such things on my behalf (the "connection" is only to them, not to the outside server). Even then I'd only save 10-20% of the connection time, if that, not some mythical 5000% that you are claiming.
Additionally, if it was really that big a problem, everyone would be madly upgrading to 7, patching their registry or just using Linux. It's not. I don't know of a single commercial deployment where they routinely increase the TCP window scaling, or have cited it as a reason to upgrade. And to be honest, the speed difference in copying local-network data to/from local shares between XP and Vista/7 VASTLY outweighs anything TCP-window-on-huge-latency-connection wise - what you win in TCP-scaling to remote sites you lose (quite literally) 50-100-fold on copying files to/from local network shares.
The 64-bit thing, we agree on. But as yet I haven't hit a single deployment or even personal use case (*cough* games *cough*) that requires (or would even benefit enough) from more than 4Gb - actually 3Gb because my laptop steal 1Gb for it's nVidia chip and various other bits. When I do, I'll be upgrading but until then it's not the big deal you make it out to be.
This just in
Even better now - http://www.thinq.co.uk/2011/7/12/windows-8-will-run-all-current-pc-hardware/ is a quote from an MS exec that says that anything that runs Windows 7 now will run Windows 8 when it's released later this year.
Next up from Microsoft: How to kill Windows 7 deployments with two simple steps:
1) Announce that XP is support for another three years.
2) Announce a new OS coming within the year.
3) Announce that any investment in 7 will be the same as an investment in 8 a year later.
If they are to be believed, I'd cancel all my Windows 7 deployment plans now... if I had any.
Why would my company spend vast amounts of money fixing something that ain't broke?
What makes you think that having your systems infected is in any way a preventable thing, or anything more than vaguely related to the last OS security update you installed? If you're doing things properly, it doesn't matter what OS is in use - people can't open attachments with .EXE's or even HTML code in them that hasn't been scanned / sanitised already and they shouldn't be able to execute programs that haven't been authorised.
Preventing malware infections is a way of life for IT guys, and we deliberately limit things as much as we practically can and spot suspicious activity. In a properly locked-down environment, you have no need to know what OS is running at all.
MS stopped doing most security updates for Windows XP years ago, they only ever do the most serious now (i.e. ones that a huge, public virus exploits and causes millions of infections) so you've been in pretty much the same position for the last few years anyway. The point is that if you properly secure the points of entry, and properly sanitise anything passing through, and properly look for anything that's slipped past, what OS is on a desktop is neither here nor there.
Having the latest OS software updates is a useful tool but it's not the be-all-and-end-all of security. For a start, you're vastly more likely to catch something through an application exploit than anything else. Having the latest OS software updates applied without proper testing is vastly more likely to end in tears, though, especially if you deploy them automatically to every machine. I've had more downtime because of Windows Update than I've ever had through virus infections on the networks I manage.
God damn it
I better get a move on.... only 2 more years before I have to start looking at alternatives and see what's for sale then. Because unless you work in the largest of organisations, a year is a perfectly adequate time to do a serious amount of testing and then a massive one-off upgrade for something like that. And that's if I decide that we actually *need* to move to something else at that point, and that we will move to Windows, and that it will be Windows version X (whatever is best at that point).
Shame, because if you'd sorted out your educational licensing, Microsoft, so that I didn't have to pay annually for something I originally paid for once and owned a perpetual license to, then I'd have been on Windows 7 last year.
In the meantime? 3 years? That should see me into Windows 8 at least, by which time Windows 7 will be cheap and stable and I'll know all its quirks, and then extended support for that will last me until 2020 at least (assuming there isn't another endless deadline-slip like there was for XP). I'll set my calendar to remind me in 2 years to check the end-of-life date again, but that's about it.
But VISA and Mastercard are under no obligation to process a payment from or to anybody. They are merely a third-party company (not a "bank") that may or may not extend you a line of credit (that happens to be available to use from high street shops and Internet sites) and may or may not accept or refuse any individual transaction for just about any reason they like.
Credit cards aren't "currency", they are credit. That's how their business works - you're taking a short-term "loan" out, to pay it back later (end of the month) and it just so happens that the credit on this loan is accepted by lots of shops as a method of payment. It's *credit* that is given voluntarily to a user and can only be used where such credit is recognised. If you're not credit-worthy, you will never get a credit card. Similarly, an "Icelandic Bank Credit Card" is likely to be worthless and accepted nowhere, a bit like American Express.
The problem is not the credit card companies - the problem is that people are so reliant on them that it's somehow shocking when they refuse to deal with you. You want to use their services but they are under no obligation to let you unlike, say, certain banks. And anyone can say "Sorry, I don't take VISA" in the same way they can say "I don't take AmEx" or "I won't take 50 camels in exchange for this product".
VISA and Mastercard are US-based, so they have to follow US law. If they are being stamped upon by the US, they will never do business with you and/or never extend you credit. Stupidly, both of the largest credit cards in the world are US-based, apparently.
They are under no obligation to deal with you. They are also under immense political pressure NOT to deal with you. This isn't *their* fight. And nothing in the world stops you taking cheques, cash, bank transfers etc.
However, the fact that it was invented twice, by two groups, without knowledge of previous work, means that it was probably "obvious to one skilled in the art".
Basically, "encryption" has been about since the Ancient Greeks. The only step forward was public key cryptography using products of large primes. Prime have been around since the Ancient Greeks, and were always known to be an interminable difficult problem to solve even when it was incredibly simple (in comparison) to create such a problem.
By the same token, elliptic curve cryptography is hardly "novel" except in the use of elliptic curves to replace primes, and associated countermeasures and other things needed to take into account when using something other than primes. To call them patentable is really pushing your luck because then it takes seconds to push patents through the system claiming cryptography using just about any mathematical process that's difficult to unravel without the original "key". By the same token, quantum encryption really is a whole new way of doing things but, again, still mathematically based on knowledge that's been around for decades.
The only patents worth applying for, especially in the EU, are hardware patents - a piece of quantum-encryption kit that uses fibres to encrypt between banks would have qualified as "new" ten years ago even if the algorithm wouldn't, a use of encryption to provide (say) a certification that a card belonged to an authorised cardholder via certificate exchange, or similar - all patentable - and ALL in hardware first and most importantly.
Otherwise you get into things like the DVD CSS debacle - where people "can't" describe the algorithm even if it's just a series of mathematical steps that have been known about for centuries.
I had a minor bump with a Jaguar last year. It was low-speed, low-impact, flat-on (so no piercing corners) on a roundabout, with no other cars involved, and both cars drove happily away immediately afterwards and on to their destinations.
Normally, you don't find out how much things like that cost but because the insurance companies were arguing and bringing in solicitors I had a rare insight into the cost:
£5500 for a new bumper for the Jag.
£4000 for a hire car for the owner while the work was done.
Considering my insurance was only £1000 a year, I had a good deal. I repaired my own car privately because I only had third-party insurance. A new bonnet, bumper and headlights came to a grand total of £100, fitting was free (but less than a day's work), (so you can tell the damage was ENORMOUS....) and hiring a car for the day would have cost me about £100 again with something like those StreetCar deals (even if you weren't already a member, and including comprehensive insurance for the StreetCar). My car passed an MOT a year later without any additional work to correct the repairs or anything. Call it even double-or more than price and still it doesn't come close to the price the insurer's were paying each other for the claim.
And then you wonder why insurance companies charge so much. Technically, I have to be their customer for 9 more years, accident free, before they start making profit on me - and I'm hardly a boy racer, and they renewed my insurance this year for the exact same price even though everyone else's went up by 50% or so last year.
Okay, so let's say that Indian drivers crash more than Scottish ones (or the other way around, I don't really care and aren't trying to be racist).
Obviously, that means we can charge them more, right? Because they are statistically more likely to have a crash, that makes it okay, doesn't it?
If a simple extension of your argument leads to such (illegal) absurdity, then you have to be very careful about thinking through the consequences of your original argument. That's what the courts have done in this case.
"Sagittarians can get cheaper insurance with us!"
So, is insurance a blanket cover for possibly eventualities, or is it a person-specific risk-analysis of their particular behaviour. It's fast become the later which basically means that "insurance" is an incorrect term and a waste of time, because you end up basically paying all your own bills for everything you ever incur, rather than have a smaller blanket cover to cope in the event of a one-off rare expense.
The point of insurance originally was that millions of people would pay into a fund that would then pay those unlucky enough to have a problem needing payment MORE than they could afford in, say, a single year. (Like pensions - everyone pays a bit but not everyone will make it to retirement age, but now that's not true either so we're all just paying for each others retirement, and moaning whenever someone suggests putting up the pension age) Not everyone would claim it, because not everyone would have the problem, but yet everyone pays a small token payment to cover it all.
Nowadays insurance is basically "You pay it" over a slightly extended period - if you have a rare accident, your premiums skyrocket no matter what company you go to, to recoup that cost in full as quickly as possible. If it wasn't for the fact that certain insurances are compulsory, I wouldn't have any at all - it actually works out cheaper for most things to just save money yourself from the same premiums you would end up paying. I'm honest, though, so in fact what it does to dishonest people is they just stop having insurance and pay the fines for that action instead.
Take, for example, public liability insurance. I can't be expected to pay someone £1,000,000 compensation overnight because of something stupid that I did, or something totally unforeseen. But the premium payments for such things are *TINY* in comparison because they are so rare, and I can afford that - and so can millions of others who then cover the base costs of the rare accidents. In that situation insurance actually gives the consumer an advantage and a reason to have it. But in car insurance, every single risk category ever available is taken into account so that everyone basically ends up paying their own bills - or, worse, far more than their fair share just because they are male - whether by settling privately, or by paying such HUGE premiums to the insurance companies to cover your costs + their profit that it makes no sense.
What's needed in certain cases in a specific insurance that covers a lot more people. A compulsory, blanket insurance payment on your road tax, for example, and refused to anyone who has X points on their licence (what they would do at that point is a matter of policy - no be able to legally drive sounds good, but more likely would be that they WOULD be charged extortionate amounts if they wanted to get back on the road - which again brings the same problems eventually). And then you have the problems of uninsured drivers, etc. where your premiums can skyrocket because someone else was deliberately ignoring a legally-required insurance. That shouldn't be happening.
Insurance of any sort is like a Dixons Extended Warranty. Overpriced, nowhere close to the value of the thing covered, you'll never hope to use it, but if you do you don't want to be charged more for next year just because you have already broken your iPod once this year.
Insurance can either be one thing or the other - blanket cover for everyone for an equal fair share of the total overall costs, or a particular payment for a specific user (so an accident-free male will never be charged more than an accident-prone female as is currently the case, sometimes). It's currently trying to be both so the insurance companies make larger profit.
More worrying - what happens is someone provides statistics that Indian drivers are statistically safer than, say, Scottish? Sir Paul Condon-style statments aside, does that mean they are then allowed to charge me based on my nationality? I hope not, and in that case my gender is no more different a piece of information. If you can discriminate on one basis, you can discriminate on all of them, so the insurance companies will have to provide blanket cover with no discrimination or unique, personalised, discrimination and give me a lower premium than any established driver that has had an accident until I actually HAVE an accident. They won't like either, of course, but that's business.
Different problem. ADSL on the telecoms side is an incredibly "noisy" way to splice a large amount of analogue frequencies into different "digital" channels - it's more an A-D problem than it is digital - the transmission is analog to the local exchange / street cabinet and only there is it converted to a real digital signal. So you're actually arguing analogue performance again, which is a different matter (HDMI is only EVER digital). Think of it being a set of modems that all use super-high frequencies (standard old-fashioned 56K modems only use AUDIBLE frequencies so that phone transmission doesn't interfere). ADSL is a bunch of 20+ modems on different high frequencies and only some of them ever connect at any one time (with splitters to separate out the audible frequencies so you don't get more interference than necessary)
Plus, that RJ11 cable connects to your house phone wiring, shares frequencies with any and all phonelines in your house, connects to 20+ years old cabling to the exchange / cabinet, uses only two pairs at most, is subject to all sorts of homebrew extensions and convenience and - get this - there is no such thing as an "ADSL certified cable" that will give you perfect reception if you use it (mainly because you'd have to plug it in at your local exchange, but also because there's no way you could specify such a thing well enough - and if you did, BT would have to replace every bit of copper in England to correspond to the standard). RJ11 cables (actually, that's the name of the connector) are just analogue cables. Most of the time you can get away with only a single pair inside them being wired. They aren't rated for any particular frequencies or anything, unlike HDMI, Cat5, etc. They are basically a bit of copper. I promise you, you can run a phone line over mains cable, jumper cables from electronic sets, even old headphone cables - I've done it when testing internal telecoms for my employer. You could do it with a set of car jump-start leads if you wanted. You *can't* necessarily do the same with Ethernet / HDMI, and you certainly couldn't sell it as an "Cat5" or "HDMI" certified cable because it does not meet the spec at all. It's an entirely different matter.
But HDMI is still inferior to that cable that plugs into the side of your laptop and has done for the last decade at least.
Sorry, HDMI runs at a maximum clock rate of about 340MHz. Inside that, there's a useful data transfer rate of about half that. That brings it into line with Cat5 cable, which costs about £50 for 305m on a one-off purchase (connectors and crimpers included).
Since then, there is Cat5e (which can handle Gigabit-rates), Cat6, and Cat6a (which can handle up to 10Gb Ethernet and potentially more). And these data rates are at **100m*** (or 50m, in the case of 10GbE), not the paltry interconnect distances between HDMI device. How much is Cat6 cable? About £50 for 305m.
So assuming you bought a 100m HDMI cable, assuming it was running at 1920×1200p60, assuming it has only 8 connectors on it (Cat5/6 only has eight wires inside, HDMI has more than twice that for little reason), assuming the quality of the connectors was better than some manually-crimped RJ45 plug, then it would be worth about £60 (£120 if you doubled all the connectors to make up for missing pins). For a two-metre HDMI cable? I'd expect to pay about 40p/80p - call it pound-store stock by the time you package it. Stick gold-plated (pointless) connectors on it and a bit of quality control and you're into the £5 range. There's zero reason for it to cost any more than that. Ethernet has been surpassing the datarates and requirements of HDMI for decades and in fact if you want to extend HDMI, you're expected to use CAT6-convertors/extenders for it.
There isn't nothing "special" in even the most expensive HDMI cable. Either something meets the HDMI specifications and can carry the logo, or it doesn't and can't - if it meets the specifications and is undamaged you will get a perfect digital image. There might be quality issues in terms of production (e.g. connectors falling off, conductors not taking kindly to right-angles and kinks, etc.) but in electrical terms HDMI cable is surpassed by the stuff that joins your printer to your wireless router (or whatever).
Even the analogue audio-cable scam was absolute rubbish. Digital signals, there's no excuse. I can't remember the last time I saw a packet re-transmit on my Ethernet statistics on any switch or computer I've managed in the last ten years - because it just doesn't happen when you cable and connectors and in spec. And my Ethernet sockets / plugs get more abuse than the average HDMI port - hell, they get plugged in and unplugged every single day into a dozen different devices and still work flawlessly. Unless you're rolling your chair over them, they work. HDMI is no different or more special just because it's carrying audio/video data. If the cable meets the spec, it meets the spec and can do 1920×1200p60, 4096×2160p24, or whatever the relevant spec revision states. If it doesn't meet the spec, it can't be called an HDMI cable.
There's nothing in the world worth £50 for a tiny piece of leather and a cheapy lamp, especially not against a device that, second-hand, goes for roughly the same price.
My girlfriend keeps her Kindle in a pencil case that we bought from WHSmith. Cost all of £1.99 and fits perfectly, with decent protection, and looks better than most of the "official" covers that I see.
Someone please explain the difference between the manufacturer of a sub-£1 leather-bound diary that I'm bought for Christmas each year and this tat-masquerading-as-quality? In manufacturing terms, there's NOTHING complicated about a Kindle cover and in a year I expect to see them in pound shops, if not before.
Only an idiot would buy this, and I'm disappointed that The Reg didn't see fit to penalise it more in the review just on the basis of price alone. I can fill up my car for less than this cover costs. That's a scary thought.
Let's not forget the Olympic clock.
Or the ticketing controversies that are still going on.
Or some people don't like what happens to the stadium after the event.
And the logos/mascots were a bit of a disaster too (hinted at in the article itself).
And several other little niggles that just don't sit right with a well-planned, well-financed, well-managed, well-thought-through project that it should be for that amount.
I'll be steering clear of Stratford station and all adjoining stations for the entirety of the Olympics. What idiot honestly thought that was the best place to hold it?
And take two days to charge....
Re: Family photos
That part was more about the timing - takes a week to get your photos back, who cares? Takes a week to get to your sales data, that's more of a problem.
Though I have to say, my family rule is "If YOU don't have it in at least two places, it doesn't exist"... doesn't matter if that's on a laptop and on a USB stick, or on a backup server and on a CD somewhere - if they don't have it in at least two places, it doesn't exist, and even the cost of data recovery wouldn't be worth it for a family photo (I've only ever been quoted in the realm of £1000/Gb for professional recovery, and everything else was something I could do myself - i.e. just stitching bits that read okay back together).
Even if I take a copy of their photos for them, they are idiots if that's their only backup (THEY have to have it in at least two places, I will also have it in at least two places, anything less than two is stupid, anything more than two is sensible). When I backup files for work, they go to at least five different "places" (tape, off-site tape, other servers, external backup hosts,etc.) , and it's not that hard to do even for 100's of Gb's. But if a family member comes to me with an accidentally-zeroed USB stick? Unless they had backups in at least two places (in which case, why do they need anything recovered?), or they pay me something approaching professional data-recovery rates plus a generous commission, they can revel in their own stupidity.
YOU need to keep copies of YOUR data in at LEAST TWO places and not rely on anyone else to supply those. Everything else above and beyond that (giving your data to me, or random family member, or random cheapy-backup-place to keep) is optional but recommended and DOES NOT count towards the AT LEAST TWO that you need to keep.
It's amazing that 90% of people will ask you where they can download MP3's within minutes of them getting a new computer, and nobody has EVER asked me what would be a decent way to backup (even if they HAVE lost files before) when they do so. Hell, I've never been consulted until after a dataloss, when they are shocked that the cheapest-bidder 3.5" USB hard drive they cart to work every day in their bag died or was so full of bad sectors it wasn't storing anything useful, and they lose everything they thought they'd backed up.
Not the only problem
How long will it take to download 1Tb over your Internet connection should all go wrong? (At 10Mbps, that's about 10 days, I think). And how long did it take you to upload all that (I'm guessing a LOT longer)?
Not that that's a massively important metric in backup, but hell is that inefficient and slow, and seriously knocking up your bandwidth allowance (which in the UK isn't exactly great). Hell, with some ISP's, you would get kicked off their monthly plan before you got 3% of the way to downloading that data again.
There is more to backup than just safety - convenience is also a factor in anyone that cares about their data, e.g. business. If you told a business that it would take a week to get their data back, after they'd paid for very expensive daily backups, they'd cut their ties with you - unless it was absolutely their last resort to get that data back.
Family photos - not a big problem. Your tax/sales data - potentially a big problem. Household backup is nowhere close to commercial backup and even in large companies, I'd expect the local tech to have LOTS of on-site and off-site *PHYSICAL* copies. Hell, I make one backup to a drive that sits on top of the computer it backs up - when I *do* need to restore for anything other than complete catastrophic failure of that room, it'll be cheaper and quicker to do so from that than from any of the other professional backup solutions in place. It only takes hours to restore from a local disk, it can take WEEKS to restore from an off-site host. Plus, you have absolutely NO way to verify that the backups are uncorrupted - unless you *want* to do a two-week-long restore for every backup.
Cloud backup is a silly idea, except as one tool in the "If I have half-a-dozen different ways to do this, and use them properly, something will *definitely* restore should I need it" armoury. And personally, for the price you're paying, I could probably rent a server / hosting package cheaper and at least then I could check that things were being stored properly myself.
All well and good as an anecdote - but what OTHER equipment did you interfere with in the process? How do you know? What if 41 of you guys get together and try to use the devices all in range, all at the same time? What if your radios have powers capable of reaching across km rather than 100's of m? What makes you think something running in 2.4GHz cares about what it stomps over compared to something that could, potentially, be sitting right next to an emergency frequency and not have to interfere with it? What if some device comes online and stomps over BOTH channels simultaneously?
And let's be honest - a model flying area is going to be pretty empty in terms of radio frequencies - unlikely to have microwaves (i.e. something not DESIGNED to be a radio but emitting anyway), wifi, bluetooth etc. abound in large SNR's. Hell, I can't pick up my own wifi at the end of my garden - the local park must be relatively dead in comparison.
You're talking two entirely different concepts - picking up a pair of relatively empty frequencies from a range deliberately reserved for hobbyist use, so that you can communicate with a low power device that's nearby (and sod everyone else trying to use that frequencies) in a way that interference causes less than 1/50ths of a second renegotiation on a channel that can theoretically communicate in the order of 10^9th's of a second - versus - a wideband, hole-spotting, "unlicensed" carrier that could be pushing MW or mw, over kms or metres, right up against working, reserved frequencies and that *can't* just stomp over anything at all (because even a silent channel, or one displaying static, is pretty indistinguishable from one that is idle and/or encrypted properly), and that *everyone* has the same device in a small area and tries to use it.
So, they migrated from a DEC Alpha to an "upgraded" system. That upgraded system couldn't then cope without further "performance upgrades".
1) Not much of an upgrade.
2) You wasted a lot of your money to get back to *exactly* where you were when you started (if not even further behind than that).
Great advertisement for a IT company, that. We can't handle moving a few million records of a million customers without ballsing it up, even with access to all the computer equipment we sell every day, and months of migration time. And to boot, the new system can't have been doing anything that the old system wasn't capable of anyway (given that the new system slowed to a crawl and died).
And whatever happened to proper testing - rather than ballsing up the back-end of a major company for months trying to fix things? Hell, record all transaction on old (live) system, playback on new (testing) system and see if it copes as well as you hope. When it can take those same queries live and return the same answers every single time, within good time, then you can slowly migrate the client PC's one at a time and not suffer a second of downtime.
Modern-day IT: Whack the version number up, buy more computers, put a bigger processor in, don't bother to test, end up with less functionality, break lots of stuff, spends months "fixing" it and then call it a success.
Bought a GoDaddy cert recently
Cost me £50 for 5 years - only basic validation obviously, but it's hardly bank-breaking stuff nowadays. Hell, I pay more than that to the company hosting my domain name, and my hosting is another bill of a lot more still.
If you have a need for SSL for securing transactions, the prices for basic certs are a drop in the ocean compared to everything else (e.g. PCIDSS standards, hosting, commission, etc.). If you have a need for SSL just for the encryption, then you can get stupidly cheap certs that fit the bill.
And when it comes to security, I'm not sure "Our website uses a free SSL certificate!" is particularly confidence-inspiring in either the website or the certificate issuer.
Never read Da Vinci Code in my life. Dan Brown is actually a pretty pathetic author by all accounts that I've heard, even from people who *have* read his books (a group which, in my experience, are even less well-informed than most bible-believers). He's famous because he made a book containing outrageous unproven (and often blatantly false) assertions that happened to be condemned by the church. Call it "Life of Brian" syndrome - banning something is the best way to give it free publicity.
So you admit that most of the gospels weren't included because they weren't interesting enough, and that someone sat down and selected them based on such criteria? And consistency is an incredibly dangerous ground to step on when comparing gospels. After anything not consistent was disposed of and thus what remains must be true? Logic tells me that the correct answer is that after anything not consistent was disposed of, what remains must be... consistent. And lacking an awful lot of extraneous data which would otherwise degrade the signal-to-noise ratio, which would affect people's "nonsense meter" much more quickly.
Dating? Try 80AD for a lot of the rejected gospels - a timeframe smack bang in the middle of the "accepted" one's dating (accounting for reasonable error on both counts). St Jerome? Hired by the Pope 400 years after the described events in order to SPECIFICALLY re-translate the four chosen gospels only? No bias there. And we only know the sources that are still around because they are still around. By definition, destroyed ones would have been... destroyed.
You appear to have a blind obedience to the accuracy of a particular set of documents (ironic in itself). Until you can ask yourself why that is, there's really nothing more I can correct (or at least disprove) for you. A scientist starts with "I believe nothing" and then waits for proof of things. Religious people have a tendency of "I believe everything" and then ignore even when they are disproved. One way can give you a feeling of what's true, the other only ever gives opinion.
Wanna drive a Christian into speaking drivel?
Ask them to name the *other* gospels that they reject, and why. Ask them why, out of all the different gospels, only 4 are accepted in the New Testament, when some of the others are a) more accurate, historically, and b) more interesting, but contrary to a lot of the waffle in the "accepted" gospels, and c) contemporary to the accepted gospels in terms of dates. And those are just the ones that we know about, because quite a lot of them were destroyed for "purity" around the time they were written, or afterwards by various churches.
The Bible is a collection of VERY selectively edited highlights of stories and parables gathered over hundreds of years and constantly re-re-re-re-written by various people over the millennia, and then interpreted into whatever fashion is described by the current day while conveniently ignoring many similarly-originated documents that tell much more interesting stories.
Living your life by the Bible is a bit like living your life by a modern copy of Aesop's Fables (the originals of which would pre-date the whole biblical era by several centuries) that you bought in a bookshop, while conveniently ignoring the rest. Which of the 584 original fables is in your copy? How have they changed since the original Greek text? Which are considered politically incorrect nowadays or make no sense (there's one where an apple tree and a pomegranate have an argument before they are interrupted by a bush)? Which ones are actually originally from Aesop at all?
Except I don't know of any Aesop fable that tells people, in no uncertain terms, what they should not eat under any circumstances and yet have about 90% of its "believers" completely ignore that list. Or one where it absolutely categorically states they shouldn't worship graven idols because it's one of the most dangerous of sins, and yet its followers all carry little tiny Jesus' on crosses around with them and in their churches.
The chances that the Wii actually uses the DirectX API's are, I think, incredibly slim. Thus which number Microsoft decides to arbitrarily assign to a group of shader models, etc. is neither here nor there (and DX 11 is being ignored at the moment because it cuts out a large portion of older PC's, for instance - there's a point at which you just don't get any better graphics and just having higher resolutions or more anti-aliasing makes more difference).
If you're writing a game today, you're almost certainly targetting DX 10-level hardware too because otherwise you'll get no end of driver / performance bugs on even slightly sub-par machines (old, laptop, integrated graphics, etc.). So moving to a "DX10 equivalent" machine rather than "DX11 equivalent" is hardly taxing compared to porting an entire program to a whole other platform anyway. If your program only works on DX11, then you have much BIGGER problems when it comes to moving to the Wii, PS3, etc.
And the Wii itself was generations behind when it launched. Didn't seem to hurt it's five-year domination of the games console market. And the point is that, at release, it will still be probably the most powerful console available. Whether that lasts for six months or six years makes no difference - that temporary leapfrog on competitors is worth its weight in gold.
More like: Don't file lawsuits on shaky legal bases where you can't even prove that the person you sued is the one who committed the offence you're referring too, and where you basically demand money with menaces (Give us £500 or we'll drag you through the courts) before people even have a change to defend themselves.
I know someone who received one of those letters and I advised them to ignore it, because there is no way the case could ever be proven and it was *incredibly* unlikely that the person accused had downloaded what they were accused of (or anyone in their household, for that matter). And yet the letters were offering an "easy-way-out" if you admitted the offence and didn't defend try it but ONLY if you paid them lots of money (some hundreds of times more than the cost of the thing they were claiming was infringed) first. If you didn't pay, they were going to take you to court and you'd have to defend or hit a default judgement, and there was no guarantee that they'd be around long enough to pay your legal fees in that case (think of the thousands of accused who now won't see a penny of their legal expenses paid because this guy went bankrupt?)
And all on the IP address scraped from a bittorrent tracker, IIRC, which is about as technically and legally reliable as a plumber's estimate.
Not to mention
Not to mention Data Protection, import/export controls, access to data when Internet connections go down, expenditure to ensure they *don't* go down, reliance on a (money-sucking middle-man) external provider, etc. etc. etc. and then a complete loss of all investment in your current infrastructure.
Cloud is an incredibly silly rehash of existing concepts that most places just DON'T need, and in some cases actually don't WANT for good reasons. Why don't businesses move from their own controlled systems that they've bought permanently to something where they have to pay annually and have less control over? I just can't fathom it :-P
In a world where businesses are hitting serious fines and legal implications for having systems that are being hacked all the time, data is being leaked via third-parties (how do you know your cloud provider isn't also serving your competitors, or that it doesn't pass through embargoed countries?) and data is being flung across the Internet without thought, it's seems incredibly silly to suggest that we put even more data in the hands of an outside entity that *we* have to pick up responsibility for if they mess up.
As an IT guy, it's my job to generate, protect and manage your data. There's only a few steps of that process that it makes any sense at all to hand to a random lowest-bidder third party without putting an enormous burden on them that they just wouldn't be willing to sign up to. Hopefully in a decade, the word Cloud will seem as outdated and hilarious as the word "E-commerce" or "Push technology" or "The Information Superhighway". Cloud is a marketing fad, and as such is rightly ignored by anyone who would be knowledgeable enough to actually decide on the pros and cons of such things. Hearing it turns me off products, services and service providers. It has its uses but they rarely ever push the "cloud" angle even when they do use it. Pretty much the same as SaaS but there are people being forced into that as well.
Why is no-one using it? Because no-one wants it or they can do better themselves. Same old IT story.
So instead of everyone having a single hard disk that they keep their data on and buy a new one only when they replace their computer, Apple are going to need to buy several disks for everyone, in a highly-redundant configuration, with off-site backups and caching SSD's and content duplication for CDN, etc. plus spares and replacements because they are running 24/7 in order to serve them the same files.
Yeah. That's going to REALLY hurt the hard disk industry if that takes off. What are they thinking?!
(Sarcasm has now left the building).
Putting your data only on the cloud is still a stupid idea no matter who's behind it.
Biggest problem will be price. As it is, Wiimotes are *still* £30-something even after five years of production, not to mention MotionPlus, Nunchuk, etc.
That was always the biggest barrier to > 2 player gaming on the Wii for me, if I'm honest - you only ever get one with the console and you're usually willing to push to a second if you get it "free" with a decent game, but after that there's little incentive to continue because of the price and even the availability in the early years - they could easily hit £50 for a full set for a single player. And that's not even considering the WiiFit which is basically a one-per-household item, if that, and is still prohibitively expensive even now. You should not be paying almost as much as the price of the console in order to get two players on a par in terms of controllers.
At one point people were hacking the Wiimote to be things like presentation remotes, etc. but when you consider the price of the hardware (including a standalone sensor bar in that case), it's actually just cheaper to buy a presentation remote or a cheap interactive-whiteboard-type sensor that sticks to the wall.
I don't doubt that Nintendo's bulk-buying could bring the price down enormously but they didn't really do much to accelerometer prices in terms of the Wiimote's final price (or even that of it's knock-off rivals). And they'll pull the old trick of only giving you one controller because "you can just hot-seat". That said, Nintendo tend to know what they are doing and will make lots of money off everything through sheer popularity and business sense. But it's annoying that the "console" price never includes two complete controllers any more, and that they push 4-player or more games when hardly anyone can afford that kind of setup without carting equipment round to other people's houses.
Amazing then, that in couples that try for pregnancy deliberately without any form of aid whatosever, you're expected to give it at least a year before you even think about seeing your doctor about it, and a couple or more before they think you might have a fertility problem, and a few more before they will actually do anything about it (if you still want kids that badly by then).
It's a statistical game, always, but sperm can slip through solid rubber (yep, you don't need to have any kind of tear), avoid the women's monthly shedding of the uterine lining, work its way up the tube and STILL fertilise an egg just the same. The chances are low but even with all the precautions in the world, you'll be lucky to achieve 99.5% avoidance.
Having just watching Inside The Human Body, I can also tell you that probably only half-a-dozen or so sperm would get to the egg even *with* perfect conditions and no contraception. That's *why* there's 250,000,000 of the buggers in the first place.
And anyone who relies on JUST a male pill is as silly as someone who relies on JUST a condom, or JUST a female pill, or JUST a diaphragm, or JUST a coil, etc. Without putting all the burden on the woman's body (have you seen the changes that happen when a women goes on/off the pill?), it would be a cinch to just get the man that you want a serious relationship with (serious enough to discuss contraception) to take a pill too to increase the chances at virtually zero cost.
P.S. I'm a man. I have a (planned well in advance) daughter. And would seriously consider taking these if they were free on the NHS.
Start... Run.... cmd...
You mean Start... Run.... cmd.... first
Troll... because the servers were fine. It was a bad CLIENT update that sent things dolally for those users with "Auto-update" turned on. E.g. My Skype was fine all yesterday.
Hold on - you're worried about latency and interactivity speed enough to "justify" internationally-hosted servers but then you use virtual machines on a cloud infrastructure? Eh?
An international ping should never be more than about 100-150ms for anyone who's actually intending to make use of the Internet and it's so tiny an interval that download speed etc. are much more of an issue than how long a command takes until first response - you're not Google and don't need sub-microsecond response times. If you do, then a cloud-hosted VPS is about the worst possible thing to use because there is no guarantee of what the latency will be at all (for a start, you have no idea where the server is actually located in the target country, or what bandwidth is available, or what portion of CPU it gets, or when demand is going to go through the roof on nearby servers etc.)
Surely you would have been better off just getting a handful of smaller dedicated servers in this country? It would have been cheaper, I think, they would have been better specced, you would have more control over them, there would be no implications of foreign / international legal systems, and you wouldn't have any more latency issues than someone abroad pinging a site that doesn't use international servers (e.g. the BBC which is hosted almost entirely in the UK, I believe). And if someone like the BBC doesn't need local servers, it's doubtful that you do.
Also, your bandwidth could easily be sitting on a bog-standard 100MBps lines that they have running through most datacentres to supply racks and thus make your bandwidth / spike / response time / billing worries moot. You also won't be getting a bill next month of ten times the price because you hit an infinite loop. The "overall" traffic figures you wanted came out to about 3-4MBps constant - obviously there will be peak periods and spikes but that's not huge by most dedicated-server standards.
It seems to me to be a cloud advert (and although you mention problems with the very first step with some providers, you neatly gloss over them) rather than an article. If you're seriously considering running with 500Mhz and 512Mb, then you are vastly over-estimating the CPU power needed too - a bog-standard, bottom-of-the-line dual-core server could probably run 8 or 9 of those without too much trouble - and that's if you just moved those same virtual server images onto a single dedicated server. If you're paying £250 a month at the moment for a single colo, you should really being upgrading that hardware and/or instead buying 4-5 cheaper, smaller, dedicated servers from a variety of hosts (and then hardware failures don't need to factor into your costs at all).
With proper load-balancing I see no reason why you would need to own a single byte outside of the country you're in. It just seems odd, like every other "justification" for cloud servers that I can imagine.
Working somewhere that uses BT Business Broadband, I don't think we're at risk. The BT router went into "long-term storage" the second it arrived, for offering crap like free wifi to anyone who walks past, free pass to the BT engineers, etc. and yet no capability to simply forward all packets including DHCP.
We had replacement modems on order before the boxes even arrived. Like to see them sniff past the modem that connects only to a Linux gateway that does actually, proper, firewalling, NAT and filtering.
But this is just yet-another-reason not to trust BT equipment. What next? They team up with software companies to snoop your hard drive to see if you're infringing their licenses - all totally "legit" of course. Even speaking as someone whose job involves licensing compliance, that's just totally out of scope of the supply of a broadband line. My MAC addresses are personal, private information and uniquely identify particular items of kit that you have no business knowing. Try that on my networks and see how the lawsuit from my workplace reassures you. You forget that for every user that HAD the device, a thousand users who DIDN'T still had their networks snooped for it. That's not on, no matter how passive or well-intentioned the attempt was.
The unofficial rules of escaping zombies...
1) Don't move. That "safe place" you've been told about 50 times that's several miles (and a zillion zombies) away will be way overrun before you get there, to the surprise of your entire team (despite the fact that millions of others were told to flock there and most of them would have become zombies eventually).
2) If you do move, don't faff about. Although there are some great improvised weapons, nothing quite beats a few lorries moving at high speed all together as a team. Weapons are your backup for when the fuel runs out in every single vehicle.
3) If you can't get away safely, find a huge, big solid wall, find a decent corner with another huge, big solid wall and camp in it. There's always one somewhere. Do not use plasterboard walls.
4) Check the roof, and the floor. Nothing more embarrassing than being dropped on or tunnelled under when you're in an otherwise safe place.
5) Chokepoints. Don't stay in a room with eighteen doors and windows. One. Your escape route will be overrun too when you least expect it so don't bother with an emergency exit - one doorway, let them come at you over the heaps of furniture while you all aim everything you have at them.
6) If ANYONE in your group acts weird, shoot them instantly, and then shoot them again to make sure.
7) Don't worry about the miniscule risk of hitting an innocent civilian or helpful health carrier amongst the horde. They were going to die anyway.
8) Don't rely on the Army, Navy, Air Force, National Guard, Police or anybody else to help. They won't. All they will do is produce an endless supply of heavily-armoured zombies.
9) Be suspicious of every new recruit - put them in a quarantine group basically-forever and don't let them near the group you KNOW isn't infected.
10) Expect someone completely at random to get infected.
If your IT guys can't keep a handful of virtual / dedicated servers online, you don't have an IT department, more a cage full of monkeys.
And you *can't* host entirely in-house, except for maybe a single set of mail servers - that's kinda the point of multiple MX records and redundancy. And if you *can't* manage a couple of virtual / dedicated servers doing email for a domain, then obviously anything "external" (cloud or not) is the way to go. It was called webmail. People who used it for business purposes were kinda looked down on unless it was an in-house webmail.
The point is that if you want to keep your Data Protection Registration, you're almost certainly better off doing it yourself. Seriously, it takes about 10 minutes with Ubuntu, Postfix, Dovecot and a domain name with at least one MX record pointing to that server - though obviously a pro setup would take slightly longer - a single domain SSL certificate and a couple of lines in config files to make POP3 / SMTP use TLS, SASL etc. where applicable, spam filter, webmail interface etc.
If you were having problems with this, then your IT department are unsuitable. If they threw you onto webmail, of course it's easier for them but someone, somewhere is doing the same job they should be and making a profit from you too. And don't tell your Data Protection guy that you have absolutely no idea who has access to those stored inboxs.
Sounds good, being locked into a single provider.
I'll look into it the same day that the Competition Commission looks into the whole "You can only pay by card using Visa" thing.
Wrong and Wrong
But that doesn't mean the police know that.
There is no problem with taking photos of children (so long as it's "innocent") in a public place or in a venue you own, or even technically at a school play (but the school are free to eject you - however neither they or the police can demand that you delete your photos or anything else). People (including officers) who think otherwise are wrong. You just might have to spend an afternoon in your local nick to prove it.
"Meanwhile, Skype's 170 million users will be hooked up to Microsoft's Lync, Outlook and Xbox Live products, as well as other communities, said the software vendor."
Thanks, I was looking for the reason that would stop me using it almost immediately the rumours were confirmed. I *just* about tolerated the "Windows Live" stuff when my Hotmail account was sold to them, and I haven't used Windows Live for ANYTHING else since.
I'd say - less than 3 months before I'm made to change my Skype account to a Live (or whatever) account, and about the same until the client just bugs you forever about joining communities you have no interest in ever joining.
How does a *web*-focused strategy merge with a P2P video-conferencing software?
And I think you need to look up how Azure works, particularly the compatibility and languages and underlying server structure (i.e. buy a few thousand MS Server licenses and we'll let you have your "own" Azure cloud - targeted at Dell, HP, etc.).
I wouldn't call Bing "actively reaching out" so much as "it's there and it'll probably work". By designing something to work for the vast majority of Windows users, the chances are it'll work for the other OS users too without hassle - I think that *ACTIVELY* reaching out is stretching the truth to its absolute limit.
And anyone in the computer industry for more than a decade of so recognises certain patterns in MS's behaviour - patterns that *can* be used to predict future behaviour with a high success rate. It's nothing to do with MS-hate so much as recognising the same things happening over and over and over. Hell, most of us (myself included) are dependent on MS-deployed OS in order to make the majority of our living.
And past results have shown that just about *EVERYTHING* I've used from MS (especially the stuff that was bought up from others) eventually turns to shite. That's not a comment on how much I hate MS, but a simple observation. Hotmail? Turned to shite. And I was a paying customer for years. They bought professional backup software to stick into Windows, it turned into shite. WebTV. MechWarrior series. Visio. Rare. Lionhead Studios. Sysinternals. DesktopStandard. Multimap (now part of Bing but I haven't used it since around the time MS took over, and I didn't even know it WAS MS until I just read it). Basically everything that has been bought by MS that I used turned to shite within a year or two of acquisition (some almost immediately). And it wasn't that their functionalities were moved into MS and made the relevant MS products "better" - far from it. Most of the time they were quietly left to die, or merged in a crippled way. Hell, a lot of the MS operating system is nothing more than acquired software with its best functionalities destroyed rather than enhanced by being part of the default OS install (ntbackup can't even ATTEMPT a restore from anything other than the EXACT version of ntbackup you used to backup - sometimes down to the hotfix).
If the rumours are true, then Skype users are in for a rough ride, whether or not they hate MS and want to use Linux versions. It's just the way history has always gone with MS. I'm a Skype user - at the moment there's no reason to ditch (mainly because it's only a rumour at this point) but this will make me look for alternatives NOW (which I may or may not end up using) whereas if Company X had taken it over, I probably wouldn't bother until I saw problems.