Re: "Like working at IBM"...
Sad but largely true.
73 posts • joined 10 May 2010
Sad but largely true.
Depends why you play games. There's nothing at all wrong with Go or Chess if you play games for the intellectual challenge above all, but if you want a pleasant social gathering with a group of friends of mixed backgrounds and tastes, where the games aren't necessarily trivial but the company and the personal interactions are also a huge part of the pleasure, neither Go nor chess is going to fit the bill very well. Whereas there are a huge number of very good, mostly commercial games out there that would - but equally that most people have never even heard of (and, sadly, are unlikely to even come across, precisely because of the "board games are just for Christmas with the kids" idea that still seems so prevalent).
The huge temptation with Roborally is to get out lots of boards and then string out a massive course across them all, tucking the checkpoints into fiendishly-hard-to-reach places to boot. Whereas to my mind the BEST games tend to take place when all the checkpoints are easy to get to and the route weaves back and forth across the middle of a mere one or two, fairly-easy-to-navigate boards. That way any lead player is forever having to negotiate his or her way past the rest of the field, who also get the chance to influence the outcome. Oh - and ditch the rule about number of lives; having your robot destroyed is quite enough penalty, and no social game needs a player elimination rule unless it's vital to the game.
If you have an electric grill on your oven, turn it on set to max before you go any further.
Make an ordinary omelette mix by beating a couple of eggs and a splash of milk together with a fork.
Melt a generous nob of butter in an omelette or small frying pan on a medium-high heat, then add and fry a teaspoon or so of curry powder mix in the butter for 30 seconds or so before you add the egg mix (Tikka Masalla works well; the pan wants to be hot, but not TOO hot at this point, or the spices will burn rather than fry, and the result will be pretty disgusting).
Add the egg mix, and use the back of the fork or a spatula to move and spread it around enough so that most of the mix gets exposed to the heat and sets, and the bottom of the pan is covered reasonably evenly.
Once the mix is reasonably solid, take it off the ring and nuke it immediately under the hot grill until the top is also set (as David Roberts correctly points out, you get a fluffier, better-cooked omelette that way - although the spices in the mix tend to reduce the rise).
Take from under the heat, cover generously with a strong, tasty Cheddar (sliced or grated), and shove back under the grill until the cheese is melted.
Turn out folded in the traditional "half-moon" shape, and serve immediately. One of the fastest cooked snacks known to man, and downright delicious.
Virgin being late doesn't matter; if they end up providing cheap competition, by contrast, that's incredibly important. The real way to get mankind off the planet - if you'll forgive the purple prose, the true future of space exploration, and possibly even our survival as a species - lies in having lots of competing, self-interested, commercial parties capable of getting into space and making money from being there. Governments won't get us into space to any degree that matters; we've had four decades of watching how THAT one pans out, and they have completely the wrong priorities. But when there's money to be made, and lots of competition looking for new ways make it - sit back and watch the REAL Space Race begin.
@Robert E Harvey: I suspect that's what the writer was thinking, but if so, he needed to work harder at it. As it was, it was an awful ending - no real explanation, just "magic" and a touch of "happily ever after" that, once again, would have been more in place on CBBC. Oh, and on a planet of 7 billion people, the pivotal individual just "happens" to be a pupil at Clara's school? Well - there was a vague hint of another Missy connection; but if it was meant to be significant, it needed laying on with a much, much bigger trowel.
I don't make a point of going around discussing Doctor Who, but it's come up in conversation a couple of times, and I haven't yet spoken to anyone who claims to be liking the current series. And the plaint isn't Capaldi - it's the abysmal plots. I'd really love to know what the viewing figures look like - because I have a nasty suspicion they're nosediving.
Personally, I think it was a huge mistake to start into the whole Doctor/Clara/Danny triangle without giving Capaldi and the writers a few episodes to find his character. Yes, it probably makes perfect sense when considered in the abstract - Clara struggling to come to terms with the new Doctor - but as a viewer, I find it's not easy to get interested or involved in something like that when I'm still struggling trying to buy into a new Doctor myself!. And it certainly doesn't seem to be helping that both Capaldi and the writers *also* still seem to be struggling to settle on a real character for his Doctor - to date he still seems more like a ragbag of quirks and traits than a rounded character we can empathise with. It doesn't help, either, that we've had some pretty ropey story lines and scripting along the way, and some downright embarrassingly bad ones (the schoolboy science errors in Kill the Moon were frankly appalling scripting). As for the putative "Missy" subplot... it feels utterly tacked on; it certainly hasn't hooked or intrigued me. And if anyone wants to disagree, fine - but as a simple fact this series has been weak enough to completely lose the interest of my wife, who has casually enjoyed watching DW with me since the early '70s.
Just my opinion; YMMV. But this has NOT been, so far, a shining example of DW at its best. Nor am I convinced it stands a great chance of becoming one.
Years in the industry left me jaundiced - I simply don't trust MS to have my best interests at heart. And I don't trust IE. I stopped using it waaayyy back when MS took their blatant management-driven decision to force its roots down into the OS, so that they could claim to the US courts that they couldn't possibly unbundle the two (Win 98? I forget; long ago) - and thereby opened a shed-load of unnecessary security holes that took years to close again. And even though that's ancient history now, and my attitude is in all probability now antediluvian and entirely illogical - I still avoid it like the plague. I only ever fire it up on those rare occasions when I hit a web site that I absolutely MUST use, that is so badly (or parochially) coded that I can't get another browser to work on it - and then I go have a good wash to get rid of the unclean feeling.
It's been said that it takes years to gain trust; it takes minutes to lose it. Never were truer words spoken
So, indeed, do most people.
Getting to Mars is a publicity stunt on a par with the original moon landings - and likely to be about as long-lasting in terms of any related space program and exploitation of the achievement. It's hard to do, but not as hard as it would have been a couple of decades ago, and I doubt that many serious scientists feel it isn't do-able. But once it's done, the politicians (and, sadly, the public) will lose all interest.
Getting to the moon and actually doing things there, by contrast, has little of the sexy, public appeal of Mars (and even less of the vote-grabbing power) - but it's far more do-able, and the opportunities it presents (both commercially and internationally) are enormous. I strongly suspect that, as far as western nations are concerned, at least, though, it's going to take the involvement of the private sector before it happens; political interests are simply too divided and short-term.
Expect the US to get far more interested if Russia, China et al. start to look as though they might actually achieve something; as others have pointed out, even as a simple platform the moon is potentially unparalleled military "high ground", and as a base for making further, sustained progress it has even more potential.
"If you can see the pixel grid you're sitting too close / your TV is too big for your room!"
No, if you can see the 'pixel grid' you're sitting too close FOR THE RESOLUTION.
If you're viewing something low-res on a high-res screen, it's going to take multiple actual pixels to display "one" pixel from the low-res image. So a distance that is perfectly fine for viewing 4K material will potentially make older stuff look pixelated from the same distance. That's not the fault of the screen, or its size, or its distance, and it's certainly not an argument against 4K per se; it's down to a mismatch between content and display.
I'm not a big apologist for "new" media tech. Tube technology has gone, plasma screens have more or less died the death, and LCD is cheap to the point of being commodity; the TV hardware industry is clearly at a point right now where it desperately needs to find a "next big thing" to get everyone replacing their sets and revive its business. That, it seems, wasn't HD; it was never going to be 3D; it most certainly wasn't "curved screens" (!). 4K? Perhaps. Maybe. I don't know. I'd happily put a larger screen in the corner of my room, IF the quality of picture was good enough, and *if* it was comfortable to watch. Right now, my screen is at the "right" distance", for its size and resolution, for comfortable viewing - I did quite a bit of work to get those factors right before I bought. So I'm not going to simply boot it out and buy something else that's supposed to be "better" unless it genuinely is - and in MY house, to boot.
@AC: 'I've been in the car and have been stopped for "dangerous acceleration".'
That's perfectly reasonable (if not necessarily well-explained to you). ANYTHING you do whilst driving on the road or a public place can be an offence, if you go about it in a way that could cause problems for other people (including pedestrians). Check the legislation and you'll see that our main road offences were quite deliberately framed with broad-brush terms such as "reasonable consideration", "due care and attention" and "dangerous" - with the interpretation of what constitutes such left, in the final analysis, to the courts and case law. It's an approach that removes any wiggle-room for getting away with blatant infringements on technicalities, whilst leaving room for common sense to prevail. It's also one that has, on the whole and in my own personal opinion at least, worked pretty well. (And it's also one that means that the government's high-publicity "attack" last year of "new" offences for middle-lane hogging, etc., was a wholly-unnecessary PR exercise - everything was already perfectly well-covered by the existing legislation - but that's a different discussion).
The fact that they got ANY result from the control shows that they had a problem of SOME sort. And unless they repeated both experiment and control, each set up again from scratch each time, and the results from each were similar each time (but different between control and experiment), it could simply be that the instrumentation problem was weaker on one occasion than the other.
I'm with others on this. I'd love it to be verified - it would not only be potentially revolutionary, but throw a spanner in the works of current orthodoxy - rarely a bad thing. But frankly this feels a lot like cold fusion did when it was first trumpeted. Sorry - exceptional claims (apparent violation of conservation laws) require exceptional evidence, and I'd say the jury is still decidedly out.
"Now where's the equivalent UK vote on whether we'd like Scotland to stay or go? It's not all about Scotland..."
Well - for my money at least, in that question is the nub of the problem facing the "Yes" campaign (or would be if they were actually having a sensible discussion, rather than just promising that everything will be wonderful). Because, on the one hand, they want the people of Scotland to have the sole vote on whether or not to leave the UK; but on the other hand, they keep trying to pretend that (of course!) Scotland will get to share pretty much everything it currently enjoys as part of it.
Sorry, guys - make your mind up. Those two positions are not compatible. In particular, when you're a member of something, and you decide to cancel your membership, it's almost unheard of for you to continue to enjoy the rights and benefits that membership gave you. If you want to have an equitable share in what the UK has - effectively, to divide the current UK into two parts - then those of us in the rest of the UK have a right to expect a reasonable say in the decision-making process. If you want to make the decision unilaterally (which is what Salmond et al. have opted for), and you end up choosing to walk out, then fine - but don't expect to simply pick and choose what you get to take with you. At that point, it makes very little sense for the rest of the UK to start from any other negotiating position than that it's basically ALL ours until otherwise agreed.
"The SBS would also inherit a proportionate share of the BBC’s commercial ventures, including BBC Worldwide Ltd, and their associated ongoing profits."
My problem is not with the principle of that; it's simply that it's yet another regrettable, jam-today, sweep-the-practicalities-under-the-carpet example of the "Yes" camp's ludicrous pretence that, in the event of a vote for independence, all difficulties will simply melt away and Scotland will get everything it wants.
Sorry, but I was born a cynic; if Scotland votes "Yes", then, sure, there will cases where an adequate level of independent operation already exists, and in those cases it's quite possible that the division will be fairly amicable. Where that's not the case, though, vested interests (overwhelmingly, south of the border) are likely to fight tooth and nail to either keep their assets intact or to make Scotland pay through the nose for what it wants - anything less makes no business sense. Is that the case with the BBC? I have no idea. Would such an amicable separation extend to existing income streams? I *highly* doubt it.
If the Scots vote for independence, that's their choice - I wouldn't rob them of their right to make that decision for a second. But either way, the idea that some Magic Referendum Fairy will simply wave her magic wand and gift a Scotland that has just chosen to divorce itself from the UK with everything that Salmond et al. want to claim as its "fair share" is, frankly, about as ridiculous as... ...well, as the concept of a Magic Referendum Fairy. She's far, far more likely to wave her wand and walk away with half the contents of Scotland's wallet.
(Oops - meant 170 million, not 17 million. I never spot the worst typo's until my edit time's up...)
...is just how hard it is to get your brain around Deep Time and probability. Because in truth the conclusion is about as surprising as being told that water is wet.
What is easy to overlook is just how immense Deep Time is. Set against that, and given that the possibility that the dinosaurs as a whole could be wiped out at any particular point was always small, but also always greater than zero, the probability that they would be wiped out by *something*, *sometime* was always 100%. The only question was how long it would take, and whether whatever it was would wipe out life on the planet entirely. And more to the point, it was always likely to take some form of rare, extreme catastrophe to give the final push - because life is *good* at surviving, and simply works around anything less.
In the end, it took getting on for 17 *million* years for the something to happen; and life here, as a whole, survived. The truly big surprise, if there is one, ought to be not that the dinosaurs eventually died out through "bad luck", but that it took so long for that to happen.
Back in the early 1970s, a wholly-innocent and unsuspecting player in a play-by-mail game of the classic game of negotiation and backstabbing, Diplomacy, received unwanted, extended attention from the Manchester police on the back of a telegram from another player, reading "Attack on Liverpool agreed".
I remember being at a games con shortly afterwards, when the UK Diplomacy crowd were having great fun chatting and laughing about it. But the lesson has to be that it has always paid to be aware that you're communicating over open channels.
I remain totally unconvinced by all and any discussion of the "demand" for this (and frequently any other) new technology. It seems to me that companies within the media technologies industry have been desperately flailing around for the last few years looking for any new thing they can manage to hype enough to revive their flagging revenue streams. Blu-ray; HD; 3D; 4K; curved screens (yeah, right.... getting a little desperate there, aren't we, guys?) - lots of things they can do, nothing that the market as a whole seems to particularly want or need. And certainly nothing in the way of new tech that the market is going to avidly embrace while the vast percentage of available content doesn't support it.
Had the same thought - even chose the same title. Ah, well, serve me right for not reading down the full thread first.
I can't help wondering whether this has rather more to do with putting Lohan's name out in front of people than with the professed complaint.
I'm still massively to be convinced. All the wearable tech to date is clever and fascinating, but it's also firmly stuck at the "Hey, isn't our great new idea cool?" stage as far as saleable products that the market is really likely to want are concerned. Smart watches in particular look like a solution looking for a problem that isn't there to be solved. And the sobering truth is that the last 40 years or so are a positive elephant's graveyard of novel, technically sound products that died stillborn because no-one actually needed them.
*Will* people find that they need smart watches? Well - my money, at least, is agin it. I'd point to the *huge* range of apps already available for smart phones. I therefore find it telling that, despite a *vast* range of ideas to trawl through, so far the companies developing them don't seem able to find a single thing to show their new babies doing that seems remotely calculated to get the average person excited. OK, sure, I understand that no-one expects them to have the killer app at their fingertips - but are you really telling me that everyone thought about everything they already use their smart phone for, and the very best thing they could think of to put onto a small screen on everyone's wrists were social media alerts? Ouch.
It's hard to avoid the conclusion that, however technically clever smart watches might be, as a market it's almost certainly going nowhere.
Or Michael Flanders and Donald Swann (in the days of the Lord Chamberlain's "blue pencil", no less):
The fleet set sail for Rockall,
To free the isle of Rockall
From fear of foreign thrall.
We sped across the planet
To find this lump of granite.
One rather startled gannet;
In fact we found... Rockall.
There are two linked but distinct issues here.
One is whether Uber are breaching the relevant law, the other is whether the law ought to change.
And, for my money, on the first at least, things are quite clear - these apps are most definitely in breach of the intent (at least, and maybe even the letter) of the law as it stands. They're a classic instance of something coming along that the law, possibly, doesn't explicitly forbid, only because the technology in question simply wasn't envisaged when the law was framed. If I were a London black cabbie, I'd be up in arms as well, looking to get the spirit of the law enforced.
The second issue is a perfectly valid one to discuss, and not one I have any great opinion on. But right now, the law is what it is, and its intent is perfectly clear. If that's to change, that should be through proper legal process - not from simple force majeure on the part of interlopers exploiting legal technicalities.
It's a weird experience, Jamie, but I've been there too.
I loved bacon before I went veggie (although for some reason I always found the stench of a ham boiling absolutely appalling, even if I was quite happy to eat the result). And I most certainly didn't stop eating meat because I didn't enjoy it; a good steak was a real treat.
But maybe a few months in, quite suddenly I found myself hyper-sensitive to the smell of animal fats in general - a heavy, greasy, thoroughly unpleasant smell. I could walk into the kitchen two hours after my wife (meat-eater) had been cooking, and still notice it enough to make me feel quite queasy. That eased up a lot after a while - but even now, fourteen years in, it's not entirely gone.
Bacon? Well - I always liked my bacon crisp - if it didn't shatter when you tried to cut it, it wasn't cooked enough. Streaky, therefore. Can't say I ate it often enough to genuinely miss it - but from time to time I've grabbed a packet of one of the veggie simulacrums out there and fried it up. A couple are even half decent imitations. But I always end up with vicious indigestion, which turns out to be a powerful disincentive; haven't done that for quite a while now.
No. The fact that the NSA have now produced an email, when they claimed to have been unable to find ANY such emails, is circumstantial evidence that their claim MAY not have not been truthful. But, equally, taken in isolation that evidence is extremely weak, and capable of other, quite innocent construction. Plus there is a huge difference between circumstantial evidence and proof.
(Clearly it IS circumstantial evidence. though; it's specious to argue otherwise. If, hypothetically, the NSA were to produce another such email tomorrow, and further ones at regular intervals, always claiming to have no more, there would come a point at which even the most hardened but honest opponent of Snowden's actions would start to seriously question the truthfulness of the NSA's claim. So each individual, hypothetical such example would have a certain weight of evidence, and enough of those weights would combine to make someone take pause. The question then simply becomes one of how many such examples it would take to make a particular individual question the NSA's version of events. Depending on your viewpoint, it could take many. Or then again, it could take only one. But either way, a single email IS evidence. Just not necessarily strong evidence. As the old joke has it, "We've already established what kind of a girl you are. Now we're just haggling over the price.")
Yes, of course it does. It's not a real person in a real desert - when did you last meet a real person who took successive steps in entirely random directions, even when drunk out of their minds?
It's simply a way of describing an entirely random movement in a way that the reader can relate to.
If I remember rightly, there are at least two interesting (and related) facts about this version of the Drunkard's Walk that don't seem to have been mentioned yet:
1) Given long enough, the drunkard will return to his exact starting point
2) Indeed, given long enough, the drunkard will visit EVERY point in the desert and beyond (in other words, his walk is space-filling).
That despite the fact that his average distance from his starting point grows with time.
The craters on the moon ARE impact craters. And they're mostly symmetrical because only near-grazing impacts form craters that AREN'T. That's what happens at cosmic speeds. The sheer energy of the impact blasts material near-equally in all directions long before such trivia as impact direction have any chance to influence things.
The effect was first confirmed back in the early 20th century, using rifle bullets fired into mud, by Daniel Barringer (the man who also showed that Meteor Crater in Arizona was precisely that).
For a more detailed explanation, see, e.g., http://www.scientificamerican.com/article/why-are-impact-craters-al/ . Or www.barringercrater.com has classroom instructions on how to demonstrate the effect.
Well, I up-voted the original on principle - but I can't help wondering whether I was actually right to do so. On what sound legal grounds would the Beeb actually claim exclusive use of the name?
Copyright? No. In the UK at least, copyright does not cover names. (see, e.g., http://www.copyrightservice.co.uk/copyright/p18_copyright_names)
Trademarks? Possible, but probably not. Whilst the Beeb has registered TARDIS as a trademark, trademarks are usually granted for a particular industry sector. "Polo", for example is a trademark in the UK for both a type of mint and a mark of car. Or there's the well-known clash of naming over "Apple", involving the global IT brand and the Beatles' record publisher, of course (resolved by Apple Computers agreeing not to publish records, and Apple Records agreeing not to produce computers).
There's UK laws on "passing off", of course. The BBC is still protected in its use of TARDIS by those. But it would take a severe stretch of the imagination to suggest that a data routing strategy/algorithm might be likely to infringe.
None of the above would in principle stop the Beeb taking someone to court, of course - and that someone might well then give way rather than risk their own assets on the lottery of a court case. It wouldn't alter the legal position one iota - but, sadly, asymmetry at law often produces "unfair" results..
(IANAL, so the above may easily be flawed - but I've worked with them, and copyright, patents and so forth have been relevant to my job at times.)
"Not fun on 3270's without cut'n'paste"? It was simply a different paradigm, that's all. Who needed cut'n'paste when you had ISPF edit? Incredibly powerful - if you grew up with it, you could make it jump through hoops. Far from needing cut and paste, when a GUI interface was all I had, I missed the sheer raw power of Edit's command line. In my later years, I had a rich choice of coding environments; I used GUIs for some languages, such as Java, but 3270 for others (most definitely including COBOL) . Horses for courses, and all that.
I started in other languages, and came to COBOL late-ish in my career (supporting CICS development - and I consciously started writing my code in COBOL because, yes, that's what a lot of the customers I talked to were still using). Yes, it had lots of annoying details - and until COBOL 2 came along, some serious linguistic omissions too - but of all the annoyances, the biggest one of the lot for me was that comparatively trivial, and wretched, terminating period. Absolutely mandatory in some places; a syntax error after the self-same statement in others. Slap a conditional around a piece of perfectly good, working code, and suddenly it wouldn't even compile. If I'd had a swear jar in the office, it would have been full about twice a day.
"10 donuts (sic) at 1 pound, 3 shillings and 6 pence" is... £1/3/6d.
Now - if you'd asked about doughnuts at £1/3/6d EACH... I'd say you're definitely shopping in the wrong places. But you'd pay £11/15/0d (10x £1 = £10. 10 x 3s = 30s = £1/10/0. 10 x 6p = 5s. Utterly trivial when you're used to actually doing mental arithmetic - even after 42 years.)
And indeed, at the other end of the spectrum - once you get used to not wearing a watch again, it's a relief not to need to.
I own a cheap Casio F-91W digital - cost me well under a tenner, totally reliable, a battery that will last a decade or longer, and does everything I need in a timepiece. I wore a watch every day for 30 years whilst at work, because so much of my life was driven by the clock. Now that's not the case, and the watch has slowly migrated from a permanent home on my wrist to equally permanent lodgings in my trouser pocket - from where it gets taken maybe once every couple of days (when I actually need to know the time, and taking it out's the simplest option). But in truth, I have clocks around the house, in the car, and on my phone; I'm rarely far from sight of one, and I simply don't need an extra one stuck annoyingly and uncomfortably on my wrist. So if I'm going, say, to stick serious computing power there, it better give me something I really value, and that I can't get half as well another way. So far, I simply don't see any application that remotely steps up to the task.
The recent history of technology is awash with novel and inventive ideas that went nowhere because people didn't want them, didn't need them or both. And if there's one lesson to be taken from that, it's that, just because you can make something "clever", that doesn't mean to say that people will buy it. Bet the farm at your own risk.
The theory predicted this; the experiment bears out the prediction. Good science.
>'TM' in the UK *is* a registered Trademark. The 'R' is non-existent here.
As others have said - that's totally back to front.
In the UK the "R" in a circle denotes a registered trademark, and it's an offence to use it incorrectly. It's not illegal to use "TM" to indicate that something is being used as a trademark, but it nor does it have any legal significance. (Source: www.ipo.gov.uk)
This is only needed at all because parodies have come under increasing attack in the UK courts in recent years; for most people for a long while I think it's been fairly clearly understood that a parody of something was different to breach of copyright, and not open to challenge. Now people are trying to erode that, so it needs formally codifying.
Parody is a tradition as old as human culture, and a vital tool for deflating the pompous and the powerful, It most certainly needs to be preserved and protected, and if that needs legislation, so be it. And whilst it would need defining, in real terms I suspect most of us have a fairly clear idea what constitutes parody, and what is simple rip-off. (For example, using someone's recognised song for commercial purposes as in this case, with different words chosen to fit your marketing needs, clearly isn't parody - it's a rip-off and an attempt to avoid copyright fees. Nor, to quote a previous poster's example, would simply changing the cover of a book and selling it otherwise unchanged make the whole thing a parody - although I suspect if it was well done, people would accept the modified *cover* as a parody of the original).
I was a software tester on one of IBM's flagship mainframe products for 20 years. And I can tell you, anyone who knows a way of catching every glitch in complex software before it goes live (yes, even the ones that bring the customer's business-critical systems crashing to a very visible, embarrassing and expensive halt), that is simple and robust enough to be used in practice in widely diverse environments and by development teams using different approaches and tools - AND simple enough to be understood by non-technical management, so that they won't simply throw out what works two years later in favour of the latest "flavour of the month", in order to be perceived to be "managing" - REALLY needs to get themselves some good marketing and legal support, because they're in line to make a LOT of money.
So basically your criticism of 8.1 is that a third party app you supply doesn't play well with it - and that rather than go to 8.1 (and, presumably, the support chain of dependent fixes that will inevitably form behind it) they should stick with your app? You might want to rethink your technical priorities, if only for your customers' sake.
>You can't judge the quality of the broadcast from restored footage recovered from the sludge the BBC archived.
Most definitely true. The original broadcasts (both 405 and 625, and of this and just about everything else that has survived from the early years of TV) were far clearer than anyone would be lead to believe from the quality of the copies around today.
"It is of course purely anecdotal". Never were six truer words spoken.
This is purely anecdotal as well. Having spent over three decades working for a well-known IT multinational (the last two in an absolute hive of geeks, namely a development lab of a couple of thousand people), I'd have to say that the rest of that is so far from accurate, in my own experience, that it's almost visible coming back round from the other side. IT folks are a mix of types just like any other group. In my time I saw the inevitable cross-section of types - some of whom would have been short of breath after climbing a flight of stairs - but if anything rather more of the folk I worked with were fit, sporty types than is my experience in the general population. Personally I suspect that the stereotype of the weedy, uncoordinated nerd is often simply a comfortable myth for those who don't find it easy to accept that there are people out there a whole lot brighter than they are - "At least I'm better than them at something...".
True IBM mainframe programmers don't use columns 73-80.
Not to comment on Rifkind's interview per se (which I didn't hear) - but, to be fair to politicians, the Beeb's current batch of journalists seem to positively revel in aggressively asking mind-bogglingly overly-simplistic, often highly slanted questions to which no-one in their right mind would give a simple yes or no. Presumably they think it makes their interviews look "tough", and they go to bed dreaming of the golden moment when someone will actually make the mistake of a straight answer. It's piss-poor journalism of the very worst kind - the best interviewers of the past (Robin Day comes to mind) were fully capable of skewering their victims with incisive, razor-sharp questions in the most polite manner, and then hanging politely but doggedly in there until they either got an answer or it became crystal clear to everyone listening that the interviewee was dodging the issue - but aggression and hoping for a gaffe seems to be all that today's poor dears are capable of.
..at least, not technically. How could they? The internet in its current form basically didn't exist when most of our current batch of security and privacy laws were drawn up, and those laws simply don't cover it. But - and it's a BIG but - GCHQ undoubtedly DID do things that, had they involved older modes of communication that WERE covered by those laws (obvious parallels would be mass tampering with and copying of letters, or tapping and recording every phone line in the country without a warrant) most certainly WOULD have been illegal. And with that in mind, their behaviour was morally dubious, to say the least - obeying the letter of the law whilst clearly breaching its spirit.No, it's not surprising that any intelligence agency would do (at least) everything the law lets it get away with, but we've been here before, had the discussions, and come to conclusions that don't support what they were (and, presumably, still are) doing. And we therefore have every right to be very suspicious of any attempt to avoid, prevent or back-pedal on, a significant legal crackdown on the scope of what is permitted without proper and independent oversight.
Hmm. That wasn't how I read the article - the impression I got was that YOU would be fed the RBT you'd chosen, NOT the person you called. Both possibilities are clearly technically possible in principle (and from comments here, the second is already out there) - but, on a re-read, the article isn't clear as to which it's talking about...
Sadly, it's my experience that, in the world of retail sales, "Almost sold out!!!" (spurious exclamation marks mandatory) is trader-speak for, "We have a pile of these we need to shift, we can barely even give them away, and we think you're gullible enough to buy if you think they're popular".
When similar results turn up from many more places, that will be a result worthy of the tag line. Until then, it's more interesting for the fact that it bucks the trend - although that will doubtless not stop the more die-hard climate change sceptics pouncing on it as yet more evidence that they're right, and it's all some big conspiracy that everyone but them is in on.
The BBC carried a very telling interview the other night, with the editor-in-chief of the journal Nature, Dr Philip Campbell. He was talking to Professor Brian Cox about the power and robustness of peer review, for all its flaws, and at this point in the interview about how the scientific consensus on climate change has grown and solidified:
"I would so love to show that climate change isn't happening. I'd love to think that global warming isn't happening in a way that I do actually believe threatens my grand-children's future. But it's so unfortunate, if you like, that we don't seem to be getting papers that show that it's wrong."
Still available on the BBC iPlayer, if anyone missed it and is interested: "Science Britannica, episode 2: Method and Madness".
" It's really only our way of life at risk."
If by that you mean our descendants staying alive, I agree. Otherwise, no. Collectively we're disturbingly like yeast in a fermenting vessel, gobbling up the resources around us as fast as we can and destroying the very environment that keeps us alive. In a worst case scenario (given that we show absolutely no signs of being able to cooperate as a species to even begin to actually stop doing the damage) we'll push one critical system or another past a tipping point, the planet will see a mass extinction bigger than than the one we're already causing - including us - and there will be nothing we can do but watch.
"Of those online holdouts, over a third said that they felt the internet was "not relevant" to them, and 32 per cent said that using it was either too hard or that they were concerned about the threat of viruses and identity theft. This latter figure has nearly doubled since Pew's 2010 survey. and shows that the rising tide of online crime is a significant deterrent effect to use."
NO. Back to maths class, guys. What has "nearly doubled" is a percentage of a percentage. Without the previous numbers, it's meaningless.
Depending on which figure you take off that graph as "2010", the number of holdouts back then could have been as high as 26%; now it's only 15%. And assume for the sake of argument that the percentage concerned about viruses then was 17% (so that 32% is "nearly double" the old figure). Then we're talking about 32% of 15% (4.8%) of Americans now, versus 17% of 26% ((4.4%) of Americans back in 2010. No great change, and probably well within the margins of statistical error. But that doesn't make as good a story.
"Even if it will work unset, you've got either a malignantly flashing display, or the wrong time forever telling you that you weren't clever enough to set a clock."
Microwaves are great for heating (some) things, but rubbish for anything close to proper cooking. So on a manically busy microwave week, we'll use ours for all of 15 minutes (cooking frozen veg, warming tortillas and similar). On a slow week it's lucky if it sees 15 seconds of use to defrost a little butter. So (like lots of other gear in our kitchen that doesn't get much use), it only gets turned on at the socket when it's needed. No blinking; no wrong time. And no guilt.
I have to admit, though, that I still have an indecent number of candles and holders readily accessible. And will do for the foreseeable future. Torches. phones and so forth are great for finding your way around the dark areas of the house, but you want light in whatever room(s) you settle in - and half a dozen well placed candles can make for a very pleasant environment.
Spot on. It's what you get when higher management doesn't want to hear about problems and potential failures, only how well everything is going. Actually having problems is always seen as failure on the part of the people further down the tree, and punished. So middle management en masse basically gets scared to tell anyone above them that something is seriously wrong - and what little status filters back up is a censored and glossily polished imitation of the truth. At the time things need to be fixed, no-one wants to say it's needed; by the time it can't be hidden, it's too late. And when things eventually get so bad that it can't be ignored any longer, the guys at the top - who are wholly responsible for the toxic culture they rule over - blame anyone but themselves, hunt out and punish the "guilty", and start the whole poisonous mess all over again.
It's worth pointing out, to anyone too young to remember, that Microsoft were playing their dirty old "embrace and extinguish" game even back then. MS was heavily involved in writing the first version of OS/2, and there's plenty of evidence to support the suggestion that they took active steps to make sure it failed. After which they launched Win NT, using quite a few good ideas that came out of that period of collaboration. By the time IBM was able to come back with a new version unsullied by MS involvement, the battle for hearts and minds was already over. I have no love of today's IBM - but MS became the Evil Empire in my book right then, and have done nothing in the mean time to change my opinion. I won't give a penny to them that I don't absolutely have to.