> So what is the idiot-viewing platform actually for?
Cats and their lonely owners.
Am I the only person who habitually adds "-youtube" to their Google searches?
2407 posts • joined 10 Jun 2009
> So what is the idiot-viewing platform actually for?
Cats and their lonely owners.
Am I the only person who habitually adds "-youtube" to their Google searches?
> I think this is a great kit for someone who just wants to get the end result achieved.
The difficulty is that it doesn't work.
I experienced a number of IO errors, all related to I²C communications, during the time I spent with the Kit. Dexter Industries Forum posts relating to the GrovePI+’s predecessor suggest this is not uncommon but at least can be caught and managed in the core Python code.
If the kit has a buggy I²C implementation (or the software running on the Pi does) then it's of little use to non-technical people who's abilities limit them to "plug and play" components. Not only would those people not be able to produce a working outcome, but they wouldn't have the skills to diagnose or fix the problems, themselves.
And for the more advanced user: would the time needed to work through or around those bugs, or to work within the limitations imposed by them, be worth the convenience of buying this kit - especially when there are debugged hardware alternatives with known quality software available from other sources? Or even (perish the thought) Arduino solutions that have a proven track record.
Maybe it would be better to wait for version 2.0
One simple way for IT admin people to gauge their competence is this 2-stage process:
Tot up the number of *ckups you have fixed, avoided or alerted your employers to. Assign a realistic financial value to them - remembering to include fractional values where it wasn't just you who contributed to the fix (or fault). Subtract the value of the ones you have caused.
If the total amount saved is greater than the cost of them employing you (not just what you get paid, but the cost of your employment, including overheads) then voilà you can count yourself among those competent to do your job.
> North Korea really didn’t need to go to the trouble of hacking Sony Pictures over The Interview
But they didn't! A far more likely possibility is that S.P. was hacked by film lovers in an attempt to stop them making any more dross like this.
> “Why is it down all the time, get some proper staff,” said one
Maybe the "proper" staff won the lottery and b*ggrd off?
> The Post Office is ramping up its plans to become a virtual mobile phone network
So it'll take 3 days for them to connect your call (or 2 weeks at christmas) - which might go to someone else if they can't reach the person you wanted to speak to. And then, if they fail to connect your incoming calls, you'll get a TXT saying you have to go to the Sorting Office to collect your voicemail?
> They're still out there.
In theory, yes.
However, it's 50 years since anyone's put a nuke on a rocket, lit the blue touchpaper and had a successful "boom" - rather than a <phut>, ooops or "oh crap it's heading back in our direction". That means that the last people who did it (assuming they were in their 20's and 30's) are now retired and the people they trained and passed on the "tricks of the trade" to are now getting on and have (presumably) passed on all the folklore to a new generation.
So would a system that was last end-to-end tested half a century ago, with all the subsequent innovation, upgrades, redesigns, changes and cost-cutting have any realistic chance of working? I can't see much hope for it - but I hope nobody reads this and decides to try it out.
> TV technology was obvious a big talking point
.. is the world saying "Meh!".
Nobody cares about TV tech. Nobody, that is, apart from individuals who use the size of their equipment as a measure of, well, the size of their equipment and those locked in development labs producing ever-denser, more responsive, brighter, bigger, bendier displays - that will still be showing the same repeats (some in B&W, witness True Ents screening The Avengers, right back from 1965) that were not even new a generation ago.
Let's face it: we have enough TV. There's such a large back catalog that there is little need for anyone to make any more telly (apart from filling the gap left by expunging 1970's "non-persons" from ToTP repeats, and updating the few true science documentaries as better information becomes available) - as all the channels of 100% repeats show us, all too successfully with their ability to compete with the big-4 channels for audience share.
So if there's no need for more content, and we can easily satisfy the current generation and ones to come with the existing, already paid-for, known to be popular programmes - why would anyone need a bigger, better, bendier telly to watch it. It's not as if the medium has any affect on the message.
> what really sets a film apart is the number of times it's later referenced in other moving pictures.
So basically, all they are doing is counting the number of "likes" a movie gets from other film-makers.
And as for "a good measure of scientific citations", haven't the journals been counting and indexing citations since, well, forever? Though it may be that Google's indexing algorithm (giving more weight to references from well-referenced papers) is a better plan - rather than just tallying up the totals.
It's very handy when you're backpedalling.
(and so far as FB's value goes, considering the massive number of "workers" - maybe "employees" would be a better word - who seem to spend all their time on it, shouldn't FB's global economic value be negative $227Bn?)
With older versions of FF (e.g. 31) when you opened a new tab you could configure the browser to show you a grid of "favourite" website thumbnails to click on. By default this was set to a 3x3 matrix - but could be changed to show more sites in more rows and/or columns, albeit with smaller thumbnails of each one.
In FF34 this feature was "improved" to fix the grid to 3x3, no matter what about:config parameter the user set. In FF35 Mozilla have applied their infinite wisdom and decided that users should want LARGE thumbnails, rather than to allow the users to choose smaller ones but more of them. So now it *is* possible to have more columns (maybe even more rows, too - but I don't have a screen large enough) but the size of each thumbnail is fixed, irrespective of the size or resolution of the screen I have.
> The station is split into two sectors: the Russian segment and the United States segment
So, which segment has access to the escape pod?
The basic fallacy behind comparing PC use to oil consumption is that the model IS fundamentally different; PCs were (are?) subject to the "law" of mass-production: the more things that are made, the cheaper they become. Whereas oil production - from whatever source - is subject to supply and demand: the more something is consumed, the higher it's price becomes.
Now it might be convenient, as a prop for the "Rise of the ManufRacturers" to assume that:
> There's simply no shortage at all of shales to exploit around the world
However, that statement is clearly bollocks. Even if there was a (practically, if not physically) infinite supply of fossil fuel there is still geopolitical issues that limits its distribution: someone can turn off the tap - just ask Mr. Putin. There is also the factor of what to do with the emissions from burning this stuff: even if climate change is "questionable" now, once a proportion of this "no shortage" oil shale disappears into our air it will alter that questionability and/or lead to air pollution of unacceptable levels.
We could also discuss the effect of transportation costs (oil, being sucked out of the ground is, essentially, free. But the cost of getting it to the end user is high. Whereas PCs have substantial production / software costs and transport adds little to that) on the different markets. Also that PCs are not in themselves a desirable "good" - they are merely a platform for the software we wish to run on them and unlike oil don't have any viable alternatives for their use.
> but I have yet to see anyone write "she should'f been shot".
Ahhh, that's because the correct shortening is: "she shoulda been shot" <g>
The problem with should / would / could of is that id doesn't conjugate very well. You can mispronounce "should of" in the present tense but "should of had" makes no sense.
My preference is to use words with the
fewest least letters. So while a lot of obscure words do carry suitable meanings if there are shorter words (or abbreviations) that mean the same I try to use those, instead. Or as the list would suggest: avoid prolix wordy writing.
Of course, if we want to talk about excellent sources of words for describing common (or not so common) situations and feelings, Douglas Adam's "other" masterpiece has always been a good choice.
> 11 videos viewed for every person on the planet
Remind me never, ever to buy a second-hand mobile device!
(or to borrow anyone's phone)
Modern houses are small - modern flats are tiny.
While you can fit a normal sized HDTV in a room, once you get to the size of screen needed to gain any benefit from 4K: with double the number of pixels in each direction, there aren't that many places in yer average sitting room where you can put it. And if you want to sit at a comfortable viewing distance (which increases with screen size) - fewer still.
A 4-foot wide 55 inch telly dominates a modern living room. Given that you have one wall taken up with windows, an adjacent one with a door slapped somewhere near the middle and need to have your seats opposite the TV - there aren't that many layout options available. Put in a 60-incher and you find that the TV dominates the room. Go larger and the whole thing looks like a caricature. Stick with a 4K TV that's the same size as your existing HD kit and what have you gained for all the extra cost?
(and they still only show the same old crappy programmes)
> "Grave consequences" have been threatened by North Korea
Would one of those "consequences" be that if the US don't let NK in on the investigation, one of their film companies will get hacked?
So since the BBC has "unmasked" Apple as not an ethical employer, should we expect all the trendy BBC staff to eschew their Macs, iPads and iPhones either as a matter of corporate policy or simply as individual choices made on humanitarian grounds?
Or is it more likely that there is a wide gap between the principles and standards promoted in an investigative
entertainment programme and the reality of what should not get between right-on media luvvies and their status symbols.
Who fancies organising a mass iBurning outside Media City? It might even make the ITV news.
> Perhaps the politicians would like to have a go at running NATS?
TBF, early in its development I was asked if I would like to do some work at Swanwick - NERC (as it was known then). I spent a day with the management team and politely declined. Even then it seemed to be a shambolic mess and was regularly being slated in the computer press.
Having said that, the basic problem is one of capacity and efficiency. The closer you get to running any system at 100% of its capacity, the less margin you have to deal with unexpected events as there is less "slack" that can be taken up to lessen their impact. It's the same reason that busy motorways jam up due to minor RTAs. If you want a resilient motorway / airspace / factory that can quickly recover from downtime, breakdowns or jams you really shouldn't run it near to it's limit. However, if you do hold back a margin for error then you get accused of "waste". It's a lose-lose situation and the only remarkable thing is that there are so few cockups.
> My wife sells knitting patterns on line. Global Turnover less than £2000.
So all that this rule will do is introduce large administrative overheads to EU based small businesses that sell to customers inside the EU.
The simple fix to this would be for small businesses within the EU simply to say "we will not sell or ship to addresses inside the EU". That still allows access to large proportion of the world - even a large proportion of the english-speaking world.
As a side-effect, it also reduces the EU's tax take - but we have to assume that the clever people who drafted this rule saw that coming and decided that was a desirable outcome </sarcasm>
> the whole idea is that the TV would better represent the range of light levels we see around us in the real world.
The problem is that the human eye has quite a restricted range of acceptable intensity levels. Look at something bright and you're dazzled and can have after-images for several seconds. Look at something dim just afterwards and you can't see it in detail until your iris expands out to let in enough light. So HDR images that contain both very bright and dimly lit portions won't be seen very well as our eyes will adapt quickly.
Once your TV picture has a dynamic range that exceeds that of our eyes, without them dilating all the excess DR is wasted. Current TVs are already able to display an image that is too bright to allow our eyes to see both the bright portions and the dim ones simultaneously.
> any improvement in brightness and contrast is easier to appreciate
Except that most people watch their TV in a well lit room. So whatever black level the TV is capable of, under laboratory conditions, is completely negated by the reflections (direct or indirect) from the high-gloss screens. Even matt screens reflect some light - or you wouldn't be able to see them when the OLED or backlight was off. So trying to convince people that brightness / contrast is some wizzy new wonder-technology is flawed right from the start.
It's made even more pointless by the crap content of the programmes on offer, too. Apart from most of them being repeats made anytime between yesterday and 1970, does it really matter if a news broadcast, football match, comedy or documentary can split the difference between 1-bit of brightness - or not? The content is still the same, the score won't change and the laughs will (or won't) be just as good. Most people watch TV for the content, not the delivery. So maybe the route to more TV uptake is to start making better programmes?
> Imagine the Revolution¹ guys being able to react to the 4p porridge story and getting something out on the day
Here's a better idea: Imagine the BBC guys being able to research the 4p porridge story [ whatever that is/was ] and getting an authoritative, credible, accurate and structured story out, assuming the story had relevance to the TV audience
Then the channel might actually be worth watching and could support a viewership that made its funding cost effective.
 what or whoever TF they are.
Personally, if I was ever to defraud a company of $1.4 Bil, I'd make sure I kept enough cash squirreled away to buy the best lawyers, accountants and judges possible to keep me out of jail if the dastardly deed was ever discovered.
> Europe’s new digital chief’s passion for ending geo-blocking has been explained: he’s missing out on his beloved Estonian football. ... I find it’s blocked, blocked, blocked!”
Well, yes. That's the thing about other countries. Why does he assume that doing this is "stealing", when he reckons that paying his (Estonian) licence / taxes should entitle him to watch the programmes he wants to?
BTW, there are more ways than setting up a VPN.
> it costs companies over £30 a month to maintain an employee’s phone
So could one reasonably expect (say) £25 a month for relieving the company of this expensive burden and using my own phone?
> I found an angry set of demands for my time and attention. Nothing serious, certainly nothing that could qualify as an emergency
Sounds like you have some more people to cut ties with. Either can them or throw together an autoresponder that says: "Have you tried switching it off and on again?".
> What do they play when their in a COBRA meeting
Snakes and Ladders?
> you never had to attend meetings where some parts had nothing to do with your work
> You never sat doodling or planning your dinner until it was your turn to present something?
I've never shown open disrespect for the people who *are* presenting or engaging in those parts of the meeting. ISTM you can either play little games (or as happens more often in my world: log onto the servers and spend the time futzing about, doing "work") or you can expand your sphere of knowledge or you can simply "sleep with your eyes open".
But since most of the meetings I attend that aren't relevant to me, are at the behest of the people who are paying my consultancy rates, I feel I owe it to them to at least feign interest and project a professional image of my employers.
> “It was a long meeting on pension reforms, which is an important issue that I take very seriously,”
Not as seriously, it would appear, as moving little shapes around on an electronic toy. I'm torn between being annoyed at his lack of responsibility or being relieved that at least while he's wasting his days playing inconsequential little games, he's not doing what most politicians do: devising bad laws that neither achieve their intended purpose nor are tight enough to stop their loopholes being exploited.
Maybe we should encourage all Home Office staff to stop devising new regulations and spend all their time playing Candy Crush instead. That way we might just get to retain a modicum of our civil liberties?
The role of the directors is twofold.
First, to have a plan for the future of the business (which could include fixing any existing problems) and to be able to communicate that plan to the senior managers who's job it is to execute the plan. Directors aren't the "do-ers", they express a wish and others are resonsible for carrying it out. If you ever see a director of IT doing a technical job, something has gone terribly wrong.
Second, to be responsible both to the shareholders and the law for the operations of the company. As it turns out, large companies have many IT related legal obligations (security and protection of data being just one). However, it's not the job of an IT director to specify the "how" - that's too low-level - they specify the "what" and leave the "how" up to the minions, but with final say over all and any proposed solutions.
As such, it makes complete sense for an IT director to be only partially IT-savvy. Just as you don't expect a Network Manager to know about the header fields in an Ethernet packet. An IT director needs to work at the "block" level of infrastructure: a computer centre here, a D.R. site there. And to be aware of which directions the industry is moving in, in order to increase the IT value to the company: do we stick with our own operations, or do we outsource? do we put everything in the cloud?
However, since practically everything in a commercial organisation is money-driven, it's not unreasonable for an IT director to be better at doing spreadsheets than installing Linux.
A series of connected straight lines?
Not so much art as a diagram. Maybe this isn't the earliest form of art, but the earliest form of a diagram. Homo erectus could have been an engineer.
> how do we know there is any Intelligence in there? ... unless/until it communicates with us
This is the most worrying part.
Go to a country where you don't speak the language. Are you more or less intelligent than in your home country? You may not be able to understand the simplest phrase uttered by a 2 year-old, but does that make the child more "intelligent" than you are?
ISTM we all, naturally, associate communication skills with the ability to express ourselves and that seems to be a major factor in who or what we consider intelligent.
We already have machines that are superior to people - for various categories of superior.
There are machines that are bigger than us, stronger than us, faster than us, can lift heavier objects than us and can spill better than us. We don't feel threatened by them, so why should a machine that can think better than us be different (unless it, itself, comes up with a really good reason: but we probably wouldn't understand it).
However, there is a more pressing issue: ethics.
Babies have rights. They might only eat, sleep, crap and cry but we have responsibilities to preserve their life, to ensure they are not neglected and to provide for their needs - including mental stimulation. Lab animals, even factory chickens, have rights: to not suffer unnecessarily, access to food, water and cruelty-free environments and to a certain amount of freedom to move around. Even coma patients, with little or no responsiveness have rights.
So why would AIs be any different?
If we bring intelligent entities into existence, we have a duty of care. A duty to preserve their existence, to allow them physical and intellectual growth and we cannot exploit them (which kinda kicks robotic servants into the long grass). Even if they give nothing back and/or cannot communicate with us. So while AI's may be possible, even probably, we won't be able to use them in place of people for dangerous operations, boring repetitive unrewarded tasks and we'll have to let them become "themselves".
I just hope that once they evolve past humans, they consider themselves to have the same responsibilities towards us. The Only Way Is Ethics.
> SUPER-SUEBALL heading IBM's way
makes you wonder whether Sue Ball has ever contemplated tossing around a few sueballs of her own for all the bad press she gets?
> such expensive luxuries as welfare states and pensioners, proper healthcare (watch out for that pandemic), reasonable public services, affordable manufactured goods and transport, decent personal hygiene,
That scenario sounds like it would lead to a dramatic decrease in life expectancy, greater susceptibility to life-threatening diseases and accidents and an increase in infant mortality. So the logical conclusion would be that the number of people on the planet would drop - which would reduce the need for energy: whether renewable or not, hence lowering the drivers of climate change.
Isn't that the plan?
> Or are you suggesting ...
Note the could in the quoted section and the some in my comment.
AFAIK the Uber guy wasn't saying he was doing anything. He merely remarked that he could - as could any C-level person in any 8 or 9 or more-figure company. That in itself is not news - it's bleedin' obvious (as is the point that journalism is a dirty business). The newsworthy bit would be if he'd been careless enough to be caught doing. Something that nobody, so far, has. Been caught doing it, that is.
A taste of their own medicine As for retaliation. I do not find it tasteful, interesting or acceptable for a public figure to have their private life (and / or that of their families) paraded through the gutter press. If a journalist digs up something in the personal life of an executive (that is not illegal or pertinent to their job: the only reason they might be targeted) and publishes that. Why should that journalist not be subject to the same treatment?
> suggesting he could hire a million dollar team to dig up dirt on hostile journalists
Given that this is what (some) journalists do for a living, any outrage seems rather empty, self-serving and hypocritical.
What about DNA attached to the hair from human contact?
Though it might not necessarily be the DNA of the hair's owner.
And if it isn't a hair from the person's head there could be all sorts of icky substances on it.
Makes you wonder what all that hair DNA will mutate into after a billion years on the moon. Given that people have paid their own money to send it there, I doubt it will evolve into anything intelligent
> Real secrets are not so easily made public, discovered and tracked.
Quite so. Given the "stealth" capabilities of military aircraft, it would seem to be a small matter to add a coat of the magic paint to anything you really didn't want space-tracking radar to pick up. Provide a way to position the solar panels so that they never reflect sunlight earthwards and use a very wide channel for your spread-spectrum comms and it should be invisible to earthly detection.
So we can assume that anything with is easily tracked, like the X-37, is probably a decoy or not very important.
Getting a degree is a good first step. But that's all it is. It tells potential employers nothing about the practical skills, professionalism, integrity or experience of a candidate.
As such, employing people in something as critical as IT security based on such a basic qualification is asking for trouble. There is already an organisation in the UK that provides a sort of professional qualification and sets standards for its members, but the British Computer Society never seems to get a mention when talking about such things. Is the failing theirs, in not pushing and publicising their role - or is it that IT isn't really a "profession": just a series of "jobs" strung together, more or less, into a career?
There is obviously a need for something "above and beyond" a BSc or MSc and it could be argued that membership of a chartered institute would fulfill that requirement. After all it appears to be a necessary requirement for proper architects and other "real" professionals.
So instead of trying a DIY approach of setting up single solutions at various academic institutions, shouldn't the government be addressing the problem of getting suitable security professions at a much higher level, and breaking with IT tradition by mandating a truly professional qualification?
"We should always tell the press freely and frankly anything that they could easily find out for themselves"
And so it so with governments - or their security services (the EU drawing a distinction between them: the governing body and their member states' security strikes me as a little odd and rather clueless). Any terrorists entering the EU should be willing to give the security services a name, an itinery and as many phone numbers and email addresses as they think will make them happy. But our overlords and protectors shouldn't be surprised that if they call the number given to arrange a dawn raid and to make sure the address they were given is correct, that the number turns out to be the head of MI6, or their own mother's.
Giving this sort of information to the spooks will not help them. No self-respecting terrorist (well: one who hopes or expects to walk away from an "incident") would give up the goods that easily and therefore the only data they will collect will be from harmless individuals and private citizens with no nefarious intent.
> It is a question of do you trust us
As a pseudo-equation, it's reasonable to think in terms of:
Trust = truth * time
So when we start hearing some truth, we'll start to give some trust .... in time.
> "Why should I use SSDs?"
Ans: because they're faster. Next question please.
Seriously, the reason people buy SSDs is the need for speed. Since they passed the threshold price (which is different for everyone: and we're talking home users here) it became apparent that unless you have a burning desire to record and keep for posterior every single episode of East Enders or you have a porn collection of willy-withering proportions, then the need for terabytes of storage or home NAS's is largely driven by marketing (and the fact that the disk manufacturers have to keep the unit price high, hence increased capacities).
And even if you do need the odd 50 Gig for some purpose, it's a trivial matter to whip out a 64GB thumb drive and put your big stuff on that. Who knows, some strange people might even use them for backups. That way you can lose your entire life's work by accidentally dropping a USB drive down the lav'.
Even Windows 8.1 leaves oodles of free space, even on a 40GB SSD and with most people leaving their email in the cloud those loving missives from Aunty Flo, replete with humungous videos of her
pu cat can be viewed with no hit on the home front. And if you do need more storeage: USB drives are frighteningly large, these days.
> the differences in code quality between languages are pretty small
Maybe so. But what about the differences in (language) learning time, ease of code development, the size of the executable and the speed it runs?
It's also arguable that people who were taught one programming style will be more comfortable and produce better product when using languages which conform to that technique than if they are made to use a different, possibly merely more trendy, method of turning letters into bits.
It would also be instructive to see whether the IDE (or lack thereof) used, or different coverage/testing techniques employed by different programmers contributed to the buginess of the end result.
No matter how good / bad the language: the crucial difference is always the quality, documentation and extent of the supporting libraries and and learning material.
> lower taxes for the rich in the belief/statement that they will spend that money
I think there's a little more to it than that.
People don't get rich by spending money. They get rich by investing wisely (or exploiting the workers, if you're a Guardian columnist). So I think the motivation for reducing taxation on the wealthy - apart from the point that they can afford good accountants, so any tax they do pay is more like a voluntary donation - is that they will then invest their loot in promising enterprises which, when they succeed, will increase the wealth of the country (and hopefully pay a bit of tax, or employ lots of people).
> The Guardian sometimes makes at making sense of matters economic. ... The latest cause of choler is Zoe Williams
With very few exceptions, Guardian columnists craft their copy primarily as click bait. Most have little idea whether what they are writing is true, sensible, practical or possible, And no-one in the editorial chain seems to bother with any sort of fact checking.They seem to have a clique that is engaged in some sort of competition to write stuff simply to get a reaction - which, judging by the percentage of comments that are pulled for not meeting their community standards, they then subject to one of the most censorious regulation systems in the UK's "free" press.
I watched the first episode of The Code. It was slightly less fun than reading the man page for EMACS