Not before time, as it's clear that most people doing mobile-phone or even laptop video calls for TV news, and so on, are relying on crappy wifi, given the choppiness and poor quality of most of what we're seeing. But at least it's encouraging to see that more people now seem to be getting the message that portrait video in a video-conferencing or for-TV context is for knobheads.
78 posts • joined 12 Jan 2010
After 1.5 million days of computer time, SETI@home heads home to probe potential signs of alien civilizations
I've never really understood why SETI, and others of its ilk, pitch their search frequencies in the Gigahertz range, when if we're anything to go by the sorts of frequencies likely to leak into space in a significant way are those of radio and TV (from long-wave through UHF, etc). There's a theory that as things like analogue TV and nationwide transmitters get turned off, and TV/radio moves more onto fibre, the internet and technologies like DAB for distribution or transmission, where broadcast "cells" are lower-powered but more numerous, we as a civilisation have already passed through the short window where we're blasting high-energy, discoverable, signals all over space. Why the assumption that any sufficiently-advanced civilisation will suddenly switch to microwave if it wants to announce its existence? Maybe it's better for generating high energy bursts, but the odds of looking at the right patch of sky at the right time seem much reduced.
The penultimate paragraph in that article nails it really. Lego is way down the scale when it comes to plastic-based environmental offences (the odd container that falls off a ship is hardly its fault), as the stuff simply doesn't get thrown away. It's handed down, donated to charity shops or re-sold in yard sales, but not binned.
As someone tall enough to have sufficient "bar presence", but who is married to someone who isn't (and as a former barman), I think in principal this is quite a neat idea. However, alarm bells ring at the requirement for internet access, as this would seem superfluous for basic on-site face matching. So what does it need internet access for? Has the state demanded snooping access to the vizogs of drinkers now, to add it its database of possible ne're-do-wells?
WarGames 8" floppies
Yes, Matthew Broderick's character David Lightman was using 8" floppies in his IMSAI 8080 drives. According to the director's commentary on the DVD I have, the whole setup - even though it was fairly ancient when the film was made - was chosen because the interactions with it, especially the use of an acoustic coupler instead of a plug-in modem, made it far more obvious what was going on than those with a newer computer would. This was also why they came up with the idea for the online interaction with WOPR to be read out by John Wood/Dr. Stephen Falken - text to speech was nowhere near that good in those days, but it significantly increased the dramatic tension.
Making it up?
I thought that Whitman's comment "that is a major role of the CFO and the accounting team, to decide what is the right answer" was a bit odd. Outside the realms of overtly creative accountancy, the rules of finance are fairly deterministic. That ought to mean that given a set of numbers, the answer should always be the same, but instead it comes across as an admission that stuff is being made up somewhere to match a pre-selected version of reality. Surely not?!
Data Protection in photos?
I've long been curious about assertions that images can be covered by data protection legislation when what we see is only ever a run-time interpretation of the actual data stored in the image file. By which I mean, if you were to open up a JPEG in a text editor, there's absolutely nothing in the actual data which could ever be connected to any individual, or indeed anything recognisable at all. So isn't what we see in an image only ever our brain's understanding of this visual representation, and if so - how is that a data-protection issue?
By the way, this doesn't mean that I disagree with the bloke taking on South Wales Police for its blanket face slurping!
Wishful thinking in the UK
yeah, but Japan's Shinkansen/Bullet Train "is a passenger-only operation, untrammelled by compatibility with the existing system, unhindered by road crossings and freed of directly serving any but the largest communities". You also need a good chunk of nice straight track in order to achieve those sorts of speeds, and good luck with trying to do that in the UK unless you want to be bogged down in planning and objections until the end of time. Oh, and you'd have to keep all the old infrastructure as well, to run freight and all those crappy short-haul EMUs.
Re: Ratio for video masochists
I concur - that thing they have to do when showing crummy portrait videos in, for instance, BBC News, drives me nuts. You'd think it would be easy enough to make all video record in landscape but add metadata to say "this was actually shot in portrait, so display the relevant clipped portrait region if displaying on another phone", but keep the raw video in landscape so it could be used on TV news, or on a phone in landscape.
How the mighty have fallen
It's sad where Levy has ended up - once a respected author, former chess grand master, co-developer of the multi-championship-winning SciSys chess machines and, perhaps most famously, part of the Elan/Flan/Enterprise story - a microcomputer that would have been ground-breaking if it hadn't been nearly two years late. Enterprise/Elan failed in 1986 with debts of £8 million, so maybe the whole thing, way back in 1985, set the pattern for the present day.
You could go further and say that Englebart invented the mouse in the same way that Edison invented the lightbulb - i.e. both get credit for things that had either been invented before, or at least contemporaneously, somewhere else. In this case, Englebart's Joseph Swan was Telefunken, which had already built a ball-based mouse - this being closer to mice we know today as Englebart's was x-y only - a few weeks before the MOAD
Re: Well it seems that fiber optics companies are going to make a killing in offshore revenue
"all the major companies are currently American (Google, Ebay, Amazon, Facebook, Twitter)"
That's a very Western-centric view of the world - without even thinking about it too much I can replace that list with equivalent Chinese companies that are as big (at least relatively, but sometimes even globally) as the US equivalents: Baidu, Alibaba, Tencent and Sina Weibo
60% is nebulous as no-one actually knows - it's more-or-less a number plucked out of the air (the assumption being that once you've reached 100% you're immediately going to die?). And, although I could have chosen to say "you'd still be the first people on Mars" it doesn't change the fact that yes - it would be worth the risk on "everyone" - to get even one person onto the surface of another planet, because of how that might change the whole of humanity's outlook - even if it's only temporary - and get people looking up and outwards instead of down and inwards for a change.
I can only imagine where we'd all be if the Polynesians first spreading out over the Pacific, the peoples first colonising the Americas, the Vikings, or the early European explorers had been so recklessly timid and afraid of nebulous risks as we as a species are now. Does it matter if you were to receive 60% of your lifetime's radiation? No, it doesn't - because you'd be the first person on Mars.
Re: I get this completely
"A midi file simply won't contain all this detail, but by programming an AI with raw audio sounds. There is an opportunity for this parts to be learned."
I agree, but I'm not thinking about MIDI (at least not in the play-by-numbers sense). What I mean is - and I say this as a semi-professional pianist - that it must surely be easier to teach AI to play something like a piano like a musician by reading and understanding actual music than it is to try and filter out the useful bits from the huge pile of data that is the raw audio stream of a performance. Even what seems like the difficult stuff, for instance interpretation, is something that most musicians have to learn: when I was nine, I didn't know any of this and just plonked everything out at the same volume, but I learned over time what sounded better.
Looking through the wrong end of the telescope?
Much like a comment above suggests, I don't really get this. Most conventional music has rules - those rules are fairly easy to quantify: start with any note, use that as your base key. Pick major or minor (or melodic, enharmonic, whatever) to give you a scale for that key (the notes to use are rule-based too). Pick 1st, 3rd and 5th of your current scale to give you a simple chord that works with it; add a 7th or occasional minor 3rds if you want to be bluesey, etc. Now and again, change your scale root to the 4th note, then the 5th of your base key - bingo, you've got a 12-bar-style progression - the basis of much rock and/or roll music. Noodle around with the notes in your key and generate some sort of melody. Add in some alternate progressions for variety or as a middle eight (again, there are established patterns for all of this). Does it sound OK? If not, start again and try other combinations. This is pretty much entirely how I learned to improvise Jazz/Blues piano
To be fair...
I'm definitely no fan of Apple and its revisionist interpretation of history, and especially the frequent claims that it somehow "started the computer revolution" despite only ever launching products that already existed, albeit sometimes improving upon them (although neither the Apple I nor II were demonstrably better than the competition). But to be fair, the Lisa wasn't so much "horribly wrong" (although it was buggy) as simply "horribly priced" at around £8,000, or about £25,000 in today's money
"An AI set up to do the same job could also have such a scenario built in."
Which nicely sums up where AI is at the moment - there's still no "intelligence" that can realistically consider situations like this, in the way a human can, outside of its programming. What if someone had even thought about this in advance and added a rule like "do not launch counter-attack if missiles <= 5". What then if 6 "missiles" had been detected? Until such time as AIs can really play a hundred games of tic-tac-toe and come to the conclusion that "the only winning move is not to play", then it's just not safe enough to work in this sort of application.
Eliza's author, Joseph Weizenbaum (sometimes credited as the father of AI) had strong views on this, suggesting that a programmer who helped fake bombing data in the Vietnam War was "just following orders" in the same was as Adolf Eichmann, architect of the Holocaust. He said "The frequently-used arguments about the neutrality of computers and the inability of programs to exploit or correct social deficiencies are an attempt to absolve programs from responsibility for the harm they cause, just as bullets are not responsible for the people they kill. [But] that does not absolve the technologist who puts such tools at the disposal of a morally deficient society"
Can technologists see the future more clearly?
I would say that technologists have a patchy record at future-gazing at best, with Adam Osborne (of Osborne 1 fame) predicting in 1979 that 50% of jobs would be lost over the following 25 years, or Alvin Toffler suggesting that computers would enhance our mind power. I don't see much of that in the tabloid race to the bottom or a world of uncritically-accepted fake news on Facebook. Even the legendary Dr. Christopher Evans suggested that computers would remove drudgery, increase prosperity (for all, not just a few) and iron out intellectual differences between all people
Nearly 40 years ago almost exactly the same things were being said about AI and computers in general and their impact on people's livelihoods. Phrases like "jobs holocaust" and "the collapse of work" were common. Expert Systems (like Weizenbaum's Eliza) were going to replace doctors and psychiatrists Real Soon Now. Then there were "Fifth Generation [AI] Computers", abandoned after a decade with millions of pounds, dollars and yen spent.
It's still not happened, and whilst most of the other promises of technology posited at the time have been wildly exceeded in ways pundits of the 1970s wouldn't have thought possible (storage, performance, power, price, portability, graphics, etc, etc), general-purpose AI still seems to be only just out of the starting blocks.
There are definitely areas where AI has massively improved, like language and image processing, but a "universal AI machine" still seems to be a long way off.
Re: The same everywhere...?
Responsive Design is an absolute scam, as the one thing it does little to address is data usage (although it got a bit better since actual pre-download support for different image resolutions was added). Downloading 4MB of content before deciding which parts of it you want to use is completely missing the point and is exactly why I often use The Reg's mobile variant on my laptop on the train and despaired when the BBC dropped its proper mobile version.
Re: Pandora's box?
The era when anyone made vast amounts of money thanks to the wool trade has long passed, and whilst it's true that wool might naturally fall from sheep, I suspect you would find that harvesting it by picking it out of hedgerows would not be economically viable (or would mean £500 sweaters). Wool is now financially a very minor part of the sheep-rearing business and, like milk and leather, is strictly a by-product of the meat industry.
Re: For the true Sinclair aficionado...
Camputers (builder of the ill-fated Lynx) was also headquartered on Bridge Street. It was related to Acorn in the sense that it was a spin-off from GW Design, a company that had provided some PCB design services for the Acorn Proton/BBC Micro.
There's also Jupiter Cantab - designers of thr Ace - but they were way out in Bar Hill, up the A14 (although that probablly didn't exist in those days)..
Allso worth a stop-off would be the Cambridge Science Park to see Cambridge Consultants, which once counted Clive Sinclair and Robert Maxwell as board members and was something of a nexus of early Cambridge micro companies.
Re: Details, details...
Ah yes, of course it's a court and I should have known that having worked in Cambridge for 11 years, ived there for a bit and having only recently stayed at Christ's (when they let their student rooms out during the summer) with its imaginitively-named First, Second and Third Courts!
TomTom just gets driving more than Google or Apple
On a recent longish trip in a friend's van, I became so impressed at just how much better TomTom's phone navigation app was compared to Google or Apple (the latter of which delights in absurd instructions like "turn left in to the B one-thousand-and-thirty-four in 600 feet") that I splashed out the £39 or whatever it was for three years as soon as I got back. Proof not just that you get what you pay for, but that TomTom's navigation stuff is clearly thought through by people who actually drive, rather than - apparently - by people who don't.
Re: This is ridiculous it is so out of touch...
Yeah, and I'm not sure what else it says about the target demographic, when the example is lager (clearly the cheapo gnats-piss supermarket own-brand version and the other sort commonly referred to as "wife beater") and neither of the percentages is representative of most draught real ales, many of which are between 3.7 and 4.2%
So, we appear to have come full circle: back in 1979, white-collar/technical union head honcho Clive Jenkins stated "the days when fears of unemployment caused by computing could be discounted have definitely vanished", with his union - the ASTMS - going on to predict an extra 3 million unemployed by 1991, attributable to computers and robots. Jenkins went on to call the whole process a "jobs holocaust" and, together with co-author Barrie Sherman, forecast an imminent "leisure society" with most people being unemployed most of the time. Adam Osborne, he of the Osborne 1 computer, was also in to predictions of gloom in his book "Running Wild", also published in 1979, in which he predicted 50% of all jobs disappearing with 50 million job losses in the US alone, thanks to technology. Even the Socialist Workers Party pitched in with a book published in 1980 called "Is a Machine After Your Job?".
Maybe we should just be more like Dr. Christopher "Mighty Micro" Evans who was far more sanguine, saying in 1979 (not long before he died at the age of 48) "Like it or not, the technology is going to overwhelm us. So, as for some of the eerie futures that seem possible, I don't think we've got much option. Take the case of machine intelligence. It's going to be just too useful for us not to develop it".
Plus ça change!
Re: why do I stick with Firefox?
Same here really: I think it will be a very dark and sinister day for the internet if Google gets to own (and control) the entire stack, from the entry point (search) through the content (YouTube, etc) to the client used to access it all (Chrome).
Netscape gave us a choice against Microsoft's dominance. It's not just for nostalgia that its progeny Firefox - for all its warts - should be supported and improved to ensure that some choice remains in a Google-dominated world.
The iMac was essentially a copy of the 1984 Ontel C/WP Cortex - same all-in-one/built-in-to-the-monitor design and available in different colours; the iPhone was a lot like a Visitor data device as seen in 1985 TV sci-fi series "V", in the episode "Reflections in Terror" - same rounded corners and everything.
Woz was clearly not a layout artist
I'd heard it said somewhere before that Woz was not considered to have been that brilliant an engineer, and the layout design for the Apple I seems to prove it. The chips are laid out in about the worst way you could think - it's almost like getting the longest tracks imaginable between everything was a design objective!
The iPhone was a copy anyway
In the episode entitled "Reflections in Terror" of TV series "V" (1984-1985), one of the Visitors attempts to extract the DNA of Elizabeth the "Star Child" by "accidentally" spiking her with a rose thorn. Prior to this, she checks Elizabeth's identity with a device suprisingly similar to an iPhone, complete with rounded corners, an LCD-like display and touch capability. That looks like prior art to me!
What am I missing?
What I don't get about Bluetooth or other wireless speakers is... don't they still need power? If they need power, don't they need wires? If they need wires then, er, why not just wire them in to the source and solve power supply and latency in one go. I suppose I get that some people might prefer to plug in a power brick locally for each remote speaker (with all the wasted stand-by energy that implies) rather than having wires run all around the house, but unless you happen to have pre-wired mains sockets right next to where the speakers want to be then you still have a trailing wire problem.