103 posts • joined Saturday 12th August 2006 19:51 GMT
Typical edu do-gooder attitude
Again, powerful people with meagre pedagogical insight feel qualified to weigh in on subtle details of curriculum planning.
I teach programming. For every 'more' requested, there is correspondingly less time for something else. there are rarely calls for less of anything. And often that something else (e.g. learning about different number bases in general) forms the foundation of whatever is wanted more of (I assume hexadecimal is still regarded as useful for 'coding' by the clown, but who can tell?)
Fragmentation of Xcode and its documentation
...makes it hard to develop iOS apps too. Docs (especially tutorials) are rarely in sync with the latest version which invariably doesn't run on version -2 or sometimes version -1 of the current Mac OS. This is fragmentation too, and it's well within their power to prevent, and yes it hurts developers. Apple's third party developers suffer from Stockholm syndrome.
Re: This is not new
Yes and no. Documents regarded as routine or ephemera today may turn out to be valuable, or of historical importance tomorrow. The point is we can't make the same judgements about the importance or value of documents in advance that hindsight would lead us to in future. Original letters by famous people are some of the most valuable artefacts to appear at auction. Even if its just Karl Marx's laundry bill, someone will pay big money for it.
But MS and others should take some of the blame: their document formats are clearly and simply under-Engineered for longevity. Open formats go a long way to solving this problem, but they are eschewed for many other reasons. Potential longevity is not a selling point, unfortunately
I thought this was exactly the point of mpeg4 (based on the multitrack 'elemental' architecture innovations of Apple's QuickTime, which has become much less interesting since): Keep the media in different streams rather than interleaving them or multiplexing them, so those streams can be selected for different audiences, devices and bandwidths at reception. The classic case is the scrolling text news ticker visible under so many talking heads. Or even movie credits. It's plain stupidity to compress text with a video codec, and yet it's standard practice today to do exactly that. Insanity! Transmitting text along with video is a solved problem! But that's not enough when the engineering and marketing agendas outweigh that of the designers and content innovators, not to mention the content industry (more correctly called the back catalogue industry) who just want to rerelease the same old cash cows in new formats, rather than let creative people do something like (say) Peter Gabriel's Xplora1 on bluray (bluray specifies that the player has a JVM, no?)
But mpeg4's non video profiles never really took off. Largely because the content creators didn't think in terms other than the existing paradigms for what video could be. One track each of audio and video. maybe a subtitle track if you're lucky? its pretty poor for so-called 'multi'media. Others have pointed out the dearth of multi angle content. (Ideal for porn, but even those tricks fell away). Just as Marshall McLuhan observed: we first use the new media to do the work of the old. (q.v. artficial horse heads on early cars). Only later do we find out that the new media are good for something different than the old, and the old media were actually better at some things. (Books are still good).
Most people are unaware that mpeg4 specifies a profile for 3d models and textures, or for interactivity for example (these was never implemented). Adobe (or whoever else might have made an authoring tool) had other priorities. Same with the browser guys, caught in the XHTML 2.0 quagmire. So content creators and software firms need to be on board. Therefore it's significant that the BBC is pushing this. I remain skeptical, however.
Gif colour limits
It's not well known, but gifs *can* contain more than 256 colours. The restriction is that the image be divided into contiguous rectangles (pixel dimensions of your choice) and *these* should have the same (max 256 colour) palette. So the top half of the gif could be greyscale and the bottom half could be 'vivid' or whatever those funny old names for cluts were. You can have as many of these blocks as you want. Presumably you could have 1x1 pixel blocks, each with their own palette but then it would scarcely be efficient compression. Still it's a nice hack not a million miles from Amiga's ingenious HAM. Unfortunately there are very few tools which support the export of such exotic objects. Browser support is pretty good however. There's a page on the web which shows off the technique. Can't be arsed to google for it though.
What mole and skel said
And we could add: NOT recognising a country when the rest of the international community has done so ( beginning with ISO, UN etc.) is just as much or more 'getting involved' in the conflict.
Logitech GUI and drivers
I've had several mice from these guys. The driver software GUI has been consistently underwhelming and stupidly restrictive. Buttons can open apps but not run scripts, for example. I don't think they 'get' How to design configurable software for power users at all. I wouldn't dream of spending that kind of money on their products.
Now, if they reached out a bit to the hacker community, provided an sdk, let the Linux guys play, hooked up to script runtime middleware etc., it would be a different story. But they won't, because of their unfathomable but precious strategy.
This device is overpriced and underpowered, and soon to be obsolete. Why bother?
Ideology getting in the way of ethics? But surely Peggy, you wanted to give democracy the best opportunity? Surprise surprise! Neoliberalism reveals itself to be a poisonous scum floating on the ocean of the struggle for true liberty. TINA? Just another totalitarian game dressed up as liberation. Who pays the rough men which enable us to sleep soundly? And who is 'us'? There is no such thing as society?
I've been an apple user for years, increasingly disgruntled by their slick grin shenanigans. I used a htc desire android phone for a while until it bricked itself for no aparrent reason. That turned me off somewhat, but it was good while it lasted. Now i'm on iPhone. I never use iTunes on it, rather the 'music' app, which is a lovely piece of simple engineering and design. But of course I move music onto the device with my mac version of iTunes, which truly seems to become more stupid and irritating with every iteration. Now there's no way of seeing two playlists at a time, so comparing your collection with what's on the device is out. Stupid. But these are small issues. Apple still make solid products, for the most part, and I haven't given up yet, but I am no fanboi, when their strategy tax gets too costly there's no loyalty left to lose with me, and i will happily switch. I agree fully with the correspondent who pointed out that hating apple is just as much a fad as loving them. I like the look of the new Nokias, I just wish I had similar confidence in Microsoft.
Re: Genuine question about evolution
Natural selection acts on *populations*, not on individuals.
Keep that in mind and everything falls into place. Please also disregard the stupidly named "Darwin Awards" when trying to understand how evolution works, because that is about individual misadventure. Genetics have very little to do with that.
I was a Director developer, who tried FutureSplash Animator in 1996 and couldn't take it very seriously. I regarded it as a toy. We had afterburner and then shockwave, which could do so much more. Director's scripting language was never highly regarded, but had powerful LISP-like features, a command-line, acceptable OOP, and it compiled to bytecode. You could do great things with it. (Most of the really great Director stuff was overlooked, or made for very small audiences). The early 'action' editors in Flash were a bad joke.
I was even more amazed that a handful of people started using it to make some quite decent casual games, even emulators, synthesisers and some fabulous data visualisation tools (e.g. gapminder). Slowly, steadily, it became more technically powerful, and certainly more up-to-date than its older brother.
I switched to Flash, even began teaching it, and developed a couple of solid medium-size applications with it, plus many small things. I adopted AS3 and grew to like it, but there has never been any doubt that Flash - authoring and playback - has been rotten on Apple's systems for almost 10 years. Crashes, hangs, lousy resource management (memory/cpu) and poor OS-integration have been the norm. Multimedia designers - many of whom are Mac users - have always had a love-hate relationship with Flash, and I believe that now they are ready to move on.
I confess to a certain amount of schadenfreude. I remember the snooty Flash kids, their tool of choice in the ascendant, looking down their noses at Director devs, just as we looked down our noses at the Hypercard and Authorware community. It becomes increasingly obvious: Closed multimedia authoring systems are always a dead-end, no matter how defacto 'standard' they may temporarily be. Microsoft's never-popular Silverlight - a potential competitor crippled by neurotic strategy - is another relevant example. In each previous case, there was always an obvious proprietary ship to jump to, but that is not so now, and that is why Flash SWF will linger on, way past its sell-by date.
I know that this is total non-problem for comp.sci folks, but the audience for the Flash authoring tool is quite different. They have different needs and different expectations when making interactive stuff. And yes, many of them are crap, and clueless but many really want to make good stuff and adopt best practices.
I also see no other software which has vector drawing tools as friendly and intuitive as those found in Flash since its very first versions, and I see no animation tools that can export lightweight vector-based animations (e.g. svg) for the web. (BTW The animated gif exporter of Flash *really* sucks, and animated gifs are surely not where we want to go in the 21st century).
Adobe Edge looks promising, but I am wary of Adobe's ability to manage multimedia authoring tool development. Their track record is abysmal. Can they just not screw up, bloat, and hobble their tools with the limitations of their broader strategy for reaching 'internet marketers'? I am sceptical.
And I am training my JS/HTML5 muscles for multimedia teaching and multimedia content production, because the quality difference between 'the men and the boys' is going to be pretty stark.
Quite so. Windows was almost always the OS for people who didn't care which OS they ran. No wonder the users are in no hurry to upgrade.
It's not just a tool
... it's a culture.
I really don't buy this 'just a tool' argument. It's not like a hammer with a wide range of uses, some of them harmful, it's more like a garroting chair. Yes, the latter is 'just a tool', but it rather imposes certain ways of persuading or manipulating both users and ...er... audience.
A decently designed tool subtly or overtly guides use in such a way that the results are usually good, and you have to work hard to get really bad results. (Example - an electric kettle, or even ElReg's forums, which while it's not anywhere near as good as a mediocre nntp client for actual discussion, does an acceptable job of fielding responses from readers, which was doubtless the spec).
Powerpoint (keynote, etc.) might have become more colourful and decorative since the early days, but it's essentially unchanged: It makes hot air go much further, and offers weight and density when real substance is lacking. And this is why people like it.
I fully agree with Mr Longland. Slides are usually not necessary, they usually detract from or muddle the speaker's message, and few speakers, if any, have the skill (and *discipline*) required to use them effectively.
I work as a teacher, and the tendency (and temptation) to use powerpoint is very strong. After almost 10 years of boycotting MS Office entirely. I was briefly persuaded to use ppts by the way they could be repurposed to provide notes for those who missed the lecture, until I realised that my lectures had become mechanical, inflexible and ponderous. (I could even hypothesise some folks started missing my lectures for this reason).
After one session when it appeared that all the projectors were producing a blurry image, and where we could have spent the whole time troubleshooting (turned out someone had helpfully 'wiped' the lenses with a non lint-free cloth) - I said "sod the slides, let's get on with the lecture" and I was reminded what a pleasure it is to wander amongst the students making more personal contact, picking up on body language responses and so on. The feedback from the students was terrific. I immediately decided to drop powerpoint - and return to my older teaching style which had served me so well for so long.
Now I am thinking that twitter is probably a better tool - the audience can make their own notes, and share them with each other, making the feed available to whomever afterwards. I can drop in urls if necessary, but I don't even need to use a screen. Will do some testing with this technique next semester.
AC's reminiscence of powerpoint culture in the military is sobering. I remember that story breaking, and I thought "OK, now maybe something will happen" but somehow the culture remains. Well, Microsoft PR might have spun the story, but I think the problem is more that organisations are actually *addicted* to Powerpoint. Remove it, and most of its users will be revealed to be little more than charlatans, and that in turn will make the organisations look bad. So I guess we're stuck. Maybe a few more air crashes will help.
I had already worked out the general details from reading the reg for nearly 10 years, but seeing as I never have to deal with exactly this area of IT (I'm in software), I really appreciate that El Reg takes the time to define and illustrate a rather common kind of occult technology. I assume it's not exhaustive, nor even entirely accurate but who cares! I am definitely better-off than "none the wiser" after reading this article. Another please!
The problem with cherry picking 'hard economic data'
...is that all public spending is put in the column marked "outgoing" and all the taxes are in the column marked "ingoing", which leads to the wrong ideological conclusions about how to balance the budget.
The thing is, spending can also be an investment which will be good for business (e.g. railway infrastructure, education or sewers) in the long term, and refraining from spending on these areas, or allowing them to decline, can actually generate overheads in the medium term. Businesses don't run like this. They understand very well that 'you have to speculate to accumulate'. For some reason, politicians are allowed to ignore this basic fact of business economics.
Trouble is, politicians (like the author of this article) only have to worry about the few years until the next election, and all their policy is subservient to the short term aim of getting re-elected, rather than any long term aim which might be better for the constituency/nation.
... including - yes - some breakfast cereals, but also vegemite, ovaltine, horlicks, certain cough sweets etc. are forbidden in Denmark because the authorities are worried that the food industry will fortify unhealthy products (such as sugary soft drinks or potato crisps/chips) and then sell them as health foods.
In theory it sort of makes sense, if you are a bureaucrat with one eye on the public health budget, but in practice the law hits very wide of the mark. Meanwhie Danes (like the rest of Europe and especially USA) still guzzle thousands of litres of sugary soda, which enjoys plenty of advertising exposure, even on kids' tv - despite regular reports about how harmful it is. Not to mention the unhealthy amounts of pork, and dearth of vegetables, in the typical Danish diet.
The ban applies to sales, and possibly also wholesale import of all fortified foods. Possession and importation for personal use is not covered by this particular law. The same ban was introduced in Norway some years ago, but now it is legal to sell Marmite again there as long as it is sold as a vitamin supplement, rather than a foodstuff.
I live in Denmark, I love marmite and I think this law is utterly stupid. Don't get me started on the stupid Danish restrictions on over-the-counter medications.
has since the introduction of OSX been the Mac equivalent of .exe (and I think it comes from NeXT before that). Unquestionably Jobsian territory.
I agree with many of the comments above: Especially those who say that Microsoft are in an absurd position after trademarking "Windows" and "Word" - which are single words, and complaining about a pairing of words, which was certainly not on anyone's mind before Apple started selling iPhone software this way.
Before the iPhone, handheld software was invariably called software, or programs, or categorised as utilities or games or whatever. I don't remember running into the use of the term "app" outside the Apple universe until the iPhone hit it big. It may have been used occasionally by developers, but it was certainly a rarity in marketing contexts. It should be possible to make some kind of statistical analysis using the wayback machine to prove this point.
Also fully agree with AC that this shows real lack of imagination that they can't come up with another name, like Google have. ExeStore? heheh Well, we know that MS lacks imagination, but HTC and Nokia should perform better than this.
Very slick piece of PR
Microsoft needs a PR hit, when it has had so many misses lately. I expecially like the wording of
"a database of sensitive information that can enable a party to surreptitiously 'track' a user,”"
- exactly! Apple and Google squirm and wriggle and say "we don't collect this information" but they don't mention that they do do stuff which allows others - an unspecified party, as Microsoft has it - to do exactly that.
If this can put pressure on Apple and Google to tighten up their ship, it will be a good result for everyone.
Roll out the Tiresome Chorus of Assange Bashers
There may be many examples of poor style, or overzealous reactions from Assange and his allies, but the these are more than eclipsed by the war crimes and atrocities being done daily in the name of our so-called democratic institutions.
Nobody has mentioned that Wikileaks takes exquisite care to release only the information which will not endager lives. They may err, or misjudge in some cases, but I have yet to see an example of a civilian or military death which resulted from a leak published after wikileaks released it.
There are (therefore) other more important, and wholly ethical reasons than making money, which would behoove Wikileaks to strictly control their 'crown jewels'.
All those that celebrate the principle of 'free' leaks from wikileaks are arguable ethically worse-positioned than wikileaks themselves - unless you can point to some kind of code of practice which would also prevent civilian and military casualties.
-also, when banks and credit card companies *illegally* block sources of income from donations, how else do you expect the organisation to pay the bills?
I am getting really tired of the ad hominem attacks on Assange in the reg forums. Sure he's not an angel, but most of the criticism of him seems to consist of ridiculing his vanity, and his eagerness to secure a decent financial footing for wikileaks and its employees. I will continue to regard these attacks as spiteful, cowardly and tiresome, especially for as long as the war criminals Wikileaks exposes, and those that undermine the validity of democratic values are still considered paragons of virtue, just because they have more extensive PR resources.
There is a book for kids to learn programming, and it uses python. I forgot the exact title, but it's probably something like "Programming for Kids".
I can also appreciate the value of Processing, because it is so visual, and can be treated rather like Logo if you approach it in the right way, although we can ask ourselves whether the 'curly bracket' languages (C family) deserve another generation of potential bigots who are all fingers and thumbs the moment they are required to use a different syntax (e.g. python, lisp family, pascal family).
We've already seen complaints here about python's use of white space, as if the C family had nothing equally rotten in its syntax. (I have no preference, but I have yet to see a convincing argument as to why white space is such a liability in python).
I haven't looked at scratch. We could also mention Squeak and other Smalltalk variants, which was designed for kids in the first place.
An early cybernetic idea
One of the earliest (1940s) explicitly cybernetic ideas arose in the interchanges between Norbert Wiener and Gregory Bateson, and it went something like this "How would you design a machine which could act like a schizophrenic?"
This (what we would now call) 'reverse engineering' of insanity guided decades of research by Bateson into the nature of schizophrenic communication, leading to the 'double-bind' hypothesis, the application of Russell's theory of logical types to communication theory, and Bateson's conclusions - and demonstrations - that similar patterns drive creativity and evolution itself.
We should have had a clue from the schizophrenics themselves, who invariably have paranoid fantasies about 'machines' or technologies which control their minds (or the minds of everyone else) and make them crazy. The machines are indeed real, but they are made of flesh and blood, laws and rules rather than metal and microelectronics, although the internet has opened up the possibility of software which could generate schizophrenia in its users. (There's an app for that?) You can pick your own examples of software which 'drives you mad'.
What's missing here, then is not the banale conclusion that the machine was 'acting all crazy', but that it was making wild creative leaps, and quite literally 'thinking out of the box' which is something that has eluded AI research for decades. Good stuff.
Business as usual
I read the reg *because* they get their beaks and talons into almost every person/company/technology (although it's clear that individual hacks have their favorites).
Check the masthead slogan - it has always been so.
are unambiguously defined by international, NOT national law. AC is right and you are wrong. The Nuremberg trials established that invading another country without the go-ahead from the security council is the *supreme* war crime. Issues of collateral damage and civilian casualties are precisely covered by the definition of supreme war crime. Look it up.
USA is bound by this law, which trumps national law, like it or not. Only the US politicians and mass media conspire to hide or gloss over this fact from those who swallow their propaganda unchewed. The typical defense is that such laws undermine US sovereignty, and are therefore worthless, or that the UN is ineffectual (a self-fulfilling prophecy because the USA regularly undermines its authority, and then criticises them for having no authority).
But the fact remains that the USA *is* a signatory of the international law on war crimes, and is therefore bound by it. A congressional vote is irrelevant - or rather, it somehow could make congress accessories to the supreme war crimes of invading Iraq (and Afghanistan). Ask a lawyer.
And it is the USA which has the largest stockpiles of biological weapons. They also have plenty of other WMDs. Does this mean that any other country has the right to invade them? Are you really saying it's legally and/or morally 'right' to invade a country which has WMDs simply because WMD's are dangerous? What about China, or France, or Israel? I believe AC understands the threat very well indeed, and I imagine that he/she correctly estimated the danger from Saddam's WMDs at zero. You, however, are not prepared to hold the USA and other coalition countries to the same standards you wish to impose on other countries. This is pure hypocrisy.
We now know that Saddam had no WMDs, exactly as the UN weapons inspectors told us at the time. Clearly, they know their business, but the US media exposed Hans Brix and others to ridicule and character assasination. But the weapons inspectors were *right*. They did their job properly and reported the truth in their findings repeatedly before the war began. Many of us suspected it back then, and we went on the streets to demonstrate about it. Even the CIA knew it, and were forced to provide hokey 'evidence' for Colin Powell's shameful and dishonest speech to the security council. Dozens of CIA employees have since resigned over this heinous travesty of their skilled and difficult work.
And the idea that the Americans 'fell for it' is disingenuous in the extreme:
The *American* CIA spooks and the *American* military knew Saddam had no WMDs, but Bush/Cheney insisted on evidence. The CIA were obliged to turn over whatever junk evidence they had - evidence that they *knew* to be pure fabrication. The CIA repeatedly told the Bush administration that there was nothing but junk evidence about WMDs or any connection between Saddam and Al Qaida, so Bush/Cheney stepped up the waterboarding until they got some 'operationally useful' confessions - i.e. junk evidence from the mouths of enemy combatants, rather than the fictions of CIA spooks.
It is not difficult to find the statements of ex-CIA employees or ex US-military personnel who have gone on record with their grievances about this, but I suspect you are not interested in looking for data which undermines your views, and would rather echo the mountains of propaganda which supports them.
It was junk evidence, commissioned by the Bush/Cheney administration which ordinary Americans were 'taken in' by, Bush/Cheney's own propaganda, cynically and intensively peddled by Murdoch and others.
I find it quite extraordinary that you, or anyone else, attempts to play the WMD card, as a post facto justification for the Iraq war, as it has been so thoroughly and extensively debunked, so often, that surely only the most gullible and naive still believe in it. Even Tony Blair has stopped using it as an excuse. It makes you, Mr Gumby, come across as a gullible fool which can't be entirely true. Of course there are still people that still believe the sun goes around the earth, which I also find extraordinary. I am sure they are not complete fools either.
You wrote something sarcastic about letting the facts get in the way.
I ask without sarcasm: How do *you* rationalise ignoring these facts?
Tittle must cuntain bums and/or pricks
At first I was thinking that this kind of mild horseplay was a bit daft. If you want a sex party, then have one, for goodness sake, but I suppose I am not in the target group. It will probably be a hit with anxious teenagers who desperately need an excuse to say things like 'insert this long hard object down your pants'.
In my day we had to make our own entertainment. (Spin the bottle, postman's knock, kiss chase, felch the bulldog etc. All non-electronic and requiring only the most simple equipment).
I do think it could be an interesting new trend, and I'd like to see them make a much kinkier version, using the glam rock aesthetic from games like guitar hero, and of course the game should come with a pack of wipes.
There are some good leads to follow up here. What I am especially interested in is how to generate diagrams from (especially online) data sources e.g. rss or csv feeds. This is touched upon in relation to visio, but only in passing.
Any chance of a rundown of those tools which specifically address this approach?
And shouldn't lower-level code-based visualisation tools like Processing or even HTML-5's Canvas be mentioned?
After all, the most interesting feature (and surely the future) of screen-based (rather than print-based) infographics is allowing the browser/user to manipulate the view, i.e. what used to be called 'interactivity' - not just in making histograms or UML more pretty, and easier to draw.
Pendant is as pedant does
A squib is not a broken firework. It's a (non-recreational) firework, so dampness or not is a significant quality.
Make keywords less legible
Another correspondent complained about the effect this might have on 'skimming'.
The lesson here could be, make the important keywords fuzzy, and do the filler text in clear Arial or Helvetica. Could be an interesting next step in this research.
Consider that 'Simples' campaign, or even 'Think Different' - both of which generated plenty of free PR as people discussed the grammar and/or spelling.
Renowned hypnotherapist Milton Erickson used to use bad grammar or mispronounce specific words deliberately in order to get the listener's unconscious mind to activate and 'work harder'. This would make the hypnotic suggestions more memorable because the subject would make them more his own.
It's the 'store' bit which is innovative, not 'app'
So much talk about 'app'. Yes the word predates the iPhone. Get over it. And as others have pointed out, NeXT/OpenStep/OSX executables have the .app suffix, rather than .exe. which is not insignificant. (I often imagine that .exe is short for "execrable"). Also the Mac has always had an 'Applications' folder, whereas Windows has always had 'Program Files'.
OK, that's that but AFAIK there was no in-device software 'store' before the app store.
Best you could do before iPhone apps was go to one of those woeful and scary looking handheld-software review sites, with some kind of e-commerce thing bolted on. They always had the impression of being run by eastern european gangsters and riddled with malware, and often seemed to offer endless opportunities of carrying you into some kind of link-farm labyrinth.
Making a piece of software which strictly controls the process of buying and installing software which runs on the very device where the software will eventually run, and calling it a 'store' is something innovative.
Granted it's not like discovering gravity, or inventing free market capitalism, but it *is* an innovation. IMNSHO this gives Apple good reason for laughing Microsoft's petty dispute out of court.
As for 'Windows', it's a name which has always made me nauseous because it implies that other OSen don't have windows, or that Microsoft invented the concept, whereas Windows was very much the catch-up windowing GUI tech in the mid 1980s. Rather hilariously, the 'windows' in Windows version 1 could not overlap and could not be dragged around the screen. Nobody these days would recognise such a GUI element as a 'window' today.
...for a set of headphones with built in mp3 player (and removable storage). (No wires!) But a touch screen interface is a royally stupid idea, as others here have pointed out. I would want to be able to control the whole thing using only tactile feedback (i.e. fingertip control), so I can keep the damn things on and keep my eyes on the road/tv/legs of the girl on the escalator while I skip around my music.
Why is the tech industry obsessed with graphic displays? We have at least 4 other perfectly good senses (or 11 more, if you follow Rudolf Steiner). It's like the ipod you now have to take out of your pocket and look at, in order to operate it.
The inverted crucifix is, first and foremost, the symbol of St. Peter (who elected to be crucified upside down, to show everyone 'ow 'ard he was. Did I say 'ard? I meant pious, obviously.)
The symbol is not uncommon in St Peter's cathedral in Rome. The catholics in this story evidently had too many other dogmatic details to keep in mind.
what's wrong with monochrome
I must join the chorus calling for affordable monochrome e-ink *today*. Why, if the tech is working fine are they dragging their feet getting products to market? You can still buy monochrome laser printers, and they are still really useful machines - not to mention cheap.
I suppose this obsession with color is driven by the fantasies of marketing men, and hardware firms looking for fat margins - but really, I only need 2 bits of grey - or even 1 bit of black/white to read Dickens or Dostoevsky, or indeed 99% of the project Gutenberg texts comfortably.
Capitalism fails to deliver again.
...well Scheme really. It just looks a bit like C, and unfortunately 99% of web developers treat it like C, which is why we have such a horrible mess.
No HTML5 on MSIE on XP - MS hands over even more market share
how long time it takes certain people to upgrade from one MS OS to another,
and considering that
IE9 will be for Vista and Windows7 only (not XP)
and considering that
IE9 will be the only MS browser which supports HTML5,
... it would seem that the mooted 2025 date for HTML5 being fully established is quite accurate.
Tax increases for the rich!
If this is true, there's no need for anyone to feel bad about increased taxes for rich people, right? Score for social democracy!
(I've always been skeptical of rich people's claims that high taxes for rich people will fail to motivate them to exploit poor people efficiently).
IE6 market share is below 7% today and falling every month. There's nothing to explain.
The Netscape/Sun idea of abstracting away the OS so that you could do useful work on 'any' computer was smart, and was the most significant threat to Microsoft's hegemony in the 1990s, which is why MS were rather obliged to pull all those dirty tricks, which got them in hot water with the antitrust boys, and which still echoes in the EU-enforced browser chooser.
Microsoft must be peeing their pants at the prospect of losing not just one cash cow (MS Office) but both (Windows). Considering that MS loses money on almost every other aspect of their business, it's no wonder that they are now banking on Bing to compete directly with google at *their* game, and it's no wonder that there are competing factions within the mastadon.
Microsoft has simply grown too big, with various departments producing software which compete with (and are incompatible with) the products of their other departments. We're seeing in-fighting and the legacy of inflexible business models, which indicate that they got addicted to the milk of their own cash cows and forgot that there are other nourishing drinks - with considerably less fat - which might be made available for free.
MS is moving too slowly to keep up with the rest of the industry, which is sad, because I was just beginning to like them. Maybe they would have been more viable today if the DOJ had split them in two.
As for using HTML5 for making the GUI of 'real' (i.e. not web-based) apps. I have to ask 'why not'? XAML and HTML5 are both markup languages, why should one be better than the other? The processor intensive work (e.g. video decompression) should/could be handled by appropriate hardware anyway, and we haven't yet seen the impact of GPUs on non-graphical number-crunching tasks like real-time audio filtering. (This positions Apple in a very advantageous spot).
HTML5 can be optimised with the right tools and be no worse than Qt or any of the other presentation frameworks.
Show me a GUI which does not deceive the user.
Philip K. Dick wrote a short story in the 1960s about a robot which could camouflage itself as a TV set, sneak into people's homes, commit murder, and then leave evidence at the crime scene to frame some innocent human being or other. I forget the title, but it's in 'The Golden Man' collection.
I too go with the 'computers deceive regularly' meme. I subscribe to constructivism, which points out quite scientifically that the evidence of our senses is largely illusory, and any resemblance to reality - whatever that is - is rather coincidental.
Or, to put it another way: Show me a GUI which does not deceive the user, in some important respect.
It IS confusing when a control has mutliple functions, and at the very least it puts greater cognitive demands on the user, because the user has to maintain a 'stack' of presses which is difficult in stressful or cognitively loaded situations. (q.v. "was it six bullets or only five? To tell you the truth, in all the excitement..." - yes even Dirty Harry is not immune to this problem of 'how many times did I operate the control'?).
See my other reply about accessibility. My daughter would have to spend so much time working out the 'code' for the multiple button pushes that she would lose interest totally.
My preference: As few buttons as necessary, but no fewer. One button per function. (I can accept play/pause on the same button, but would prefer that they were seperate).
Physical buttons and Accessibility
2nd generation iPod shuffle remains a great design, with the built-in clip, physical buttons you can operate with your hand in your pocket, and the user's own choice of headphones, so I too am pleased to see the design return.
Having the controls on the headphone cable was just stupid, and an lousy excuse to charge exorbitant prices for mediocre headphones. My guess is that Apple noticed a drop in sales when they switched to the 3rd generation.
In my case, I have a multihandicapped daughter who loves music and loves pressing buttons. She has cortically impaired vision. She has a 2nd Generation iPod shuffle plugged into a cheap-as-chips portable handheld amplifier. We've tried her with touchscreens, but her vision is really not good enough. She needs to FEEL the buttons.
So... now I know what she will be getting for christmas - especially if belkin or some such produce a little handheld amp with a neat housing for the new shuffle. The shuffle is cheap enough that we can have several of the things about the house, and she can switch to another one when she wants to hear another playlist.
I often prefer to operate devices in the dark, in my pocket or whatever, without using my eyes. A physical button is still more 'sound' feedback for the fingertips than a short vibration (which affects the whole hand). If they can localise the vibrations on the screen somehow, we might be getting somewhere, but for now...
Touchscreens are an accessibility nightmare for anyone with partial sight or any kind of blindness.
Reproduce it in the lab, then, "boffins"
If it can't be reproduced, it's a junk hypothesis
This is about MSIE I think
This is not just about ads.
Google sees the extended functionality in the browser offered by ActionScript 3 as yet another nail in the coffin for Windows. They know they wont get everyone on board HTML5 and Ajax, so Flash serves them in the sense that it can help make Windows irrelevant when running browser-based 'cloud' (ugh) applications. (You can do a quite decent asynchronous browser app in Flash, it's just that generally, people don't).
For strategy reasons, it's unlikely they will give silverlight the same special treatment.
What John 186 said
Yes, I am dreaming of a low voltage adaptor with multiple outputs, and where each output can have its own voltage and connector.
I know you can get low voltage adaptors with switchable connectors and a voltage setting control, but it is scarcely a solution to the rather stupid problem of needing to use one high voltage power socket for every low voltage device. Maybe I just don't understand Ohm's law properly, but it seems like this should be technically possible.
I seem to remember (from my rock band days) that guitar effects pedals (from certain manufacturers) could be chained together so that you only had to plug one of them into the wall.
Maybe I was just stimulating my imagination too much back then, but I know we also had an open standard for digital music around 1983, which - extraordinarily - everyone agreed upon and still uses. Sometimes musicians have nifty ideas decades before the rest of the tech community has even realised which end its arse is pointing.
OK, I wasn't dreaming
The 'Voodoo Lab Pedal Power 2 Plus '
...is designed for guitar effects pedals (so it has a low hum circuit design), but could be used for any low-power devices, which run on voltages which are a multiple of 9. (You can combine two outputs for 18v, etc.) Brilliant!
Good, sober, slightly boring article
Seems to me that Apple has scored a huge PR win with its anti-flash gesturing. Don't the journalists realise they are deep-throating Cupertino? Apparently they are only thinking about their deadlines. (Question 2: Doesn't Adobe have a spin doctor???)
Look, Flash is actually a very good technology, politics aside. Shame Adobe don't have a fecking clue how to leverage it. They are about to hand the crown over to the hairy HTML5 hippies. (I am a hairy hippy myself, but the loose and forgiving HTML5 standards gives me the willies. Hey, I LIKE xHTML 1.0 Strict !!! Validation errors are a GOOD THING).
I am also a long time Mac user, but I am disgusted by Apple's recent cockiness about Flash, which confirms the worst criticisms made against them over the years. Who's going to help unskew the market? Google? Yeah right. Nowadays it is Microsoft which looks like the good guy. Can't believe I am writing this, but that's IT for you.
Clocks and Timepieces
OK, if the pedants are having a go at the definition of "second", I'd like to point out that a 'clock' is a chronograph which has bells on.
If there are no bells (or whistles, or other alarm or striking mechanism) it is merely a 'timepiece'. No indication of any bells on this new-fangled quantum chronograph.
(My father is a horologist).
Still the best tech...
...for finding out what's on tv right now and later tonight. (Amazing! It's the same device I am watching which tells me what's on in half an hour. Don't even need to open my laptop!).
I had no idea that 'teletext' was a brand name. I though it was the generic name for the technology behind Ceefax, Oracle etc. (What IS the generic name for it, then?)
I am dismayed that the BBC and others are starting to drop this ugly but information-efficient (1KB per screen!) technology.
A web page is just not usable enough - not only do you have to fire up a different device (even a set-top box is another device!) but also, each channel has its own layout, and differing levels of detail etc. Maybe at one point all TVs will be able to go on the net and download programme listings. This no-brainer feature will certainly come... my guess is it will be in 10 years time.
BTW I am rather fond of the open source Java-based app "TV-Browser", which aggregates dozens of channel programming feeds into a fairly simple GUI. Check it out. If only the RadioTimes would fix the problem that the BBC Entertainment feed is spitting out the BBC Prime feed by mistake. Grrr.
Finally I would like to say that I was an avid user of the teletext mode on the BBC computer, and lusted after a 'teletext adaptor' - a device which could have impressed the pre-web internet users of the mid 1990s, even if it came out in the mid 1980s.
The gun lobby tend to have a "nothing to hide = nothing to fear" attitude - at least those that have any claims to respectability. This device will certainly split them into two camps. Bravo!
BTW - why does 'Remember me on this computer' never remember my password? (A password which I can not change to something easy to remember). Yes I do have cookies enabled.
conservative reg readers
Amazing how conservative you folks are.
Yes, it looks ugly, but the way I see it, there should be at least as many buttons on a mouse as you have fingers. Any less is missing a trick.
Nobody is forcing anyone to buy this device, and let's face it, the scroll wheel is the first innovation in mouse design since Engelbart's 1968 original. (Why has nobody thought of making the ubiquitous office swivel chair into an input device?) Engelbart also devised several other 'no-brainers' including the ancestor of Endfield's Microwriter which have not caught on. Users were able to achieve extremely high typing speeds on those 'finger-chord' keyboards' ridiculed by 'Poor Coco' above, who clearly has no idea that they used a mnemonic finger pattern system, rather like sign-language, and not ascii codes. Once there was a good excuse why such innovations did not catch on (expense and lack of standards), but now we have USB.
BTW did anyone try the Oberon operating system? (Circa 1987). It used an ingenious 'chording' system with its three mouse buttons: Hold down one button while clicking another and you 'copy' the selection to the clipboard, for example. They also used up to four cursors. One for 'source parameter' one for 'target object', one for 'selecting' and one for 'execute'. It might not be the best design, but at least Wirth and co were thinking out of the box.
The mouse and the keyboard are the computer's primary 'sensory organs'. Should innovators not work to increase the bandwidth of their inputs? Increasing resolution of the motion sensors is one thing, but why not have pressure sensitive or velocity sensitive buttons? (256 levels of force? Dynamic Photoshop brush sizes? Wacom do this already and it's very, very cool). Imagine a keyboard where you could press harder for bold text etc.
There are many possibilities, but little real innovation. It doesn't help to have a gaggle of IT 'experts' who dismiss any attempt to design something new without actually trying it in their hands - which is what input devices ultimately stand or fall on.
If the OO mouse were the first of many design iterations, I think we would soon arrive at a really good input device. Yes, it looks absurd, but you have to start somewhere - and I think we are seeing the old cliché: innovators are invariably ridiculed until everyone realises the idea was always brilliant.
(How many here will admit to ridiculing the iPod because it lacked a radio? Well, I remember a vast clamour of voices with exactly that opinion. Yeah, I know... It was a long time ago, I never saw any Jews being mistreated... We didn't know what was going on... I can't really remember.... etc.)
I definitely use the 'home', 'page down' and 'page up' keys at least as much as I use the scroll wheel, probably more. Perhaps some of you minimalist/conservatives would rather have the scroll wheel on the keyboard too? Beside the page down' and 'page up' keys would be an 'obvious' place, no?
The real issue is the driver configuration software. Many logitech mice have 5 or more buttons, but the opportunities for configuring those buttons are ridiculously limited. Kensington mice have superb drivers, with finely tuned acceleration control and a proper macro editor but they seem to be going in Apple's direction of 'less is more' in their hardware designs.
So... why not TRY the device before leaping to any 'brilliant' and 'witty' conclusions.
Why it matters (to Microsoft especially)
Some folks still don't 'get it'.
If people can do their jobs and run their lives using web based apps like Google Docs and Google Wave, they can do so from *any* operating system which can run the browser which runs the apps.
If it doesn't matter what operating system people use, Microsoft has lost the game.
It's that simple.
If MS Office is to be 'eaten by web based apps, rather than StarOffice, OpenOffice etc. Windows will become irrelevant.
Google is positioned best to win, which should concern anyone who values individual privacy. I don't trust Microsoft either, but at least they haven't been gathering inscrutable browsing data on me for the last 10 years.