58 posts • joined 17 Dec 2007
I found Lil'Racerz to be very buggy, unfortunately. It was fun and had lots of amusing touches (for example you can run over the spectators if you skid off the track and they'll try to run away from you).
I've really enjoyed Ground Effect by the guy who wrote Populous. It's a very chilled out racing game.
Not just Sony
That warehouse was used by one of the largest independent record distributors. A lot of small independent labels lost their stock in that fire and probably are now bankrupt.
John Seddon would have a few things to say about that...
Sad that Hinterlands by William Gibson didn't make the cut. That's one of the best stories about alien-ness I've read.
I'd love to see a movie version of Hinterlands by William Gibson. It's a haunting story, and it could probably be done on quite a small budget.
My laptop has a multitouch (and tablet) screen. It's great to see *anyone* actually integrating the various Linux desktop technologies to take advantage multitouch. Hopefully the multitouch events will be supported in the Firefox and Chromium builds so that multitouch webapps work as espected.
Voice? Don't care. Data? I want it!
On today's smart-phones *anyone* can easily create mobile services. How can Nokia compete with the entire web?
Hardly a new trend...
"...a worrying trend: Microsoft appears to be copying what someone else did rather than having any confidence in themselves and providing any real innovation or design leadership"
Worrying trend? Sounds like business as usual. MS' big successes have come from reproducing other companies' innovations. Their big failures from trying to innovate themselves.
Narrative games are always disappointing
Dumbly following someone else's linear plot just makes for a dull game. A good game creates situations that the *player* wants to tell stories about. There's a huge difference between a story writer and a game designer. Both are fine art forms, but why do story writers think they can just write a story in game-form and it will become a good game?
Google face blurring technology couldn't cope
It got confused between their arses and their faces and blurred the former, not the latter.
Maybe you can't blame Google for that...
What matters is the JVM. Java the language is stagnant, maybe dead
"there really isn't another enterprise grade language of this caliber around"
There are plenty of much better languages but no runtime with the same combination of performance, management support and a wide variety of libraries.
There may not be a next big JVM language. Scala and Clojure are taking over from Java in some organisations. Java will continue to limp along, becoming more and more unwieldy as it shoehorns language features onto an already awkward base. JRuby. Groovy and other languages will have their niches.
For enterprise software, the language to pick will be the one with best tool support that lets programmers evolve their systems rapidly. And which language that will be basically comes down to JetBrains, unless the Eclipse community can pull something surprising out of their hat. Considering they still haven't got Svn support into the main distribution, I won't hold my breath.
Did Google ever innovate? Their innovation has always been behind the scenes, not in the services they provide. They created a web search algorithm that was better than that of Alta Vista and Yahoo, and a home page that had one input box and so was better than that of Alta Vista and Yahoo. Then they made a webmail app that was better than that of Hotmail. And bought a company that produced a maps webapp that was better than MapQuest et al. And bought a company that put videos on the net when their own attempt went nowhere. And bought a company that made a spreadsheet webapp that was not as good as DabbleDb. And bought a company that made a wordprocessing webapp that was just plain awful (and therefore familiar to people who used MS Word). And bought a company that made a phone operating system that was better than Symbian (but then were beaten to market by Apple). And then produced Wave which was an innovative combination of chat and a wordprocessing webapp , the main innovation being that it worked in a browser. And Buzz which was a microblogging app that was a knock-off of Twitter and screwed over Gmail users. And now they've bought a company that runs a virtual currency service...
Provoking a flame war gets BCS mentioned in news headlines...
I'm sure there's a marketing term for this tactic.
5th August: apart from a few scattered survivors, all of humanity are blind
...and fall prey to shambling carnivorous plants (and Eddie Izzard's overacting).
It works for De Beers
"The idea that restricting supply somehow increases demand is utter bunk fueled by a complete lack of understanding of economics. Every consumer who walks into an Apple store wanting to buy an iPad and walks out without one is a customer who might go across the street and buy a netbook"
It works for De Beers. People could buy other gems or artificial diamonds, but they don't. And, a more appropriate comparison to Apple, it works for fashion houses.
Not just alpha geeks
I've installed Firefox and Chrome on my mac at home (as well as having the default Safari) and my wife uses Chrome by choice every time.
One way to counteract the censorship of .xxx
Anyone with a normal website on a .com, .co.uk, .org or whatever domain change it to use .xxx. Then it no longer becomes a porn-only domain.
Microsoft dreams the dream... and lives it
"On your PC noone would ever dream to move all the standard features around"
Apart from Windows Media Player, Windows Office Communicator, Office Ribbon Bar, Internet Explorer, etc. etc.
Not that Microsoft are alone. ITunes, Quicktime Player, Winamp, etc. etc. all ignore the platform's look & feel (feel being more important) and make the Windows experience a case of random button clicking around mystery-meat UIs.
It makes the Linux GNOME desktop look like a paragon of consistency.
The fact that a language descended from ML is a supported part of VS is great news. Yes, ML code is succint compared to C#, but it is that way because the language doesn't require you to write loads of unnecessary guff to help the compiler writer, you just concentrate on what you want done.
Must be awkward for frequent travellers
...unless they override the heuristics when the user is logged in and they know the user's location.
Developer productivity featurs?
What about navigation, code analysis and refactoring? Are they still awkward, slow and unreliable?
Choice? It's all about reliability.
IPhone users don't want choice... they want an unreliable browser that frequently crashes and can take out their entire phone. And Safari does that job perfectly well thank you.
What a terrible idea.
Why on earth are they continuing with that white elephant?
I bought a netbook because it's cheap and small
I bought an EeePC 900 because I wanted to write on the tube or in cafe's (and do a bit of programming) and I didn't want to worry too much about the cost if a thief snatched it off me. The larger and more expensive they become, the less useful for that kind of thing. I do miss a touch-screen though because I like to sketch diagrams and what-not with my laptop's stylus.
An iPad is less tempting for using when out and about because it's more expensive, more fragile, and harder to type on.
Waiting for the TV ad
Buried alive? There's an app for that!
Engineers love non-monetary compensation
It looks makes Google's engineers (who might be attracted by better paying jobs elsewhere) feel that they are working for a morally superior company.
[Where's the icons for the Google CEOs?]
Much better than commons collections
Commons collections is basically unsupported. It does not support generics, for example.
Google collections has excellent support for functional programming idioms. I've used it in production code for some time and found it very useful for cutting out boilerplate code for iterating through, mapping & folding collections.
Edgy humour... yawn.
If you can't be funny, be shocking. The refuge of the unfunny attention-seeker.
The Met have offered this service for years.
The Met have offered this service for years, and in machine-readable formats.
Wrong way round...
The Maven repository is a security mess. Eclipse's authenticates plugins and updates.
It would be better for Maven to use Eclipse's update mechanism to download and authenticate packages than for Eclipse to download packages from Maven's global repositories.
Remember comic chat?
Now that was a good twist on the chat idea, but like many interesting ideas from Microsoft it disappeared without trace.
I saw one being ridden in London
Soon after they were released I saw someone riding one along the towpath past the canal-side housing estates in Hackney, being chased by a large crowd of jeering, stone-throwing kids. What a beautiful sight.
Java already has a VisualStudio beater
And shouldn't the Maven developers think about fixing the security issues first, before trying to take over the world?
Why does Eclipse care a jot about NetBeans? It's JetBrains they should be chasing. If they spent a little more effort on usability and performance, Eclipse might have a chance of being a real alternative to IntelliJ. As it is, it's only useful for people who's time is worth so little that all the time they would save using IntelliJ would cost less than its asking price.
No way to compete with Android
"This does remove at a stroke the principle advantage of Google's Android: the code being free."
Whether Android is free or not, its biggest advantage over Symbian is that it is much easier to code to. The Symbian APIs make Windows 3.1 look simple and elegant.
@KenBW2: genie and zoom effects in Compiz prove my point
You missed my first comment. The transitions in Compiz don't take into account how you have configured your desktop. If you put the window selector in the top right corner, as I do, Compiz still animates minimising windows downwards to the bottom of the screen. If one moves the dock around the screen in MacOS, still windows minimise to the true position of the dock icon.
Compiz has copied the feature from MacOS with no attempt to understand how Apple actually designed it and why they designed it like they did, and no attempt to actually provide a useful, usable feature for the user.
@Gerry: I'm a Linux, not OS X, user
I've used Linux as my main desktop for years, so it does matter to me when the desktop experience becomes worse, and Compiz and the wobbly effects definitely make the experience worse, breaking features, showing misleading transitions, etc.
My point was that, unlike in Apple, there's no pressure encouraging OSS programmers to use the graphics power for good and not for evil (or at least, not for showing off).
Shows the difference between Apple and OSS
I hate the wobbly effect!
The genie effect in MacOS X communicates useful information: where the application has been minimised to.
The wobbly effects in Linux desktops convey no useful information at all and sometimes convey misleading information (e.g. minimise to the bottom of the screen when my menu of running apps is at the top). It's completely pointless work that is more about stroking the egos of the graphics programmers than delivering useful features to users.
Remote control for presentations
Bluetooth is widely used to control presentation software. The profile is supported by many phones and Windows. It's a good idea: flip between slides without having to keep going back to the laptop to hit the space bar.
Except... if you have a room full of people with Bluetooth phones, the connection freezes. So it appears to work perfectly when you rehearse before the presentation but fails as soon as you stand up and start talking. Sweet!
Would make a lovely media center PC, except...
No built-in DVD drive :-(
That's something that's not too ugly to put beside the TV but what would be the point (besides MAME)?
Apple go into markets where they have a chance of making a splash with novel new technology. Nintendo already have that role in the games biz and would eat Apple for lunch.
I reuse code all of the time, the difference being...
...that none of the code I reuse was ever written to be "reusable". It was just written, was useful, got used, got found by me and got used again.
Make useful code easy to find. There's your "reuse strategy".
Anything else is just someone trying to sell you something.
He should actually read what Fred Brooks writes
Saying we should "simplify essential complexity" shows he either hasn't read what Fred Brooks wrote or completely misunderstood it.
The essential complexity is the complexity of the problem domain and, by definition, cannot be simplified.
The accidental complexity can be addressed, but certainly cannot be "killed". Computer programming is never going to be completely straightforward as long as we have to obey the laws of physics and economics.
The tools aren't good enough because the vendors do not know what they are for
UML has no practical purpose.
It cannot be used for real modelling.
If it can be round-tripped it is just another representation of the code but graphical diagrams cannot scale up as well as text and we already have excellent tools for working with code (as the article says).
It could be useful to support informal discussions about designs, except that it's not very good for visualising object-oriented software.
How do you measure the productivity of a modelling tool?
Come on Matt. Don't tease us with throwaway statements like "while they're being left behind in terms of productivity, though, modeling tools are making headway with advanced features...". What does productivity mean for a UML modelling tool?
Number of A4 pages per day? Number of lines drawn between boxes per hour? System features deployed to customers per week? Money made from the system per financial year?
Define your terms.
Networking is not the biggest problem for laptop users
I've had no problems with networking when using Ubuntu on my laptop. However, X configuration when plugging in monitors is a complete disaster area. The design is completely illogical, so it's no wonder it's unreliable: testing it must have been impossible.
Wot Google, no devices?
At least Apple has sold a few phones.
UML isn't really modelling... so what is it for?
It's interesting that Anonymous Coward brings up the "programming" vs. "software engineering" distinction.
A engineering model is a simplification of a system (aircraft, bridge, office block) that ignores unimportant aspects of the system so that the the engineer can easily and cheaply prove that the system as designed will have certain desired properties (fly, handle expected traffic volumes, not fall over in high winds).
UML cannot be used for creating engineering models. You can't prove interesting runtime characteristics from UML models.
So what is UML good for?
Maybe for visualising ideas on a whiteboard or paper, but it's a piss poor notation for software visualisation.
Maybe as a notation for software blueprints, but if you have enough detail in the diagram to mechanically produce code, you have code and you need all the testing tools as well as refactoring tools, to work with it.
- NASA boffin: RIDDLE of odd BULGE FOUND on MOON is SOLVED
- Pic Mars rover 2020: Oxygen generation and 6 more amazing experiments
- Microsoft's Euro cloud darkens: US FEDS can dig into foreign servers
- Boffins spot weirder quantum capers as neutrons take the high road, spin takes the low
- Plug and PREY: Hackers reprogram USB drives to silently infect PCs