17 posts • joined Monday 26th March 2007 17:18 GMT
Watching the video from about a minute in shows not only that the laptop is some flavour of Mac, but also you can hear the comments from the loony - they don't exactly sound anti-Apple: quote "Thank you Steve Jobs. I have a Mac!". There's also some other stuff about evil and words I didn't catch.
Clearly the bloke was a nutter but there was a definite fanboi slant to the madness.
I just hope it isn't catching.
The way I see it...
...is that if the iPhone interface is the replacement for the old iPod one, they've made a huge mistake.
As far as I can tell there's no easy way to use the iPhone UI without looking at it (no tactile feedback), and in many ways it's a two-handed UI as well, in that you can't really drive it with the thumb of the hand that's holding the phone.
Both of these are major downsides compared to the 'traditional' phone and iPod/mp3 player UI's.
OK, the new design might look flashy, and technically it's quite impressive, but it's answering a question no-one is really asking. I wouldn't claim that existing implementations of the phone/mp3 player interfaces are perfect, but at least I can drive the things without having to look at them, and without having to think much about it as muscle memory and touch are more than adequate for using the most common functions blind.
Oh, and Fred Fnord, keep drinking the Kool-Aid, I'm sure it keeps the acolytes at The Church of Steve, Inc. happy. Quite why people who buy Apple stuff can't just see it as just another product like any other I don't know, maybe there's an additive in that shiny plastic?
The easiest option would have been to just not offer this service.
While a full multi-platform product might have been nice, I suspect cost was as big a factor as anything and finding a solution that would suit the vast majority of users, while not having huge implementation costs, probably proved too tempting.
There's also the not insignificant point that the material has to be licensed for transmission on this system, and if the rights holders won't let you use the material on a 'multiplatform' compatible system then there's not much you can do.
Linux and Mac support is always a 'nice to have' feature rather than a solid requirement; I can think of all sorts of products that I'd like to have Linux support but it just isn't practical or economic to add it, it's cheaper and easier to just use them under Windows, even if this means adding a single Windows box into a sea of Linux based systems.
Alternatively, you could look at it like this: regardless of having a license, the terrestrial broadcast system only supports PAL, even though there are systems out there using things like NTSC. If I'm in the minority who chose to buy an NTSC receiver for some reason it's a bit rich to start demanding a broadcast is produced to suit my system even though the existing system supports 99%+ of the installed user base, and I still have the option of buying a PAL receiver if I really want to make use of the service. No one is actively stopping me using the service, it's my choice of incompatible platform that gets in the way.
Or how about something like Sky? Closed platform, and if you want to access it on a system other than a Sky box, regardless of paying for access, you're SOL.
While I find mulitplatform support great, and would love to be access his sort of service on my choice of platform it ultimately has to be accepted that the majority will get support first, and the minority *might* get support at some point in the future.
Hand picked reviewers
Given that the reviews so far are hardly what could be considered independent, I wouldn't exactly say I trust them.
Better to wait for the things to escape into the wild for proper reviews to happen.
Also I'll trust reviews more when they actually have some decent phones to compare to - maybe some of the larger phone review sites might manage that.
Of course, working as a phone sounds like it isn't something the iPhone was really designed to do, more an iPod/email/web machine that happens to have a slightly crappy phone attached, so comparative reviews might not work out too well.
Guess we'll wait and see.
Is there actually demand though?
Given you've been able to buy portable, handheld LCD TVs for years, and yet they never became commonplace in the wild, is there really that much demand from people to waste their mobile battery watching stuff on a tiny screen?
Also, consider that the majority of the time you'd actually use this kind of thing (during the day), there's usually nothing on worth watching. By the time any good stuff is on, you're usually at home (or in the pub) watching on a proper sized screen.
So whatever format wins, is anyone actually going to care?
Re: Big bang?
It's actually high energy particle collisions, which isn't exactly fusion.
Anything with high energies has the potential to make a bang, even if it's just the equipment letting go. Which is the most likely effect of just whacking everything to 11 and seeing what happens, instead of testing carefully first.
There's also the argument from the tin-hat brigade that the outcome of some of the experiments might have unfortunate side effects, not that I believe such things but I thought it a good basis for a joke...
More likely someone has decided that if their experiment is going to cause a Big Bang, it may as well be a F**king Big Bang.
There's usually a reason why you do low energy testing first - it leaves pieces big enough to tell what went wrong.
Skipping the test schedule because you're behind is rarely a clever thing to do, you always miss something and the outcome is usually expensive.
More Apple hype/lies...
Yet again, the iPhone has functionality that is claimed to be new and/or innovative, and yet seems to be anything but!
On some sources it appears that Apple is claiming to be the first mobile device to be able to play back a 'new' video format - in reality this looks like h.264, and the iPhone would be a long way from being the first mobile device to play this back, including playback of higher resolution/framerate videos, with the option for TV output.
Plus of course YouTube Mobile is already available, so at best they're playing catchup.
And it's not like they can claim better video quality, it depends on the source and there's no way YouTube videos are ever going to look good, particularly streamed to a mobile, particularly one without 3G support.
So far I've seen very little, if anything on the iPhone that is either new or innovative, though there have been obvious features missing. Apart from a flashy interface, it really doesn't look like it has much to offer.
Of course the hype machine doesn't care if it's true or accurate, and the Apple acolytes neither know nor care either.
Why do they bother?!
'The iPhone, launching in the US on 29 June, will incorporate search and mapping technology from Google and allow for calls to be made directly from Google Maps.'
As does my N95, and indeed any other phone that has the Google Maps application installed.
Hardly a killer app if everyone can already get it for free anyway on other mobile platforms.
Except of course that unlike the other platforms, on the iPhone you can't install 3rd party apps!
Link-16/MIDS is actually capable of lots of stuff, the bandwidth is usually more than adequate for all sorts of realtime data exchange. Though you can't stream video, which seems to be the primary point of this.
Anyway, I may have spotted a slight flaw in this concept. While all the data link systems in service transmit at frequencies that make them very much line of sight, using the radar has a further downside.
Conventional data links will use a conventional antenna, not exactly omni-directional but reasonably capable as long as you aren't pulling wild maneouvers that end up masking the antenna from the receiving party e.g. banked turns.
With a combat radar though, both the transmission and reception paths are going to be in a relatively narrow window, forward of the aircraft. Exactly how narrow would depend on the performance and configuration of the antenna array, and how steerable the beam is. Which means you're basically only going to able to swap data with someone who is not only in front of you, but flying towards you!
OK, with a C2 platform like an AWACS with a fully steerable array maybe they could be flying in another direction other than towards you, but they'd still need to be in front of you.
And of course, in order to use this you're going to have to activate a system which will have more enough power to show up on ESM systems for miles around. (Aircraft radios are usually a few 10's, to a couple of hundred watts, the radar will be low double-digit kilowatts)
So, given it isn't really necessary for the majority of data transfer tasks and the extra functionality isn't the kind of thing pilots will want to use 99.9% of the time, that it's difficult to use in flight because of the flight profile you have to adopt, and that it'll advertise your presence to anyone with basic detection gear, I can see this getting firmly put on the 'interesting, but useless' pile.
Looks like good engineering to me
Why engineer your own controller, which would have a relatively huge price attached due to low production volume and NRE costs, when you can just get something off the shelf which is cheap, ergonomic and easily available?
Absolutely nothing wrong with using cheap COTS bits if they do the job.
Plus you save on training the troops how to use it!
The way it works
The whole sale of 'surplus' RAF Eurofighters is actually a great way for getting a free (-ish) upgrade from Tranche 1 to Tranche 2/3 aircraft. Given the step change in capability between the tranches, it's worth selling on aircraft that aren't immediately needed in order to get replacement aircraft of a better spec at a later date.
In effect you're just swapping delivery slots, trading delay for upgrade.
Not that the Saudi aircraft will be exactly to the standard spec, they have all kinds of toys but usually not quite as good as the original eg. component downgrades, deleted features (eg. some of the secure comms modes), that sort of thing.
Though even if the aircraft were exactly the same, I like to think the pilots still make a difference and the UK is still slightly better on that front.
And the 'Saudi-ised' in-country production doesn't necessarily mean much, usually it just means a large group of Brits putting the kit together in-country. The locals usually couldn't care less about the technology, it's just a way of keeping hold of the cash - if the work is being done in Saudi then at least some of the cash being spent will stay locally. Also it provides for all sorts of legal ways of getting cash to various local VIPs, but that's another story...
Making payments to various people has long been part of the way business is done, especially in the Gulf. Everyone does it, it's all in the open, and there isn't anything unusual. Plus it's legal. And lets face it, what's wrong with paying the locals with their own money - they pay you, you give some of the cash back. And they're the ones who have to justify the money they've to the rest of the government/family.
Finally, having heard some of the stories from those involved in the original deals, direct payments were just one of the ways certain people could get cash. Others also existed, and all were legal. For example, a Saudi company would provide all the accommodation for contractors, at cost above the normal market rate. Said company owned by a VIP close to the deal. Not exactly clean, but legal enough. And no-one will complain as it can probably be counted as part of the local offset due under the contract.
The Saudi Tornados don't seem to be doing too badly, for all the rude things some people have said. They're keeping them for now, with a sustainment program introducing quite a few upgrades to take the remaining fleet to something like a 'GR4-lite' spec capable of supporting the latest toys. F3 spec may be gone, but the airframes are still useful.
As far as the Lightning II argument goes (don't think you mean Raptor, at $200m each!), they aren't actually particularly cheap unless you fiddle the numbers and hide a lot of the costs. The Eurofighter in unit cost terms actually isn't too bad in terms of cost/capability. In the Raptor/Eurofighter equation, the >3:1 cost ratio between the aircraft means there isn't any argument, Raptor might be slightly better (depending on the tactics employed) but if you can afford a 3:1 force ratio 'slightly better' isn't good enough.
But would it work?
The current system seems designed to target the missile in flight, and probably assumes going up against a delivery system with one warhead per vehicle.
The first problem is that an interceptor is likely to disrupt the missile itself, but not the warhead on top of it. Obviously breaking up fissile material in-flight isn't ideal at the best of times, but even if this isn't being avoided it seems likely that the warhead would still survive the intercept relatively intact.
Historically weapons accidents have shown that even the sophisticated multi-stage triggers can get driven close to causing detonation by a crash involving a warhead (e.g 8 of 9 arming stages going active). And given that the 'el cheapo' systems used by countries this is supposed to be used against are likely to be built to go off (ie. be reliable on delivery), rather than assume fail safe, it seems likely that an interception would cause a headache to someone under the intercept. Though only for a millisecond or so...
Given a more sophisticated MIRV system, it seems even more likely that the re-entry vehicles would survive (being pretty robust objects) and would hit something, and again the combination of the fall plus the impact(s) could conceivably lead to some sort of detonation, even if only partial. Though the kinetic energy alone would be enough to ruin the day of anyone near the impact site.
The other problem is that the system probably assumes a relatively dumb target, and it's increasing possible to counter the interceptor systems. And once you get the re-entry vehicles released you really have a problem, as they aren't hard to track (massive IR signature for one) but interception is incredible difficult given the small target and high speeds involved.
Maybe they should have just asked to share the RS-24 test the Russians just did, after all a missile defence system test vs. an ICBM designed to counter a missile defence system would probably have been fairer than trying to kill an old Polaris.
It shouldn't be assumed that you'd only even have to target (relatively) low tech systems from unfriendly states - after all, espionage or plain old money can get almost anyone the weapons they want, and it also isn't inconceivable that those developing the latest systems might actually want to use them one day.
So if you can't kill the latest and greatest, it doesn't seem worth the effort.
And personally, I feel more at risk from the side effects of the defence system than I do from any ICBM. After all, the idea is to protect the good ol' US - as long as the missile doesn't hit the target it doesn't matter what happens to all the foreigners who get hit by the debris.
That was a rather smug reply, which singularly failed to demonstrate any kind of clue about the situation.
It's not a free service that's down/unreliable, it's the whole outgoing email service. Which is part of the paid service package.
And it's not like there's even an indication it doesn't work - if there was, you'd know to work around it. All that happens is emails go into the system, and never come out, which is by far the worst thing that can happen: unless you routinely CC yourself a copy of all your email you'll never know there's a problem.
Gmail and the others aren't guaranteed reliable - gmail certainly goes down occasionally - and as they're free (and US based) you have no recourse when they go wrong.
But for something you've actually paid for you expect it to work. And for the supplier not to spend days denying the problem exists when it clearly does!
What's the point?
Simplicity, robustness, compatibility and low cost are the criteria that should always - and usually do - define what standards win out.
Case in point - Firewire vs. USB.
Firewire may well be faster, and have some technical advantages but ultimately very few applications actually require the features and performance it offers - chains of disks, and video cameras are pretty much it. The connectors aren't complex, but not the cheapest design.
USB provides a cheap, simple interface which is 'good enough' for the majority of purposes it's used for. Implementations of both host and device controllers are easily and cheaply available. And all sort of devices are available that can connect via USB that would never have dreamt of going the Firewire route. And the connectors are simple and robust, and come in a variety of formats to suit specific applications. At the lowest level you can even get away with a raw PCB edge connector. It may well not be the best technical solution, then again the winners usually aren't.
As far as video interfaces are concerned, VGA still has the advantages of using a simple, robust connector format, and being compatible with any signal that can be fed through the cable. There isn't any licensing fee. There aren't 1.0, 2.0 or other flavours, merely whether the equipment being connected can support the signal timings, and you'll always find something that works even hooking old equipment to something brand-new. And the available bandwidth is huge, so no real timing or resolution limits. If ultimate image quality and performance is required, the analog interfaces still win out.
The digital interfaces have all suffered from a few common problems, possibly because they're all variations of the same thing: bandwidth is inadequate, crippling the supported resolutions for the majority of implementations. The designs are inevitably proprietry, so there is always a cost involved in implementation. New versions of the interfaces are always being proposed to work around shortcomings - compatibility is an afterthought. And the most recent trend is towards tiny, fragile connectors that offer no real advantage apart from being cosmetically better - DVI is at least a proper 'industrial' type connection which won't kick loose or break off, the others are pathetically flimsy.
The only real advantage of the move to digital interfaces has been the removal of the d/a a/d convertor stages, and that wasn't a particular problem anyway. All that seems to have happened is that the move to digital has introduced a lot of problems that didn't exist before, and as the encrypted, consumer electronics, toytown design trend continues I can't see things particularly improving.
So for now I'll happily admit to staying in the analog domain as much as possible, as the various digital options really don't seem to have much advantage to offer.
Divorced from reality...
The eco-friendly types don't do themselves any favours when they start spouting junk like this.
The amount of roll you use doesn't exactly come very high up the list of enviromental threats, if's not like it's made from rainforest hardwoods, just plain, lowgrade farmed timber. And it probably is going to end up as fertiliser eventually, so it's a pretty enviro-friendly & carbon neutral lifecycle overall.
Personally I think she should worry more about the drive she makes to the mall in an SUV to get the roll in the first place than how many sheets it takes to wipe your arse.
But in any case, how could you ever possibly get away with only one sheet? Or even the (luxury!) of 2 or 3?
I suspect the answer will only ever be found by re-adjusting to a new state of personal hygiene, or by also becoming one of the enlighted eco-warriors and existing on nothing but air and sunlight...
Surface area vs. volume
I always thought that as as sphere got larger, the surface area per unit volume decreased.
As I see it, it comes down to heat dissipation - a larger skull has less surface area relative to the volume of brain matter within it, which means in a hot environment there would come a point where the brain couldn't be cooled adequately, and this would form a limiting factor.
If the air is cooler then more heat can be dissipated as the thermal gradient vs the head is higher, which means the upper volume limit changes and larger brains/heads become viable.
Beyond a certain point I would guess that cooling isn't as much of a problem as retaining heat, and this could have other impacts eg. need for more hair - plus it could drive even larger growth just to keep the volume/area relationship low.
This of course is what engineering suggests as the simple & obvious explanation - differing environmental challenges leading to a requirement for a larger brain doesn't seem quite so elegant, but thermodynamics probably isn't a strong point for biologists.
- Xmas Round-up Ten top tech toys to interface with a techie’s Christmas stocking
- Xmas Round-up Ghosts of Christmas Past: Ten tech treats from yesteryear
- Review Hey Linux newbie: If you've never had a taste, try perfect Petra ... mmm, smells like Mint 16
- Analysis Microsoft's licence riddles give Linux and pals a free ride to virtual domination
- I KNOW how to SAVE Microsoft. Give Windows 8 away for FREE – analyst