294 posts • joined 28 Apr 2007
Bravo for these tests. I think they are pretty crucial.
I do still worry that the motor will have problems at altitude, simply because there is so little air pressure that it may not be able to get motor up to operating pressure in time with what small amount of combusting material you have.
The problem is that the rate of burn of the motor is linearly dependant upon the pressure. The nozzle provides a restriction that causes the motor pressure to rise - reaching a steady state when the motor is a at full thrust. But if there is almost no pressure to start with, the initial rate of of burn may be too low to climb onto the rising pressure curve, and instead simply peter out. Ignition with one atmosphere pressure is a big step onto this curve. This is why suggestions about some form of rupturing or pop-off cover for the nozzle are suggested. Such a cover can hold the pressure of the combustion products of the igniter inside the motor and you get a nice high pressure startup of the motor. Of course you then have the issue of designing a suitable cover.
The point? Well if the test succeeds, you are fine. If it fails, this is almost certainly the solution. What you could consider is a dual trial. Think seriously about the design of a suitable cover, and maybe fly two motors. If the initial test fizzles, set it up to try igniting the second one.
"The MPI standard first got turned into Open-MPI by Squyers and Graham in a project that began in 2004 and shipped its first code in 2005."
Whilst Open MPI is are really good thing, it seems disingenuous to totally ignore MPICH, which is another open source MPI that first shipped code in 1992, and is still going.
It was in France.
Anyone who is naive enough to think that any sort of dark network will provide unassailable protection deserves what they get. Like any secure system there are many points of attack, many of them not technical. If a government agency wants to find you, they can muster forces that you can only guess at. Such a touching trust in a technological solution must gladden the hearts of spooks everywhere.
It isn't impossible that Tor is, end to end, an intergovernment security honeypot.
"But there's no need to panic just yet – as far as the scientist can tell the Earth suffers a supervolcano blast roughly every 100,000 years or so and the last one, the Oruanui eruption, blew off just 26,500 years ago"
I think I should introduce you to my friend Andrey. Andrey Markov that is. He had a few words to say about processes like this. Logic such as the above only works in movies.
As to "without warning" you do need to remember you are talking to geologists, and they tend to think in slightly different time scales to the rest of us.
Re: Ziggy the Seer
He probably does. Heck I want an Aston Martin. I doubt he is going to give me one of those either.
There is a difference between: want, need, can afford. Its a bit like Christmas really.
The NBN isn't free, getting the average taxpayer to subsidise your torrenting habits to the tune of a few thousand dollars each is lunacy. If I could personally pay a few thousand and get FTTH on top of an existing FTTN offering I would be very happy. I would love the bandwidth, and I use it for my work. Some of my clients would dearly love serious bandwidth too. I sincerely hope that the new NBN gets a serious business focus, and instead of running fibre out to the bogans so they can stream HD reality TV, we can get the SMEs up to speed. That is where FTTO is needed.
Re: They are there for show
Of course they are "non-executive" - there can only be one executive director - for the NBN that is Ziggy. There is nothing special about this, all boards work this way. The executive director is the person who executes the work. The non-executive directors are there to direct, they do not work for NBNco, and thus and are not executives.
Hang on here - the Tracking Preference Expression working group has been working for two years and has 105 members? Just to provide a standard on Do Not Track? It sounds as if the W3C has more problems than a bit of dissent.
Seriously, someone let the politics and vested interests get the better of the process way early in the effort. If this is how W3C is operating it is already long past its use by date. People pointed at the ITU as the benchmark of paralysis in progress, and as the reason why the internet needed something new. Seems history repeats itself all to quickly.
Coming of age
I remember when it hit too.
There were some really nice things that went on as it spread and was contained. But the thing that most sticks in my mind is the warning message that was sent around in the first day. It contained a couple of very interesting sentiments.
Paraphrasing, as it has been a long time, and I don't have a copy of the that message anymore. (Although I would love a copy.)
"We all knew that it was possible to write something like this" "We just didn't think anyone would be dumb enough"
It ended with: "This is bad news."
The bad news was the loss of innocence. This was the moment when the mutual trust ethos died.
Nyquist and Shannon
There is a significant issue with ultrasonic communication that has been touched on earlier, but seems a lot less understood than it needs to be.
I just checked on my Macbook Pro, and as I expected that internal audio (ie that one available to the internal mic and speakers) is 44.1kHz (ie CD) sample rate. This places an absolute hard limit of 22kHz on the highest frequency is it possible to generate or detect. In fact the need to have a realisable (as opposed to theoretically perfect) anti-alias filter requires a frequency limit that is lower than this, and for all useful purposes means that the audio is limited to 20kHz - ie the top end of a young person's hearing. Whilst Macs and PCs have long supported higher sample rates in the OS and over connections to sound gear, that does not mean the on board basic sound chips do.
Any computer trying to converse with ultrasonic sounds would drive any pets in the room wild.
On the other hand, the idea that you could have a clandestine channel to an air gapped machine does have merit. So long as you are prepared to put up with a low bit rate there are quite realisable ways of doing it with the on board audio. Ultrasonics is simply naive. There is more than enough horsepower in a modem machine to use sub-noise techniques that would be robust enough, and essentially undetectable, to be quite useful here. Again, it is Shannon that shows you how.
Precisely the incentive that was supposed to be there and agreed to when the company participated in the standards process in the first place. Turning around after the event, once the standards are set, and demanding extortionate fees, or refusing to license at all, is breaking the entire model of standards.
A company is free not to participate in a standard, in which case the standard will likely be set without their IP. At which point they can try to make what use of their IP they may wish, without it being part of the standard. However, if they want to reap the rewards of a much larger market, as tends to be established by a new standard, they have to play by the rules, and in joining the standard they agree not to gouge.
There is nothing new here, and no change in incentive about joining standards. Just an enforcing of well accepted and usually uncontentious rules.
It's spring here, which means lovely average temperatures. The trouble is, Adelaide in spring uses PWM for the weather. It was 31 here today. 19 tomorrow. Average of 25 - lovely only if you are a statistician or meteorologist.
Re: Separate CV's into 2 piles arbitrarily
I think the fact they are applying for a helpdesk position answers the luck question already.
The lucky ones went in the bin.
It is just possible even the cheap coloured iPhone might boast a ceramic case. Or one with a ceramic coating. Pretty high tech, but vastly nicer than a simple plastic. That might account for the scratch resistance. (Titanium Nitride is technically also ceramic, and is the nice gold colour you see on some drill bits. On the higher priced iPhone that could be worth having, even if it is gold.)
The device in the video had an Apple and iPhone branding, I don't think Apple are exactly pleased if these appear on third party products, even cases for iPhones.
Directly addressed flash?
I doubt this, but it is worth mentioning.
Since Apple do control the hardware and OS, and have a significant hand in the design of the CPU itself, it isn't impossible for them to start to exploring less conventional architectures. Nuking the filesystem and replacing it with a persistent object store that is managed by directly addressing it contents would be a great thing to do. That would require 64 bit addressing now. They did have a system that worked a bit like this once - it was called the Newton.
Like I said, I very much doubt it, but I continue to nurture the hope that with the huge ecosystem of hardware and software design now under the Apple banner, they will start to innovate past the current typical architectures.
Baba O'Riley of course owes a clearly acknowledged debt to A Rainbow in Curved Air by Terry Riley, whihc is echoed in the opening synthesiser riff, and to Meher Baba, who Pete Townsend was significantly influenced by. It is a deep song, and goes well past "teenage wasteland".
And, I just wasted five minutes of my life I won't get back again, and listened to the new song. The opening is quite clearly Baba O'Riley, it isn't a coincidence, in even has a hint of the Terry Riley atonal synthesiser riff in the background. It is only a few bars, and if anything I would say that it is a homage to The Who more than anything else. Whoever wrote it knew exactly what they were doing. The rest of the song is mostly a ripoff of the 80's big hair band anthemic songs.
What it might do is get a few yoof of today listening to The Who, ones that had never heard them before. That can't be bad thing.
Just to be clear - the modified boats are AC45 class - and the contest they sail in is NOT the Americas Cup. They do however have a very close relationship with the cup, and were designed, and the series they sail in pushed, by Oracle as a support act for the main game. What is at stake is both the reputation of Oracle Racing, and a much more critical and currently open threat.
Offences under rules 60 of the AC and 69 of the rules of sailing are under consideration. A rule 69 offence is not trivial. It would be heard by the ISAF, and could in principle, involve a ban of the offending people. In the worst case Oracle racing could be handed a multi-year ban on racing. (I don't think anyone realistically expects this, but the possibility is open.) That would implicitly cause them to lose the cup. So, whereas the lead weights are on an AC45, the consequences could cascade to the main game.
The usual questions are open - who knew and when. Prior to the race series the weights were found in, the boats were sailed by the key Oracle sailors. If they received a ban it would cripple the Oracle team.
No, Oracle capsized with no loss of crew, but essentially total loss of the boat. It was Artemis that broke a beam and broke up, trapping Andrew Simpson underneath.
Won't work. The deed of gift gives the challengers quite enough leeway to drag you through the courts (of New York) and make you run the event in the sea. Indeed the New York courts are the effective custodian of the cup and final arbiter on how it is run.
This little debacle is going to be interesting.
Bitcoin fanatics should realise that this is a two edged issue.
1. Come use Bitcoin, it isn't money, isn't taxable, and isn't subject to any government's law. Oh yeah, because of that, if you get defrauded, or otherwise ripped off, you have no legal protection. You won't see your (not) money again.
2. Come and use Bitcoin. It is safe, because if you get ripped off or defrauded, the perp can go to gaol. Oh, yeah, that same government that sends him to gaol wants to have word about tax.
There is no government on the planet that will not assume, right now, that they don't have a say in Bitcoin use. Just because it is so low level that it isn't worth the effort does not mean Bitcoin is somehow home free. As evidenced by this little fraud case, there is usually no new legislation needed anyway. Money, tax, and fraud have a very very long and inglorious history. Any idea that Bitcoin is somehow brilliantly novel enough to get past this is sadly naive. (It is about on the same level as teenagers who somehow think they invented sex, and no-one was ever doing it before.)
Correct descision, even if the taint lingers
Although the taint of local favouritism will be impossible to extinguish, and if it were Apple attempting to ban Samsung imports many would suspect the same decision would not have been reached, the actual decision is the right one.
There is a tiny spark of sanity here. Any reform of the current lunacy in the patents system is a good thing, and nixing the thermonuclear options is a good start. It sets a good precedent. Next time it might be Samsung battling against Apple banning their products.
The absolute distance doesn't matter - what matters are the changes over time. So long as the unit on Mars doesn't move it doesn't matter if it is on a rock or a mountain. Mars rotates, so the transponder on its surface will be moving at quite significant rate, as is the earthbound end as the Earth rotates. Lots of fun compensating for all of that. But all manageable, and in the end you get to measure the precise orbital mechanics, and possibly a number of the relativistic influences. Which is probably the main point.
The whole shebang is not that far off how GPS works now.
Re: IT angle?
Upvote from me. Exactly the point. Seems half of the above commentators have trouble with reading comprehension. This isn't exactly helped by the bizarre way the article is written.
And what constitutes electronics? You have a battery and and a switch, which is already electrical. Some sort of arbitrary dislike of semiconductors? The item most likely to fail is the battery, followed by mechanical devices freezing.
Going to end badly
I can't see that this is a good idea for either Dell or MS. MS will get a seat on the board, which is enough to have real influence. History has shown that companies where MS gets serious influence make bad decisions on the promise, and get shafted in the future when the decisions turn out badly. MS have a very bad reputation for keeping faith.
MS can't offer anything to Dell that makes this attractive. Early access to technology for MS isn't exactly going to be of earth shattering importance, MS have done very little new or innovative for a very long time. But such access has the threat of stifling innovation and business agility at Dell. The downside for MS is that they are again getting into the position of starting to compete with their customers. A cosy relationship with Dell will almost certainly add to the perception of all the other PC manufactures that, whilst they can't do much about the need to purchase Windows for PCs, they probably want to actively consider every possible option for any other product they make. And right now that means Android. Maybe MS thinks that they simply can't fight Android in the mass market and will be content with a only couple of big manufacturers (Dell and Nokia). This puts MS on a trajectory to buy these two and to compete with Apple head to head in this space. This could end as badly as iPod versus Zune, and if I were a betting man, that is where I would put my money.
And this was where? The National Enquirer? World Weekly News?
NASA isn't a person, it doesn't make decisions like this, mostly because it can't. It is a huge, bureaucratic, risk averse, political machine, with 18,000 employees, and up to 300,000 including contractors. (Which would include contractors like United Space Alliance - who were largely responsible for shuttle operations.) You don't keep the lid on a cynical, strange, and stupid idea like this in a structure like NASA.
The rule of conspiracies applies. Never ascribe to conspiracy that which can adequately be explained by incompetence. NASA had more than enough managerial incompetence to cover disaster conspiracies many times over. Sad truth is that they simply didn't have a clue there was a problem. There had been foam strikes before, they had already made a decision long before to degrade the issue to one that was not flight critical, and they thought that given they had seen it all before, and got away with it, things would be no different this time. So they went home for the weekend.
Re: Not just the foam strike
Not that I know of.
Re: In reality nothing could have been done
The investigation panel showed how if the damage was taken seriously it would have been possible to have put an astronaut in a position to see the damage and to access it. There would have been time to do this. At this point there would have been an unequivocal need for drastic action. They suggested that stuffing the hole with a selection of on-board materials and changing the entry profile may have been enough to save the crew, it not the orbiter.
Whilst there is some truth that NASA is very politically directed, they would know that loss of the orbiter would inevitably lead to a congressional investigation where every email, phone call, and every tiny bit of physical evidence and documentation would have been worked through. Once the foam struck the die was cast. They were going to lose the shuttle programme if they lost the orbiter. Senior mangers would have known this. In part, where NASA failed is that senior management didn't know there was even the slightest hint of a problem. The internal culture simply didn't allow for there to be one.
Not just the foam strike
There were a great many lessons in the Columbia disaster. Whilst el Reg provides a nice write up the basic reason, taking time to look at why it could happen, as well as the what happened would be worthwhile.
The investigation uncovered a huge number of flaws in management of the shuttle programme. It wasn't just that NASA lost a second shuttle that set in motion the retirement of the fleet, but that NASA manifestly was not able to show that it was up to the task of managing the programme. It was clear that NASA would never be able to get the shuttle programme past losing one in every 50 flights. Some of this stemmed from inherent defects in the shuttle's design, many of which were inflicted on NASA due to the politics and budget cuts in the 70's, but a great deal from issues in NASA's internal culture.
Mission rules required that the ground control team provided constant oversight of the mission. Yet there was so little concern about the state of pay that the mission controller gave the team the weekend off. Both violating mission rules, and evidencing the total lack of interest in the foam strike.
Whilst the foam strike was always the prime suspect in the loss of the orbiter, there were other very serious engineering flaws uncovered. The investigation spent some time specifically looking at NASA's processes, and specifically criticised it's "broken safety culture." The external tank manufacture had been so tightened up financially that the position of manager of a particular part of manufacture, and the position of safety and quality control for the same part was occupied buy the same person. Yet no-one seemed to realise the fundamental conflict and inevitable loss of safety this would bring. Ultimately NASA was shown to have not learnt any lessons from the loss of the Challenger. The same hubris, and culture of "we got away with it last time" that doomed that craft, also doomed Columbia. The issue of foam strike was degraded from a flight critical one - where in the original rules for the orbiter this was a non-negotiable flaw that would have led to instant grounding of the fleet until resolved. It was let slide to the point that it was considered a regular "problem" that they would ultimately sort out, and not considered a serious enough to impact flight. An identical mindset as they had for the SRB O-ring seals that doomed Challenger.
The report on the disaster is worth reading from cover to cover. Whilst there is nice story of forensic engineering, the real story is in the surrounding culture, and the question of just how and why it was allowed to happen.
"I’d shut it down and give the money back to the shareholders."
Half way there.
Re: She can promise anything
"possibly the only thing that would get her re-elected is....actually I can't think of anything."
I can. He is called Tony Abbot. We have the absurd situation where the single biggest electoral asset that Julia has is Tony, and his biggest asset is Julia. Both are probably scared stupid that the other party has a leadership spill. Perhaps the politicians we get we deserve, but what desperate mortal sin are we guilty of to deserve this mob of idiots?
Re: You'd think there would be a vegetation free zone around this expensive sensitive equipment
If you look at the satellite pics there is a reasonable amount of space around most of the the instruments, all except the AAT which does get a bit close to the trees. However a lot of the support buildings do not have much of a gap, and it seems from the news that they have lost a lot of these buildings.
I tend to support the original sentiment - there wasn't enough clear ground. The site is right on the top of the hill, so a clear area that doesn't bring the fire front right to the doorstep of the facility is possible. And for the main cluster of instruments was both done, and worked. It is the fire front that does the damage. There is always the risk that blown embers will ignite a building, but the telescope facilities are mostly metal domes, and won't catch fire easily. A full force fire front however will melt them in place. In this respect distance is the only hope. It is quite possible that it was a simple ember that took out the support buildings anyway. It is a very common way to lose a building, and will take out a building sometimes well after the fire front has past.
I have both been to Siding Springs, and have experienced Oz bushfires first hand. When I vistied it was very different, I have a pic of the UK Schmidt telescope dome enveloped in cloud.
The Microsoft guys negotiating the deal really don't care in the slightest about the level of discount that can be calculated against a per seat price. It was a given that they were going to sell an all of department license. The only question was what the maximum amount of money they could extract from the DoD was. That number was probably not too hard to discover. Then all they do is work on convincing the DoD to hand it over.
The DoD's job is to muddy the waters and convince MS that the DoD really have much less money to spend, and get MS to latch onto a goal price that is actually lower than it is. Given the number of ex-DoD consultants that MS could engage to help, I suspect the whip hand is actually Microsoft's, and not the DoD's. But it always good to let the loser save face. A press release from the DoD making themselves look good is a small price for MS to pay for extracting that last 100 million from the DoD.
Predictable but important
If this was a few years ago and the university announced laptops running Windows all round, there would have been hardly an eyeblink. For better or worse, the default platform for tablets is iOS. Perhaps hard for the hardcore Apple fanboi and anti-fanboi equally to stomach, but Apple/iOS is the Microsoft/Windows of the tablet world. TCO of a single OS, single hardware platform, plus the existing tools for content creation (nobody mentioned iBooks Author, yet it is certain to be a key part of the case for iPad) is going to make a very compelling case for rolling out iPad. The real questions are much much harder than deciding to go with iPad.
There is a very clear tsunami rolling across the oceans of higher education right now. Most universities know it is coming, but I doubt anyone actually understands what will really happen, or what the right answer is. But the traditional university teaching modes of lectures, tutorials and practicals, plus exams is obsolete. What nobody knows is what the right replacement is. Access to very high quality teaching material from the likes of MIT, free on the internet, plus access to a wealth of other information that previously would have required hours a day in the library clearly outpaces the current model. But we should be able to do vastly better than this. Whether this means universities cynically reducing costs whilst maintaining a bare minimum education standard, or driving towards real improvements in outcomes and maintaining the current funding, that is a political matter. But not pushing for change is derelict.
I do however suspect that the UWS rollout is probably ill conceived. Content creation is not going to happen overnight. Indeed I would consider that there should have been a two year lead time for the training of lecturers in creation, and time to actually create the content, review it, rework it and then only roll it out to the first wave of students in the third year of the programme. Expecting the academics to be fully embracing the tools in time for a first semester delivery to the students is going to yield nothing more than Powerpoint slides of last years lectures available on-line. Something that will provide exactly zero improvement on the current regime. Freed of the need to actually listen in lectures students will spend the hours idly viewing Facebook and messaging their friends across the lecture theatre. It can, and should be, much better; but I bet it won't be.
If it takes a 17 point improvement to win the category, it would seem that Balmer could win it four times in a row and still not make the top ranking for actually doing a good job, rather than just a better one. That really is starting from a low base.
Re: Total bollocks from el Reg
Sadly true. Seven thumbs down and counting. One assumes that the down-voters also lack the technical ability to understand what I wrote. I have come to the conclusion that there is a core group that will down-vote any comment that does not actually slam Apple, and even comments that are neutral to Apple will attract their down-vote. The stream of comments to this article that suggest that many see it as simply a forum for Apple bashing, and nothing more rather reinforces this view. It is becoming no better than YouTube comments.
Total bollocks from el Reg
Seriously, we get two articles on this patent in one day on el Reg, and it appears that in neither case have the authors actually bothered to read the patent, or if they have, they lack the technical competence to understand it. What we do get is the now very tired Apple bashing fest that is fast making technical commentary from el Reg on anything to do with Apple essentially worthless. This is sad, there was a time where el Reg was actually worth reading for such commentary. It no longer is.
1. The patent does not attempt to patent near field charging. Got that? Really it doesn't. The title alone should be a give away: "Wireless power utilization in a local computing environment" Note the bit about "utilization." It is a patent on how to use wireless power in an innovative manner.
2. The innovative bit about the patent is the re-radiating of power from one device to another, and a protocol for controlling this. Go down to the claims section and have a look. The claims is where the actual meat of what is patented is. The stuff earlier is explanation, it isn't what is claimed for patent cover. Indeed the rules of patents require that you cover any earlier contributing technology. If you see something in a patent that you have heard before, it is there, not because of some nefarious attempt to re-patent existing technology, but due to a requirement to place the new work in the context of what has gone before. Not doing this can cause the patent application to fail. Note that you can't be expected to cite provisional applications from competitors - they are secret until the patent is approved.
Seriously, this article is so bad it should be deleted. It is an embarrassment to both el Reg and the author.
There are a couple of fundamental errors in the design and assumptions here. Sadly they pretty much nullify what has been done.
There are three sources of heat loss - radiation, conduction, convention. The design and tests have not addressed these correctly.
Radiative losses are independent of atmosphere - they remain essentially identical in a vacuum or at sea level (for those wavelengths that the atmosphere is transparent to - which are those that matter here.) Heat loss due to radiation won't be noticeably less at altitude.
Heat losses by conduction through air are independent of pressure until the mean free path is longer than the distance between objects. For the dimensions and pressures of this project you can assume that conduction remains about the same. Use of an aerogel insulator would help significantly here.
Convention might matter. Even at 0.01 of sea level pressure, the air can move, and thus can convey heat from the motor to the body of the aircraft and thus to the outside. However the vastly lower pressure reduces the heat capacity of the air equivalently, so the energy moved reduces considerably. You may need to consider how to prevent convection cells of air forming. Making sure the cells of air are small (where small is a few mm) is the way to do this. Aerogel is good here too.
The critical one is radiation. Space blanket only provides useful insulation against radiative losses. It does this in two ways: it reflects radiation back to the source, and being a highly reflective material, it radiates heat very badly, and so does not lose heat by itself radiating energy. In order to work it must not be in contact with anything - it must have a clear space around it. Sandwiching it between two layers nullifies its entire function.
To use space blanket you must wrap the outside of the assembly very loosely - possibly in more than one layer, with a minimum of contact points between the rocket motor and the blanket, and if more than one layer minimal contact points between the layers. If you want an example of how it is done, look at a picture of the Apollo Lunar lander's legs. Indeed check out any picture of spacecraft and observe how the blanket is arrayed. Multiple loose layers of blanket will trap small cells of air, and thus also effect a reduction of convection.
As mentioned above, you won't know how well the system performs until you test it properly, and this means into the baro-chamber and packed with dry ice for an extended soak. It is worth applying a bit of basic physics here too. You know the energy drain of the heater - power = volts time amps. You can work out the thermal coefficient of the motor - (so many grams of aluminium, so many grams of propellant - or use a surrogate of similar known material) thus you know how the temperature of the motor should rise with time when the heater is energised. You can compare the observed temperature with the ideal case, and work out the thermal losses. You may discover that a carefully insulated motor will not require a heater, or if it does, you can work out the minimum heater current required, and appropriately size the power source.
A layer of aerogel then a couple of loose layers of space blanket and I very much doubt a heater would be needed. Insulating the system batteries would similarly benefit - enough that with internal losses naturally heating the batteries you may obviate any thermal problems in the batteries.
Missing the point
The idea that Apple must find a Steve clone, and that even an arsehole clone is better than no clone is fundamentally flawed. Steve is gone. There is nothing that can be done to change that. All great companies have great leadership, but the nature of that leadership changes fundamentally with changes in the leader. An attempt to maintain the past by emulating a few external attributes of the now gone leader is no better than a cargo cult. Building replicas of planes out of straw does not make the real planes arrive, and being an arsehole does not a visionary make.
Apple's senior management have a seriously difficult task ahead of them. There will never be another Steve. One reason Steve got away with being who he was, was that he was one of the planets most wealthy people. He didn't need to do the job. He wasn't a career executive coveting the CEO position and its pay-cheque. He founded the company, and still owned a goodly slice of it. The only answer is to recognise this. Apple can't succeed by trying to emulate Steve's management. They do need to take serious note of what was good that he brought to Apple, and try to distil it, and ensure it remains in the company DNA, and then find managers that recognise what it is that makes Apple Apple, and who will continue it. The huge dangers are that they become paralysed, or succumb to ego driven infighting. Guiding the company down this path is Tim Cook's job. A successful Apple will not be the same Apple as when Steve ran it. It may be better, but it can't ever be the same, and attempts to keep it so will doom it.
Re: Is it just me?
The sticking point seems to be this:
"Adkins had been seconded by Dockwise from another company, Cadenza Management, which was actually his employer"
You can be certain that Cadenza Management had had Adkins sign the usual email clause with them. But that isn't the same as him signing with Fairstar Heavy Transport, even though he was working as their CEO. So Dockwise sue Cadenza Management to get access to one of Cadenza's employees emails. That gets pretty weird. If you are a contractor for a company, they don't automatically get access to your email account.
Could be interesting
What Apple could, and IMHO should do is explore much more interesting possibilities in processor design. The x86 chip is fine so far as it goes, and the ARM all well and good, and nice for low power, but neither are exactly anything more than the most boring and basic functionality. Computer archtictecture has gone backwards for decades. Right now, raw speed individual CPU is no longer the prime issue. Ever since Apple bought PSemi I have wondered if they might do something really interesting. Where interesting involves taking some advanced architecture ideas and running with them. The one that I would love to see - tagged memory. Adding tags to memory can be used to provide hardware differentiation of addresses and data. Instant pointer management, and with it a major step towards secure systems. Also add a full/empty tag, which provides for intrinsic synchronisation in memory, and with it support for fine grained concurrency. These are not new ideas - look back to the Tera MTA for one example. But you could go a very long way back to see lots of additions that can provide for secure systems, and parallel code support.
It would be a brave move, but if you look across the industry, the only company that is in a position to make a break from the ossified architectures we currently use, it is Apple. Worrying about Windows compatibility just repeats the mistakes that keep things bogged down. Linux is so conservative in its internals (no bad thing, it is just important to understand this) that it won't be able aid any such progress.
What goes around
Water/liquid cooling always used to add quite significantly to the cost. Almost doubling the cost of a machine. Cray used to do the T3E in both water and air cooled versions for this reason. The water cooled version packed a lot more processors into the same cabinet footprint, and was the only way to get really big configurations. At the same time Thinking Machines made a big thing about the cost effectiveness of their air cooled designs. Considering the prevalence of water coolling for gaming machines at least some of the components should be pretty cheap now. But anything custom or low volume is going to be a problem.
The Cray 1 was also water cooled, but conventional cold plate. The Cray 2 was famous for being liquid immersion (in Freon) cooled. You had to drain the Freon out of the cabinet to perform any work on the machine. Cray's factory cooled a large pond outside the building with excess heat.
Not really unusual
I have seen similar things happen with other service providers. The root cause was a cancelled credit card which triggered an automatic flagging of the account as being used fraudulently. The card was actually legit, but had been cancelled for other reasons - however it was pretty easy to see why the provider would put two and two together and assume that the card had been stolen and later cancelled by the rightful owner. The next step was less sensible. They then looked in their database and decided that some other accounts that were apparently linked (by IP address) were also therefore fraudulent, and cancelled the lot. Took ages to sort out.
If it is something like this Amazon will not be forthcoming with an explanation as it might reveal something about their internal fraud detection policies. Even of the rules a stupid, they won't reveal them.
Re: and Apple? - how about an edit botton????
The usual mechanism is to disable edits as soon as anyone replies to the post, or after a short timeout. It isn't as if this is a new problem or hasn't been solved many times before.
bUtton that was.
Re: and Apple? - how about an edit botton????
"Owning such a company isn't something we would not associate" just ignore the word "not"
Owning such a company isn't something we would not associate with Microsoft generally, but that they do, and they use the technology for their mapping application does bring into contrast the differing attitudes of our big technical companies.
IBM famously runs leading edge research, and even boasts the odd Nobel prize. Google isn't so focussed, but isn't adverse to bleeding edge and oddball efforts, including hardware, and spends pretty big. Microsoft spends pretty big, although seems to have an inbuilt ability to ignore most home-grown good stuff (rather like Xerox). Which leaves us with Apple. A point that rankles with this self confessed fanboi (and owner of many an iProduct.) Apple underspend on research, spending much less than the industry average, and vastly under the levels of the preceding companies. One might guess why they have been manifestly unable to get their heads around new - previously non-core product technologies - like maps.
Microsoft and Apple are well known for buying in technology, MS more than anyone. Which works fine if it is software, and the technology is thus of the same flavour as your core capabilities. But it takes something more if you are building a major new capability. This is what is worrying about Apple's Maps. Until we see the same sort of innovation as we see with Google and MS - actually doing research and pushing the edge in more than just software - we are not going to see a product capable of competing.
Not useful for a head up display
Whenever a display technology that can layer on glass is mentioned, the idea that it could be used for a head up display comes soon after. But it won't work. The key part of a head up display isn't the display technology - it is the optics that allow the display to appear in your field of view in focus. Try driving down the road whilst your eyes are focussed on a streak of grime on the windscreen. The road is out of focus. You can't focus on a display laid over the windscreen and also drive. A head up display uses a set of lenses to focus the image of the display device at infinity, so it appears in focus whilst your eyes focus on the road.
Re: The problem is ALWAYS the PEOPLE.
I have the feeling that Apple's map problems have nothing to do with software, and everything to do with data curation. If Google have 7000 people working on maps, you can bet that 95% of them are doing nothing but staring at GIS data applications and hand tweaking the map data. Sadly this isn't work fit for human beings, and will also pay about as well. Th best quality software on the planet won't replace low resolution satellite images with high resolution photographs taken from airplanes, or magically fix out of date business information. The only way to sort this out takes lots of effort, money, and time.
See my post above. I though this too for a while, but a little digging and you find that "iPod Out" is not the analog audio output. It is the special iPod emulation mode iPhones have. I'm reasonably sure that iPod Out requires video out - this is how it displays a virtual iPod on a car's touch screen. (Which a car that supports iPod Out has. iPod Out having been developed in conjunction with BMW.) Apple really should be much more clear on their web page.
Maybe much more complex
It looks a bit more complex that Apple have actually said, and Apple have not been smart in dispelling the confusion.
There is a lot of talk about the loss of audio line level out, and the Apple web page says that iPod out is not supported on the adaptors, leading many to assume they mean no line level audio out. Which it seems isn't the case. IPod out is a mode where an iPhone or iPod Touch will emulate an iPod in a manner that allows really nice integration into car audio systems, where it actually displays an iPod control screen, complete with album art on the in-car system. It is this that doesn't work. Assuming they are actually supporting line level audio out in the adaptor, the adaptors at least include a DAC, so it isn't just a connector. The adaptor probably contains more than this too.
The iDevices have never supported S/PDIF, and I very much doubt they will start now. Those docks that do support it have licensed a special USB chip from Apple that allows access to the internal digital audio stream. I doubt Apple will be giving up that control.
I suspect we are going to see some later technical descriptions abut the Lightning interface, but Apple have let slip a few things, and a look at some of the issues with USB make these make more sense. Apple say it is an 8 signal interface. Which is already interesting. USB 2 is two signal (+ and - signal) and USB 3 adds four more (superspeed TX +/- and RX +/-). The remaining two signals may be Apple simply keeping the old serial interface, or they may have done something much more interesting and the Lightning interface may not be USB at all, and the adaptor contains a USB interface chip as well as a DAC.
The plug is double sided, and I think everyone has assumed that because it can be inserted either way up it means that although it has16 physical pins, they are simply 8 electrical pins duplicated. This may not be true. If the socket has only 8 pins, sure, but if the socket has 16 pins we may see some slick use of differential signalling and symmetry allowing four pairs of differential signalling pins, plus the power, ground, and maybe power output for accessories. Apple have explicitly said 8 signal pins - so the question of where power and ground come from needs answering anyway.
Apple will want to future proof this for some time, so a range of things are possible. In a decade's time our expectations of what can be done on the connection interface, and indeed what we expect from our smart pocket device may be significantly more extended than we imagine now. Indeed, have a look at Thunderbolt. Cut out the two low speed signalling lines and a few redundant ground pins and it would fit. Who knows? The name is tantalising.
Yes, I expect that the vast majority of commentators on this will make the mistake of claiming that the uncertainly principle is in doubt, and totally miss what has actually been claimed.
From the first linked article:
"It is often assumed that Heisenberg's uncertainty principle applies to both the intrinsic uncertainty that a quantum system must possess, as well as to measurements. These results show that this is not the case and demonstrate the degree of precision that can be achieved with weak-measurement techniques."
The experiment addresses the phrase: " as well as to measurements." The intrinsic uncertainty remains.
- Does Apple's iOS 7 make you physically SICK? Try swallowing version 7.1
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Pics Indestructible Death Stars blow up planets with glowing KILL RAY
- Video Snowden: You can't trust SPOOKS with your DATA
- Review Distro diaspora: Four flavours of Ubuntu unpacked