10 posts • joined Tuesday 18th December 2007 19:07 GMT
Re: Didn't think I'd ever say this - but why bother with lasers?
augmenting part of your visual area is still "augmented reality glasses" ...........
Thanks for the interesting article
Yes .... I don't doubt that the centre of gravity still falls as expected (since the top can fall faster than gravity, and the bottom of the spring slower than gravity...)
I'm starting to think that (on the instant after release) each "ring" is still holding up the ring below. i.e. that each small component in the chain is effectively still in equilibrium ....... each bundle of springy atoms is still holding up the next bundle of springy atoms .... and that the really slow "propagation" inherent to the slinky means that the "release" of tension takes ages to propagate down ....
After all, if you're sky diving, you could still push and pull on objects, regardless of the fact that you're not anchored to any thing.
Re: @John Latham / xyz Seems overcomplicated
It's not the same, no.
if it's sat on a table horizontally and stretched, then it will spring towards its centre again when released.
If it's hanging, then the bottom rings have round equilibrium between the "up" pull and gravity. If you turned off gravity, it would spring up from the bottom, much like when sat on the table.
But they're not switching off gravity, they're releasing the top. Therefore, you would expect that on the instant that the top is released, the bottom (which now has the same gravity pull down, but less force pulling it UP therefore balancing the gravity), should start to fall immediately.
I'm not pretending I understand this fully, but It intuitively makes sense to me that that the "top" rings, which were always being pulled by gravity and have now lost their balancing force imbued by being tethered, would "spring" down at a faster rate than the bottom rings which fall under gravity alone ....
After being initially impressed that Microsoft had produced a functional, attractive AND sleek operating system, we stumbled upon annoyance after annoyance...
Old versions of certain programs not working, certain devices lacking drivers, some vista drivers working, some not.. Some new and critical programs crashing mysteriously, security utilities going haywire and not wanting to work with the in-built security ...
Then within the space of a week, both our Win7 boxes developed serious problems- one stopped booting, and no recovery/repair seemed to help it. The other generates garbled error codes, particularly when we try to shut it down (we have to hard power it off). We wanted to like it, but we've gotten sick of making excuses for it, and "oh, it must not work right in Win7" became an almost daily utterance. Back to XP again then...
am I missing something?
I fail to share the authors enthusiam- this seems like a really silly idea
why would you want a display to show a different image on both sides? I can only think of tradeshows as an example of where this would be useful, and that's not a huge market. Plus 2 LCD displays back to back works just fine, really. It would have to be cheaper (or at least similarly priced), to be worthwhile. A shared display for clamshells? okay, but still, big deal?
...when has intel (or amd, or anyone for that matter) turned down an opportunity to embarass the opposition. Either it's a great excuse to gloss over the Penryns being slightly later than hoped, or Intel feels that the PR gains are outweighed by the financial benefits of being able to sell 'existing inventory' at a higher premium than anticipated. One thing to bear in mind is that the new chips will command considerable premium- two months 'not' selling pricy Penryn chips has to be a sizable hit on projections? If Jan was the projection in the first place?
..because bringing a *machine* up from idle would take even longer than bring a disc up from idle. The article mentions "submillisecond" idles, which basically means highly erratic loads, where the machine is either "doing nothing" or "full out processing" while handling jobs taking as little as 10-100s of microseconds each. Sounds like database queries and on-demand web service apps to me. They want to 'sleep' components for the sub-millisecond gabs between these queries, to save power.
- Review Samsung Galaxy Note 8: Proof the pen is mightier?
- Nuke plants to rely on PDP-11 code UNTIL 2050!
- Spin doctors brazenly fiddle with tiny bits in front of the neighbours
- Game Theory Out with a bang: The Last of Us lets PS3 exit with head held high
- Flash flaw potentially makes every webcam or laptop a PEEPHOLE