54 posts • joined Wednesday 15th July 2009 13:06 GMT
Re: A bigger hammer?
Fancy crushing some rocket fuel? I'm not sure I do.
However, the idea of casting the igniters into a block of something fuelly, pressing into place, and then popping a smidge of silicone outside that seems rather appealing. Then the silicone seal is against something solid, and there's no pressure differential across it until you press "go!".
You've not seen a vga cable right?
Re: They haven't copied
Not in the slightest.
also, thank you for triggering a major nostalgia attack. I even have the disks. Now, where can I get a floppy drive?
Destroying TV availability for better mobile internet? Bring it on!
Hurrah for the paranoia industry
I read the article as saying blocks had been found and proven, but not spoofs, which are indeed far more difficult. So let's all fear spoof attacks!
It's not someone's funding review time by any chance?
Didn't we have a long discussion after a post about Iran's alleged drone theft in which it was suggested that GPS spoofing is actually quite difficult? Is it, or isn't it?
There we are then, thanks.
If it's omitted on a modern drone it's probably on cost grounds, but this sounds like it was an expensive sort of drone...
It's harder than that, sad to say. Many of the proposed attacks involve a slow drift. You would need to compare position information with airspeed information and then work out whether it is inconsistent with realistic wind speeds. Possible, but not bulletproof.
You can definitely keep track of your position through inertial measurement.
It's just a LOT more expensive and difficult to do than GPS, and errors accumulate unless you have something to regularly check against... like GPS. (Until recently it was also bulky and heavy, although I wouldn't bet on that still being true).
As a result it tends to only get used where absolutely indispensable (like on a submarine - GPS reception down there isn't so good). I wouldn't bet against the existence of drones that have this technology - but whether it's involved in the real story behind this particular episode or not, I have no idea.
See, and similar spectrum - this is not a dubious assumption.
Sight has evolved several independent times on Earth because it's too useful not to. Plenty of species on earth have different sensory ranges than humans but they're all in the same basic region - not just for evolutionary reasons but for physics ones. If there is life on other planets the biology and evolution will be different but physics should be the same.
Frequencies of the electromagnetic spectrum that are significantly higher (mid UV and above) are energetic enough to affect a much larger proportion of molecules a lot more rapidly than visible light (ie cause damage). Too much of them and it's not likely there will be anything around to have eyes at all. Too little and it's not worth evolving the ability to sense that portion of the spectrum.
Frequencies that are significantly lower (mid infrared and below) are sufficiently un-energetic that they are hard to sense, let alone image accurately.
We also know that there are a lot of stars out there putting out light in the visible spectrum. Plenty in the non-visible spectra too, but the above reasons make that less desirable.
However, I agree that wasted light is likely to be a brief phase for technology reasons.
On a purely technical level this is simple. Carry usenet, don't carry usenet binaries. This is easy to automatically identify.
On a practical level it won't change anything (too many other distribution channels and there always will be), and on a moral level it would take a very good argument to persuade me this is the right approach, but technically no problem.
No Apple rip-offery detected
Oh come off it. Apple don't even MAKE a 7" tablet, despite it being an absolutely perfect size for a tablet. It's far from a ripoff, it's a downright superior form factor with much better ergonomics.
I have a 7" Galaxy Tab classic and I get a lot of comments along the lines of "wow, it's like an ipad but it's small enough that you're actually carrying it - my ipad never leaves the sofa, I should have got one of those instead!". I'm glad to see Samsung think that the 7" size is worth keeping going; when mine dies or becomes hopelessly obsolete, it will get replaced with something the same size (assuming Apple haven't succeeded in banning every tablet under the sun by then).
IT != computer science
Most "IT" degrees don't seem to be very useful. They add a lot of noise to the signal here - Yet Another Java Course often just indicates that someone was told that an IT qualification would automagically lead to a good career (poor naive sod).
A Computer Science degree should (SHOULD) be different. It's not about how to code, it's about how to think, devise algorithms, understand the business of abstraction. There are plenty of non-degreed coders who are very good at wrangling giant masses of braces and semicolons but shy away from (or just fail at) anything that looks like maths or devising new abstract algorithms to tackle new problems. There are plenty of CS degreed types who know the theory but can't code, but plenty more who can do both (if only a majority of code was written by these people!)
So the survey isn't really measuring anything useful (surprise surprise)
People with no degree who have taught themselves BOTH sides are like gold dust, but there are just not many around and they're not always easy to identify. (Plus there's no guarantee that these remarkable talents correlate with other abilities that make someone successful in the workplace, useful on a project, or a bearable colleague.)
Wrong standard of comparison
This isn't like a desktop computer or even a smartphone/tablet - it's more like the embedded controllers in mp3 players, stereos, washing machines... this is a high frequency by the standards of these things; many run at 16MHz (including current Arduinos) or 20MHz. They are generally used to interface with physical hardware and sensors, which simply aren't feeding data fast enough to keep a gigahertz processor busy.
In any case, once you start working at high frequencies, circuit design gets rather trickier. These things are designed for enthusasts to plug into breadboarded prototypes, which are not designed for high frequency signals. They also need to be able to tolerate input voltages much higher than delicate high-speed computer processors.
Cheer up mate
Given how much of this is closer to electronic engineering than computer science, I don't think you're being fair on yourself there!
I'm astonished by how smart some of these guys are, though.
The piston launcher is intriguing - and this sounds like a great plan to sort out the problem of getting to a stable flight condition.
Also sounds like very impressive building on the truss front.... I'm looking forward to pictures!
Happy with the laws of Newton, it's the laws of Sod I'm worried about
The rail exerts a force on the rocket to keep it on a curved path.
Maybe I've got my sines and cosines the wrong way 'round, but it seems to me that the larger part (cos @ less than quarter pi) of the opposing force exerted on the rail (on an axis through the wheel hub) pulls down against the balloon, and the smaller part (sine @ less than quarter pi) will tend to generate a moment that will rotate the (top-anchored) wheel in the direction of launch (as if the rocket were "dragging the rail around with it", although this is a separate force from the friction term, which I'm hoping I can keep nice and low)... this moves the wheel centre away from a vertical line through the anchor point, so its weight (and the cos part of the force) partly counteracts this moment.
The actual launch angle will be slightly higher than the set angle.
On a balloon, laterally there's not a lot of force if the thrust is nice and symmetric; there is no crosswind (we're drifting at jolly close to wind speed). On a ground level test the wheel definitely needs anchoring against crosswinds rotating it about the vertical axis, but I don't think this would invalidate the test results.
It's quite probably not right for LOHAN but I'm confident enough in my analysis to try it on a smaller scale (admittedly standing behind a wall)
I'd better do the sums again
It will certainly drag the rail a bit... but unmanageably? As you're only using the bottom part of the arc, most of the force is in the vertical plane and pulls against the balloon (which can stand it); the remainder will try to turn the wheel about its axis but it's a comparatively small force and hasn't long to do it in.
Hmm, I must test this! For science!
Easier than you think
Hang on a moment. We're talking about a trigger used before flight here, not something that will move control surfaces.
Not that there are not loads of people doing the latter now - google ardupilot. One of the great miracles of the modern age is that these things are not only cheap, they are very easy to use.
Arc shaped launch rail
One big giant balloon.
Suspend a big hoop below it, hanging vertically, with a few threads of fishing line like bicycle wheel spokes to keep it circular.
Fit launch rail from the bottom of the hoop curving up one side 'til it reaches the desired angle, then sticking off for a little bit (the last bit will need a rigid brace)
Rocketplane sits at the bottom, no need for any kind of counterweight. Light the motors and it accelerates along a curved path along the rail to the right angle, missing the balloon, then flies off at a tangent to the circle on the intended path.
Shouldn't weigh as much as a long rigid pole with a counterweight (oo-err?) as it's smaller and shouldn't need to be too hefty to be rigid enough (as it's efficiently braced)
Also means you have a bit of speed before being totally untethered so aerodynamic stability will work better.
Or an accelerometer
Better yet, tiny accelerometers are quite cheap now and can detect what direction "down" is quite reliably (this is how phones/tablets determine orientation flips).
Not so great if the plane is swinging, or once you light the engine, but neither is the mercury switch - and it is easier to detect swinging from an accelerometer than it is from the mercury. Plus, we can easily log a peak acceleration figure :-D
As the icon says: proceed with this nonsense at flank speed!
Your design is definitely not as unstable as the classic pendulum rockets - the wings, and consequently having the centre of mass ahead of the centre of aerodynamic pressure (I doubt you could avoid that with this design even if you wanted to!) will make this stable once it gets going - so it might well be OK. The start is the tricky bit, combining low speed with thin air... I'm not sure I'm qualified to predict what would happen there but it sounds like a job for very conservative design if ever I heard one.
Steering the whole engine unit, though, sounds problematic to me - will moving the entire engine mass relative to the entire fuselage mass be able to respond fast enough? Isn't this why big rockets usually rotate small subsidiary thrusters rather than the main engine?
I'm really looking forward to this... whatever happens I'm sure it will be fun to watch!
And or Or?
Isn't it only if you're going over 1,000 knots, too? The statement in that document is "AND".
Wouldn't be surprised if many chipsets use an "OR" limitation though, and the altitude information on civilian GPS does not work too well up there.
Time to start packing gyros, accelerometers, barometers, and compasses?
And are you absolutely certain that nose-mounted rockets aren't the old http://en.wikipedia.org/wiki/Pendulum_rocket_fallacy in a new form?
Physics are fine - what about mass production?
Tolerances on that air gap are very tight, considering it separates two substantial masses - this seems to rely on a degree of precision manufacturing not widespread in consumer devices. Get that even a bit wrong, or just have too high a variation, and these will be difficult to trust.
It gets even harder when you scale it up to bigger things like air conditioners.
Definitely worth trying to fix
Absolutely right about the solder joint; but from my very limited experience of iffy CRTs, the next things to check are resistors and capacitors - all cheap, thank goodness! - before you need to start worrying about anything complex or difficult.
Just make damn sure the caps are empty before you start work.
Fuss about Android updates is ridiculous; 7" is the right size (for a tablet!)
Galaxy Tab is fine as-is. I honestly don't care if Android 4.0 Chocolate Gateau or 5.8 Messiah are ever released for it; it was brilliant on the day I bought it and won't suddenly stop being brilliant when Google release a small collection of interface upgrades - which is all most versions are. Hell, I still have my Android 1.5 G1 phone and it's a perfectly decent device. The improvements are minor.
The 7" size beats the hell out of 10"ers for portability and, funnily enough, ability to type on it (as it's narrow enough to thumb-type in portrait mode). Not to mention replacing a phone AND a netbook, and fitting in a jacket pocket.
Vanilla frozen yoghurt for me please
Hang on, isn't the Toshiba Folio 100 tablet a Tegra 2? The one that was in Curry's and PC World for about two weeks before they realised the Toshiba interface kludged on top of Android was made entirely of fail? I tried one, seemed powerful enough, shame about the horrid interface.
There's rumoured to be another Tegra tablet just around the corner, the "Advent Vega", but I'll believe it when I see it.
(The big secret is - existing plain Android (or "vanilla frozen yoghurt)" is absolutely fine on a 10" tablet. Manufacturers just want theirs to look different so they don't become commodity hardware. I don't think they'll succeed unless they do a much better job than they do right now)
Count me as one more for "grotesquely excessive overreaction"
A kid, idiot school techs - this is a slap on the wrist offence, not a matter for charges.
Overreaction is, of course, business as usual for both the law and school authorities.
I thought half the point of this was that it was distributed? So the creators shouldn't *have* to have a giant pile of servers.
Pity the pre-alpha is poor quality, but I think it's a bit too early to call whether it's broken-and-unfixable or just broken.
Fit with Apple's nanny policies?
How will they reconcile the sex, drugs and violence that GTA customers expect with the strict limits on sex, drugs and violence that St Jobs seems to feel are needed to "protect" Apple customers? I foresee disappointment.
Cores vs GPU
Karakal - know any games that use six cores? Or even really tax four? The 5770 is fast but a 5850 would be faster; dropping to a quad core processor (I can't believe I'm saying that) would free up money spent on unused cores (while still matching the current general "high end" core number so having at least one spare for the future) and save money for bigger graphics you can benefit from right now.
I'm not much of a gamer but I have enough of an interest in making PCs that go like stink to know that, above a certain level, you have to have something of a plan for what it needs to be best at. This spec would be a fine basis for a development PC for the office (with maybe a couple of modest tweaks) but I doubt my boss or colleagues would be thrilled by masses of fans and a bright red glowing gamer case.
What's it for?
Nice processor but
- graphics card is too small for gamers but rather large for non-gamers
- processor is too large for games, and non-gamer users are unlikely to want all those blingy lights
- large amount of poorly designed cooling sounds rather undesirable in a PC you'd only buy if you had very processor-heavy work to do
Seems like a powerful box that doesn't fit an obvious niche.
A year ago I couldn't have justified the cost of a Core i7 desktop; this year it offered the most bang for my buck. On the other hand, my almost 18-month-old netbook is indistinguishable from my wife's 2-month-old netbook apart from a few gig of hard disk and all the scratches... and cost about the same. So surely a major effect here is that, unlike desktops or big laptops, there's not much reason to upgrade a netbook?
AC: it's the same chip as the GTX480 games card that is claimed to burn a suspiciously similar 250w but actually gobbles 320; with a similar clock and more memory, it's not going to eat less. Your comparison is made even trickier because the Xeons can do a lot more than a graphics processor; the abilities of these things are pretty narrow. Most proposed GPU-assisted high performance computers need a lot of standard CPUs to keep the GPUs busy. It's pretty nifty, but it's no miracle. The ATI equivalents get more raw flops per watt - but do even less, so they need more CPU support. Same tradeoff (if you ignore coding difficulties, which you can't). Ye cannae change the laws of physics, Captain!
nVidia will continue to rule GPGPU for a while because they have a (fairly) widely known and (kinda) not too difficult language, but neither that nor this card will help them break out of fairly narrow applications where the performance per watt advantage creeps above marginal.
Graham: The transputer was a brilliant piece of engineering at totally the wrong time. It's like inventing the Ferrari in 1800; far more improvement in travel speeds from improving the roads would be needed before it was worth having. NOW we need it back, I agree... although I'm not sure I agree with your assesment of computer science; it's full of great ideas for parallelism in an ideal world but hasn't got much chance of meshing with the existing mess most people have to deal with...
Just clarify one thing...
Are these the same nvidia chips that are widely slated for being too hot and PCIe-bustingly power-hungry in consumer graphics cards? And, remind me, is heat and power performance more or less important in massive HPC applications?
I don't understand
Now hang on a moment Bart - why on earth would you want to watch any of that stuff in the first place, even free? So why would you have a "media services" bill at all?
Count me among the completely unaffected.
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene
- Beijing leans on Microsoft to maintain Windows XP support
- Google's new cloud CRUSHES Amazon in RAM battle