50 posts • joined 22 Aug 2008
Let's ask him to personally locate where Philae is sitting on comet 67P, and move the lander out into the sunlight so its batteries can recharge. Someone will be back along shortly to pick him up for the return trip to Earth.
*Wally checks left wrist*
"Ah, 3:00 PM. Time to kick some asses."
*Wally yawns, stretches, checks right wrist*
"3:07 PM. I suppose I'll take some names now."
... comes the iWatch U.P.!
"Never before, in the history of mankind, have we had the opportunity to monitor our data with this level of precision," said Tim Cook, CEO of Apple. "And by 'we' I of course mean 'Apple' and by 'our data' I mean 'all of the telemetry on our customers' bodily functions we can possibly record.'"
I thought the same thing. The reference in the article makes it sound like they only make these obscure batteries shown here. They've been all over Newegg in dozens of gadget categories for years. I've been hesitant to buy their stuff just off of lack of brand recognition, but they're frequently highly rated. I'm sure I've got more than a few of their products like hard drive enclosures or memory card readers that just work and you don't have to think about again.
I tend to think that the definition of "troll" has changed a bit since the 90s. A troll used to be a skilled manipulator of a forum's emotional hot buttons. They could come in feigning ignorance and drop an obvious loaded question: "So, why hasn't Google ever made an iPad? Is it because they can't do it as well as Apple so they don't bother?" Then they just sit back and wait for the flames to start rolling in. A good troll knew exactly what got a particular newsgroup or forum riled up, and would start arguments by using ignorance to suggest the local sacred cow was anything but.
These days, it seems anyone with an obnoxious attitude or an unpopular opinion is considered a troll. I've even seen posters labeled "troll" simply for stating a perfectly defensible, mainstream opinion that just wasn't shared by the next obnoxious internet loudmouth posting directly afterward.
It's been many years since anything has so much as brought a tear to my eye. I read Pale Blue Dot over a decade ago. Seeing these pictures, and reading Dr. Sagan's words again, reminded me of how they are some of the most moving and awe-inspiring moments of my life. If we could get the whole human race to stop their bullshit for two minutes and really contemplate all of the ramifications of these pictures, we'd make a giant leap as a species.
Can Ordnance Survey use these photos to turn the LHC into Minecraft now too?
DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS! DEVELOPERS!
Silence will fall when the question is asked.
The artificial singularity that powers the Googleplex creates a non-negligible effect on spacetime around Mountain View. The graph shows the time that their servers perceive. Since the singularity slows down time, it took an extra 15 minutes after the event began in the outside world before Google's servers registered it. The stalwart team of boffins at Vulture Central merely corrected Google Coordinated Universal Time to regular Pacific Daylight Time.
I'm not following something here. Reports on this story say they used a Beagleboard inside a regular iPhone charger. I have a Blackberry charger that's a 1.25" cube. My Nexus 7 charger is 1"x2"x2". As small as single board computers are, they aren't going to fit in either of those. A Beagleboard is 3" square. Are there iPhone chargers big enough to fit a Beagleboard that an iPhone user would believe is an Apple OEM charger? Or would the hacker have to explain the oversized brick to his target to get by any suspicions?
I didn't know frozen green beans were a digital storage medium. If they thaw out, do they lose their data?
Oh . . . the bar code. Gotcha.
Sounds good to me. I'm not involved in its development, but I'd be flattered if it was named after me.
"Oh, is that all. Driven by ?"
SOON! SOON THE WORLD WILL RUN ON INTERNAL COMBUSTION!
AND THEY WILL COME TO ME FOR ITS SECRETS!
I thought most Androids ran on ARM chips.
I held out a lot of hope for smartbooks when I first heard of them. I assumed they'd ship with Android and I'd just replace it with Linux. I waited and waited for them, only to see most of them canceled after they were shown at CES around 2010. I knew of the AC100 but don't think I ever found anywhere to actually buy one. I was hoping that smartbooks would become popular enough that most of the typical laptop manufacturers would offer a version. Unfortunately the iPad happened and everyone turned their attention to tablets. Hopefully Chromebooks will pick up where the smartbooks left off, and eventually someone will accidentally put a Windows-style keyboard on one. Then I'll have exactly what I thought I'd have three years ago.
A laptop with a proper keyboard (i.e. all the keys - Home, End, Delete, PgUp, PgDown, F1-12), no rotating equipment, no air intakes or exhausts, minimum 1280x800 resolution screen around 10-12", 8 hr battery life, at least 32GB of storage (NAND, SSD, I don't care), ability to run full FOSS applications (LibreOffice, GIMP, Firefox, Amarok, VLC) and no locked bootloader BS so I can install whatever OS strikes my fancy. I'd be willing to pay up to about $500 for it as a light travel and living room machine.
ARM Chromebooks are close, although most have gelded keyboards geared toward Google's idea of computing. The OS choice is typically limited to Chrome OS and Ubuntu also. At least, if you want to swap in a Linux distro, Ubuntu is usually what's going to have images readily available. A Slackware ARM port for the Samsung Chromebook is in the works, but the keyboard would still be a problem. If Debian or Mint were easily installable I'd live with those too.
An Android tablet with an external keyboard is also close on the hardware side, but it doesn't have all the keys, assembling a suite of basic "apps" without being nickeled and dimed or having my privacy invaded is annoying, and printing is a minefield.
My current machine in this role is a Lenovo X120e running Win7. It's got everything I want except the screen resolution, heat management, and battery life. I can't set it on the couch or my lap without blocking the bottom air intakes. It also has a 1366x768 screen, which is cramped. Win7 with all the FOSS applications I like is a good combination now.
I'd prefer an x86 processor since the choices of Linux distros and software are a lot broader. So even if Intel can't get their latest Atom down to the mW draw necessary for a phone, if they can at least come up with something that doesn't require a heat sink and fan, I'd be happy with it in a laptop. As mentioned above, x86 doesn't have all the driver issues that ARM SoCs have. But I'd like a silent machine that doesn't blow 140°F air on my leg when I actually set my "laptop" on top of my lap.
Can the user wear a shirt while wearing said glasses, or are those two technologies somehow incompatible?
Icon for "I'll have to take my shirt off first."
Build an FTL ship and launch it directly at the cold spot. Collect data for several million years along the way. It's obviously a signal from before the Big Bang, because I heard about that on a documentary on the SyFy channel a couple of years ago. Robert Carlisle presented it, I believe.
". . .people have now had a taste of the vastly streamlined and simplistic interfaces of mobile apps."
Let me fix that for you: "people have now had a taste of the featureless, unfinished, barely functional, unfit for purpose interfaces of mobile apps."
I recently attempted an experiment on my Nexus 7 tablet to see if it would be suitable for corporate email. Having pecked out 4-5 sentences, a single mis-swype of my palm resulted in the entire text buffer being selected and replaced with a space. At which point I discovered Android has no undo function, to add to the long list of missing text-processing concepts that have been with us for decades. Such as cursor keys or "Home", "End", or "Delete".
While searching for the location of the non-existent undo function on Android, I came across a post by one brilliant gentleman who suggested that Android doesn't need one. That's because it's really not that difficult to just retype things. An opinion he no doubt cultivated because he's never assembled more than 140 characters at a time, or actually expressed anything in words that was of any importance whatsoever. But his attitude was one of righteous indignation for why someone would question why Google continues to refuse to include "undo."
I fear the future of computing will be a race to "streamline" interfaces not to make them easier to use, but simply to strip out feature after feature to avoid confusing the least capable users among us. But, hey, that makes it easier to call the code finished, right?
If we had telescopes (ground-based, no less) capable of resolving the image presented in this article from 450 MILLION light years away, we wouldn't be fumbling around trying to find extrasolar planets. We'd be inspecting what ET is having for dinner on every inhabited planet in the galaxy. As has already been mentioned, the combination of the lack of an "artist's rendition" notice and the incorrect distance measurement serve to give an entirely misleading impression of this finding.
Lovecraft's already been covered, so I'll knock out the rest of them to save everyone the trouble:
Are you sure the crew isn't Norwegian? How many dogs do they have? Are they all accounted for? Because that would be just like John Carpenter's movie The Thing. It was also about digging things out of the Antarctic ice. Spoiler alert - The Thing was a shapeshifting alien.
And for the two people that saw it - what if they find some ancient city/hunting ground/temple used by an interstellar hunting club to grow sentient biological weapons to hunt for sport? Because under the ice in Antarctica is also the most logical place for said hunting club to locate their lodge.
I'll let some other Stargate SG-1 fan talk about the chair.
An abacus? We had to do our calculations with rings of giant stones, and it took the machine 31,557,600 seconds for one clock cycle!
An unlocked iPhone doesn't operate on the 1700 MHz UMTS band, so it will be limited to EDGE on T-Mobile. I'm not clear what the incentive would be to pay for an unlimited data plan for 236 kbit/s. Does this mean Apple will start making T-Mobile versions of their phones with crippled chipsets like Samsung does?
Because in the Latin alphabet, Jehovah starts with an 'i'.
For some reason, I was operating under the delusion that this organization was something completely different.
The best part was when my brain read that back to me in Zapp's voice.
The soundtrack in my brain immediately began playing "Thus Spoke Zarathustra" the instant I opened this article.
Where's the Pavlov's Dog icon? Oh well, Paris will do.
if Lewis had written it. He works the particle-molestation desk though, not the CPU-molestation desk.
This article had absolutely nothing to do with Bachman-Turner-Overdrive.
...is that George Lucas designed them all, so you get Desert Planet, and Water Planet, and Lava Planet, and Ice Planet, and Forest Moon, and Jungle Moon, and City Planet, and Grasslands Planet...
Alien, 'cause he kinda reminds me of Greedo. You know, the Greedy Alien.
"or permitting smaller robots to get about inside buildings or other cluttered environments"
Which would of course be the best place for a shiny automaton to train its plasma rifle on the IR signature of a cowering humanoid.
That sure didn't take long.
8bits/channel x 3 channels (RGB) = 24 bit color. He was talking about bits/channel. Better to have 36 or 48 bit color (12 or 16 bit/channel, respectively).
I was all excited to get a 4G until that douchebag at Gizmodo had to go and ruin the surprise. I'll just wait until the 5Gs come out now.
will it blend?
No, don't you see? The disaster has already occurred! Will occur! Will have already occurred, but not yet!
Some time next week they'll crank up the particle-punisher to such extreme levels that spacetime itself will be rent asunder. The distortion wave will travel in both directions through time, so we're already seeing it. Apparently one of the first effects is that December now occurs in February.
There was a GI Joe vehicle from the mid-80s called the Sky Hawk that looked almost identical to this thing. Even as an 8 year old, I wasn't exactly sure how such an aircraft was supposed to fly. The idea of "center of mass" probably never entered the toy designers' heads.
Black helicopter, because, well, COBRA had black helicopters.
In the US, stories like this usually end with a comment about "it is not yet clear if school authorities will press charges." Granted, it sounds like those responsible are long gone from the school. But defacing school grounds/public property, anti-social behavior, embarrassing authority figures, etc. can't be allowed to go unchecked. Where's the promise of an investigation?
I love reaching a particular sentence in a Reg article where I think "this must be a Lewis Page story." I then scroll back up to check the byline, and 95% of the time I'm correct.
The "Lewis Point" in this story was of course the aforementioned "ravening, self-mothered pseudohydrozoan immortal Dr Who jellyfish clone vampire blobomination horror-swarm." Excellent.
No one wants to comment on a missile requiring "penetration aids". This was a Friday article, right? Am I on the wrong website?
If you need an explanation of my choice of icon, you are definitely on the wrong website.
This sounds a lot like what Mac users went through in the transition from OS 9 to OS X. The reason was that OS X was a radical re-write that broke all of the backwards compatibility of apps for OS 9. The benefit was that OS X was a complete re-write that got rid of a lot of legacy cruft and allowed Apple to push the OS beyond what they could do with OS 9. A compatibility mode was provided for OS 9 apps, but it was a small hassle to put up with in exchange for all the shiny goodness of OS X.
I thought one of Microsoft's excuses du jour of why Windows is so monstrous, bloated, and takes years to update is because they have to maintain backwards compatibility with 15 years worth of legacy apps. So here we have an ugly, half-implented hack just to keep perfectly good (and possibly very recent) XP apps going with none of the benefits of a ground-up rewrite of Windows to clean the skeletons out of the closets.
Which is it, Microsoft? What do your customers gain in the bargain for losing backwards compatibility?
I recognized those curves immediately as the game in Soylent Green. I had always assumed that thing was a prop created solely for the movie set. It had that look of "this is what the future will be like" from the perspective of the early 70s. Pretty funny that those were actually built. The fiberglass and the metal flake paint makes me think of a sidecar or a custom trike from that era.
Oh, but you see audio copy protection is not impossible. The music pigopolists just need to pull off the same massive scam that the MPAA et al got away with for HD video. Yes, HD-Audio! You'll have to throw away your old, poorly performing low-resolution analog speakers and replace the entire signal chain with HD-Audio compatible equipment. The audio stream is encrypted on the disc, encrypted in the player, and not decoded until a nanosecond before the drivers fire in the speakers. To make the system work, you'll need an RIAA/IFPI approved HD-Audio signal cable (of which there will be four iterations before they settle on one). But not to worry, Sony will be happy to sell you each of those iterations for the low, low price of $50 for every three feet of speaker wire. Just wait until the next generation game consoles come out, they'll include two different incompatible competing standards of HD-Audio for you to bet on. And it will all be worth it for the lushest, most immersive simulations of real sound you'll ever hear.
Oh, you must be new here. See Lewis or Lester to get your Martian Translation Matrix installed. The insights you'll receive from our resident Martian friend will astound you. He's really quite harmless, and after a while he actually starts making quite a bit of sense.
She's been on me about buying the fourth wallscreen for the viewing room. Says it will make her soaps more immersive. Apparently she's gotten the new script in and would really like that fourth screen before the new season starts.
Oops, got to go - Captain's calling me in. Got another customer that needs to be liberated from his books.
Mine's the glossy black one with the salamander on it.
Michael C - "On a Mac, I'd have none of these concerns. Any malware would likely not work at all. Any pop-ups targeting Windows this way would be VERY obvois indeed, and any malware i might inadvertantly download would not run."
madra - "it's one of those articles were us mac & linux users can have a good belly laugh at you sad windoze drones and your pestilent OS - we havenae had one for a while!"
What I find interesting about this particular form of malware is that the methods they are employing don't necessarily rely on any particular flaws in Windows' code. They are employing flaws in the computer literacy of Windows' users. It seems that the payload they are trying to deliver is a constant harassment of the user to pay for some software via a dodgy website with their credit card. If 95% of computer users were using Ubuntu, I could still see this same scam working.
All that said, this article hammered home some thoughts I've had recently. All the talk about "is Linux ready for the desktop?" or "what Linux needs to do to become the dominant OS" starts with the assumption that that would be good for the existing users of Linux. I think I've decided that I'm happy with all the idiots staying on Windows. There's the old saying that Linux is only secure because it's not popular, and no one writes malware for it. I've never agreed with that - Linux (and OS X) are more secure than Windows by design. But there's another saying that system security is only as good as the hardware's physical security. When the user sitting in front of the box is the box's worst enemy, it doesn't matter what OS it's running.
Yes, the hardware support and compatibility would be great if 95% of the world ran Linux. But all the Russian gangsters would then start writing this kind of crap for it and I'd still be cleaning it off of my family's machines - hopefully before they'd given their credit card details to a website in Botswana. The social engineering aspects would be harder to pull off than your typical ActiveX exploit, but they'd still have enough success to make it worth their effort.