31 posts • joined Thursday 19th April 2007 17:36 GMT
Ted Nelson wrote on a similar theme, in his 1970s book "Computer Lib." He denounced what he dubbed "Cybercrud," the tendency to use computers and cybernetics as a catch-all excuse ("sorry, the computer is down") or as an impenetrable shield to prevent the analysis of weak theories.
It's amazing how little has changed in 35 years. Or not.
Flash eats laptop batteries
It's true, Flash is so inefficient that it burns CPU cycles excessively. I know because I have a laptop with an old battery that only lasts about 20 minutes. If I hit a page with flash, even if the flash isn't running yet (like a YouTube page) the battery dies within 1 minute. It just eats my batteries.
Brilliant Business Model
This is a work of absolute genius, it demonstrates the new business model for the Internet age. Forget actual product development, just sell the concept and the empty package. It's pure profit.
Back around 1976, I took some early 8080A microcomputer programming classes, I'll never forget what happened at the first day's lecture. The professor described how he was a former engineer for Univac, and how as a hobby, he still maintained the last working Univac II system still in active use. It was used by a local company to calculate payroll, not a terribly complex task by the standards of the modern era, although it was a huge breakthrough back in the late 1950s. Of course he was baiting the students, and one of them asked the obligatory question, "why would the company keep such a dinosaur in production, when you could do the same job on one of these new microcomputers?" The professor thundered back, "because the old computer is already PAID FOR!"
I thought about that incident a lot, especially in 1980 when I sold that company a small microcomputer with payroll software.
I kept wondering why I wasn't being offered the Safari 3.2 update, then I realized I installed the Safari 4 beta from the Developer program. The 4.0b has been completely stable and worked fine with SIMBL plugins like PithHelmet (even if it doesn't officially support that version). I wondered if 3.2 > 4.0b, so I uninstalled the 4beta with Apple's official uninstaller, and Software Update then offered the 3.2 update. And then my problems started.
Now I am completely unable to open a new page or link in a tab, it causes 3.2 to crash immediately. This is a dealbreaker. This isn't a problem with accessories like SIMBL, I use nothing but PithHelmet and it works fine with even the 4.0 beta (after editing the .plist to accept higher than authorized versions, so it doesn't auto-disable). But even disabling PithHelmet with 3.2, Safari just does not work. Now, even worse, my reinstalled 4.0b halts intermittently, with a warning that no new page can be opened, do I want to quit immediately and force close all open browser windows? Even if I say no, I get a second warning that some system resources seem to be missing and I should reinstall Safari. This is untenable, a proper uninstaller shouldn't leave resources missing, especially if I just go back and reinstall it. Perhaps the 4.0b installer only works over earlier versions, not 3.2. If so, Apple should release an updated beta installer that works over 3.2.
Well jeez. It now looks like I'm in a mess, due to perhaps incomplete uninstalls or incomplete reinstalls. I've tried reinstalling the 4.0b but the problem persists. If I can't get this working, I'll be forced to do an archive and reinstall of the whole OS.
The True Reason
I found an analysts remark that I can still quote from memory that explains the true reason for the iPhone's poor battery life:
"Battery life sucks because YOU CAN'T PUT THE DAMN THING DOWN!"
I thought that was particularly perceptive.
Some people will believe anything
I am embarrassed that I know how this hoax got into circulation. It was from some stupid joke on a stupid TV show (that I will not name, to save myself further embarrassment).
One of the characters is on a movie set and he says to some stoners on the crew that he's going back to his trailer for some Chocolate Quik (note: not "quick," Quik is a US brand name). The stoners think he's using some new slang for going back for a toke of weed. So that becomes the running joke, they hang around the set during breaks chatting about how much they like Chocolate Quik. Then one day he says "oh man I just bought some Strawberry Quik!" So the stoners have got to see this new primo product, they go back to his trailer and find out it really is just milk and powdered flavoring.
It appears that the procedures described by Ms. Bryson are illegal under US voting rules. The system prohibits giving receipts to voters that show which candidate they voted for. This is to prevent vote-buying, if you had a receipt, you could take it to your political Boss and show them you voted for his candidate, and get your payoff.
Furthermore, it would be illegal to put both a voter's ID and the candidate he voted for on a receipt, that would violate the Secret Ballot rules (in addition to aiding vote buying).
For details on the legalities of voting systems, I always point to CompSci Professor Doug Jones' authoritative website on voting systems:
In particular, I recommend his essay "A Brief Illustrated History of Voting."
I am continually astonished when these modern voting systems don't follow the simple, basic procedures that have been well established for hundreds of years, as described in Prof. Jones' history paper.
Yeah, I remember selling these IBM units, back in the day. IIRC the IBM Portable came out a little after the Compaq Portable shipped. I had some customers who were IBM-only shops, they lusted for the Compaq but couldn't get authorization to buy one. So when this IBM shipped, we sold a few.. but only a rare few.
One of the problems with this unit was the floppies, most dealerships wanted to do the normal "buildup" process, so they'd buy the base units with 1 floppy drive, expecting to add a generic second drive. But those persnickety IBM-only customers often objected when we'd ship them a built-up machine with an added second floppy disk drive, but without the IBM logo on the 2nd drive bezel.
These Compaq and IBM portables were never designed to have hard drives installed, they were dual-floppy units at best. But a lot of the massive size of the machines was for card slots, which nobody really used. Then some smart guy had an idea for a "Hard Card," a hard drive and controller on one card, just pop it in and you've got an IBM XT equivalent. Oh man I sold tons of those Hard Cards. Anyway, I suspect this IBM portable might have this type of add-in, as there really isn't any other way that I know of to put a hard drive in this CPU.
I remember when that Commodore unit first shipped, it reminded me of the Otrona Attache, it's a similar form factor that came out a couple of years earlier.
This is pretty much the minimal cabinet you can squeeze a 5" CRT and two floppies into.
With the SSD option, the MacBook Air is now a geek lust object. Too bad you can't afford it, but oh if you could, you'd be the envy of every geek. But you can just afford the sucky iPod-style drive. Oh well.
This is what makes products lust objects. Jobs and Ives know it.
Don Mitchell: Oh yeah, you triggered an old Cray memory. Yes indeed, Apple did buy a Cray to design their computers. I remember attending a lecture by Apple's Advanced Technology Group about this. They said they were using the Cray during the day to design custom chips, but at night, they turned the Cray over to the ATG for experimentation.
The ATG lecturer said that soon desktop computers would have the power of a Cray, and they intended to find out what you could do with all that power in a single-user computer. Then they made startling predictions (for 1985) about what the personal computer would be like, based on a Cray. It would have astonishing features like a CPU that ran over 1 gigaflop, a gig of RAM, 1 gigabyte disk storage, 1 megapixel 24 bit color displays, 1mbps networking, etc etc. Those things all sounded astonishing because a system like that cost millions of bucks. But if they had to spend millions just to do some experiments to figure out where the future of computing would go, they had to do it.
Alas the ATG was disbanded shortly after Jobs returned, but their influence is still felt widely throughout Apple and the entire industry. And their predictions of the future of personal computing came true even faster than they expected.
Ah, that's beautiful. I never saw a Cray 1, but I did get a good close up look at a Cray X-MP sometime around 1985. An animator friend was doing some work at Digital Productions, he said I should come over and take a tour. They had just finished production on "The Last Starfighter" and were working on a film for artist Matt Mullican. I got a good look at some convoluted animation software by Symbolics, and listened to their technicians griping that no matter how powerful a Cray they had, the animators would invariably choose the most computationally intensive effects, so all that CPU power was useless, their current productions were still taking 4 hours to render one frame, just like it took 4 hours in all their previous work in years past, before they bought the Cray. The 4hr/frame scenes I saw rendering would run in realtime today using modern graphics cards, but back then, this sort of graphics rendering seemed like a damn miracle.
Anyway, on my tour around the office, I saw a framed computer graphic image on the wall, it caught my eye because I'd seen before in one of my first computer graphics textbooks. It was a pen plotter artwork of outlines of little birds, swirling in a pattern that looked like a magnetic field, it looked like it was produced with early 1970s technology. As I paused to admire the print, someone walked by and noticed me noticing it. I mentioned that I'd seen this in my textbook and I had always admired it, and where did he get it? He smiled and said, "I made it myself." I suddenly realized, I'm talking to John Whitney Jr.
The Estimator's Rule
There's an old recursive rule called The Estimator's Rule, it says "it takes more time than estimated, even after taking The Estimator's Rule into account."
But in a way it's true, it's actually an interesting problem in computer science. Computers are notoriously incompetent at giving accurate estimates of program runtime. If you run a program to estimate runtime, that estimation program takes CPU cycles away from the program you're monitoring. You could add another level, to take into account the effect of the monitoring program. But there's another CPU hit. Repeat ad infinitum. You never close in on an accurate estimate, add levels of monitoring and you start diverging from it.
Well fortunately, ballpark estimates are usually good enough. I notice a similar behavior copying files in MacOS X, initial time estimates can be insane but it quickly settles down to a vaguely accurate ETA, once it starts tracking how fast the transfer is moving.
Excellent job of muckraking.
Great scoop you dug up there. I particularly loved the quote from the Wikimedia attorney, in denial that they knew about her arrest records, "We have, in our records, no evidence of any such thing."
Guess what? Now they do. I searched up a couple of arrest records online at the Pinellas County Sherrifs Office (1 DUI and 1 DUI+Hit & Run arrests) and added them to the Wikinews story:
This will be a test of Wikimedia. The links lead to incontrovertible facts. Will they stand, or will the links be removed under some lame pretext like NPOV?
I noticed that upside down iPhone and I wondered how long it would take to appear on the web.
BTW, that show "Two and a Half Men" is really good. The writer almost redeems himself for his prior crap "Dharma & Greg." No, I take it back, nothing can redeem someone for inflicting that show on the public.
Good math = bad physics?
It is at times like this that I recall one of my favorite math/physics quotes, I forget who said it, I think it was Stanislav Ulam. He said "insofar as mathematics accurately describes reality, it ceases to be interesting."
And that's the problem with crap like this. It's interesting math but it's useless as physics.
Yeah, I noticed the iPhone immediately, and was annoyed at the fakery. What really stood out was that the phone didn't blank the screen when he held it up to his ear. You can see it in the article accompanying this article, the phone is lighting up the guy's face. In actual use, the iPhone has a proximity sensor, so the screen blanks when you put it to your ear.
Wow, that's a blast of nostalgia. I remember writing accounting software for the PET. We wrote in UCSD Pascal on Apple II computers, it was easy to port to other UCSD platforms. We wrote once and recompiled for use on the PET, Ohio Scientific, and a few other obscure platforms (none of them over 2MHz, ha). The Apple version was the only software that ever shipped.
But really now, you should have asked if you could open up the PET and taken a pic of the insides. IIRC it was mostly a big, empty plastic box with a little motherboard at the bottom. The top part was hinged at the back, you raised it up like the hood of a car, it even had a little rod to prop up the top.
TeeCee is on the right track with the "iWipe." I keep my iPhone in my pocket next to a folded microfiber cloth, the kind you use to clean eyeglasses. Most of the time, the screen is rubbing against the cloth and cleaning itself. I carry the cloth to clean my glasses anyway, works great on the iPhone screen too.
I've been there..sorta.
These things happen. I worked at a graphics bureau, we got a new expensive ($500k) imagesetter, the crate was impressive, it had an uncrating procedure that took 30 minutes just to release the shipping locks. Then the idiot installers got it out of the crate, rolled it on its own coasters down the dock, and bounced it a mere 2 inches over a concrete curb. The imagesetter was ruined, the light-tight interior got bent and developed light leaks we could never track down. The company dispached engineers, they worked for days but they couldn't fix it. They ended up replacing the whole machine and writing off the old one as damaged in transit. We were lucky the damage occurred out on the dock, before we took delivery of it, or we could have been screwed.
Ah, I am heartened to see that people are still traumatized by the old Nassi-Schneiderman charts that were all the rage when I was in college in the mid 1970s. We had one textbook that translated all of Wirth's Pascal flowcharts into NS charts, it was really clever in that a well designed NS chart had a 1-to-1 correspondence to Pascal code, the chart was unambiguously convertible into code.
This all worked fine in theory, then I got my first professional programming job. We were writing accounting software for the Apple /// in UCSD Pascal (oh what a dead end) and the designers spent all day plowing through algorithms in Knuth and Wirth books, cranking out NS charts for us lowly coders to implement. But alas, the designers were not coders and their charts rarely worked right. This resulted in continual bickering between coders and designers, as mere coders we were not privileged to alter the designers' structures, unless we could crank out our own NS charts that represented our proposed code, and the designers all agreed that our changes were correct. Sheesh!
This scam has been all over Craig's List for the past 2 or 3 months, it's usually an ad for purebred bulldogs, pugs, or other expensive breeds. It seems to have dropped off lately, the CL people seemed to have caught on to the scam and mercilessly cancel the messages (animal sales are prohibited on CL anyway). So the scammers have moved on to other venues in search of new victims.
Macintouch.com leads today with a link to this very interesting linux listserv message:
A message claims to be from LMH, identifies himself as David Maynor, and says that Infosec Sellout is Jon Ramsey of Secureworks
The message looks more like disinformation and self-aggrandizing hype to me.
Von Clausewitz observed "War is the Province of Friction, wherein the simplest things become difficult." On a complex, friction-filled battlefield, even the simplest weapon systems break down in unpredictable ways. High technology is unlikely to change this problem. In the Fog of War, high tech weapons are the least likely to perform as expected.