103 posts • joined 1 Jun 2007
Hi I agree with Paul that a different term is needed. Before reading this article, all I knew about Big Data was that vendors of Big Storage were talking about it, so by association I assumed that Big Storage implied that you were dealing with Big Data.
After reading this article, it seems that we need to break the assumed link between Big Storage and Big Data. You could have Big Storage for telephone call records and still be in the world of SQL "small data" (as defined in the article: easy-to-understand, simple relationships). "NoSQL" isn't a good replacement for "Big Data", because it seems to be mostly concerned about scalability and not a fundamental difference in the quality of data. So we obviously need a new term to describe what is A New Level Of Complex Data That Can't Be Easily Generated From Set-Theoretic Operations On Atomic Data.
Can anybody come up with a better acronym than ANLOCDTCBEGFSTOOAD? Perhaps that would be best, because ANLOCDTCBEGFSTOOAD belongs to me, it's mine and I own it.
Does this mean that the Erlang language doesn't work well?
Or is it just bad application coding?
It's called Computer Science, and...
...Science is not about man's control over nature, but about man's control over man; C.S. Lewis said that (or something reasonably close to it). It is therefore to be expected that jumps in science would go along with jumps in control.
As long as there is IT unemployment, there will be thousands of us willing to help build the gilded cage. (Theater is also profitable, as long as you have the right investors.)
It goes back farther than Vietnam
American troops first encountered Jihad warriors in the Phillipines after the Spanish-American War. The Moro juramentados were the terror of our soldiers because they would pre-emptively apply tourniquets to their limbs and wade into a group of our soldiers, dispensing death and destruction while insensible to limb injury.
Amazon's Wetware Mechanical Turk
Want a few pennies for doing something on the web? Amazon is flogging Artificial Artificial Intelligence at:
I don't know whether this Amazon is THE Amazon, but it certainly shows how CAPCHA workarounds can be organized on a massive scale.
question about data traces
I'm worried about the wear-levelling features that I've heard are now embedded at the flash controller level out of your normal reach. Does that mean that traces of data on a region of flash that has been remapped to spread out the write cycles will remain and be somehow available for forensic examination? Or do the controllers zero out flash after it has been copied and remapped?
disappeared into Compaq, later part of HP?
That's only accurate in the sense of corporate visibility. Tandem had already quietly appeared and then disappeared into the banking and stock trading infrastructure of the world. For example: if you gave ATM network owners a choice between having their water or their Tandem systems turned off, there would be a run on Porta Potties.
Scaling to serve read-only webpages is a piece of cake since there is no need for session management. Scaling to serve sessions is harder, and gave rise to the whole J2EE stack and other expedients. Scaling to ensure financial transaction integrity and continuous service is another matter altogether, since you can't solve the problem by just throwing machinery at it: this is why the secret sauce has always been the Tandem OS and its transaction monitor as much as the changing hardware generations underneath them.
Although I'm only a Tandem and not a Stratus programmer, my understanding is that Stratus at least started out with a more hardware-oriented fault-tolerance approach than Tandem. But Tandem started out as a shared-nothing system: no single hardware failure could take the system (and therefore the application) down. As such, Tandem was solidly grounded in both software and hardware fault-tolerance.
Although the backupability and network mobility of virtual machines has tantalizing implications for near-continuous availability, I'm not aware of any VM-based system that guarantees no lost or duplicate transactions. Until one arrives, ATM and stock-market operators would rather fight than switch.
maybe not more creative...
...but they sure have a lot more to deal with:
"What we used to call systems integration is just system administration nowadays"
The wider palette of paradigms certainly provides greater scope for creativity than the "FORTRAN, Assembler or COBOL" trinity of bygone days. (Whether the average programmer takes full advantage of it is another question.)
looks like the subjunctive has already bit the dust...
This article starts out with "is proposing English adopts a phonetic approach".
I thought it had to be something like "is proposing that English adopt a phonetic approach", or perhaps "is proposing that English be spelled more phonetically".
P.S. A less-than-phonetic spelling standard means that the same idea can be read anywhere regardless of local pronunciation practice. You say tomaytoes, I say tomahtoes, we spell tomatos. If it's not broken, don't fix it.
heartbeat pressure waves travelling through elastic arteries...
...are analogous to electrical pulses travelling through transmission lines with capacitance: waveforms will deform.
Blood vessels will constrict to different degrees at different times (analogous to changing the capacitance in an electric line), so the agreement between the two heartbeat waveforms can be expected to fluctuate over time. Whether the signal degradation would be enough to affect the timing scheme is open to question. I just hope they're testing over a wide range of constriction.
Give us a fag, mate?
They're below the level of assembler, and even assembler is below the level at which I program and could hope to understand what is going on.
I'd like to see some certification method and accountability for what is going into the writable control store, not just a line saying "applying microcode updates" flying by whenever I start up my computer.
There's nothing wrong with sensitivity and compassion per se.
Although I hesitate to wade into this sea of raging linguistic testosterone, duty compels.
The ridiculousness of the situation really lies in a government or any other corporate body preaching sensitivity and compassion. These are, and always have been, attributes of individuals and not of groups. Governments only seem to have them when good individuals succeed in manifesting them in the course of their official duties.
(Religions are also groups, but membership in them is usually voluntary and preaching is expected in any case.)
You can't legislate sensitivity any more than you can legislate a head for numbers or singing ability. Such initiatives are superfluous when good intentions are present. Their apparent success in other cases will only be a veneer of patronizing hypocrisy.
Bottom line: right sentiments, but utter cluelessness as to an appropriate venue.
comp sci for an IT tech job?
That's the real tragedy. I thought that a comp sci degree was so you could go out and automate serious stuff from scratch, not just chase the latest web programming paradigm.
I think that we're in a period of degeneration through focus on details and flourishes, not the basics, kind of like the Baroque decadence.
Whether or not a Classic period will follow is anyone's guess, but my guess is that the Smalltalk crowd may figure prominently in another episode of the Eternal Return featuring Squeaky-clean code.
(With apologies to aManfromMars)
How to keep email off a webserver?
All I can think of is having the Internet-facing mail server in the DMZ storing emails downstream behind an internal firewall.
Any other suggestions?
not all time-related errors can be exposed by setting the clock forward
The first Patriot missle batteries deployed during Desert Unpleasantness Part I had a timing error that was only exposed if the system was left on for a sufficient length of time, allowing for decimal to binary fraction rounding errors to accumulate through repeated addition. This had at least two consequences:
1) The missle performed perfectly according to its lights, and went a number of meters to one side of the target instead of hitting it.
2) The magnificent explosions hailed by the media as Scud interceptions were really Patriot self-destructs to avoid mischief on ground impact.
The problem was later solved by a software update.
In this particular case, code inspection plus numerical analysis might have reasonably been expected to reveal the problem.
Ion drive... Disney... 1957 or thereabouts...
Seen on the Disneyland show as well as in theatrical release.
Everything was written by somebody, so question everything
Even the Encyclopedia Brittanica is a product of practically faceless authors. Get used to it.
Ceci n'est pas un bijou
Already been done, 'nuff said.
why on earth would you use flash for swap?
Even with wear-levelling, flash memory is really only appropriate for few-write, many-read use as already noted. Why on earth would anybody (let alone a technologically sophisticated company like Microsoft) recommend its use for virtual memory swapping?
I was, frankly, baffled when I first heard about this, and remain so to this day.
manned space program benefits
I grew up watching Commando Cody, Jet Jackson, the Mercury and Gemini flights, Fireball XL-5, Star Trek, and the Apollo missions. Although all of that stirred and inspired me, rational discussion of manned space program benefits needs to be based on something other than nostalgia.
My impression is that "space program technology" is mostly engineering, not basic science, and that most of it was based on military prior art, not NASA pioneering from scratch. The basic science and technology had already been worked out in military programs from the V-2 to the Redstone and Atlas boosters, as well as commercial efforts like the the AM transistor radio: Frankie Valli wailing "Sherry" over the first shirt-pocket transistor radios did more to promote the miniaturization of electronics than Sputnik ever did.
As for computers, the space program has lagged, rather than led the way. This is entirely appropriate, since you want reliable rather than flashy technology in outer space.
Although there were some basic science discoveries attributable to non-military space programs (the van Allen radiation belt, Hubble imaging, cosmic ray measurements etc.), most of this could arguably have been accomplished without ever putting anybody in orbit.
It seems, therefore, that the manned space program was vastly oversold in terms of its tangible benefits to the taxpaying public except for TANG, the convenient orange-flavored sugar beverage beloved of our heroic astronauts. (The fact that "tangible" begins with TANG will not be lost on conspiracy theorists.)
The real motives for the manned space program probably lie in secret military missions piggybacked on an ostensibly public program and convenient excuses to funnel ever more dollars into the military-industrial complex.
If some people enjoy the show while their pockets are picked, I suppose that's more pleasant than my depressing alternative. But I also believe that the truth will make you free, rendering truth all that more precious in an increasingly unfree world.
Is anybody using OpenBSD on it?
If so, what is your experience?
Let me see...
The Force will provide everything you need (to code a mashup whose critical parts are provided at the sole pleasure of another company that can withdraw them at any time).
Have I understood correctly?
Douglas Adams (of blessed memory) already predicted this
I believe it was on the planet Frogstar: long-bearded passengers strapped into seats and sedated via IV, resuscitated for a minute or two every few years to hear the recorded announcement that the flight would be leaving momentarily...
It's the professors, not the school
I don't believe that there are good or bad universities, just teachers that are better or worse. Unless you're a classic self-starter (e.g. Albert Einstein, Bertrand Russell, Richard Feynman), the real purpose of universities is to increase the chances that you will bump into someone dedicated to the life of the mind who will by example show how much entertainment and utility hide in the outer convolutions. It is by their interest and enthusiasm that great teachers communicate a love for their subject. Sometimes you take away a shared appreciation of that subject, sometimes you just take away the desire to find something that moves you as deeply. In either case you are fortunate. Many grab the sheepskin and run, never having had the experience which in any case can never be guaranteed.
(Reminds me of an old joke: "You may not know the right answers, but at least you'll learn how to ask the right professors...")
If you go back to the origins of the university, you'll find that they were student associations that paid standout lecturers and standup philosophers to entertain them, much as fanboys gravitated to Plato in the grove of the Academy. Later on came the transformation of the university into the institution of primary importance and of the professor into a mere university employee (with few exceptions). In the origins of the university lie the original reason for and the true utility of university education.
I smell a rat
Why is your government suddenly so forthcoming about laptops gone missing? Methinks there may be a hidden agenda to bludgeon the public into acceptance of the perception that privacy is impossible.
If I hadn't been tweaking old COBOL and TAL code for decades in response to changing business conditions, I'd be inclined to agree. However, I ***have*** been looking at old Tandem programs, understanding enough to modify them, retest and send them back into battle: we're not talking orphan objects here. The complexity usually comes more from years of accumulated real world adjustments than spaghetti coding. I will grant that there must be pathological cases in which people won't go near a critical module for fear of breaking the system, but I don't see those because I don't get hired to work on them.
Would I rather rewrite the old Tandem apps completely? Yes. Would I choose different languages? Yes. Would my clients consider the total cost and associated risk justified? Almost never.
But speculating on variations in application code quality misses the point, which is: the Tandem architecture provides scalability, data integrity and uptime that other systems simply can't touch. When you consider that there has been only one major architectural change in the system since the late 70's (K-series to S-series) apart from the normal Moore's Law upgrades, that's pretty amazing.
Does rehosting on Itanium make a difference? Not to me as a legacy maintenance programmer. Neither did the move from proprietary CISC CPUs to MIPS RISC, or the failed attempt to move from MIPS RISC to the DEC Alpha of blessed memory. Since Pathway and SQL were put into place in the early eighties, Tandems have always programmed the same for the most part. System managers and the people who pay for the hardware care about the hardware changes in order to achieve their required Transactions Per Second capacity but are quite happy that they have never had to suffer through the software equivalent of a forklift upgrade. Tweaks and minor conversions, yes, but never a "bet your business" kind of upgrade.
Of course, I do other stuff on the side as well as on my own time, and I'm sure that truck drivers like to drive sports cars now and then as well.
As far as I know, the Tandem is the only long-lived commercial system designed from the get-go for NonStop operation, data integrity and extreme linear scalability (no SMP "knees" in the power versus CPUs curve). The architecture reflects these baked-in requirements, and the OS and middleware take advantage of them. Once you're used to this kind of engineering, almost everything else looks like improvisation.
Re: you're obviously paying an awful lot
There are various reasons why the NonStop systems are beloved of large financial institutions, among which:
1) You never had to (or have to) take them down once a month or so to reclaim memory leaked by the OS.
2) They are the gold standard for transactional integrity so you never had to tell anybody "Sorry about your missing 10,000-share trade, we don't seem to have any record of it that I can access at the moment."
3) Programs written in the 80's are still earning their keep. Since Tandems have traditionally been back-office systems, software investments get amortized over ridiculously-long lifetimes because they don't get rewritten every four or five years as a result of language wars.
4) Tandems were designed to make their owners happy, not their programmers: the exact reverse of Unix. As a Tandem programmer since 1979, I know whereof I speak: some of the Tandem utilities and the native command shell downright clunky compared to Unix. I also recall reading an early Unix paper bragging that a development machine had stayed up for a whole week. Tandems usually stay up until a new OS rev has to be cold-loaded (months, sometimes years). ATM networks love this.
5) I still maintain venerable COBOL transaction-processing code that is not only well-amortized, but when plugged into the Tandem architecture provides NonStop, ACID database updates with absurdly scalable performance that nobody else can touch. On the multi-node Tandem architecture, systems don't failover to each other; transactions fail, get backed out and backup processes distributed across various processors pick up the slack.
Although COBOL isn't my favorite language, it's the end result that matters: mission-critical enterprise processing reality, not just the hype, and rock-solid since at least the mid-eighties. Sometimes I get to write Tandem stuff in C, but for the most part I indulge in shell scripting, Tcl/Tk, Python etc. at home. But I digress...
Getting back to the putative subject of this comment (paying an awful lot), the bottom line is that Tandem provides great TCO and a solid, scalable base for mission-critical enterprise systems. As for the alternative, you can duct-tape ten million pigeons together but it will never quite replace the 747.
"What fool put carpet on the wall?"
(he said, failing to notice that he had fallen down.)
It all depends on your requirements...
...otherwise you're arguing in a vacuum.
Just look at the history of the Web. Starting out with static web pages pulled right out of a filesystem, it's no surprise that webmasters didn't immediately upgrade to transactionally-atomic, normalized relational databases. MySQL started out as msql, a much less ambitious file manager with an SQL query language layer spread over it, and was there at the right time with something that was good enough to power the adolescence of the Web.
As many websites matured into order entry frontends and customer service portals, business requirements started to make themselves heard. MySQL has been adding features to meet them, and the enterprise DB vendors that had the features for years have been trying to raise web developers' expectations and lure them to real industrial strength platforms, the kind of infrastructure that runs banks and stock exchanges.
Do you need ACID properties for a blog? No. Can you currently afford ACID properties for search engine infrastructure? Perhaps not. They would be nice to have, certainly, and will eventually be taken for granted just like rural electrification and indoor plumbing. In the meantime, economics, response time and other requirements will require tradeoffs given the current state of the art.
I repeat: it all depends on your requirements.
beware of proprietary encryption
I always buy dumb USB sticks and do encryption on them from my PC. I distrust the U3 standard used to flog "secure" USB because it's just another example of Someone Else's Software that I don't control.
I currently use TrueCrypt, originally a Windows product but now available through a Linux GUI as well. There are also the free PGP and GPG encryption solutions. There's also a nice nice LockNote utility from Steganos, but it's limited to Windows.
No matter how carefully you secure your USB media, there are still data trace issues on the PC: when you decrypt, traces can be left in the OS swap space and/or temporary files which even when deleted can leave rescueable data in a filesystem.
There's also the issue of load-levelling on the Flash-based USB drives which may be a problem if the USB drives even momentarily contain cleartext sensitive material, but I don't know how serious the exposure is for lack of technical information.
My current solution is to keep a small harddisk partition to stage cleartext files on, and then wipe or shred the entire staging partition when I'm done with the files.
Of course, all this is whistling in the dark since my personal data is most likely to leak during a bulk theft from a vendor or financial institution, but I'm a bit of a techie and it's interesting to learn about.
Wear-levelling means even more data traces
Wear-levelling means that secure delete by shredding a file may not really shred all of the file all the time. I love the convenience of USB flash drives, but am very worried about the security implications of the technology.
Hollywood hates children
OK, maybe a little overstated, but there is a lot of truth in it.
Ever since Buffalo Bob and Howdy Doody gave kids the Hostess Cupcake Hard Sell by lingering over cream-filled centers and chocolatey goodness, children have been relentlessly singled out, isolated and manipulated by both the advertising and entertainment industries.
One part of the dynamic is to turn the poor kids into miniature acquisitive adults by constantly reinforcing the equation "possessions = happiness", truncating their childhoods.
Another part of the dynamic is to blatantly sexualize children, leading to enormities like the Jon Benet Ramsey case. I'm sure that this is responsible for a large part of the kiddie porn industry, and that it contributes to hazy thinking that children and adolescents are legitimate objects of sexual fantasy and worse. The industry take seems to be that if it helps move lipstick and designer label clothes, too bad.
This is the real problem. The use of entrapment has serious legal issues, but it's just a sideshow compared to the abolition of childhood. Deal with the cultural issue, stop manufacturing predators and the need, or perceived need for pre-emptive entrapment will become much less of a problem.
I take vigorous exception to...
...the notion that upfront analysis delivers software later and costs more.
My experience is that front-loading the analysis phase and debugging pseudocode instead of real code is much more productive than immediately slinging code and trying to retrofit changing specifications into it. Retrofitting clarified spec into existing code forces hokey workarounds because few managers really believe the XP gospel about throwing the first one or two systems away. Calling it refactoring doesn't make the result any prettier unless you can throw away code in the process, but just try it!
That said, what I found very valuable in the XP literature was the incremental, iterative approach presented as a solution to a problem, i.e. risk management. Risk management is the strategic issue, and the choice between "prototype iteration until done" versus "spec it out and debug the spec first" is really a tactical decision that depends on conditions in the field.
If the client only has a vague idea of what is desired and what is possible, then throwaway prototypes that throw up forms on a screen that people can jab their fingers at is the way to go. Oftentimes, however, you are interfacing two systems whose detailed data formats and semantics must be nailed down before you can really begin talk about the problem. One size doesn't fit all tactically, but the strategic concept of risk reduction is applicable always.
There is valuable content in the XP gospel, but it may not survive the slogan wars. It has penetrated some of the collective unconscious though, and can be expected to appear fairly soon with a new coat of paint as the paradigm du jour unless O'Reilly runs out of animals.
Growing up in the Sputnik era...
...of George Pal rocketship sagas and Walt Disney's Tomorrowland space-travel edutainment hosted by genial old Werner von Braun, us American kids were exposed over and over again to pictures of Jodrell Bank whenever somebody wanted to underscore that we were living in a time of great scientific curiosity directed outward, away from the Earth.
The image still conjures up some of that excitement and wonder, just as the Northrop Flying Wing and Vulcan move some of us in ways that the more capable Stealth Bomber never will.
My guess is that if budgetary concerns don't kill it, the lawyers will. Can you say "attractive nuisance"?
re-use has been around for years...
...in the form of glue languages.
The previous shell example is fine for stdin/stdout text manipulation (and let's not forget awk!), but Python and some other languages are more powerful while preserving the glue mentality.
In the words of Nikolai Ivanovich Lobachevsky (as reported by Tom Lehrer):
Let no one else's work evade your eyes!
Don't forget why the Good Lord gave you eyes,
So please don't shade your eyes,
And plagiarize! plagiarize! plagiarize!
(Though please remember to be calling it "research".)
As long as better software gets produced cheaper and faster without violating patents, go for it. That's why the Good Lord created glue languages.
I don't want to know how sausage is made...
...so I plan to consult only official information sources.
(Please don't complicate my life, just update me on the enemy du jour and the bling de rigeur.)
watch how fast this gets outsourced...
...and to whom
another one from "Duck Dodgers in the 21st century"
Porky: "Eager young Space Cadet reporting for duty, Sir!"
Not up to the quality I have come to expect of The Register...
...but, as with Wikipedia and a number of other sites I visit, entertaining and/or informative a fair amount of the time is good enough for me.
Out of curiosity, I just ambled over to Wikipedia to peruse their entry on The Register, which gets a much fairer shake than Wikipedia does here.
And once again, why pillory the somewhat open editorial apparatus of Wikipedia when no transparency or accountability is demanded of the "professional" news media, let alone the ability to contribute content? What antidote is there to concentration of media ownership other than wikis and blogs? And is it not to be expected that some wikis will bulk larger than others in the scheme of things?
Could one sparrow bear a coconut from Africa...
to England? Or two, if one grasped the front of the coconut and the other the nether extremity? And were the latter the case, would it not then lie within the realm of possibility that a plastique-bearing coconut could also be lifted by such sparrows? And had they been crossed with homing pigeons and trained to return to a target of choice, would not said ornithococological weapon constitute the gravest threat to freedom-loving peoples in general and Trafalgar Square in particular?
This grade-school obsession with the manners and modes of ingenious and gruesome (though highly improbable) death is highly entertaining, but let's also remember that if we just erect a cordon sanitaire around the Middle East (to protect them from us, not us from them) and then buy oil from the last man (or woman) standing, we will be able to do so in peace, comfort and security.
But that's not really the point, is it?
It's later than you think
Re another comment:
"Mainframes make so much more sense than the hundreds of smaller servers we see with all their installation, integration and huge heat/space problems."
It seems to me that the sloshing of computing power back to centralized mainframes will favor the consolidation of governmental power.
It's not too hard to imagine a world in which PCs will be confiscated (or simply no longer be sold) and replaced by state-approved thin clients in an effort to remove "weapons-grade" encryption capability from the grubby hands of hoi polloi.
It's my considered opinion that most of us IT folks have been busy building the gilded cage that everyone will eventually inhabit. Regardless of our personal scruples, most of us need the work and a job's a job.
There was never any need for the Boys from Brazil. The will to power (like the poor) will always be with us though the balance between it and the traditionally irrepressible anarchy of human behavior is swinging towards fascism, notably facilitated by the technology we develop and implement.
* Traceability of physical location by commercial activity (credit and debit card purchases)?
* Traceability of physical location by voice communications?
* Traceability of physical location by Internet communications?
* Transparency and data mining of free email?
* Voluntary and even cheery disclosure of socal network connectivity over and above what can be deduced from email?
* Concentration of the media facilitating cultural and historical amnesia?
Way to go!
a similar experience in hindsight
Back in the eighties, PG&E (the local electric utility) came by to lop some branches off a large tree in my backyard that were growing too close to their high-tension lines.
They offered to remove the entire tree to end the problem forever, and I went along with it because I liked the idea of not having to rake leaves each year, having more sun for my plants, and not having to pay for the tree removal.
The day they came to "wreck the tree", my neighbor came over and was rather upset about it.
Looking back at
1) a now-unobstructed view of lovely high-tension wires,
2) the fact that I annoyed a neighbor without really intending to, and
3) the destruction of an good-looking tree that had been there for probably 30 years or more,
I wish I had left well-enough alone and let PG&E just trim it now and then.
When I read this report of someone who wrecked a neighbor's tree or trees to satisfy a personal desire to "go solar", I think that something worse has happened: the cult of self-assertiveness has won out over getting along with your neighbors.
From my childhood up to now, I've noticed that it's increasingly common to live somewhere for years without even getting to know your neighbors, let alone cultivating good relations. Many of us increasingly assumed that we would move up and out, and so "neighborhood roots" were an antiquated concept. (William Whyte's "The Organization Man" describes this phenomenon, though in the context of the 1950's executive lifestyle.)
This has combined with the geographic dispersal of families to produce a level of social atomization such that our operative concept of "neighbors" is largely reduced to "coworkers". I'm all for getting along with people at the office and try to rub along happily with everyone else there but still, something important has been lost if my solar panels are more important than my neighbors.
Paris Hilton, because of the wistful yet lovely, questioning feeling conveyed by the gold-toned graphic.
The Eternal Return
With all due respect to Unices and the Linuces, their lineage is dominated by mainframe thinking: a single, highly valuable CPU being divided up amongst multiple tasks to obtain maximum utilization of a scarce resource.
The price of a CPU plummeted, but few people have shaken off the old assumption that CPUs are still best exploited as shrunken, multiuser, multitasking mainframes with all the baggage that this implies. Symmetric MultiProcessing hardware has been accomodated to some degree by improvisation on the monolithic OS designs, but multicore architecture will look a lot more like shrunken PCs on a network, suitably simplified for on-die networking etc, but with increasing amounts of private, per-core RAM that SMP doesn't really address.
Trying to take advantage of massively multicored hardware while dragging single-processor and SMP baggage along will necessarily produce its share of backward-looking monsters and things indigest. The recent claim by Stonebraker et al that 10X to 20X improvements in database performance may be had is based on the assumption of a (gasp!) single threaded database application running in a dedicated core with gobs of dedicated RAM. This bolshevik approach to the application versus kernel threading debate assumes that we will soon be living in a one-thread-per-core world, at least in terms of application design. Although some "housekeeping chore" cores may multithread at the application or the kernel level, new designs like the H-Store will throw out almost a half-century of mainframe-think and seize an entire processor for themselves without inconveniencing other applications on the die. (Just think of all the context-switching overhead that will suddenly disappear.)
What will the software infrastructure for these dedicated cores look like? My guess is that the winning software architecture will be microkernels churning away in their individual cores, loosely coupled with each other via a message-passing system. Although this approach exacts a price in terms of message-passing overhead, it more than repays it in parallelism and scalability unobtainable with the monolithic and SMP approaches.
I've seen the future, and it works: the Tandem K-Series, designed in the 70's, pioneered the massive commercial application of loosely-coupled message passing kernels. For about two decades, the K-Series (and later the S-Series) processed most of the worlds credit and debit card transactions, and powered major stock exchanges as well. (For all I know, they still may.)
Just as CPU architecture evolved from simple to complex instruction sets and fell back to reduced instruction sets in response to hardware evolution, the time approaches when OS and application architectures may experience a similar return to their roots as well.
This is really an article about hardware evolution...
...and how it invalidates the assumptions underlying earlier architectures.
Stonebraker is pitching the idea of "one thread, one core" and seeing how far he can go with it. (We all know how far the "one man, one computer" idea went...) If it takes the market a few years to provide enough cores per server to make it really take off, what of it?
It's practically a foregone conclusion that each core will have its own large, dedicated RAM to sidestep the symmetric multiprocessing shared-RAM access bottleneck.
Nothing new under the sun, at least nothing that would surprise a Tandem programmer. The Tandem designers figured out back in the seventies that loosely-coupled, independent processors were the best way to meet their self-imposed fault tolerance and linear scalability requirements. Although multicores on a single die won't be quite as fault tolerant due to physical and electronic proximity, they open up vistas of parallelism and linear scalability unachievable with SMP. The cores will probably use a single-threaded, message-passing microkernel instead of a more general-purpose, multi-tasking message-passing microkernel like Tandem did, and I think Stonebraker has assumed as much.
With continuing obscene increases in RAM, a single core could be tasked with handling standalone, in-memory manipulations of a subset of your tables, the centerpiece of Stonebraker's argument and H-Store design.
The late Jim Gray gets a lot of credit for bringing the sacred fire of true database religion from UC Berkeley to Tandem, where Stonebraker's H-Store has already been prefigured in the Tandem Disk Process. The Disk Process handles SQL queries meeting certain simplicity criteria for tables on the disk that it manages. The substitution of RAM for disk is already a commonplace, so where the H-Store breaks new ground is in giving each H-Store it's own processor. The infrastructure will need just enough kernel to manage memory and message passing (the H-Store application being single threaded) with the result that it will fit into a multicore core buffed out with enough RAM.
This is why most of the paper seems to me like a respec'ing of the Tandem Disk Process for multicores, with a rethinking and reallocation of responsibility for fault tolerance, not an abandonment of the same. Recovery becomes more coarse-grained simply because the computing elements are so much more powerful than they used to be.
patents are based on an obsolete paradigm of innovation and creativity
Copyrights and patents are based on the idea that individuals must be rewarded for innovation because otherwise they would slack off instead of benefitting society.
First of all, this just doesn't hold nowadays since employees routinely sign over rights to all inventions in exchange for a health plan and the illusion of security. It's all too common for real innovation to be stifled by their organizations, yet the patents end up belonging to the companies anyway. The rise of the great research organizations in the forties pretty largely eclipsed the lone geniuses like Tesla, Steinmetz and Edison, though even in the day of the lone inventor, Tesla had to cut a deal with Westinghouse and Steinmetz with General Electric. Only the rare creator with business smarts kept control, like Henry Ford.
Secondly, real artists produce great art because it is almost as necessary for them as drawing breath. This also goes for great programmers and scientists.
For example: Richard Feynman said that he was able to buy a nice beach house with the check from the Nobel endowment, but it can't have influenced his love for physics one way or the other. Einstein and the other Nobels? Almost certainly the same.
As for Sir Paul, he probably would have flooded the world with "silly love songs" even if it only provided him and his family with a roof over their heads and enough food to keep body and soul together.
Although I'm not a great programmer (just a journeyman), I dread the day my employers find out I would still do this for less even if I could drive a truck for more.
Bottom line: Innovation comes from innovators, not committees and organizations. Oppenheimer's reward for giving us the fission bomb was being excluded from the fusion project. How much money did Englebart make off the mouse+GUI, or Nelson from hypertext? Ultimately, the innovators usually get left out in the cold because they are better creators than team players. The argument that patents are necessary to foster innovation is just a pious falsehood nurtured by corporate patent trolls.
Want to see innovation? Reduce patents to ten years or less. I want my flying car and jet pack before I'm too old to enjoy them.
"Bring outcher dead..."
(protesting:) "I'm actually feeling quite better..."
Nostradamus fans should start searching for other pearls of prophecy in the vast Python oeuvre. ("Surround everyone, everywhere!" is my favorite; if you don't see it happening, you're just not paying attention.)
Half a league, half a league, half a league onward...
Into the Valley of Death plunged the bug hungry.
So Linux has a bug, I guess that proves it's inferior.
As to closed source versus open source, some of the arguments sound like "what you don't know (without a disassembler) can't hurt you". In other circles this is known as "security through obscurity" and generally avoided on principle.
In my personal experience I've seen a lot more BSODs than kernel panics, but your mileage may vary.
I'm also noticing that CentOS 5 seems a lot more solid than Fedora 8 (well, duh). So it's probably fairer to compare boring, stable, old, patched releases of Linux to boring, stable, old, patched releases of Windows.
I think it's perfectly fine to say "I like Linux/Windows because..." without feeling compelled to add "because Windows/Linux is a cartful of nightsoil, that none abideth the stench thereof."
(If I do, however, it's because the Devil made me do it...)
Open the pod door, HAL...
Three cheers for certifications...
...and the people that love them.
Certifications and degrees are really only worth what the student put into them, no more and no less. They may occasionally appear to have economic value out of proportion to a certain individual's effort expended in obtaining them, but eventually the disparity becomes evident and the truth will out.
That said, I don't understand why all the fuss over someone earning a certification, teenager or not. The article did not allege that "any certified person is more qualified than all non-certified people" yet many posters went through the roof as if that had been the case.
I think there is a definite place for certifications. I consider them at the very least a bona fide declaration of interest in a subject, and use them to hint at the fact that I'm not a one-trick pony. Although my stock-in-trade is mainframe programming, I went on a certification binge a few years ago and racked up 5 CompTIAs (A+, Server+, Network+, i-Net+ and e-Biz+), a CIW Associate and 3 WebSpheres (Application Server 5.0, Portal Server 5.0 and WSAD Associate Developer) to complement my HP Mission Critical Developer mainframe title. I'm out about $1000 in books and exam fees for the CompTIAs and CIW Associate but spent no money on the WebSpheres: I got the travel expenses, bootcamps and exams for free by being in the right place at the right time. Except for the WebSpheres, I figured "Hey, I know most of this stuff" so collecting the certs was largely a pleasurable experience except for having to temporarily memorize IRQs, DMAs and detailed technical characteristics of early SCSI standards.
Although my CompTIAs and CIW Associate credentials are the most basic possible, they do show that I've had my hand in micros and web technology besides slinging C and COBOL for a living. If I crack a computer case at work and someone looks at me cross-eyed, I can cite my basic A+ and Server+ technical credentials ("It's alright, I'm a doctor..."). If Desktop Support is about to do something questionable to my PC, I'll begin a sentence with "I know I'm only an A+, but...". This usually gets a laugh because most of them are Microsoft Certified Professionals or higher.
Bottom line: credentials of various types document that I have skills in these other areas that don't appear in the recitation of work experience.
And that's not all: I'm planning on going on another tear and certifying in Linux, PHP and MySQL. No boot camps, just books, playing around with the technologies, sweating a few exams and then feeling good about passing.
And before taking any pot-shots, just ask yourself if you would feel comfortable in the office of a doctor with no diplomas on the wall.
double agent PCs
Let's assume you're an "evildoer" and have somehow noticed that your PC has been "turned". (Maybe an antivirus program tipped you off, or IP traffic analysis or whatever.)
At that point your PC effectively becomes a double-agent because you can choose your in-the-clear web-surfing habits and email transmissions to actively disinform your trackers. Advantage: evildoers.
Then, you cloak your real correspondence in open source crypto or even steganography, being careful to frequently boot clean off a live CD and encrypt (or frequently zap) the swap partition and check for hardware keystroke collectors. Use SSH or VPN for extra points.
There's nothing here that isn't well-known and documented, and it's morally neutral because it's as effective against evildoers as it is against evildoers, computing resources being equal.
Of course, the escalating war of measures and countermeasures is great for business, which is why we'll see it go on in the foreseeable future. The search for pork goes on...
- Does Apple's iOS 7 make you physically SICK? Try swallowing version 7.1
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Pics Indestructible Death Stars blow up planets with glowing KILL RAY
- Hands on Satisfy my scroll: El Reg gets claws on Windows 8.1 spring update
- Video Snowden: You can't trust SPOOKS with your DATA