Re: Linux?
Oh, yeah (long night, I'm tired), they STILL DO, in case it wasn't obvious, the OCO aberations notwithstanding.
152 publicly visible posts • joined 9 Apr 2010
And before UNIX, RSX-11, and VMS, there were mainframes with OSes in assembler, customer-developed patches and enhancements, and tools tapes put together by customers and vendors. Indeed, one of the earliest user groups (if not the very first) is named SHARE--it's not an acronym, it's what the participants DO.
It has been for decades--it just turned 40, as a matter of fact, so it's pretty well shaken out. It's homogenized as it only runs on one architecture, but that architecture does everything (although localized KVM processing is usually a superior approach from all angles). So I'd say they have their work cut out for them unless the potential customers maintain their long-held animosity toward the proven product lines. In other words, it will all come down to marketing again.
...the longest chain of dominos the species has set up that executed flawlessly? Even the Odyssey link they dreamed up near the end that required a new twist worked out. Heck, the weather cooperated, too. They definitely stuck that landing. Oh, and that descent shot with the parachute, too.
How do we take the global financial systems away from the blokes responsible and give the contract to JPL?
M$ themselves haves have shown us all how. It is quite simple. Refuse to ratify all such NDA clauses in potential agreements. They have demonstrated they most emphatically do not want those 235 patents identified to the world. I believe this requirement will even trump filing a lawsuit against your less than Fortune 1000 firm. Just tell everyone the Boss is rather scantily clad from your point of view. Because he is.
The GPL is all about not restricting the source code. If you modify it and distribute the modified binaries, you must also distributed the modified source under the same GPL you got it under. However, "appliance" computers need not distribute the source code used to compile the binaries that run inside them. This is believed to be within the letter of the GPL and apparently the FSF has no truck with this PoV. But most certainly the spirit of the GPL is, ah, compromised. Even IBM is guilty with the HMC and SE managerial appliances that control zSeries CECs. They are so unauditable the customer doesn't even know what packages are onboard. The boxes are fully networked and only the IBM Support Center can perform software maintenance upon them. The customers must take IBM's word that the appliances are completely secure and nobody on their support team can subvert their computational integrity--there is no way to audit them. But they have the connectivity to peruse and possibly interfere with just what exactly is going on in the customers' LPARs without the customers' technical staff having any way to determine if that processing has been compromised.
So leave libre out of this opinion piece, please.
"But when people gain a proper perspective of the approach to technology he has championed he will be nominated for a nobel prize."
Unless the Bill Gates of our planet buy up enough influence to quietly outlaw free software while everybody is busy kicking the champion, especially while he's down. Then MS will finally rule the Net and everybody will live happily ever after.
But it cannot be if you're in a rush to mashup other people's code that hasn't even been prototyped yet (but don't worry, it WILL be secure--trust us on that).
As a sidebar, I offer http://www.networkworld.com/news/2012/052512-cloud-security-gartner-259627.html as more proof that Gartner isn't always wrong.
"On the one hand, we have an operating system with over 20-ish years of history and development behind it, that consists, essentially, of a kernel derived initially from MINIX, onto which a ton of services, tools and applications have been piled on. These were cherry-picked from the likes of BSD and its peers over a period of 10-20 years, without much effort put into making them play nice with each other."
Here's your missing detail. GNU/Linux scales. All the way down (wristwatch and probably the coming nanocomputers) and all the way up (bare IBM mainframe big iron? Sure, if you really want to, but it would make so much more sense to run tens of thousands of zLinux virtual instances under the original and still industrial-strength virtualiztion OS, now badged z/VM, because z/VM is superior to KVM). Microsoft operating systems don't scale.
I do feel a need to point out Linus used the POSIX standard to develop his kernel and the differences between kernel 1.0 and 3.0 are what you would expect after two decades of serious development. 'nix rules the networks by virtue of merit, plain and simple.
I always thought it was a little odd that IBM abandoned its 8 parallel copper data channels (the Bus and Tag cables, each 2+ cm in diameter) for a single fiber channel even though the fiber was superior in every way. So they are finally coming back to parallel throughput at fiber speeds. I wonder if slew varience is still a problem.
Just ask Captain Jack Sparrow. The character has a lot of confidence in his ability to observe and correctly perceive what he observes. Most people don't say, "That's interesting..." when they observe something completely different. They dismiss it as impossible and immediately drop it from their memories because if they followed up on why they probably hallucinated the observation they would not want to have to deal with any conclusions reached.
if you follow ZH, somewhat less if you believe Daniel, Saul of Tarsus, and John had credible inside information, and statistically impossible if you believe it is credible and the return of The Messiah occurs while you're still converting O2 into CO2.
Or did you mean all together now?
is just plain simplification, all of which is a rather lossy algorithm. That why the earlier comment about scientific writing vis-à-vis El Reg's Pulitizer candidates (</:-^ in case it wasn't obvious>) is germane to this discussion. In scientific writing, simplifications must be clearly labeled as such with subsequent restatement without the simplification or, if impractical, clear pointers to appropriate restatement. Simplification is a teaching aid, nothing more, IMHO.
needs to really work on their critical thinking skills. Some need to acquire the skills first, of course. The final exam reveals if you have learned the skills and underlying raison-d'être by evaluating how well you have personally integrated what you learned. It is a test that never ends. Here's a good syllabus (graduate level--you should deem this required, IMHO): http://www.cct.umb.edu/601-09F.pdf
All the data is (still) not in. Science never knows absolutes and people who purport it does are at best ignorant. All science can tell us is what it has concluded from all the supposedly objective data accumulated thus far and correctly evaluated producing theories, hypotheses, experiments, and results leading to redifinitions of what fresh data is now more important and deserving of resource allocation for its capture; rinse, repeat.. Remember "junk DNA" if you need a contemporary exemple.
This fundamental of science used to go without saying, but it needs to be shouted from the rooftops these days (thank you, public education).
"As Woz used to say 'Never trust a computer you can't throw out of a window' when talking about IBM."
To which mainframers tend to respond, "Never trust a computer you can pick up." And they're generally thinking in terms of Reliability, Availability, and Security. After all, physical access is often the key to a compromise. At least Apple products aren't a joke when it comes to security.
It will be interesting to see if the government chooses to tackle Apple.
"Perhaps Mr. Stallman doesn't fully appreciate that the second you attempt to argue with idiots you become one."
This is possibly true if your definition of an idiot is someone who cannot open his mind a crack to reconsider one of his conclusions about the nature of reality when someone with excellent credentials suggests reconsideration might be in that person's better interest. The problem is the non-idiot must determine, first, if he is dealing with such an idiot and, second, it is extremely unlikely any non-idiot in need of the same reassessment will now or in the future encounter this particular instance of the argument.
RMS is not an idiot and cares more about reaching the non-idiots in need of reassessment than conforming to social norms to the max so people will "like" him more. Indeed, that nonconformance can and actually has helped the message get more ink for decades. The fact the article we are discussing was published shows his eccentricity still gets the message out. He clearly reasoned long ago he must sacrifice likability as needed to get the message out. I presume he has inferred anyone who needs to like him before they can open their minds a crack is unlikely to make the reassessment anyway.
Meanwhile he endures the continuing marginalization efforts of so many who do not want him to succeed for whatever reasons. The ultimate marginalization tactic is the favorite in RMS' case (because he makes it so easy): persuade those he would attempt to reach that he is a lunatic and "obviously" nothing he says is worthy of consideration.
I don't understand technical folks who understand what libre source is all about who apparently distance themselves further from the standard-beared whenever he uses another opportunity attempt to get the world to listen and understand. Are such technical folks unable to resist the MSM push to deify Jobs and so agree with the MSM that RMS has gone (further?) off the deep end? I think it more likely they in reality don't get what libre source is all about after all and why that is so globally important.
On the other hand, RMS seems to have no truck with vendors of "appliances" (including IBM's mainframe master control components, the HMC and SE) who maintain they need not distribute the libre source running on their appliances. If that legal loophole is not properly addressed soon, then every gizmo that can run libre source licensed code will become an "appliance" all the way up to IBM zSeries Parallel Sysplex clusters running z[GNU]Linux. Then, MS will win its campaign to force OEMs to force their customers to run only MS OSes. Lastly, libre source will become illegal under DMCA enhancements subsequently propagated to every sovereign nation.
But maybe enough people can make it plain to the rest of the planet why this would be a Really Bad Thing.
Nah. Up with Jobs. Down with RMS. Meh.
I submit no organization's working component is more likely to terminate that organization's very existence than its aggregate IT component. Any CxO that doesn't understand that overarching risk should not have his/her position because, sooner or later, that organization will be bitten hard as a consequence of underfunding the crucial areas of IT, with security at the top of the list followed by loyalty and competency of the key IT employees. Competent IT employees will never buy Brand X because nobody ever got fired for doing so--to decide on such a basis demonstrates incompetency. Competent IT employees do their homework and well consider the major big picture risks of all possible choices.
I expect a lot of incompetent CxOs are going to be exposed as these cloud security lapses multiply.
Dang. Seriously late to the party again. 65 comments and here I am.
Great article, especially if you didn't know any of this before reading it.
This is mostly about history. Today you can order a z114 for less than a hundred USD grand (but a working environment is at least one order of magnitude more). What will you get for that vis-a-vis other platforms? In short, your money's worth.
z/VM can virtualize itself, as could VM/370.
Those z cores can run 24/7 at 100% and PR/SM and CP enable that (yes, I'm ignoring spin cycles).
My point? If you are in a position to check this out for the benefit of your employer and you choose not to because some college professor told you the mainframe is dead, you are not doing your job very well at all. That's it, plain and simple.
We still have a long way to go before the global network envisioned by John Brunner in "The Shockwave Rider" is functioning and able to work no matter how any government might be motivated and empowered to control it. The key is complete mobility, as every member of Ender's teams understood--when dealing with serious opposition, never stand still. Services must be provided by a mesh of thousands of systems, each always on the move, physically and via connections, always popping in and out of addressability and always changing addresses; not dependent on centrally controlled infrastructure that is not essential for all network transactions. In other words, the only way to censor the service is to interrupt all networking.
Here I am, typing a comment on the Web site of a for-profit news provider, considering an article it published involving journalistic business models. This seems somewhat surreal to me.
I must contend journalism is not your basic business. Maybe when troubadors went from town to town singing the news and learning the news, entertainer first and vendor of objective, unbiased reporting (as best anyone could determine) second, it wasn't so special. But as communications technologies evolved and more local news elsewhere became local news locally, the stakes began to rise with the technology. I believe the global majority opinion remains honest news reporting is more important than entertainment. This planet now is finding it more difficult to discern the vendors of honest reporting. The quality of the basic product appears to be watering down everywhere.
News providers seem to think their business models are the problem. I would suggest the quality of their journalistic ethics lies at the heart of the loss of eyeballs. Walter Cronkite was the single most trusted public figure of his time according to many well-respected polls. Where are his peers today? Lord knows we need them--reporters, editors, anchors who are driven to present real news even if Powers That Be object for reasons not serving the public welfare, even if there is potential to hurt ratings. The quality of news reporting was marketable once upon a time. Can it not become so again? None of us have the time to vet everything we must consider. We must have sources we can trust to be ethical, especially when real-time events become very personal.
Fit that into your business model.
I think the main point to take from this is the need to answer an important question: is it smarter for the cloud customer to outsource all IT infrastructural expertise? Should they have no one involved whose annual review hinges on ensuring that some or all aspects of the customer's IT provisioning are not only providing the necessary services, but also adequately minimizing all risks inherent with such infrastructure? Is business continuity guaranteed? Best of all, are trends or even innovation being evaluated for the potential to enable competive advantage? How much do you trust your cloud provider(s) to put the interests of your business ahead of the interests of their business(es) when incongruence exists? Indeed, how can you be sure you will even recognize such incongruence? How willing are you to bet your business on one or more cloud providers?
A common approach to preventing the general acceptance of truth that would be an impediment to an entity's objectives is to marginalize that truth. The easiest way to do that is to persuade everyone that the truth-speaker is insane. If that statement is accepted then most people will not invest any time considering the merit of what the truth-speaker is saying. The easiest way of doing that is to simply render the truth-speaker insane (if you have the means). This also pays a dividend by demotivating other potential truth-speakers. I am not saying this is what has happened in this case, only that it cannot be ruled out to the best of my understanding to date. IMHO, of course--YMMV.
DOS, OS/MVT, and CP-67 with CMS all ran multitasking environments--that's almost the same concept as multiprogramming--multiple programs loaded in main storage that are dispatchable serially using a single processor. OS used task management architecture centered on Task Control Blocks (TCB) and programs could spawn subtasks that would also compete for the processor. UNIX calls these "processes." This was all S/360--no virtual storage. The first multiprocessing (aka SMP) was developed on modified Model 67s (said to be tightly-coupled) and became generally available and supported on the non-virtual storage operating systems.
Underneath all the OS access methods you found confusing was the EXCP access method (EXecute Channel Program) that gave the programmer the ability to code the channel programs processed by the I/O hardware. Serious database products being developed by ISVs all used that to achieve maximum efficiency. Take the BBN IMP 1822 ARAPNET connection hardware most customers of the TCP/IP stack I mentioned in my previous comment provisioned. The TCP/IP stack's driver program stacked five separate read channel programs via multiple EXCPs so when the active channel program ended and generated an interrupt, the I/O Supervisor's interrupt handler immediately issued the SIOF instruction for the next queued channel program before notifying the program that issued the EXCPs of the completion. None of the TCP/IP stack's code required supervisor state (kernel-mode).
What else? Oh, the original S/360 PSW had an ASCII/EBCDIC mode bit that eventually was repurposed because customers did not use it. The "nasty" CKD hardware enabled the offloading of a lot of CPU cycles into the different hardware devices so architected reducing elapsed time and enabling more non-I/O-related processing by the CPUs.
You are missing the main point in your final paragraph. Yes, not all the technologies you mentioned were pioneered by IBM Corporation or even on mainframes by IBM customers and third-party vendors. But today's Z boxes still do all that and more (although it is still more cost-effective to offload most of the rendering processing to the smarter terminals we enjoy today).
Compare the skill sets of typical high school graduates in the East vis-à-vis the West, in addition to their motivation. Also note that since 10% of India's high school students are in honors programs, India has more honors high school students than the USA has high school students of ANY academic standing or associated motivation.