Two uncalled for "boffin" usages in a day?
I'm starting to believe that boffin should be reserved for egg-heads that do something physical. If it doesn't move, whirr, bubble or explode it's not proper boffinry.
You know it's bad when two top programmers at different conferences in a two-week period say we're in the dark ages of software development. Speaking at MIT's Emerging Technologies conference in Cambridge last week, Charles Simonyi - creator of Microsoft Office, friend of Bill Gates (and Martha Stewart - thanks ZDNet's Dan …
Fogcat your just wrong, and so is the register.
Boffin is a scientist of some description, normally a Doctor(PhD) in their chosen field and a leading light in the academic circle where they operate.
Tinkerer would be an engineer, people like Torvalds.
What their looking for here is the equivalent business person level slang which I believe is PHB.
"PHBs: Dark Times for application development"
Given that the current crop of "business" "analysts" can not type a grammatical, correctly spelled sentence, Simonyi's hope is the thinnest tissue of lies possible, and he knows it. His startup must be an excuse for rich folks like Gates, Ballmer and Buffet to lose money as a tax break.
What the guys said at the Judge Institute is true to a point, but if you seriously want to up the ante when it comes to coding then keep an eye out for Clarion7 and Clarion.net from www.SoftVelocity.com. As my company is part of the Alpha team testing the new releases from SV, there's going to be some serious productivity gains with the latest version which has been some 7 years in the making, not forgetting to mention that this technology is nearly 20years old now evolving from DOS and MS has even made an unsuccesful attempt to purchase the company to get the technology. Every other software IDE and language is seriously lagging compared to Clarion and its probably the best kept industry secret!.... well was!
The heart of business needs is where business development rules. The CFO doesn't care if the language is Java or Visual Basic or Cobol. CFOs just want their pain to stop. And they like platform and language debates almost as much as they like losing money.
Tweakers can play with software, but business will always drive markets. Business forges on. Business will get what it wants -- eventually.
So he's not so far off and if software developers had any foresight, they'd be studying business and finance and accounting 101 along with their "patterns and practices." They'd be solving business' problems rather than aggravating them with debate and platform wars.
And if architects and developers won't condescend to the trivialities of business, business will overtake the trivialities of coding. The finance department and marketing will be "writing" code and the programmers will be back to flippin' burgers.
Be careful whom you snub.
Tools may be in the Dark Ages, but programming practices are in the Stone Age. Everywhere I go I see programmers "doing their thing" in their own way. Is there an industry-wide accepted coding practice ? Nope. Is there a coding methodology taught as soon as someone first touches an IDE ? Nope.
Today, in an industry that works with arguably the most technologically advanced products that Humanity has ever made, the methods used for creating code range from "quick and dirty hack" to "convoluted, committee-managed, process-based humongous monolith".
Everyone has his own manner of programming, his own manner of commenting and his own manner of documenting his code (including NOT documenting it). And each and every one of us have been moulded by whatever mentor we had in our youths - be it a human being, or an Internet chat room. There is no consistency between programmers. The only similarities you find is when both have enough experience in their trade to know how to avoid the most common pitfalls in programming and user management. But that is only because they have both hit their heads against the same wall - not because someone taught them about it in a consistent manner.
Oh, and I do know about this and that methodology that is available on some web page somewhere. The issue I have is that it is not taught along with the basics at school, and knowledge of it is not mandatory when one undertakes the responsibility of coding a business project. Once upon a time, being a software developer was a job for engineers. They learned this new subject along the way, shared their experience with their pairs and ended up writing the first programming manuals.
That time is long finished. Today, a 14-year-old can buy himself a web site and start banging PHP code however he likes, gradually rooting himself in disastrous coding habits that he will bring with him in his professional life, and will take years to teach himself out of, if ever he manages to do so.
I don't see this changing any time soon either. So bring on the tools of the New Era. Bring on the Renaissance ! After all, it can hardly be worse than the standards & practices we have now.
I don't feel worried about being replaced as a software developer by a business analyst and some magic tool to help them to write programs.
Software is composed of very simple components, which are easy to understand in isolation - anyone with modest intelligence and a bit of training can do it. Lots of people write Excel macros to do clever things. The hard bit is trying to put millions of instructions into some coherent, maintainable shape. That is what the experienced software developer provides to a business, and that is what no tool that I can imagine could provide to someone without the knowledge and experience of complex software development.
I doubt whether laser eye surgeons will be worried about DIY kits sold in Boots any time soon either, for that matter. I believe that the future of technology is in allowing specialists to become more productive through progressively better tools, not in creating snake-oil tools that supposedly allow people to be productive without needing to have any understanding of what they are doing.
I'm sure that some specialised areas will be served by configurable standard tools, but I'm sure that there will be plenty of new problems to be solved for a long time to come.
There cannot be greater truth in what Pascal wrote.
Unfortunately, the environment that created the problem out of ignorance cannot be expected to discover the solution out of random chance. The schools that crank out programmers and created the current problem do not know how to correct the error.
Perhaps the greatest failure of software development is the failure of the software developer to learn something important before learning software development. (I know that sounds cyclical.) Therefore, the aspiring software developer must leave software development and learn something else first.
Before learning to code, the aspiring software coder should learn the "patterns and practices" of physics, engineering, chemistry, medicine, business, finance, economics, banking, international distribution, manufacturing... something! These are hard subjects. They are among some of the hardest. But when the coding apprentice begins the long application-development journey -- already armed with a profession, there exists knowledge on which to develop.
These disciplines are more than areas of study. They are cognitive patterns. Learn chemistry and one must learn to think like a chemist. Learn engineering and one must learn to think like an engineer. Learn business and one must learn to think like a captain of industry. Project designers, architects, and coders will not succeed -- fully succeed -- in projects unless they understand the environment in which the problem exists; what the buzz-centric call "the problem domain."
Engineers who program do so like engineers. They work within engineering solution patterns that have existed for millennia. Physicists who program do so like physicists. They solve problems with well-understood patterns and definable criteria. Financial managers who program do so like financial managers. And so on.
In other words, real professionals solve problems in real and known patterns that have been used to solve problems for very long times. They do not suffer from the "crash-and-burn" school of programming. While their understanding of the internals of some framework may be lacking (compared to the 13-year-old hacker), their solutions will stand and their problems will be solved.
If the programming major learned a defensible and workable "pattern and practice" for coding, their major would change to engineering or math or physics or economics. After completion of the "core" curriculum, the professional may elect to return to the computer "sciences." (... no more science than is stamp collecting...)
"Computer Science" colleges crank out human assembly lines from which garbage or gemstones may emerge. Give the human assembly line a precise set of rules and the Computer Science major will properly produce. Give the human assembly line little or no direction and don't be surprised if the Computer Science major produces little or no discernable pattern and nothing usable.
Q: From whence comes the precise instructions for the human assembly line?
A: Project designers and architects who were already professionals in their fields before they began leading the development project.
Software coders and experts are constantly amazed at the "next thing" in software. It always seems to take them by surprise. Functions. Objects. Relations. Interfaces. Contracts. Frameworks. (and on and on) But these things were in use, known, and well understood in other professions long before they migrated over to the lesser "science." They were not surprises to the mathematician or the physicist.
Coding and development will eventually become secondary skills. In the present day, the arrogant teenagers among us (no matter what age) have business cowed into believing that it cannot do without them. This, too, will pass. And those professionals in their own "problem domain" will begin to solve their own problems.
This was the genesis of the present computer age: when the small researcher and small businessman began coding on the IBM and its clones (or the TRS-80) and solving their own problems -- however clumsily. The industry was ignited. The pyre burns low, but that is not because our demi-deities of coding superiority have rescued business and industry. It is because they haven't.
Business and industry, using their own knowledge of their own problems, will assimilate the knowledge of the application developer and begin to solve their own problems -- again.
And the patterns and practices that are lacking in today's programmer will exist as the patterns and practices that were learned by the solution architect in business, engineering, medical, accounting, banking, or chemistry class.
Whenever a person from the business world comments on software development, developers (those who'll listen) tend to create another layer of abstraction exclusively design to get those people distracted whilst they get on with the important job of coding; nothing usually changes down on the shop floor.
I think it demostrates a lack of understanding of how the software development process works. making software is essentially a creative process and trying to over-formulize such a thing so that it fits within a smart little business methodology box results in the creative element being totally removed. The by product of this is that it becomes exceptionally dull to develop even the most complicated of things. it could turn programming into a drag-and-drop affair, linking boxes, with little input from the programmer.
there is a need for better methods to develop software, but its direction needs to be controlled and decided by the code monkies themselves, otherwise you'll risk frightening away from the industry some of the brightest and most creative amongst them by the pile of increasing design diagrams and documentation.
"Simonyi was quick to point out his friend is leaving behind his day-to-day duties at the company to focus on the Gates Foundation, solving bigger problems than software development"
How very bizarre that he doesn't realise that they are both inextricably linked.
A most definite case of "Not seeing the wood for the trees".
""Software programming for multicore is hot, and some of you have the opportunity to be the next Microsoft if you can solve these multicore issues," Agarwal said."..... and aint that the gospel truth.
Who would bet against it being developed Virtually Online, Registering ITs Prior Art Patent credentials in Simple Text Shared?
And I wonder when El Reg will realise what is going on and what is be floated, intentionally, out of the Bletchley Archives, Turing Module?
Simonyi's race has already been run and won. IT is now just a case of collecting the Danegeld reward which reveals the code which will automatically, by selfish Third Party Proxy Default, completely destroy Systems in Markets..... and just so as not to be ambiguous, that does mean any System in any Market.
AIMagic Spell from Blithe Blighty Boffins? Or a Rampant, Rutting Stag of AIdDriver Program? And is there any difference?
I am primarily an engineer so software to me is as intuitive as any other method of implementing a logical machine. Its easy. Except that if you take me out of context, into, say, an applications environment my productivity flounders. What's the problem?
The answer is simple. Programming is just an expression of problem solving. The real money is knowing how to solve the problem, not how to crank out code. Non-programmers and even some less enlightened programmers don't understand this, they like their applications templates because they can make things work with them, but all they're doing is driving the car that someone else designed for them. If the need to step outside the boundaries of their tool they're, not too put too fine a point on it, screwed.
Business people like to reduce creative processes to rote because it allows interchangeability ("flexibility") of staff. Large companies spend a lot of time and effort on processes, they're supposedly to increase quality but it all boils down to trying to de-skill skilled workers. (Skilled workers have power, they don't like that because it makes them uncomfortable.)
I avoid the bloatware. I have my own problems. Its amazing how easy it is to run off the edge of your knowledge or capabilities once you start messing around inside a computer -- sometimes really simple tasks are impossible, intuitive algorithms don't scale, really Mickey Mouse stuff is just off the scale difficult. Its what makes life worth living.
Biting the hand that feeds IT © 1998–2019