Yes, but …
… how is มาลัย (which means "Garland of Flowers" in Thai)?
When I joined QA nearly eight years ago I did so in a time of wonderfully ordered roles and responsibilities. It was a world of web developers, designer, application programmers and database administrators. Each sat in their own little area worrying about only their little part of the puzzle with clear definitions of …
For a while. Then 'stuff' changes and we'll need a new big bang to start all over again.
It's like adding extensions to a house. Eventually, you run out of ground, but need the extra floor. The foundations aren't deep enough, so you either move or knock it down and start again.
Ha! A more accurate analogy would be building the second, third and fourth floor anyway and papering over any subsequent cracks (or calling them 'features') until it all falls over... then you leave the rubble where it lies and start building again.
Is quite an old concept. Your "T" developers will typically not be as good in one of their "deep" skill sets area than a dedicated specialist in that particular area (unless it is a very small area), purely due to the time taken to fully master an area (especially when things are ever changing and so an area must frequently be revisited to keep up to date).
Specialists can be useful. e.g. I'm sure TalkTalk could have done with more PCI-DSS & pen-tests specialists
Right you are. I think my Bullshit Bingo was full halfway through.
This article is written by the kind of guy who does brilliant presentations, charges a hefty fee, then leaves us poor "old school" plebs to clean up after he's gone and nothing works.
"A T-shaped developer has one or more deep skill-sets of knowledge complemented with broad generalist knowledge across an entire solution."
The modern IT world has two pressures. To "industrialise" the work processes and to have staff with constantly maintained accreditations. What they don't want are staff with wide skills who are hard to replace.
In the 1970s it was not unusual for someone to have a generalist knowledge of many areas of computing - and a deep knowledge of at least one area.
That gradually disappeared as people found that gaining the necessary broad experience was time consuming - and their career progress and rewards were much better if they specialised. Those who had the wide grounding and interests became a rarity - tolerated as non-conformists who would save the day when the specialists were all saying "nothing wrong here".
There is a necessary balance needed. Too many "Jack of all trades" and you get shallow skills. Too many specialists and you get system-wide problems.
Specialists are relatively easy and quick to train. People with a competent wide knowledge that can be applied intelligently have to be grown - and it is a slow process that starts even before they enter the work force. Most people are content to be "appliance operators" who don't have the urge to look beyond their role's interface. A small number of others want to understand "why".
There is also a potentially disastrous group who are the status-seeking "chancers". With padded CVs they have cultivated a management pleasing "can do" over-optimism.
The "T-shaped generation" is mostly retired or getting close to it. Growing a new one will require many years of technical education of youngsters that won't come on stream for a long while yet.
> People with a competent wide knowledge that can be applied intelligently have to be grown - and it is a slow process that starts even before they enter the work force.
Yes, in my case is was model trains, which I still do when work is not challenging enough. If the trains are getting love, it's a signal to change jobs...
A bit of a buzz concept.
When specialities were separate(ish) this did often lead to interaction via defined APIs (as teams would work to an API be they the API producer or consumer), and so e.g. a UI could easily be ported to a new backend as long as appropriate APIs written bt the team working on the new backend
WiIth agile / multi skilled approach, SoC need to be continually hammered into peoples mind sets as, without team separation, SoC no longer happens by default
There is a story in one of Tom Peters's "Excellence" books about a new missile project. Three companies were working to bring together their separate developments. The critical interface was specified to be sufficiently loose that there was room to adjust the mating at the integration stage.
When Object Programming started to be mandatory it was found that projects could seriously fail if they didn't fully understand the application first. The structure of objects worked well for development - but required major redevelopment if the overall model proved to be incorrect in the real world use of the final application.
As that's the way I, and most of my fellow South African devs, have always worked.
Although it should be more of a V than a T, i.e. it really helps to know a good deal about topics closely related to one's speciality. e.g. in my case it's the intersection of real-time and high-performance computing, but I can and will do GUI and database work if that helps get the product delivered.
It does make working in a mult-disciplinary team good fun, making problem-solving a collective exercise rather than a scrap about whose side of the wall has the root cause.
>This venerable model heralded the age of web and app development, but it also contained the seeds of its own destruction, creating a world of silos, isolated and closed knowledge
A world of layered software, where there were APIs and you could interact with software from a variety of vendors via standards.
But that's all old hat. Now one vendor owns the whole stack, from search box to data store, or you've gone for BIG DATA with its massively expanding storage and compute requirements. Stuff that we used to hold in ACT! now requires a server with 192G RAM and several tiers of flash storage and an oracle license. On the upside, if we ever open an office in Tokyo, we can replicate it via the Cloud so it can be just as slow there as it is here.
Maybe that is actually the most efficient way to deal with it, but it makes me a little sad.
I sure don't push 'em through waterfalls. I prefer to throw the whole bottle in upstream, and let the neck float over with the rest. I'd ask a ninja to show me how to use the terminal on his favorite OS, but I'd be afraid he'd take offense and kill me painfully.
Also, I ask the assembled commentariat: has anyone else seen SCALA rendered in all capitals?
Almost all of these new devs work in the world of open source – the closed shop shrink-wrapped products of the Microsoft and Adobe heyday draws to a close – and public open source solutions, software and services rise to replace them. The developers of today cut their teeth on Linux and OS X and they use languages like SCALA, Python and Ruby, instead of .NET and Java.
Sorry, but that is purely wishful thinking.
Python enjoyed its own little dot com cycle all in one language - boom, bust. If you want to work for a mom & pop scale start up, use whatever you like (hell, use everything - I know I did when I worked for one), but as soon as you want to work for an enterprise scale business, you will be using .NET or Java.
But I did enjoy his columns, because it's good to read things you disagree with, makes you think.
Please tell me this isn't the replacement, because I spent more time teasing the meaning out from the buzzwords than working out what the article was about. (Turns out you can replace the article with the headline and get all the same information near enough).
I rarely read past the headlines these days as they have a tendency to tell you all you need to know. Make them as cryptic and weird as they used to be (can't remember exactly when I saw a change but I think it was not long after the moderatrix left in 2011) and you will get more eyeballs, probably.
Agile: Still misused in the majority of companies that 'practice' it.
Scala: A nice way to identify developers who chase the shiny. Tooling still weak, maintenance hard.
Java 8: Late, but not late enough to let scala mature.
Lean: We want more results with less effort. Or pay. Or organisation.
Dev Ops: Vital now that we have an ecosystem that resists attempts to achieve stability.
T-shaped: You may want it, but HR will just apply the buzzword checklist at random and screen out the guys who can help.
Agile - "we can't be bothered to document anything"
Scala - "we're too groovy to write anything anyone will ever actually use"
Java - "the consultants said it was good"
Lean - "we'll be offshoring next year"
DevOps - "why pay for separate ops when you can have devs do everything?"
Cynical you - not cynical enough. Yet.
Why is it so impossible to be honest about agile? Is it the religious overtones of having a manifesto that make it a belief structure rather then a development paradigm ?
"Agile allows us to create efficient metrics, openness and accountability." Really?
"Having red hair allows us to create efficient metrics, openness and accountability."
Well, it doesn't completely prevent us, but it certainly does not help us.
In a small outfit, where there is only one team and the task is basically mono-skilled then your agile teams can work as described but do not, IMHO, actually promote it. The real advantage of agile over waterfall is the frequent reality check of demoing to the end-user and the resulting feedback. This comes at the cost of not designing a solution before developing it.
Once the job requires more then one team, then the SM is the only channel of communication out of the team so your openness goes out the window. Metrics that only become meaningful when the team has done half a dozen sprints with no changes in skills, tasks and personnel are not, IMHO, exactly stable, useful or predictive. And 'accountability' that is based on an SM's ability to guilt less productive team members with no external checks or balances does not sound like the traditional ideal of productive teamwork.
Additionally, the active dis-incentive to up-skilling caused by metrics that punish activity that is not instantly productive and the lack of career progression caused by hiding talented people behind scrum masters does not encourage job-satisfaction or a willingness to go the extra mile. Agile has a place in software development, but it is not the be all and end all that it is often made out to be.
Agile works well when there is a single on-line system where the goal post are always moving, and mistakes are not costly.
Waterfall is still the best for well-specced mission-critical systems or ones that cannot easily be upgraded.
And in between, where most systems fall, the best methodology is a blend: for example use of prototypes (agile mode) to nail down requirements for a waterfall phase which is rapid and productive due to the elimination of unknowns. For each application there is an optimum point of change frequency and scope where customers are willing to wait a certain amount of time for changes in exchange for greater stability, predictable release cycle, and the ability to schedule end-user training to avoid loss of productivity due to end-users having to work things out for themselves.
From what I've seen and experienced with agile, its most common use is as a blame management tool rather than a productivity enhancement.
For example, Government projects always insist on agile, yet are usually composed of 3-4-5 outsourced suppliers. Agile CANNOT work with a team composed out outsourced suppliers, each with their own SLA requirements!!!
"The developers of today cut their teeth on Linux and OS X and they use languages like SCALA, Python and Ruby, instead of .NET and Java."
a) Only some
c) Relevance of article depends if developing Web applications, Android Apps, iOS, OSX, Windows, Linux Clients or embedded code (which is very different for a 50c 8 bit CPU and full fat OS in TV, Router or set box).
The big issue as always is poor design and implementation. You can't test in quality. Crap untyped Scripting languages make spotting implementation mistakes hard.
My impression is that "pretty" is on increase last 10 years, quality, usability and privacy down. We don't seem to have progressed in security judging by the vulnerability mailing lists I have for all the popular OS, web client applications and Web CMS server apps.
Poor specifications, true, but poor (or undocumented) customer requirements are where my World of Pain projects have all begun. Do up a sticky-taped demo (or worse, a set of slides!), show it to the customer, get them to sign, and yet never once record what their expectations of the final product are? I mean, how could it fail?
Basically, all the Agile or Lean in the world won't help if nobody can agree what the outputs are supposed to achieve... Devs muddle through at an extraordinary slow rate, because they spend the bulk of their time trying to second-guess what the customer might have wanted (the spec, if it exists, is vague, because nobody asked the customer what they wanted). In the end, the Final Acceptance testers end up specifying the product, and it spends eternity in Bugfix Feature Creep as the customer gets angrier and angrier. Repeat until a good fast-food restaurant job comes up..
As for Python, it can do reasonably well as a language for "proper" systems, but only if you employ extensive automated code-coverage and unit-testing to ensure that the program is semantically correct (many of those stupid bugs that are found for you by the compiler in a statically-typed language won't get found until runtime in a dynamic one). You need a lot of discipline to make Python fit in a quality-driven process, but a need for discipline isn't what draws novice programmers to the language.
[anon only because that of which I speak is drawn from quite immediate experience, and my usual handle is my real name.. ;) ]
Crap untyped Scripting languages make spotting implementation mistakes hard.
Yes. But nobody is using dependently typed programming languages, or even simply typed functional ones (Standard ML for example). And then you get weirdos complaining about garbage collection like it's 1970s.
This industry is still drawing pentagrams around dead chickens.
Bring on the bullshit bingo cards for everyone.
T-shaped developers? Seriously? Well then, let's tackle some more letters... We need X-shaped sysadmins who keep it all together. Then we have O-shaped management (living in a bubble), until they go somehow Q-shaped (somebody burst their bubble), at which point some chairs in the office go h-shaped (bulging until the "rogue" employee falls off).
Surely this article went pear-shaped. (Not a letter, I know, but for good measure, since buzzwords and crazy analogies need to think outside of the box -or cubicle- of course.)
Seems the author had a few too many waterfalls through bottlenecks at the local pub...
Years ago we were just APs (analyst-programmers for the youngsters who've never heard the term). We didn't have a methodology, we just talked to the users at different levels in the business to find out what they wanted & did it.
As the team varied in size from 1 to 4 we didn't have separate sysadmins or DBAs, we just did the lot. If you have to administer what you write what you write tends to avoid admin problems.
I can't remember what shape I was then but whatever it wasn't the shape I am now - things have filled out & sagged a bit.
If this is going to be a regular Monday slot I must remember to avoid it.
It's funny how, by the time these methodologies have acquired buzzword labels, they've degenerated into useless dogmatic bullshit. That was 5-15 years ago in all these cases. And everyone with talent knows to run like hell.
It was a good laugh, but more than one like this a week and I'll assume El Reg has sold out.
My favourite part about working like a Ninja is the uncertainty if I really existed in Tokogawa Era Japan at all.
I mean I might've only been born only 30 or so years ago... or that could just be what I _want_ you to think.
Which is really useful come pay review time.
"You can't prove I wasn't working on this project about a century and a bit ago, so give me a pay rise."
Ninja is the uncertainty if I really existed in Tokogawa Era Japan at all
Yeah right. As if deniable black ops "dirty tricks" operators would not have been a money-making full-time skill in an period of internecine warlord management. Especially one full of idiots who have their heads turned sideways due to some perverse cultural norms about honour.
> except that Dave Walker is not a web hack.
The following without making assumptions about the author.
This article makes me ... angry. It's the kind of language I hear from people who I, erm, do not appreciate a lot. People who do a lot of talking but no actual _work_ in any way. People who argue that enough monkeys will cut it. This may well be unfair towards the author as I can just give what my impression is.
Un-angried now thanks to the comments.
Obviously. No-one experienced at tech journalism or blogging would attempt to foist such a deluge of bovine byproduct on the commentariat here and expect anything other than universal derision. This isn't a "safe space" for PowerPoint "ninjas" used to communicating with
PHBsCIOs and other non-technical execs.
I suggest ElReg form a Kaizen team to organize an agile scrum and skills sprints so Mr Walker can focus on his core competencies in bringing about editorial transformation achieving readership buy in towards seizing the low hanging fruit of holistic and full stack quality perspectives.
First, the golden age of the developer only existed in very large and usually dedicated software companies. Most of us that have worked in medium to small companies have been developer, SysAdmin, Database Admin and whatever else had to be done to get a project up and running. At one company I was the EDI department. Not just a member of the department, the entire department. That made vacations difficult.
Second, Agile is just a bunch of jargon applied to the way most Linux and Unix programmers work anyway to try to get Windows devs to be more productive. Unfortunately, it's now moved into the workplace, so we have Agile environments where you don't get an assigned desk or cube, you have to reserve your work space for when you plan to be in the office. It's now wonderful sitting in a strange area surrounded by marketing types yelling into their cells as I try to debug code someone wrote years ago. You can always make a bad idea worse if you apply it to something that it has nothing to do with.
There has always been transdisciplinary development and siloed development.
Some people will follow a trajectory that goes from transdisc to silo, so what. The other way too, so what.
The real question: where do you want to be? I suggest that large organisations run by managers who're there "because they can't" may be prone to silo's. They may also be prone to extinction in the face of the others.
If you're in the wrong place get off your behind and get into the right place, for you.
Agile handcuffs are coming to your company soon because management loves the idea. Developers have made to much money, had too much control but now thats over with our new management tool called Agile! Agile = Sweatshop.
I've seen nothing but train wrecks with companies using Agile. Agile is fine for small projects like a website but its a disaster for large complex products. The word quality certainly is missing.
I'm an agile Certified Scrum Master, Certified Product owner
Biting the hand that feeds IT © 1998–2019