Ribboned for your pleasure
Sounds just like Microsoft Office 2007 and the Ribbon. Cue lots of angry users.
Apple has massively upset its Mac productive app users by dumbing down Pages, Numbers and Keynote to match fondleslab versions. When the Mac OS Mavericks upgrade came on line, an upgrade of the iWork suite to v4.3 of the Pages (word processing), Numbers (spreadsheet) and Keynote (presentation) applications followed. After the …
Sounds just like Microsoft Office 2007 and the Ribbon. Cue lots of angry users.
The ribbon only changed the interface. It did NOT remove functionality like the Apple Iworks downgrade did.
Angry users for both office software yes, but most PC users never bothered to actually look at the "ribbon" or the "button" they just complained since it was something new to learn. Mac users just had their software degraded to laughable proportions.
Admittedly MS never explained any of that but I can do anything in Orefice 2007 that I could have in Orefice 2003 without significant difficulty.
not only did they not get rid of functionality, but, the old menu options are still there, I still use Alt+E, I, S to paste special, even though the options are just one click away on the ribbon. Old habits die hard.
Worst downgrade ever: when MS removed VBA from Mac Excel. argh.
Except use the spell checker it seems.
Apple are probably taking the same approach as they took with Final Cut Pro which resulted in many angry users on release (but now has matured into a very good next gen pro video editor with better workflows). Much more so than Microsoft, they are prepared to anger users to affect a big shift in architecture to where they want to be. They did it when they shifted to OSX. When they moved from Power PC to Intel, when they reduced features moving from the abysmal MobileMe to iCloud, when they cut out some features from iTunes in the recent 11 release.
The new iWorks is a complete rebuild incorporating a new file format. My guess is these are the reasons for the missing functionality:
1. The new file format will be re-architected to use CoreData and to be optimised to make each atomic element within the file as independent and self contained as possible. Mobile and Flash drives work better with a different file architecture than apps build for PC's with a large amount of DRAM. Additionally, internally, I expect the file architecture will have been migrated to be database (CoreData) based, and will support the newly improved CoreData database iCloud synchronisation (which also prior to iOS7 was a disaster. Document synchronisation was working fine, but database file synchronisation was in effect broken. Core Data has had a lot of work done on it for the release of iOS7 this year and now works as originally intended).
CoreData synchronisation will rely on the document being broken into easily addressable atomic document elements which can be independently retrieved and edited to support the collaborative working via iCloud and the web interface. Changing the internal storage format to the extent Apple have will probably have meant pretty much a ground up re-write was required. I expect the shortcomings of CoreData synchronisation over the past couple of years has been a major drag on the release why this current version of iWork has been so long in the making.
2. Applescript was probably removed because running a client side scripting engine in conjunction with a collaborative editing synchronisation means background processes can mutate the data structures the scripting engine (which without a substantial re-write, must always be re-entrant to the main thread) is iterating over. Collaborative synchronisation then either results in many many edge cases that have to be overcome to keep the model simple, or additional programming API's for dealing with background threads mutating data (which would then require a style of programming that would not be well suited for users who are only used to writing in scripting languages). I expect apple script will only be re-introduced if they can sort out all the edge cases, or (more likely in my book) through the AppleScript engine scripting directly through a web API (the web API will probably mostly hidden to the End user/script author).
3. Apple were clearly having severe problems getting iOS7 out the door and, by all accounts, took resources off Mavericks and (I expect also) iWork, to ensure they got the release out. The iOS7 beta was far more buggy than previous iOS releases. They clearly had a lot to do and I was surprised they managed to fix the number of deep seated problems that were still present in the penultimate (pre-GM) release. There is plenty of evidence resources were taken from Mavericks. Just look at the mix of new icon styles for maps, pages, numbers and iBooks, with the old style icons for safari and the Calendar app and contacts. The calendar in particular is a strange mix of the most flat iOS graphic app style mixed with the old style icon. It looks to me like they set a strict cut-off deadline and avoided changing anything that didn't make the cut. I expect these inconsistencies will be resolved with future updates. If resources were removed from the project in the lead up to July, it would have had an impact on the number of features that would make the final cut.
Having said all of the above. My daughter who is at Uni, is using the new Pages and much prefers it because of the convenience of the iCloud data storage giving her access via mobile. Essay writing is probably a pretty typical style of use case for iWorks and for such non-demanding work it has a few advantages over the previous release and just generally feels nicer. She can also share her essays with me and I can edit them in the web interface to fix any typos I see while I'm talking with her on FaceTime which is nice and works very well (this feature is actually very good. The web interface is IMO a fair bit nicer than that for Google docs - though admittedly I haven't used that recently so don't know if it has improved much in the meantime). The interface is overall less cluttered, easier to use and provides a better basis for incrementally adding in the missing features and new features on top, but for sure it has a way to go with some of the more advanced features.
"Essay writing is probably a pretty typical style of use case for iWorks"
"Essay writing is probably a pretty typical use for iWorks"?
Bit of a traditionalist, myself. Remember the days when there was a language called English taught, and learned, in schools.
@WhoaWhoa. Your criticism is unwarranted; though admittedly I should have capitalised Use Case. A Use Case is a term of art in software engineering. It is a workflow defined from the user's perspective without reference to software or architecture limitations. It is entirely correct and actually says something quite specific to say "essay writing is probably a pretty typical style of Use Case for iWorks" because by saying that I am indicating Apple will have been focussing on a set of common Use Cases that would cover such user needs as "write and essay", "write a report" or "write a letter," and not the more esoteric needs such as "do a mail-merge". Use Cases can contain a main flow which is the most user focussed friendly "do what I want" flow and can also include alternative or branched flows to deal with exceptions or the unexpected. Indeed my username is a reference to the main flow which in software analyst nomenclature is referred to as the Success Case; the flow which best and most directly serves the user's needs.
'My daughter who is at Uni, is using the new Pages and much prefers it because of the convenience of the iCloud data storage giving her access via mobile. Essay writing is probably a pretty typical style of use case for iWorks and for such non-demanding work '
I think the key phrase there is 'non-demanding'. Pages is a toy, albeit a pretty one. But as soon as you need proper editing features or handling references and bibliographies it is completely out of its depth.
Somehow Apple just managed to make it worse.
After the technicolor dog's breakfast that is the new iOS7 GUI it's hard to see if Apple is even bothering to test stuff before shipping.
Er no, the old and new Pages integrate fully with Reuters/Thomson Scientific EndNotes software, about the most sophisticated citation and Bibliography management solution imaginable.
I find that use-cases aren't a good starting point for UI design: because they're ultimately describing functions of the software, they can only explain what has to be done, but not why, and they give very little guidance on appropriate ways to achieve it.
Normally, I start with a set of user scenarios - each describes a typical user (including relevant attributes such as computer literacy, education, fluency in whatever language the software displays its text in, and co-ordination), and a transcript of their interaction with the software. Each of those interactions would be equivalent to a traditional use-case, but by combining them into a narrative, you make it easier for the developer implementing the system to understand what the intent behind each use-case is.
Also, by humanising the user in each scenario, you also remind the implementor of the UI that not all users are like them.
As for iWork, I gave up on Pages as being unable to handle even the simplest kind of technical document (you have to resort to hacks to use 1, 1.1, 1.1.1 style section headings, and they have a habit of disappearing across a save), and anything I did in Numbers wouldn't interoperate well with Excel or OpenOffice sheets, Keynote is a good slide-maker, but my experience has taught me that a good slide presentation is horribly time consuming to create, and a bad slide presentation is a waste of everyone's time, so I tend to avoid producing them for talks.
"She can also share her essays with me and I can edit them in the web interface to fix any typos I see while I'm talking with her on FaceTime which is nice and works very well..."
Known as plagiarism, in the trade. Unless, of course, she credits your input to her essays.
Oh, and just because your daughter might say, "Uni", instead of university, it doesn't make you young and trendy to copy her, any more than letting her see you dancing would.
But, Apple... desire for trendiness image... all seems to fit.
"A Use Case is..."
I expect that many readers know exactly what a use case is. However, unless you have access to and can reference the specific analysis documents you suppose exist as a retrospective "justification", the fact remains that dragging superfluous technical jargon into a sentence which is better served by plain English does not improve the style, content, or credibility of your discourse.
"Er no, the old and new Pages integrate fully with Reuters/Thomson Scientific EndNotes software, about the most sophisticated citation and Bibliography management solution imaginable."
That speaks volumes about your powers of imagination.
You're obviously clued in. Plagiarism is taking someone else's work(s) and presenting it as your own or taking credit for it without acknowledging the original creator / artist. I think what is being referred to is more commonly known as proof reading or editing.
@WhoaWhoa. Tip: When you next get bent out of shape, try keeping the urge to type a nasty reply in check. It's only a hobbyist tech discussion and there's no need for it.
@SuccessCase, if your daughter is in "uni" then she probably is becoming dumber as we speak. Dumb is the new smart. Just ask Apple, Google, MS, ad nauseam. Of course a new generation of humans taught to be dumb might actually bring about the next human evolution and eventual extinction when the machine stops. So it's all roses for the extinctionists. Happy deaths to all.
(PS. Not WhoaWhoa typing this. Just another cranky punk ass.)
It's generally quicker and more comfortable for your hands to use a four key shortcut, than move your hands to the mouse, navigate to the correct location (it is a very small square and needs precision), click it without accidentally moving the mouse and finally moving your hands back to the keyboard.
I still prefer the old Office 2003 interface vs. the new "fluent" user interface of Office 2007-2013. It matters on what you are comfortable with. Office 2013 isn't bad with a toolbar for pull down menus added. iWorks shouldn't have made the same mistake as Microsoft did. Forcing the changes on people with the iOSizattion of OS X. Hopefully Apple will fix the changes or people will continue to use iWorks 09. Mavericks so far seems to get good reviews. Hope the future of the Mac OS isn't turning it into an expensive accessory for an iPad. Balancing the cross platform makes sense but the Mac version shouldn't loose functionality to match the iOS version.
"Oh, and just because your daughter might say, "Uni", instead of university, it doesn't make you young and trendy to copy her, any more than letting her see you dancing would."
I'm American so I don't know the nuances of when English people say "uni" vs. "university" but I've met a fair number of English people of all ages and they ALL say "uni" and I never got the sense from any of them that they were trying to sound trendy by doing so?!
Proof-reading/3rd party editing, however minor and even when done by your mum, is strictly forbidden by some universities. The daughter needs to check the university's policies on plagiarism and malpractice. By the way, no matter how easy it is to use iCloud storage, if it fails for any reason then the daughter is not going to get sympathy from the university. The university will provide its own secure storage and if a student chooses not to use that facility but to rely on iCloud, Dropbox etc instead, then the student will be held culpable if data loss leads to missing an assignment deadline.
No, it's not. It's jargon. Speaking as a software engineer of some 30 years vintage, only managers use phrases like this. It's hardcore buzzword bingo. You've been busted, me bucko!
1. Look up the meaning of plagiarism. It doesn't apply to what I wrote.
2. I'm sorry that is simply nonsense. Link to a single University policy document where what you say is the case (most of them have such documentation facing the public and searchable from Google). Almost every college advises the complete opposite and RECOMMENDS having someone else proof-read your essays (including Harvard and Yale). That is GOOD PRACTICE and almost every walk of life there is recognition the writer is the worst person to do the proof-reading.
"By the way, no matter how easy it is to use iCloud storage, if it fails for any reason then the daughter is not going to get sympathy from the university" You mean, if your hard disk drive goes down, or your homework gets eaten by the dog you your essay gets no mark ? Blow me down, now you mention it, each college I attended when I was younger also had a 1:1 correspondence between getting essays marked and actually handing them in. Those sneaky lecturers seem to be getting a bit tough on our poor little students don't they?
You have of course, just given a very good reason for using iCloud storage. Your document is cached locally, plus available in the cloud plus, like for all modern OSX documents, it contains every prior revision in case the latest version gets messed up with an "undetected until it's too late" bad cut and paste job or some other such snafu, plus my daughter has time machine back-ups, plus she is working on the document with access to it just she would any other and can still can upload it to her university online store as she can with any other document, so really what is your point?
Actually I got your point. It's "find a way to take a side-swipe at that nasty commenter who has a different opinion to me and suggest he is in some way unethical." Poor. Very poor.
"It's hardcore buzzword bingo. You've been busted, me buck!"
I suppose RUP, SCRUM and RAD are just minor acronyms for buzzwords and they are all just minor things no real software engineer worth his salt has heard of or refers to as well. And my 25 years experience running large scale multi-company, international, software integration projects as an engineer and as a manager where 3/4 of the projects I have ever worked on have had requirements expressed through Use Cases, disqualify me from knowing what I'm talking about. OK.
Apple sucks on this, but they are certainly not the only suckers in town ... these are just some sample MS cockups:
Windows 8 - exactly the same bullshit: treating a desktop like a tablet.
Or specific MS Office bullshit:
Ok, grab Office 2003 powerpoint and move a callout around (by clicking in the bubble), notice how the callout sticks to the position it is pointing to, this is great. Resize the callout, it still points to the object it is supposed to ... now try the same in Office 2010 ... yeah, the callout no longer points at the same position when you resize or move it - this drives ppl nuts.
Word formatting is getting worse with each new release. Especially when you are working with lists - how often do you copy-paste shit around to get a proper list in Word 2010 ? Thank god they added the same "feature" to powerpoint.
Excel still cannot open CSV files, you have to import them ... or do a "sed 's/,/\t/g'" ... because MS somehow thinks that comma-separated values means tab-separated values ... ;-) Ok, this has not changed, but still sucks golf balls through garden hoses...
The Mac versions of office no longer support VBA, because the blokes who wrote the unmaintainable innards of the vba runtime on Mac had already retired. The Mac version cannot correctly display Windows Office documents and vice versa. MS cannot even get Office right on Mac and Windows ... ouch.
Thank god I use OpenOffice ... one document displays exactly the same across the following OS's:
Linux, FreeBSD (all BSD's in fact), Windows, Mac OS, Solaris, AIX, HP-UX ... export to PDF goes lightning fast and comes bundled, native SVG support - MS Office sucks, period.
'I'm American so I don't know the nuances of when English people say "uni" vs. "university"'
"'I'm American so I don't know the nuances of when English people say "uni" vs. "university"'
Of course the Oxford English Dictionary does include the word Uni separately, so perhaps that American knows British English better than certain obnoxious posters on this thread. Certainly better than those who think using Uni to shorten University is done to look trendy but for some reason shortening enough to 'nuff isn't.
[4 Kristian Walsh]
"Normally, I start with a set of user scenarios - each describes a typical user (including relevant attributes such as computer literacy, education, fluency in whatever language the software displays its text in, and co-ordination), "
That only leaves you with a product only the users you know about can make work. I typically start with what needs to be done and successively and iteratively work on the UI bits that can be misunderstood by a dolt. That way, one doesn't lose sight of the basic facts that underpin the business model and swamp them with a UI so clever the job can't be done any more.
I could cite an example jointly engineered for a government application of some great importance and high visibility by two of the most celebrated companies in the IT world, an application that took a fairly involved but essentially simple job done on old greenscreen terminals and made something so flexible no-one could use it to do anything at all, but won't because I signed a document saying I would keep certain things secret.
There is no special science to making software for other people to use. You use the same process that is used to make can openers or socket sets or cars, (although I can detect a whiff of "new graduate engineer enhancement" in the design process of the car these days, when people can get into a vehicle and not be able to figure out how to get it to go despite decades of the design needs being quite definitely pinned down).
You look at the job that needs doing, you look at the things people can do and you make the last fit the first as best as you can. Naturally, sometimes you can't get a perfect fit for every case in one tool. That's why you can buy left handed scissors. But you can make the tool work for the vast majority by not falling in love with the build process itself.
It's not plagiarism to get help with typos.
ROFL, at least does they pay you for this?
The only logical explanation for this downgrade is to make it more usable for the common iTards.
Strewth, bloke writes a great article free of charge and is majorly downvoted for it. What's that about?
This is The Register. When articles are published about Apple that have a negative headline all the Apple hater's pile in to have a dig, and no matter how rational a comment may be, if they think is even slightly Apple sympathetic, they will down-vote it. I've learned the measure of success on here isn't down-votes, but to check if there is a comment that carries a large number of down votes, but for which there are no rational (non-trollish) replies which engage with the argument. When there are only emotional responses and ad-hominem attacks, that is actually a pretty good indication the comment has had impact. If there were good arguments against it, you can be sure the down-voters would be making them.
Down in the South Pacific it is often called Varsity.
Never went there myself.
Is proofreading really forbidden? Given that it is axiomatic that you can't proof read your own copy I find that astoiunding. Of course there are people who think they can proof their own copy, but they are worng (see what I did there? Did I make any other non-deliberate mistakes - probably). Do lecturers enjoy reading essays with lots of typos?
...lighten up, Mr Success. No one said you didn't know your job. Don't let a bit of light joshing from a self-confessed grammar nazi and 'campaigner for plain English' penetrate your armour so easily...
"Re: Ribboned for your pleasure
You're obviously clued in. Plagiarism is taking someone else's work(s) and presenting it as your own or taking credit for it without acknowledging the original creator / artist. I think what is being referred to is more commonly known as proof reading or editing."
Plagiarism and editing (by another person) are not mutually exclusive.
Plagiarism is passing off another person's work as your own. If another person edits your work, making changes, and you do not credit that other person's changes you are passing off their work as your own. There are contexts in which that's normal and expected, but submitting work for assessment in an academic context is not one, unless it is explicitly permitted.
And once the mental "adjustment" has been made to allow a parent, for example, to help little Johnny or Jane to (they hope) get a better grade than they could have done by themself it become a small step to suggest a "tiny" extra example or a "tiny" rearrangement of words or ideas. And if that tiny change is really just part of the editing, then a slightly bigger change would "help", and would be OK, wouldn't it?
It used to be the case that pupils / students were expected to learn to do their own proof reading, make their own spelling and punctuation corrections. It is more common now in some contexts (in the UK at least), especially in schools, to look the other way, because so much focus is put on grading schools by often-meaningless (but easy to measure) civil servant, desk-dwelling administrator-designed criteria that put huge pressure on schools to increase their pupils' grades.
Looking around, at the "leaders" of industry and commerce, the political "leaders", the "getting away with what you can" approach is the example set almost everywhere, almost all the time. So little surprise if many parents become complicit in this, especially with their own little Johnny or Jane.
But there are also examples a-plenty where getting someone else to do the work you need to do for yourself does not wash. In fields in which real ability rises to the top and no excuses can substitute, people work by themselves; parents or coaches or teachers do all that they can to encourage Johnny or Jenny to do that work, but they do not do it for them. A sports trainer and an athlete know that a press-up done by the coach does not develop the athlete's fitness. Likewise with scales and musicians, or potential winners of the Fields medal and PhD supervisors. The teachers might, do, demonstrate and explain, but they do not do the learner's work for them. Of course in such fields it is quite obvious that a trainer's press-up does not improve an athlete's fitness, or a teacher's scale a musician's skill. But when assessment is viewed primarily for the sake of the resulting grade, rather than as a tool for improving ability, is viewed as the goal there is temptation for a parent or teacher to "inflate" a parent to "inflate" the grade by their own direct contribution, knowing that they will probably get away with it. This is short-sighted and counter productive, since the son / daughter / student is not improving in the way that the athlete doing their own press-ups or the singer practicing their own scales is improving. Worse, they are being trained to get away with someone else doing their work for them when they can: a habit that can lead to a life time of not developing as well as they might have.
If a university explicitly states that a student may ask a parent to proof read and correct work, that is one matter. Many universities have regulations that state that work submitted by a student for assessment must be all their own. Often, even, students are required to sign a statement to this effect. Not that long ago such a statement would have been superfluous as everyone would have understood that to be so, but such have been the shifts in "acceptable" standards of integrity that explicit statements are now seen to be necessary.
Once politicians caught with their hand in the till would have immediately resigned, without prompting, for they knew they had violated a standard that was inviolable. Now governemnt organisations feel it's OK to side-step their own laws and spy on the very citizens to whom they are, in theory, accountable. When acceptable standards shift the consequences can be subtle and wide-reaching as the "getting away with it" approach spreads into other fields.
So, having a little additional one-to-one session with university-attending son or daughter to show them how to use a dictionary and how to construct sentences or punctuate is one thing. Editing their work whilst on the 'phone is trading a short term mark gain for a long term disservice which diminishes their independence, integrity and honesty. It is the parent's choice about how they wish to influence their offspring, but making an even ever-so-slight choice towards honesty or dishonesty, independence or dependence has a cumulative effect over the years for which the parent might feel growing pride, or a tiny bit of sadness at the slightly tarnishing, character diminishing effect of reduced integrity.
"And my 25 years experience running large scale multi-company, international, software integration projects as an engineer and as a manager where 3/4 of the projects I have ever worked on have had requirements expressed through Use Cases, disqualify me from knowing what I'm talking about. OK."
The problem was not distinguishing between a software design document, in which context use cases might be entirely appropriate jargon, and a discussion in a forum about your personal view about the suitability of a word processor for an essay writer. The essay writer might use a word processor, but they would not, knowingly, apply a "style of use case" to their essay.
I refrained from asking what you meant by a "style" of use case, though that flipping between the formal and informal use of language was, um, "interesting".
There are times to step back from ingrained management speak and renew an acquaintance with the English language. A forum discussion can provide good opportunities. A technical forum might, even, be a particularly good place to use English and jargon when each is appropriate.
"Certainly better than those who think using Uni to shorten University is done to look trendy but for some reason shortening enough to 'nuff isn't."
Glad you spotted the irony.
Or did you?
"It's not plagiarism to get help with typos."
If it's just a typo, help isn't needed, just a bit of effort to re-read and correct your own typing mistakes. If another person made changes that you wouldn't have made and you pass those changes off as your own work by not attributing them, that is an example of plagiarism, although you might choose to view it as a trivial one.
If there were no problem, the son or daughter would presumably have no reservations about putting, "daddy helped correct my typos" at the bottom of the work...?
"Do lecturers enjoy reading essays with lots of typos?"
No. And that is the sort of reason why so many years of education are focussed on teaching children to read and write properly. However university is a place where high levels of that competence should be taken for granted. Unfortunately, there are many universities where that is no longer the case. That does, though, make it easier for the better ones to shine as a result of not compromising their expectations of students.
A parent correcting their university son's or daughter's typos is an indictment of the offspring's education and self-sufficiency. I know a fair few students who would refuse to accept such parental hand-holding in an area in which they felt they had already passed their rite of passage; and parents who would, similarly, no more offer such help than offer to wipe their grown up offspring's bottom. I know other parents / children who would act differently.
"Re: ...disqualify me from knowing what I'm talking about. OK,
...lighten up, Mr Success. No one said you didn't know your job. Don't let a bit of light joshing from a self-confessed grammar nazi and 'campaigner for plain English' penetrate your armour so easily..."
:-) :-) :-)
The Register's enduring appeal is that there are many participants who "get it". And those that don't contribute every bit as much, in their own way.
Have a pint. Here's to slurd spech.
I did not say that proofreading constitutes plagiarism. I said that some universities consider it to be malpractice and that students should check the policy. Here is an example of such a policy: http://www.leeds.ac.uk/AAandR/cpff.htm. I did not say that anyone was being unethical; what I said was that students need to check the policies at their own universities before asking anyone to proofread their work.
Universities have procedures for students to apply for deadline extensions or for mitigating circumstances to be taken into consideration if they are unable to complete assessed work. However, loss or failure of personal IT equipment, including cloud storage is unlikely to be considered grounds for an extension or mitigating circumstances appeal. Universities provide facilities such as secure file storage and remote access for students. If these fail and a student loses work then the university is obliged to take this into account. If the student keeps all her work on her personal laptop, portable drives or personal cloud storage then the university will take the view that the student does not deserve special consideration because s/he could (and should) have used the university facilities. Your daughter should check the policies on extensions/special circumstances and on cloud storage at her own university.
"Given that it is axiomatic that you can't proof read your own copy I find that astoiunding."
You can. Writers do it all the time. Even school children are taught to proof read their own work.
"Of course there are people who think they can proof their own copy, but they are worng"
They're not. See above. Or consult a dictionary, or even Wikipedia.
"(see what I did there?"
If that had been an accidental mistake you'd have demonstrated that it's easy to miss mistakes when proof reading your own work, especially if you do it straight after writing it. But you left it intact so if it had been a mistake originally your proof reading had worked, but you chose to leave the mistake to make a point.
However, "astoiunding" highlighted the difficulty better.
This feels like matching parentheses in Lisp. Deservedly so. Time for a coffee. (And to lighten up a bit. ;-) ).
"When articles are published about Apple that have a negative headline all the Apple hater's pile in to have a dig, and no matter how rational a comment may be, if they think is even slightly Apple sympathetic, they will down-vote it."
When companies portray themselves as supreme examples of "style" over substance, parade their corporate egos and whip up their acolytes to parade their individual egos similarly, comments here often reflect a disdain for the approach.
Yes, there is pointed humour, too, sometimes. And certain types of corporate behaviour tend to invite such attention.
"perhaps that American knows British English"
O perhaps they know American?
"Time for a coffee. (And to lighten up a bit. ;-) )"
So when you want to lighten-up you grab yourself a coffee? It's making sense now...
"Glad you spotted the irony.
Or did you?"
We all probably would have if we thought you were smart enough to have meant it ironically. Unfortunately your bizaare rantings on this thread have shown otherwise...
Another 30+ year veteran here.
The developers in the office implemented an order entry page. Now me, I would have expected to see a total value at the bottom, sort of so one knew how much one had spnt on the order - preferably updated as each line was entered. Apparently there was no "Use Case" for this "feature", so there is no total.
This was my first confrontation with the phrase "Use Case", and that phrase is now batting zero and 1.
''I'm American so I don't know the nuances of when English people say "uni" vs. "university"'
I have to apologise. That was a bit below the belt.
I think the recent US two-fingered salute to the rest of the world regarding spying has tainted my reaction to things American, although I recognise that many US citizens are equally appalled by increasing disdain of politicians and spies for the very population they are employed to serve.
"We all probably would have if we thought you were smart enough to have meant it ironically. Unfortunately your bizaare rantings on this thread have shown otherwise..."
In contrast, you present carefully argued lines of reasoning.
Biting the hand that feeds IT © 1998–2017