1115 posts • joined 10 Apr 2007
Re: Dark matter
The boiling water freezing faster has been explained, or at least explained to the point that there are few arguments about it.
We're still utterly knackered on the first problem though. Tens of thousands of dilligent research and still no closer to understanding.
Just a thought... if this type of memory is used to replace RAM in a system then just how the hell are we going to get an errant OS working again? The main fallback situation of IT support is still "turn it off, wait, turn it back on" (i.e. restart)
We'll be knackered unless there's a hardware, not software controlled, function to clear the contents of memory and to start again from a clean slate! :)
...is there a song along the lines of "I so lonely" interred in there as well?
Re: Keep in mind the human brain runs at <15 Hz.
There's no such thing as analogue. At some point the measurement is discrete, it's just the scale is high. e.g. you can't have half a photon or atom. OK so you can, but things get a bit interesting at this point and we're generally interested in stable structures.
The human brain is a massively parallel pattern matching machine - emulating this in a procedural computational environment is never going to be optimal.
Who's pushing this exactly?
...as in, where's the money? Who benefits financially from BYOD?
It's not the business - only an utter twat of a bean counter will fail to see that reducing the relatively small cost of hardware compared to the additional costs required for, usually, new networking kit, management systems and disparate system support balance massively the wrong way. The business will still have to provide the software that they need the users to use and to police the licensing of it, or is part of this scheme to force employees to also purchase the full and latest version of Microsoft Office, Microsoft Project, Microsoft Vision, Adobe Acrobat, <x> Antivirus plus whatever more specialist software may be required?
Users don't benefit either... Oddly, despite the rhetoric about users preferring their own systems to do their employer's work, given the choice of spending hard cash on the hardware and software, without corporate discounts, they'll be looking at up front startup costs of at least £1000, but given typical MS Office costs, is likely to be more like £1500. OK, the business could purchase some of the software, but there are extensive licensing issues involved in this - MS's licences state that the software can only be installed on systems owned by the business, and what about when the employee leaves? Should the business buy back the system when they employee leaves or should they take control of the system (which could have interesting legal issues) and wipe it thoroughly before the ex-employee leaves the premises? What about the users who don't want to spend out their own money to do their job? Talk about enhanced benefits and other nonsense is just this - realistically the majority of employers will not give employees anything extra to compensate them for using their own systems... that's yet a further cost.
Employees are paid to a job and are usually given the tools to do this. While there are exceptions to employees being given the tools, these are far between compared to everything else. Computers in the workplace are *tools*, they are not their for the employee's personal pleasure. Would a guy on a production line be expected to buy his own socket and screwdriver set so he can assemble components - would his choice of tools actually work properly and not potentially cause production problems, possibly very expensive?
What about when an employee's own system breaks? Who replaces it, and who's at fault? Did a cleaner knock it when cleaning the office, did the system overheat due to a clogged fan, did the user's choice of AV software fail to stop an exploit hitting the system? Who's going to pay for these to be fixed? How about when a user's system contacts some nasty malware that on a corporate system would have been prevented from running but on their own system, took hold, and exploited and infected 50 other systems in the business? Who pays for the clean up? It's no longer then business's problem as it's a user's system that caused the problem therefore should the BYOD user be forced to pay up for their own lack of computer security knowledge?
So who's driving this and is likely to benefit?
The PC manufacturers? As a consumer working as an employee is likely to replace a system when starting than use an older one.
The network kit suppliers? New kit will be required to replace the existing working just fine kit, in order to segregate and protect systems from each other... if you haven't looked into it properly yet, without adequate hard networking policies in place, each of these devices will be able to trample across the entire network - the solution is to effectively segregate every end point into it's own VLAN and carefully martial access to the required internal resources and to protect the device from and to protect the wider Internet (nobody needs their Internet connection blacklisted for spam because of personal systems sending junk mail). Pretending that Windows server can manage this is putting the gate in the wrong place... it's too late by the time a device may happen to have been authenticated and checked on the domain.
The management software suppliers? They have a new market too push, and they have the tools and a desire to maximise profits through selling management systems that may just about work and then to sell ongoing support and updates - nobody in their right mind is going to think that a single purchase of this software will do the job, devices and software change all the time therefore the management systems have to be updated on an ongoing basis as well.
What this issue does bring to the fore though, is a good discussion about keeping system selection and software use alive and dynamic rather than a monolithic one size fits all approach to IT. A more agile IT provision where the appropriate tools can be reviewed regularly, or in some cases, on demand, fits the modern business environment much better than introducing expensive cost to satisfy a few buzzword toting sales reps.
"yeah if you can remember all the bloody commands and options.. and lets face it... how many of us really do that?"
Careful, that's close to blasphemy to the die hard command line or nothing zealots.
I can never remember the bloody commands and options either, while there used to nominally be standards for the structures of these, many apps have just done what they feel like. The result is having to get the help up for the app, inevitably scroll back through 3 pages of advanced options I'll rarely care about to get back to the basic options. Then try to remember to type the bastard things in the right order as the text has (typically) scrolled up the the page and out of view. I know I can use multiple terminal windows, I do it all the time, but spawning extra terminal sessions just to look at help feels excessive. An extra terminal window to look at command line parameters requires, of course, some form of windowing user interface... the shame of it!
Alternatively I can right click on the .tar.gz file and select "Extract here" and the appropriate commands will be fired off to decompress and unpack the file into the appropriate directory structure. It'll even give me a progress bar so I can see how far along the task is. Yes, if I did the process multiple times a day I'd remember the command line I'd need, but I have better things to do than that especially when somebody has already written a labour saving tool to do the job... The computer's job is to assist me, not to give me more things to remember.
Alpha radiation light?
"In any case, spacecraft don’t really concern themselves with lightweight stuff like alpha radiation."
Hardly... it's the heaviest (most mass) radiation by far, and the most dangerous to humans. Luckily our skin, a handy layer of dead or soon to be dead cells, coated on the outside in oils and bacteria and then for the most part topped off with clothing, is impervious to alpha radiation. Ingest a source of alpha radiation and you'll be in trouble though.
Re: No interest..
It's hard to know whether it's aversion or apathy... the UI is ugly, crap and uninviting, the price of the devices are poor compared to other "similar" (in the eyes of consumers) devices and to top it off, they have a nasty habit of looking like just a line of new laptop-sort-of-tablet things on a shelf.
Consumers browsing in a store like that tend to want to look at what they perceive to be consumer items, not laptop-sort-of-tablet things with ugly interfaces. In other words, the target marketing direction is all over the place. Again. Consumers don't generally get excited by just another fecking Operating System... they'll tend to get much more excited by real tangible devices that just happen to run that Operating System. Does the Apple marketing fluff go on about the latest version of iOS or does it ignore that and sell the gadget? (That might happen to have new features as a result of the latest iOS, but that's not the core of the marketing message)
Re: Supermarket shelves
If it's similar to the products I've worked with, the device power and data comms run through a convenient track rail on the edge of the shelf. LCD displays need continuous power anyway, so may as well shove data along at the same time and there are plenty of off the shelf protocol implementations for single wire comms involving multiple devices on a single data bus. The smarter ones even have full two way comms so the target device can acknowledge receipt and operation of the data received and a heartbeat function is usually implemented as well. Both of these should be almost required for remote price display functions in a supermarket.
A plastic display or e-ink display could bring the price down and produce a more legible display for the end user - i.e. a wide range of eyesight toting shoppers.
True, but in the end if he's responsible for it then he should control the actions of his teams, not let them produce UIs like TIFKAM. There's a certain amount of delegation, then there's common sense.
It's makes you wonder if Julie Larson-Green is blind, colour blind, or has tentacles instead of hands and arms. Probably deaf as well as she can't fail to have heard complaints. To be fair, it may be the case that Julie Larson-Green was just putting in place what she was told to do from above and she does have a clue about UI design. Unfortunately given the management and working practices in Microsoft, there's no way she could criticise in any way anything coming from above.
So he oversaw the worst UI MS have foisted on users since MS Bob and miraculously he leaves just after it sinks, erm, launches.
The underlying OS is good enough, a moderate improvement on Win 7. The UI shell on the top is spectacular in it's ignorance of UI practices, general ugliness (subjective, but common) and inappropriateness for anything other than a small touch screen.
So far I haven't heard anything other than complaints of disgust from uses who have had it foisted on them. Naturally, the complainers are going to be louder than others and there's always a distrust of "different", but this is markedly worse than the step from XP to Vista/Win7.
Re: Could herald TV's renaissance
Many networks don't give a shit about the indicated advert points and just run adverts cutting automatically into whatever happens to be on screen at the time. Not only is this exceedingly annoying and tend to ruin whatever is being shown, it's even more frustrating when 20s after the advert break a pre-set advert break transition comes in (i.e. scene change with pause - many shows have them).
Buyers of consumer HP kit, on the other hand – the kind sold through big-box retailers – should plan on getting used to Windows 8
In other words, plan on getting confused as fuck while glaring at an ugly, idiotic user interface that with a lot of random pokes and swipes you may just about get to work on a small touch screen device but is utterly retarded on anything else.
So far none of the feedback I've had from people I know who've been lumbered with this POS is in any positive. It's a shame really, the underlying parts of Win 8 seem to be better, it's the half finished, schizophrenic retarded user interface that's been shoved on top that's the problem.
Re: A lesson to us all
backup > NULL
Re: It is our duty to donate :-)
Analogue stick docking? Pah! Unless you could dock at full speed using a digital joystick then you were nothing more than Mostly Harmless.
Hard experience showed that it was often safer to dock manually than wait for the docking computer to scrape every face of the space station or attempt to dock with the rear opposite face of the station... where there was no docking bay. The grinding noise (well, naff shield damage noise) as it tried to shove the ship in at 90 degrees to the orientation of the docking bay slot was also a far too familiar noise.
Re: "sumptuous graphics"
The algorithm isn't line based, it's a fairly simple winding polygon algorithm check. It's for this reason that there were no concave polygon structures - i.e. no sticky out bits or engine nodules on separate spars of the model.
The smart in the code then decides which polygons (given shared vertices) are visible due to the winding, normalises the vertice list (to ensure a single line between polygons isn't drawn multiple times) and then draws the lines between the visible vertices. The polygon filling and this algorithm was beyond the processing capabilities of many of the 8bit systems...
"what do people program in that customers with older OS/Browsers can still use?"
How about industry standard HTML? It's worked for years (with the exception of ****ing idiots who insist on putting in IE only "features") and is the underpinning behind the additional features available in HTML5. A good HTML5 website (application) will still be usable in browsers that do not support all of the new features - a bad one will crap itself and be unusable... if the application is still usable but not quite optimal then that is much better than not usable at all. In any case, having compatibility like this allows the application to support the required accessibility laws and guidelines.
because customers are starting to switch to Hyper-V, the Windows hypervisor-based virtualisation system which comes bundled in with Windows Server 2012, rather than paying extra for VMware.
This is probably one of the most important points... I haven't tried the latest version of Hyper-V, but certainly on previous versions VMWare was markedly superior. However when it comes to a price comparison, free (convenient) and separately paid for are very different choices and while as soon as you start expanding your requirements Hyper-V prices do stack up, it's the initial hurdle that's one that can be hard to beat. MS is just front loading convenience to get users used to Hyper-V as a form of lock in for later and it's something that VMWare would find hard to counter in a similar fashion without losing a large proportion of their income.
So why a welk particularly?
Because the use of butterflies and chaos is so last century?
Re: Volume Licensing
You're right - that is what I really meant rather than "comes with" - it's more "only supports" as you noted. This is particularly true for laptops that tend to be less supported for older OSes.
So... the ongoing legacy of lazy, short-sighted, incompetent(*) developers cobbling together applications that only run on specific versions is holding Enterprises back. Who'd have thought that?
* to be fair, many will have had this foisted on them by much more incompetent PHBs who had no interest in long term support, applicability, security, stability or anything other than that month's statistics.
Next summer will be 3 years for most older desktops which, given corporate desktop replacement schemes, will mean that the systems are due to be replaced even if they still work fine for the job at hand. These new systems are likely to come with Windows 7, so increasing market penetration of it shouldn't be that unrealistic however "70%" will likely depend on what the marketeers decide to classify "Enterprise"...
I think they're really running out of ideas and can't do anything original. It also looks amazingly like Blizzard are trying to expand their player base to the under 8s... the whole damn thing is pretty much kung-fu panda meets pokemon! You can't do that without intentionally targetting young children.
While it was revolutionary when it came out (in some ways, not others), the game is very, very repetitive... a.k.a. grinding for those who are used to MMO terminology. While this expansion has added a few new twists, the majority of the PvE game is "Collect/Kill X of Y" or "Take X to Y". Instances, bosses and PvP do add a lot more interest to a game as they force human to human interaction which is what makes these games fun, but there's still not a lot there.
It will be nice to see what, if anything, Blizzard can come up with next - but it'll have to be very clever and well thought out. The intricate dynamics of an online economy and repetitive, or accessible, game play that covers both casual players and addicts alike is not something that is easy to get right, or even to get not utterly-broken-and-easily-exploitable which is the usual starting point!
Re: 3 for good data? Really?
I made the mistake of switching to 3 (for stupid reasons due to carphone warehouse and t-mobile being cretins)... and haven't had a worse data network for years. And this is saying some as I had the "pleasure" of using some of the earliest mobile Internet PC-cards and USB dongles.
For family reasons I frequently spend time in Cornwall. 3 and any data coverage down there? Forget it, the useless POS network might have a reasonable voice coverage but as the decided to only supply 3g connections and where not available, back this up with precisely nothing, this leaves a good chunk of the South West of the country in a black hole. Not just these areas either - I'm often about the rest of the country and finding fresh no-Internet zones, usually wherever I'm staying...
I really need to go and slap carphone warehouse for being useless and for 3 providing a service that is not fit for purpose... should be enough to cancel the contract.
Not likely, this is the 21st century and phone users treat them very differently to computers - even if you are to tell them that the phone in their pocket has as much computing power as a desktop computer of only a few years ago.
A smart phone is a commodity device - they need to effectively manage themselves. This means automatic over the air updates, preferably wireless so as to not blow mobile data limits but if the carriers got involved properly could even be excluded from mobile data limits. Various carriers have implemented custom features for specific manufacturers of phones, so this isn't out of it by a long shot. The upside for a carrier is that despite the additional bandwidth, which frankly doesn't cost them a lot, they will benefit from likely having less malware and problems on their network keeping it cleaner generally and better performing phones are more likely to retain the carrier's loyalty because the majority of users are likely to associate carrier and phone performance together.
However these updates must be diff. based, not enormous downloads of the entire OS for every update... as soon as the updates get big and require frequent restarts then users will try to find ways around having them happen as they'll see them as a meaningless bind.
...and all this was in an age where optimisation meant changing the program so it ran faster, not buying newer, faster hardware. Newer versions of software also tended to come with software optimisations and new features, not just bloat that ran slower on the same hardware but ran roughly the same speed on much faster new hardware.
It was an art form back then tweaking your system, usually to make it faster, for example to boot in less than 5 seconds. I remember getting a system to the desktop (graphical shell) in under 2 seconds; and this was a fully usable desktop, not the sham tweak from Windows XP onwards that may display a desktop relatively quickly but you still have to wait another 30 seconds for it to be usable. I don't especially miss the tweaking of systems to make them run faster, and definitely don't miss the IRQ table and memory allocation juggling that came with DOS, but considering the sheer processing power available just on a basic PC, it's disappointing just how slow they are.
Yes, yes, all very well.
But was she good looking? This is all that matters when it comes to IT and girls. Especially on a Friday.
Re: We have all been there...
I think it was XKCD that said something like "there is no force in the universe that can stop a geek desperate for an internet connection."
Re: Poor choice of materials?
Really not sure what you mean...
If you're dumb and stick your phone in your pocket along with your keys or anything equally hard and pointy, then yes you will, before long, do significant damage to it.
This is the same with iPhones, iPods and other mobiles as well. I've seen iPhones that are 3 years old and largely immaculate, I've also seen iPhone 5s that are already looking worn and knackered. It depends on how you look after them.
I have an S3 and I frequently have it in my back pocket and forget and sit on the sofa - no damage or cracks at all. I have it in a front pocket at all other times, with no nonsense like a screen cover or case and, still, no damage at all. I've dropped it a few times, usually onto carpet or wooden floors from a metre or so and have yet to damage it. Not intending to test the drop onto concrete though...
Not strictly true, because they'd still pay wages and this is money entering the local economy.
Those pints don't come free you know...
I felt that initially, but even though it's largely only platform based, I've found it to be quite useful.
emails and messages come through and are sent at stations, allowing time to read and reply before the next station.
When standing at a station, it's often useful to look up the conflicting excuses or omissions on the delays.
often the time at a station is also enough to quickly load up a web page or document if I feel the need to read something else.
A real shame, what he managed with the limited resources available compared to many of the contemporary games at the time was staggering. Many modern games lack the depth of these.
I remember being gobsmacked by Lords of Midnight when I first played it.
Let's raise a drink to him.
Re: It keeps me awake at night, it does...
Having swam with them and seen how they act... very true, they certainly do have an intriguing sense of mischief. Sometimes bordering on the painful, but considering their sheer strength, they're generally very careful with what they do.
Not sure why they'd need any help though when it came to killing frogmen considering a dolphin tactic to kill sharks is to just swim very hard and fast at them and hit them in the side, either breaking the back or causing enough harm that the shark, however hungry, will leave very fast if still able... and they use team tactics to make this kind of attack possible.
"gnarly"... now that's a word that I've not heard for a while. May have to drop it in randomly from now on... :)
...it's not hard (at least not for me), but it's not taught well either and while there are quite a lot of very good tools for it the mainstream ones, such as MS Visual Studio, do not provide the level of support that is required to really do a good job. If you've used the Intel Parallel (studio) tools, you'll see just how good the tools can (or should if polished) be - they're not perfect but for anybody that's attempted to debug parallel problems in Visual Studio (or just multi-process/threaded apps), you'll appreciate the difference against plain old Visual Studio.
For years the mainstream of Windows/PC has been single thread algorithms running on faster and faster processors, the transition to a mind set where you can distribute load is quite a shift in a way of working for the majority of developers who are used to A > B > C > D code and nothing else. Not that there will ever be no need for this kind of code, and even in massively parallel systems there's a need for it, it's just that it is not the only way to do it. Just the switch to A > [B|B|B|B|B|B|B|B|] > C > D is enough to cause many developers to run for the hills.
But then I bought a dual processor (real processors, not cores) AMD-M board as soon as it was available and started playing with that years ago. I also vividly remember (even more years ago), rather upsetting a Uni lecturer who taught both neural networks and concurrent programming courses by inferring that he was a bit of a plank for not using double buffering in his neural net simulations: with a moderate increase in storage requirements (not really a problem even then), the solution removed almost all of the coding stupidity and algorithm induced problems in his network simulations.
While I appreciate the sentiment about the insult to movies, telling people to piss off home is a bit wrong.
Many of those that you may want to go home would consider their home to quite close or next to mine and are likely to identify themselves as English people that happen to also be Muslims. This response does vary between generations, but those that are born here (as were their parents), tend to consider themselves as English of X descent - not really different to those that consider themselves to be of, for example, English of Irish descent. Being bloody minded and intolerant is not a singularly Muslim fault, practitioners of pretty much every established religion can be labelled this way at some point. It's just unfortunate how something pointless and trivial as a poor movie can be thrust into the limelight and used as a totem to gather around and cause trouble - if it was just ignored it would have been forgotten about and ignored as it probably should have been in the first place.
Now can we please have some more protests about those dastardly Pythons and their heretical movie, The Life of Brian?
Why buy one?
"If you’re an Amazon customer already, there’s no reason not to buy one, unless you really feel the need to be able to buy e-books from multiple vendors."
Am I the only person who uses their mobile to read e-books?
If you find the text size too small (I don't), then you can increase the text size. OK, so you have fewer words on a page and have to change page more often, but that's not so bad on a phone display...
The display updates faster, much faster than an e-ink display and you don't have the whole page display purge every 8 pages or so. The resolution of a phone is easily high enough on most models to not be so bad on the eye and the greater colour range and graduation between shades means anti-aliasing becomes more effective. Being able to change colours is nice as well, as white on black or vice-versa can be hard on the eye. Where e-ink really wins though is the low power usage and daylight readability, but does fail a bit in the dark requiring additional lighting, but this isn't a particularly serious engineering challenge and even for devices without it, you can get case with lights embedded in them.
Only one device to carry around - this is one of the biggest advantages there is. I don't carry around a separate MP3 player either, so why carry around a separate e-book reader? Most comical site I've seen, all too often, is somebody playing music on an iPod, reading a kindle and then pulling out their mobile to check text messages occasionally.
Choice of multiple stores and reading apps - want Kindle, B&N or other e-book readers (such as the quite handy fbreader) all on one device? no problem.
In my mind, the outstanding battery life of e-ink readers and their daylight readability are the only positive points - and both are either not an issue or can be worked around using mobile phones as e-book readers.
Re: Then divide by the number you first thought of
OK, so what is division? Division is asking how many of one number would be needed to make up another. For example, how many of these slivers of cake make up this whole cake?
The scale of the number range is mathematically meaningless.
For example, draw a circle and draw a couple of lines from the centre to two points near each other on the edge (creating a Sector, sometimes called a Slice in a Pie chart). You can calculate how many of these Sectors are in the whole circle by dividing the circle's circumference by the length of the arc describing the sector (the length of the arc is the distance between the points on the circumference). It doesn't matter if you describe the circumference of the circle as 1 metre, 3.3 linguines or 1000 mm as long as you measure the arc in the same units. In this case if the circumference was 1 metre and the arc was 0.125 metre then you are dividing by a number less than 1 - how is this technically impossible? If instead you recorded the measurement as 1000mm and the arc as 125mm the end result would be the same.
Re: I'm with apple on this one
No, I don't think most people would care, let alone mind. However it would be one less thing for Apple's marketing droids to shout about, and that is much more important.
I'm with apple on this one
I'm with apple on this one... they admitted that there are shortcomings in short lens camera lens and sensor assemblies due to side light hitting the sensors and gave relatively simple work arounds to help users avoid it. Yes, it's probably worse in the iPhone 5 because the sensor is better and the package is shorter, allowing more incidental side light in and picking it up better.
The most viable fix is probably to lengthen the overall lens and sensor assembly but the result of this will make the camera thicker and there's no way that they'll be keen to do that as there are currently relatively few marketing-droid fluff points that can be flung around about the latest iPhones as it is. As a result they need to stick with the points they do have and not lose them... i.e. light, and thin - something the iPhone 5 does very well.
This is quite different to the cock up of having poor (laminate) insulation over the external antenna which allowed sweaty palmed people (or just those who don't have dry hands) to ground the antenna to the case. Blocking radio signals with body parts (such as the water in hands) is common to all mobiles, but does depend on where the actual antenna is. Mobile phone antenna tech has come on a long way since the earlier phones that sprouted 6" external antennas, but there's only so much they can do.
Re: Now if they made it two lawyers...
Why stop at two?
Why would they allow such broad patents to be established? I mean, I could understand perhaps a broad patent precursor.
It's not the US Patent Office's to actually do any work, such as looking at patents. It's their job to take a relatively small amount of money for each one, stamp it, fling it in their files and produce statistics that idiots (politicians) can read to prove (in the way that statistics can be read to get the desired result out regardless of the original data or concept) just how innovative US companies are.
It's the role of the US courts to argue the toss about what is valid or not. This is a win-win-lose situation. Win for the USPO as their income is high and KPIs are met, win for the lawyers as they print money arguing over paperwork but lose for everybody else as entire industries are stuck in an expensive quagmire of potential litigation rather than doing anything useful or productive.
Re: What about the rights of
I'll think you will find that it is still illegal in your country. America says so.
Re: Oh no
It's not hard to run 3rd party browsers. Chrome installs in user space and not program files (not a good idea, re-merging of sodding data and apps, but this is the windows way) and there are quite a few options when it comes to running apps without installing them. "Portable Apps" being one of the most well known of these.
Yep - it does seem that way, but then to be honest nothing much "new" or "interesting" was expected with this phone despite all the hyperbole.
It would be nice if iOS got a refresh though, it's now feeling very clunky and unloved and while there have been some incremental improvements, they've not been a lot. Keeping it similar does aid continued adoption rather than the "arg... where did they move that to this time" feel that can come across otherwise.
I know a few people have been putting off new phones for iPhone 5, will be interesting to see how the sales (not shipments) go. It's beyond time apple did something interesting rather than evolutionary with their phones. The displays are nice, the camera sounds good, siri will doubtless continue to be a useless waste of space and making a nicely presented and designed phones thinner and thinner only to have ugly-as-sin cases to enclose them just seems an ongoing exercise in futility.
Re: HTML5 development
we have to accept that it's a constantly moving target but also that this is not as bad as it sounds because degradability is built-in. The HTML 5 syntax alone is a huge leap forward.
If only this degrability theory was true. It's not. You cannot use the (useful) new tags because if you do, the presentation (styling) will be omitted by the vast number of useless browsers out there. As a result, you have to wrap the new tags in old tags (mostly div and span) and apple the style to these - at which point, why bother with the new tags?
The HTML 5 syntax isn't a huge leap forward - in some ways it's a headless rush backwards towards the "golden era" of mismatched tags. Take the example of browsers having to double guess what a web page markup really means... for example is "<p>foo<p>bar" two different paragraphs without closures or is it one paragraph embedded within another without closures? The gobsmacking idiocy of this backwards step must have the developers of the various web browsers clawing their eyes out.
I'd love HTML 5 to take off, but in reality it's not much more than an exercise in frustration - and that's before you start to get involved in the quasi-religious ideals of some of the actual writes of the standards and their own "lalalalalala (hands over ears) - not listening" approach to even exceedingly well reasoned and presented feedback.
There is one valid point in here though... The irrational desire from some UI monkeys to slavishly attempt to mimic real world interface elements on a computer screen. Here I'm thinking of those on idiotic screen re-creations of TV remote controls to control video applications, those almost impossible to use twist knob controls that infest far too many audio manipulation apps and the daft LED segment style numeric, or worse - alpha-numeric, displays
Ah... and there's the key design element. Hiding interactive elements over a screen letting a user randomly thumb the interface until something happens. Because it's true, most users will get frustrated and eventually find some of these functions "charms" (or whatever BS term they'll be next week) and maybe even remember where they are after a few uses. At least some of them will be found anyway anyway, most likely they'll find one that does something and consider it their lot.
<sarcasm>Sounds like intuitive user interface design perfection to me.</sarcasm>
IBM (Lotus Notes) lost it when IBM decided that they no longer gave a shit about anything other the large enterprise customers. From that point on it started to cease to be relevant.
This leaves the more commonly used Microsoft alternatives of Sharepoint and Exchange. As in Sharepoint that is indescribably awful in most ways, where the best it manages is adequate at basic functionality, and Exchange which is stunningly resource intensive and can sometimes be a real PITA to do what ought to be basic maintenance. I'll happily admit that the management of Exchange has improved in recent versions, despite MS's insistence on focusing on Lync and their rather obscure take on "Unified Communications".
Re: All part of Apples master plan
It is also a very good business practice to diversify your suppliers:
a) A range of suppliers (should) keep prices lower or more stable and quality high
b) If a supplier goes under, you still have others
c) Supply problems at one supplier are less business critical
Basically the "single point of failure" principle.
When I did my exams at school it was the block system... therefore I couldn't do both English and Computer Science(*) at A-Level - which rather upset my English teacher but I wasn't too worried at the time. Couldn't do Computer Science along with Art (GCSE or A-Level) either, which was more of an upset though. There are always going to be some scheduling conflicts, but sometimes the rationalisation between the blocks is gob smacking in its weirdness.
(*) And yes, it really was Computer Science. We started off learning logic diagrams and made our way up from there. No "how to apply a shitty effect to a Power Point slide" classes for us - probably part;y due to Power Point not existing at the time but also due to have a very good teacher who did care.
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders