45 posts • joined 6 Jul 2006
The phrase "can't give it away" comes to mind
Windows 8 on a desktop is unusable. As is 8.1
This is like Zune. Exuberant denial until termination.
At least he is consistent...
Consistently disappointing, moronic, wasteful, idiotic. Seriously, either you make a privatised 'retail' model with government owned infrastructure - or you don't. Telstra is just a mess. An agressive one at that... but my biggest problem with Conroy is the drug induced fantasy land he lives in. Fibre to the node??? What boob thought that was a good idea - the reason wireless broadband is the decrepit state it is in around the world and specifically in Australia is that carriers are oversubscribing to get 'coverage' in nothing more than a statistical sense. In reality though, WiMax and 5G will bring 'close to fibre' performance without the associated costs and meanwhile, Australia will still be fuckarsing about digging up everyone's streets whilst the rest of the world laughed at them and overtook them (once again) whilst their incompetent government bollocked them once again... Same story with the insane filter thing. Millions upon million wasted and for nought. The problem is not only Conroy mind you, it is that both major political parties seem to insist on putting muppets in to the communication portfolio and have done for the last 20+ years. What do you expect to see other than what you have just observed!?
I need ribbons in my UI like I need a second sphincter on my ass
Seriously, Microsoft has lost its way. They are being smashed by Apple, Ubuntu and - well - pretty much everyone for better UI. Ribbon is icing on the cake. Seriously, I will just go back to using Geos on my Commodore 128. It's a nicer experience.
There's a Linux penguin, a fanboi, and a Windows user in a plane, and the pilot says, "we're carrying too much weight, one of you has got to go", and the fanboi says, "well, I can't jump because it's against the terms of the Apple Developer License Agreement... Anyway, its that fucking Windows user that has all the bloatware - get him to jump!"
APPPLE: Have a roast chicken dinner. Did I mention you can only lick it?
Talk about a tease. Yes it's 'technically' a Commodore 64 emulator, but until its running our favorite CREST and CAMELOT demos in full no-sideborder multiplexed sprite glory, it's just 'licking the chicken'
M2400 may be more robust
M2400 is basically the same setup with a higher-end mobile Quadro graphics unit and a third mouse button - but it also has a mag-alloy lid so should be more robust than the Lat'. The price is not too far from the E6400 either. I got the M2400 since it apparently is the only 14" which can drive a 30" external display... mind you - still needs a docking bay to do it!
I think the quality of the story matched the quality of the lappie
Seems to be a theme when it comes to Gears.
I wrote something on this a year ago when the universe spontaneously imploded upon the release of Gears. There was so much misinterpretation on it that my eyes watered and my rectum clenched.
Lo and behold - just as my tensed muscles were starting to relax Google went and released Chrome...
It underscores some of the points I was getting at over a year ago...
.NET got a little big ... kind of like KingKong's scrotum...
I am disturbed that there is so much misinformation and confusion about the .Net framework - oh, and its not because Microsoft made it easy, OH NO! All the same, it is a problem.
1. .Net got big because Microsoft were arrogant - no other reason. Java had separated their 'server side' from their 'client side' from day one - J2EE vs J2SE is a foundation piece of the Java strategy. Microsoft thought they would be different and fly in the face of common sense.
Not to mention that Sun was not trying to cram two thumbs up everybody's arse by making a 'framework' which does absolutely everything - even what everyboady else is already doing on that framework. What do you think nHibernate is? Its very much like the entity framework, but *sigh* again, it does not have the little "copyright microsoft" on it and therefore must be re-engineered by MS and made redundant - and absorbed in to the framework. Why? No good reason at all.
The problem is, well, eventually you end up with a server side framework for CLIENTS... oh, so now we get the client profile from the .Net guys. Nice of you to autodetect I am not a fucking server and douche me with 150-something gigs of useless binaries... morons! Why not just separate the two to begin with?
You start to see the point that the Java standards had? You can put server frameworks on top of J2SE, or you can go the heavyweight J2EE - but either way, you are doing what is actually necessary and what a normal human would do.
By doing this and then slapping Silverlight as another 'add on' Microsoft has actually engineered itself in to a place where it is more convenient and seamless to use Java on the desktop and in the browser than it is to use .Net on its native platform. Go figure...
2. The .Net 'versioning' issue noted above is another moronic maneuver by the .Net team where 1.0 and 1.1 and 2.0 were completely distinct installations which could co-exist side by side - then 3.x became layers of functionality with 'green bits' and 'red bits' - the only 'red bits' ended up being the puckered rectums of developers trying to keep up with a tidal wave of bits and pieces of new APIs on top of the 'framework' (it gets quotes by now because any reasonable person would call it a VM with about 50-odd disparate APIs sitting on top)
So short-story long - you are meant to go out and buy a new development environment every time you want to move up the .Net stack. If you don't like it, go program in Java.
To provide some context - I am a .Net EVANGELIST! Oh yeah, we love it and use it every day. For server side applications .Net is really fantastic. Try deploying client side applications with it (which we do too) and you are up for whole other world of pain. Having to make sure you have a framework installed which is STILL TO THIS DAY marked as optional by the only other automated distribution channel which might give you a chance of ubiquity - Windows Update - hell, it feels like Java circa 2000. Just plain old crazy.
All of this cannot help but make me feel like its time to buy a Mac...
typos fixed... please update original comment.
Not a bad thing - mind you - but the 2133 sports pretty much everything required to make it a full ultra portable notebook sans decent CPU.
If there were a (slightly) pimped version of the 2133 with an SSD, max 4GB Ram option and a C2D T8xxx or T9xxx it would be a killer in the ultraportable space and still be much more affordable than Sony, Lenovo or any other players - which is a completely disruptive strategy.
Meanwhile, this is actually the problem with this notebook - its TOO GOOD for the market which it targets. HP realize this part of the equation and will now manufacture a 2033 (or 2132 or some such numeric variation) out of recycled garbage and the like.
Meanwhile - HP are still sitting on a cult-maker of a machine with the 2133, akin to the TC1100 series of tablets which STILL TODAY have a cult following of loyal users (actually, HP can thank Compaq for that - but all the same...).
So expect interesting things once the 2133 baby brother is released since it will probably also mean a big change for the next rev of the 2133.
2133 is barely in the budget category
Not a bad thing - mind you - but the 2133 sports pretty much everything required to make it a full ultra portable notebook sans decent CPU.
If there were a (slightly) pimped version of the 2133 with an SSD, max 4GB Ram option and a C2D T8xxx or T9xxx it would be a killer in the ultraportable space and still be much more affordable than Sony, Lenovo or any other players - completely disruptive strategy.
Meanwhile, this is actually the problem with this notebook - its TOO GOOD for the market which it targets. HP realize this partof the equation and will now manufacture a 2033 (or 2132 or somesuch variation) out or recycled garbage and the like.
Meanwhile - HP are still sitting on a cult-maker of a machine with the 2133, akin to the TC1100 series of tablets which STILL TODAY has a cult following of loyal users (actually, HP can thank Compaq for that - but all the same...). So expect interesting things once the 2133 baby brother is released because it will probably mean a change for the next rev of the 2133 also.
Please pick a ui to kill... KDE or GNOME.
Once the dichotomy of UI is dealt with in the Linux - more importantly - the Ubuntu world - the resources dedicated to UI can focus on progress not parallel duplication/variation of a huge slab of end-user functionality.
KDE in my mind just adds to end user confusion and dilute the focus of Linux as a powerful desktop alternative to commercial players which exist today.
I think the future for Linux and broad adoption on desktops is 1. focus on end user - that means simplicity, less choice and less options - sure add in an options panel with 5000 options, but that only happens after a user clicks 'advanced settings'. 2. figure out a financial model for the enterprise to feel safe - I personally think its micro-payments, but others would disagree. Either way, by making something free you do not always succeed in making it be adopted; that is the illusion - you need to prove that there is an inherent value and relationship between the enterprise and the 'vendor' - RHEL figured it out to a large degree - a model needs to be found on the desktop which works.
Is this ZDNet?
Protocol buffers are just silly - you have XML when you need a self-describing document with a schema (pick a style) and JSON when you don't. If you're going to need performance, stream it - I mean this guy (at Google) is comparing a heavyweight DOM parser with his PB implementation! If you need smaller size, down the wire, GZIP or RLE (for speed) it then just SERIALIZE THAT if you're trying to do some binary transfer!
The whole point of XML was that it was text based. XML is used as a first class citizen in several frameworks for creating data access classes and even as intermediaries when creating data access models. JSON is a first class citizen on the client side so why create a new bleeding format? Answer: who cares? A file format is a file format is a file format. If you're sitting that far on the bleeding edge you're bound to get your bollocks sliced off sooner or later - so if you are doing any kind of software development which makes money then this API is probably not for you - for the next 24 months at least.
The great leap...
Due to the scale of the Microsoft business there has clearly been some crazy outcome whereby management by committee somehow found that the MSDN camp and the Raymond Chen camp had a brain orgy and the bastard son was Vista 32 bit.
The reality is that Vista 32 really has some fundamental drawbacks and honestly, there is no reason for it to exist - in fact - the fact it does exist is why many users claim Vista 'sucks'. The key problem is that Vista 32 still supports a 20 year old driver model and - worse - one which has no external quality control.
The approach Microsoft took with the 64 bit platform was to make it clean and perform for the enterprise - even with XP 64 bit. the problem is that this alone would have been a compelling argument for the enterprise to move to Vista, if the whole thing was not so convoluted.
Microsoft have done their users a disservice by masking the reality of the shift to 64 bits - not just because it is 64 bit but becuase of the things it represents in the way of quality control over sloppy third party vendors.
XP should have stayed 32 bit, and Vista only 64. Make it a big deal to move to the new operating system and make damned sure that the strategy for rigid control over the quality of drivers is paramount in your marketing effort.
I just cannot fathom how Microsoft got their strategy so wrong.
I would happily pay that money...
it's good value when you look at the practical uses for it:
Not only is it useful for destroying data of non-paying clients when all else fails, but also serves as a handy spot to mount their testicles for extra encouragement.
...my rectum is now officially clenched...
I see several contributors are clearly past the point of clinically insane. Just know that mommy still loves you no matter what color laptop you have...
Apple is a funny beasty - hardware and OS from one vendor. Once upon a time it was the norm - now, its unique. So when and if you compare, please try and match apples with apples, oranges with oranges et al. For starters, try comparing Windows focussed hardware manufacturers with Apple for just laptops. Then go poke around between Vista and OSX; I'm not going to even get stuck in to that...
Guess what, its true that Apple has had many build quality issues with notebooks before Jobs turned the company around; wow I spent a lot of time as an IT manager sending those things back - but I guess Apple have largely focussed on QA like most other manufacturers over the past decade. I mean, I saw some BLOODY DREADFUL Compaq notebooks, and now HP/Compaq have much much better QA. To be honest, its got jack-all to do with being a Mac and a hell of a lot with being a hardware manufacturer.
Now that VMWare Fusion is out and guys can be using Visual Studio or other 'Windows only' things inside OSX, the operating system is actually not that important. Its not about Mac or PC, its more about build quality, performance and security - both of hardware and software.
This is not Mac vs Windows rant, this is a 'right tool for the job' discussion - should you run Solaris on a desktop? Ubuntu (love the new visuals!)?, Vista 64? OSX? Of course there are still issues where Macs will run OSX - but virtual machines have levelled the playing field and users are now in a position to use the right tool for the job. Mind you, I am still trying to figure out what that job is for desktop Solaris... anyways...
I personally have liked HP tablets for a long long time - I think the TC1100 was really inspired as far as design and function goes. I am REALLY looking forward to seeing what Apple bring to the table with subnotebook form factor and multi-touch. It sounds like a complete shift in the paradigm - and for many it will be a new tool to fill a void which currently exists.
Like I said; its about the "right tool for the job"
All my love for the new year,
More followup at Smoothspan
Worthwhile readong the whole diatribe - it takes a while to get where Bob is coming from but eventually it starts to make a little sense. It doesn't really change my POV on Googlestatting to make a point, but it does clarify the whole thing.
Thankyou for pointing out misinformation
Thankyou for your response to the SmoothSpan article. Its nice to see such a level headed response.
The one thing which both your response and the original article failed to note was that Java is not equivalent to .NET - that is, .NET is a framework and JAVA is a language. A more reasonable comparison would be to compare JVMs to .NET or Java directly with C# - which are so linguistically close that any halfwit developer can concurrently develop across the platforms.
Futher to this point, both the JVM (Sun and other flavours) and the .NET framework (including Mono et al) support multiple languages - not just C# or Java - in fact Grasshopper supports C# and VB.Net as languages which run on the JVM whilst Iron Python runs Python on .NET framework! You can get Ruby compilers for both JVM and .NET framework... Its a non-issue and purely a matter of preference as to syntactic sugar and verbosity more than anything else.
Choosing a platform is more about integration strategies and third party vendor preferences / requirements than language choice! Being informed and choosing the right tools for the job will have a better chance of bringing success every time than being a childish fanboy. That just makes for a diminished view of the world. Take a look at the development lifecycle of the jUnit and nUnit projects for example - there was a cross pollination of ideas which came back to the jUnit framework after the nUnit framework was rewritten at version 2 to start using method attributes (akin to Java annotations) instead of the older jUnit naming convention strategy. As soon as the JVM in Java 5 supported annotations, this concept moved back to the jUnit version 4 codebase.
The sweeping statement about "loss of community" was just as absurd where the .Net framework is much less mature than Java in terms of age yet its penetration in sheer developer uptake has been staggering if you compare it to the historic uptake of Java at a similar age. He also failed to give ANY stat to back up that statement about community.
Further to this, you're right - his stats gathering overall is lazy - and its just a single dimension of stats, so they can be manipulated whichever way you want... maybe Google is being pounded by Java developers because of any other number of factors - complexity of deployment maybe? I know how much time we have spent configuring a good sized J2EE application to run correctly. Certainly not as trivial as deployment of a .Net app garden - yet, of course, its a different tool for a different job.
Finally, the stats about starting up companies on .Net are just plain old nuts - how many VCs did he interview to come up with that?
Sounds like uninformed opinion to me...
FOR SALE: One set of testicles. No longer required, Good condition. Make an offer...
Obviously you missed this ad from the product development guys at Microsoft. It didn't raise more than a couple of bucks as the shrivelled raisins had not been of much use prior to the Vista launch - but Mickey-soft got a a square kegging whence they chose to make an operating system with more permutations than Paris Hilton bedfellows. 32-bit-home-premium to 64-bit-ultimate and everything in between (over 46 versions if you include upgrades, OEMs and 'Media Player legally compliant' versions).
All said and done they should probably have left XP for the 32's and just focussed on Vista for the 64's and made a firm line in the sand with hardware and software - driving better code and a desire for new generation hardware. Of course that would require a fresh new set of pants-potatoes which they clearly were unable to muster within the product development group or in marketing for that matter. Imagine how much progress would have been made if Mickey had only concentrated on X64 for Vista and just done a couple more SP's for XP. Clearly a lot less egg on faces all 'round I would expect.
Foxit rocks the spot...
I have to also say that it was a while ago I too got sick of the rectum clenching startup times and quagmire of plugins which Reader 8 brought to the table. I mean - really, the latest Adobe Reader thing has extensions to display 3D models! Can anyone say DXF?! Please pick what the hell you want to be doing and stick with it.
Foxit has been standard fare for all our users for a while and I can happily say that I have never made a move with less regret. Its lean, fast and does whats advertised.
Bunch of twats...
Australian government has shown its head is firmly planted up its proverbial rectum once again. There has been so much talk of centralized filtering whilst Optus is already providing end-user filtering software free with their broadband packages.
How long will it take for these prats to get that ISP level filtering is idiotic because its decentralised censorship. There is no way it can be implemented, enforced or managed - either from the ISP or the government. Complete rubbish. Waste of our money to even investigate and clearly demonstrated the stupidity of this government AND the opposition who were both party to this proposal. They should all change their name to Richard Cranium.
Can't wait to see this bloody virus in the form of government censorship software which gets released. It will have a security breach within a month and expose every Australian family to something far worse...
Oh, did I mention this thing called OpenDNS? Costs $0 and solves about 90% of the problem with porn - of course you could always spend $4,500,000 for me to tell you like the Australian government did in this "investigation" (read JUNKET).
Second Life will dwarf the web in ten years
Philip neglected to mention a couple of prerequisites for his statement, firstly that his estimate of ten years had a precision of +/- 10,000 years and secondly it was based on the premise that humans would migrate to using masturbation as a form of currency within this timeframe.
RE: Since I'm the only one
"I'm waiting for Activesync support so I can be connected with my work but I really don't miss emails informing me how dumb some of my co-workers are all day long. I suspect the next drop of OS X will correct some of these issues."
I'm afraid the next drop of OSX will not yet be capable of terminating your fellow employees.
Apparently there is an alternate handset which does just the trick...
Apple have successfully created "Buzz-Inversion"
It looks as if Apple has actually managed to do something nobody else has ever done. No, I'm not talking about the hardware guys - I'm talking about the marketing department.
Apple have found the limit of buzz marketing and discovered the phenomena of "Buzz Inversion". It is where the buzz around the item is so great that it eventually serves to demerit the thing for which the buzz is about.
The ultimate effect is that of the buzz being inverted and those sucked in to buying the device with the hope of expectations fulfilled are chastised as halfwits crippled by their own vanity.
Apple have made a device whose buzz is so dense with hyperbole and filled with expectations fuelled by media saturation which is fundamentally beyond the actual device's capacity to deliver its experience that they have created a social stigma.
On the topic of Buzz Inversion
If I needed any more weight to my argument, read this juicy headline from Sydney Morning Herald:
It is just rectum clenchingly obnoxious to own an Apple phone.
Sorry, cannot bring myself to use its trademarked name anymore either.
Some worthwhile Google reading regarding GDocs
After one month with Google Apps
Pay attention to the bit about Docs. I think making a bold statement about the GoogleOS is a little bit premature, but when you consider what they are doing with Gears, there is certainly a theme emerging...
Once again I am ashamed to be human.
Register schizophrenia issues...
Ok, I have to admit I have high hopes from this publication (El Reg) but when it comes to reviews I am left feeling that the editor of Reg Hardware suffers from a serious case of schizophrenia.
How is it that in the same publication there can be ten page in depth reviews of graphics cards that delve to the level of comparing shader operations performance and are so involved that the reader can come away with a feeling that they have an intimate knowlegde of the product and its comparative performance AND THEN THERE ARE ARTICLES LIKE THIS???
This is not a review. This is not an article. This is not even an opinon column. It is a montage of opinion column pretending to be a review and that is deceitful.
There is no structure to the 'review' criteria, or reason/motivation for ratings, or anything of substance. What happened? Did DELL, ACER, ALIENWARE, MESH and APPLE chip in together to get an 'Advertainment' spot on El Reg? Thats certainly what it looks like. I will tell you what this ISN'T: Journalism.
Ed. Be ashamed, be very ashamed.
My crack pipe overfloweth...
Oh dear, a serious case of my crack pipe overfloweth. Lets just think for a moment about this business model;
Intelligent Weapons can not really patent the patch solution as that would mean it gets its arse sued in to oblivion by MS for reverse engineering its products and breaking licensing terms. They can really only patent the flaw.
Patenting a flaw is not going to work since the PO will not allow patents on illegal activities and by law they are unenforcable. Even if IW managed to get some patents through, and even if MS doesn't get them on the reverse engineering thing, MS just needs to prove that IW is attempting to patent flaws which are the mechanism for illegal activity and IW will vanish into the vapor from whence it came.
Sounds like a good business, I'm in...
I thought I was a nerd...
...until now. You guys rock.
I Wonder Why My Battery is Flat...
...oh, thats right - this poxy touch screen on the front wasted all the juice before I had a chance to use the thing. Much like the phone...
The long road...
Mr Jewel is not wrong ona a lot of points here but there is a big picture that seems to be being missed in all of this.
Microsoft are a law unto themselves. They operate in a way which thinks on a somewhat different plane of existance. A slow, cumbersome and compatability laden plane of existance. For the armies of developers and smart thinkers they have they move very very very slowly. But with reason.
You don't get to the top because your technology is the best. An easy choice resonates with consumers - and I promise you most consumers are not ASM geeks.
What Apple did in three years, Microsoft will do in ten. Firstly, as some keen eyes have pointed out - they do not control the hardware and this issue of compatability is a sticky issue. If you ever want to be reminded of the insanity that is Windows compatibility have a read of Raymond Chen's blog. http://blogs.msdn.com/oldnewthing/ That should sort you out quick smart...
Now, the reason I say that MS is on a different plane of existance is that they believe that compatability is more important than nearly anything else and this drives their approach to everything. That is why Vista is just lipstick on the pig from many perspectives - except one.
The one reason why Vista really is different is that .Net is a first class citizen in many respects, especially when it comes to WPF. Why is this important? Well, lets think about the best way to maintain compatability with hundreds of thousands of applications with quintillions of lines of code invested in them whilst trying to move to a new operating system?
Answer: Change the platform first and then move the operating system. How is this possible? The compilers are now targeting the .Net framework as a first class citizen. Managed C++ gets the same treatment as C# and VB.Net.
This means that in a few years (a blink of the Micorosft eye), all code will be targetting the framework and not the underlying operating system native APIs. Once everything is 'managed' then we start to drop the legacy APIs for better designed systems much faster without having to worry about compatability with crazy people patching the Kernel and using undocumented hooks - we end up with something more powerful and secure than Apple's Objective C due to its managed nature - and most importantly - its much more MAINTAINABLE.
Jack Tramiel would turn in his grave...
A Commodore machine that doesn't cost $595!
Well, I think that given today's market, this is certainly a machine worthy of the brand and its hordes of fans. I love the aesthetic of the case and the idea of skinning them. I think the guys have gone for the right niche to at least make a go of it - better than Gateway, ESCOM or Tulip!
The only things missing are descent displays, keyboards and mice which continue the theme and branding style.
Well done lads! Give AlienWare a run for their money...
Old News Wrong News
I have to admit, it took a little while to completely get the (non) problem myself. But this is certainly old news. Steve Gibson from GRC.com had commented on this functionality literally months ago.
I had noted the impact on applications which were non-installers which looked like installers (due to their filename) in my blog - which is a rather lame situation by any measure...
I too thought that the opposite may happen, regarding elevation of privilege through filename heuristics, but alas that is not the problem. It is simply a switch for the UAC when a manifest is absent.
...the real problem is that after the installer for my non-Vista aware application runs (thanks to the UAC heuristics on the setup filename) - UAC may not ask about privilege elevation if my application doesn't have a manifest and doesn't have 'setup' in the filename.
Therefore whilst the bloody installer may well work thanks to the little filename hack the application itself may silently fail or not trap errors which previously could not occur.
This opens up a whole dimension of potential issues for which there is not really a workaround - potential denial of service security flaws and the like in applications which were previously stable.
Obviously not a connoisseur of the Vindaloo
The next day's output is certainly not a "three squares" job. Let me tell you...
THIS IS RECTUM CLENCHING
Ok, I know there are like a zillion comments to this article but I also felt compelled to donate my point of view. Actually, my rectum clenched so hard it was in a spasm which would not release until I started typing a reply simply because this article annoyed me so much. Actually, it wasn't even the article - it was the comments too. BUT NOT FOR THE REASONS YOU THINK.
Whilst the fanboy twitter for Apple and Linux and other miscellanous vendors - and of course Dell - made me wince, it all danced around the two real problems which exist in the hardware industry right now:
Components are made like complete shite and have mortality rates which would make any other consumer sector expect to be sued in to oblivion but the economic pressure and economies of scale which exist in this industry have forced it to evolve to this crazy situation.
The service model for outsourced hardware support is so infinitely screwed up with huge competition and such vast demand on the (low cost) service that complete imbeciles are in the field blaming users for problem from the crap quality of components (and quality-bereft drivers) which the actual builders of the systems have bugger all control over.
Dell simply exist in the midst of the problem and I am sure perpetuate it by selecting low cost components to keep the cost of manufacture down. Which is exactly what most other manufacturers do - Apple has drawn a sort of middle ground by massively limiting options and extending the retooling lifecycle to reduce cost rather than use completely crud components, but just because the exterior is swank doesn't mean that they are not using suppliers who don't have production problems of their own and quality gaps like hell just like every one else - Intel included.
The bottom line is this: with the competitive price pressure in the PC industry as it is, if you buy anything more than a bloody desk calculator expect it to screw up or not work properly. And when you tell someone to fix it (since you paid for your extended warranty) expect to be asked to diagnose the problem and it will be your fault anyway.
Sounds like those 10,000 people should initiate a class action against the lawyer that represented to them they have a case.
Even the path to hell is paved with good intentions...
I understand the motivation. Try to simplify what is a very very complex knowledge domain for the brader developer community. And no - this is not SOA by another name, this is trying to combine the concepts of more advanced frameworks like SPRING for Java with extensive fundamental and enterprise design patterns (ala GOF) with the .Net framework foundations within the Visual Studio IDE.
The net result they are trying to achieve is to centralize the design process to be more guided and visual based on best practises of design patterns and enterprise patterns; in addition to this, instilling development methodologies such as Scrum (or the Microsoft flavour thereof).
Whilst I do not believe they are wrong or misguided in trying to add the tier above 'code' in the software engineering domain to the IDE (it is an 'integrated' environment after all) the problem which undoubtedly ensues is that the tools will mandate HOW code must be written to operate within the templates or wizard driven components of the system. The need for intelligence of the architect is somewhat stripped away in the pursuit of improved efficiency. This can easily lead to poorly designed systems which still attempt use best practises.
Combining this approach with the new Visual Studio tools for the Workflow Foundation means that the line between architect and business analyst starts to blur and in effect the objective would be to get them all using the same tool set (VSTS) to have a central point for analysis and architecture to converge and distil down to code level tasks for the rest of the dev team.
What am I doing here?
The poor little UX1 sits dazed and confused in neverland without a real use. The latest Microsoft spin on the UMPC is just all wrong - its sheer novelty. Who in their right mind is going to use a slide out keyboard or an on screen keyboard in that crazy fashion?
Now don't get me wrong, this little UX1 baby is some funky engineering. Its truly amazing, unfortunately just misdirected from a usability standpoint - which started with Microsoft dictating the crazy-ass form factor and then Sony engineers doing what they could with the 'two thumb' requirement.
The reality of 'two mode' or 'three mode' computers when you really examine users in the field is more like 'finger-actions' 'scribble/notebook' and 'compute'. If you look at where the iPhone wows people is very much with this focus on really addressing REAL users with pragmatic usability solutions for one of the modes - not just a novelty. I understand the UMPC is not an iPhone - and that the operating system would need to keep up with the form factor (which the UMPC extensions for Vista DO NOT) but you start to see how two thumbs is just not going to cut it.
The present UMPC incarnation is just not addressing the use of the form factor and the issues the reviewer has had with the UX1 is testament to this. It really seems as if the design of the whole platform just doesn't know where its going yet and until the UMPC can get some clear direction with delineating the 'three-mode' computer experience so that it is compelling enough to fill a DESPERATE NICHE which is craving for simplicity, mobility and usability - it will be destined to sit somewhere in the middle of a product range dazed and confused.
Just Give Me A Reason
I would like to offer a reverse perspective. I am a .Net developer, I have many apps that I need to develop and maintain all using the .net framework.
Now, if I want to be able to use a beautiful machine with a decent operating system - namely a MAC - and still develop the .Net applications I am already maintaining, then I have a chance of doing at least some of my development work natively on a Mac using Eclipse instead of being stuck using something like BootCamp or Parallels all the time. So Mono and GTK for Cocoa are all good things... Please just give me a reason to switch to a Mac!
Also, you missed the point that a LOT of .net development is in the space of server side / web applications. Currently there are two approaches to making a .net app portable to an XServe or Unix server - either use Mono or Grasshopper.
Mono for web apps is a real and viable alternative to being stuck on Wintel servers using IIS. In fact that was the focus of compatibility efforts in the early days of Mono if I am not mistaken.
Grasshopper from Mainsoft is the alternative to Mono for the server side; it is a completely different approach and essentially turns .net apps in to J2EE apps through the compilation process. This technology is aimed SQUARELY at server side development.
Either approach opens the path for hardware and OS platform independence whilst still using a great language and development platform.
Conclusion: Especially if you have an investment in skills or IP this is a path which can easily and effectively help free you from being chained to a specific hardware or OS vendor.
Now, there are three fundamental laws of computing for end users which we are all driven by because they are the fundamentals of how we are driven as humans. I bet you're wondering what the hell it could be -
1. I paid money for my computer. I own it AND I want it to work.
2. I paid money for my Software. I own it AND I want it to work.
3. I don't like change and when I do need to change I like them visual and simple.
4. I don't like choice. Tell me what I need to do what I need to.
Now, for all the people reading this who are sure saying "what bollocks" - sit back and think about the mean of user sentiment. Think about it through your organization - not the tech savvy or trend setters, the average punters who are the ones spending most of the money around the globe.
The fundamental premise of this article is that rule number 2 is being violated in very subtle ways. BUT since it has been that way as long as most people can remember it is maintained due to a state of stasis cause by rule 3.
Vista violates many of these rules in new ways and it will be up to the final acceptance of the implementation to determine if it really is the longest suicide note in history; with respect to the DRM issues which have been raised, I fundamentally lose control over facets of my hardware and as a simple end user it stops working as advertised depending on the kind of content I put in. BUT rules 2 and 3 are somewhat intact - so again, it makes it past the post as far as general public goes.
I have been growing with my sense of discontent; as many others have been - with what Vista represents in regards to rule 1 violations. But lets look at what MS is doing right - even if it is a user illusion - to see why they dominate the market:
* Whilst they have always added new features between version steps, they have always kept rule 1 sacred. They make stupidly obese bloatware, but you can generally turn it off to some extent.
* Rule 2 is GOLDEN for Microsoft. They even make kernel patches in new versions of the OS to maintain functionality in legacy applications which are popular at the time - have a read of http://blogs.msdn.com/oldnewthing/ to get some context on this front. Raymond Chen sheds some light on the hows and whys on this front. VERY interesting reading.
* Rule 3, well each version step has added different lipstick on the pig. When things finally went beyond the visual trappings to a general consensus that whats under the hood makes a difference (See the state of the world before XP service pack 2) and started paying attention to security, MS responded. Now, Vista is trying to balance both with the established, yet enhanced user experience feel in Aero and also the full secutiry model changes.
* Rule 4. Well, I don't need to say much on this front - but if you ever wondered why MS seemingly does everything - its not because its good at it - its because they realize that it will sell due to the power of the brand and the drive of rule 4 in the minds of many average users.
If you look at the four rules, you can also see why different vendors are successful in some areas and not in others. Look at Linux - RedHat has always had trouble getting traction because rule 2 is void - so there is no perceived value - and rule 4 is massively violated. Then look at the traction of Ubuntu thanks to the simplicity it introduced to the idea of a distro.
OSX had a great transition from OS9 because rule 2 was not violated. OSX PPC to OSX Intel was the same deal. But then OSX also violated rule 2 AND rule 3 by bringing out too many versions of the OS every year which were paid upgrades which also caused many new apps to fail compatability on the older versions of the OS - but it was all OSX.
You start to see the pattern?
RE: So does this mean...
>>Our astronauts will soon be able to get good dim sum in space?
...only if you like 'em in very small fragments with an after-taste of H-6.
-Joe / pixolut.com
DUDE! Its just a UI framework...
Okaaay, step away from this guy very slowly. No sudden moves. Just maintain eye contact and go nice and easy.
Have you ever heard of MVC? Model View Controller? Its a fundamental design pattern which most developers strive for when designing interactive systems - it keeps our design clean because it separates the data model from the user interface and the business logic.
When I say STRIVE thats because, business logic is a hazy line to delineate when you design a user interface. If the interface is truely interactive then if the app does more than hello world - its probably inherently using aspects of business logic tied in to the user interface when responding to user events. How well you separate these apects of model, view and controller equates to how cleanly engineered the design and more resuable and maintainable the code.
OK; so now first problem I have, this dude is clearly not getting the point of the *evolution* of the web application being coined Web2. To my chagrin it has been blessed with a version step but in reality AJAX is just lipstick on the pig of the Document Object Model. But AJAX is simply the enabler. It could just as easily be Avalon! The point is its moving from a static, stateless form to an event driven UI model (towards a true MVC!) but dude, Web2 != Ajax
This Web2 hype is about the social IMPACT an event driven UI can have when dispersed amongst zillions of users who can share stuff with each other and what that can do. Thats pretty cool when you think about it.
This social 'revolution' doesn't imply that Web2 is a solution for distributed systems - its not - go and play with those SOA crackheads who are busy wrapping APIs around their COBOL server which was the mutts nuts circa 1968; and are still trying to justify what they paid for the damned thing. But I digress...
O'really is not an innovator, he's a phrase coiner - but he always has been.
A lot of investors and fund managers are plain old stupid and don't get it but need somewhere to stash money - and thats why recessions happen... Its not limited to the internet - go have a look in biotech, pharma, jeez - anything that requires half a brain to understand the true concept behind the investment.
Now quit your whining and get back to writing that Atlas-enabled, Google Earth-Mashuped, MySpace page.
BOD: Batch On Demand
This whole event driven thing is pretty much on the mark methinks. But the critical issue about real time is that it often imposes huge overhead on the system when its not entirely necessary. Its like the reverse of true batch - that is, it actually can reduce the overall efficiency of the system slowing down legitimate users (like customers) from doing legitimate things (like buying stuff!) while the system fritters away cycles generating a real time report.
One of the great things about batch on demand for BI on massive datasets is that it can give you snapshots at regular intervals (ie: daily) of very detailed and complex information. Then if there is an emergency or some other intense need to have quasi real time data, reports can be invoked on demand. Whilst the data is not truely real time it does give a level of flexiblity in the business whilst attempting to proivide the best allocation of processing and infrastructure resources.
The real issue is how you architect either your real time or batch process. Generally complex reports across massive datasets can either take 1. arseloads of time to run or 2. arseloads of processing power to happen quickly. I have seen lots of strategies like partial caching (blended data warehousing mixed with real time data sources) and loads of other approaches - but whats the easiest way to manage this resource consumption issue whilst maintaining simple to understand business logic in your reports? (Remember reports are generally painful to read code at the best of times)
At the end of the day the hardware manufacturers have actually created something of a renaissance in batch processing due to their focus on multiple cores. Why? Whats the easiest way to think about threading? Um, batch!
Whilst there has always been an option to use multithreading in enterprise applications, both languages and hardware have made the use of mutiple threads very attractive and easy to do. The beauty and power of multithreading with batch is that it separates the processing strategy from the code.
We can have a thread priority set high enough to force a single processor affinity and redline it - or lower the priority of it so much that it will take the slops if things are busy with the rest of the system. No matter what we choose to do, the strategy gives two clear benefits: 1. It seperates efficiency issues from code 2. It allows real-time interaction with the processing pipeline though a management interface (ie: JMX or MOM) to adjust how much processing power the batch process is taking.
Even though I have concentrated on batch for reporting, there can be similar benefits for other data interchange requirements. Of course there are also many more complexities too - this is where a real time SOA shines!
SOA = Lipstick on Pig
Some of the big motivating factors for moving away from the old lamps is an inherent desire of developers to get closer to a new technologies capabilities thoughout a systems aspects (yes, I am referring to the interest in AOP even if I do not wish to actually use it) and thats a big deal. Adding layers of indirection and wrappers around legacy technology to make it work with the rest of the business is the issue at hand and the harder it is to do this, the bigger the financial pressure to move away from what would otherwise be a perfectly acceptable tool for the job.
Of course this was the promise of SOA, Service Oriented Architectures; wrapping legacy systems and allowing a fluid interop with new business platforms and user interface technologies. Of course we are just shuffling complexities from one place to another - managing a core technology migration instead becomes managing a zillion interfaces and service clients all of whom do different things. This is not to say that this has not been successful in many cases but it certainly is not the painless nirvana marketers would have us believe.
The really big deal about SOA and now virtualization is that it is thinking about the business not just the technology. There are some vast investments in technology and platforms and wrapping them up through services, virtualization or both is a good way of managing the shifting targets of performance, scalability and COST whilst allowing newer platforms to be a little more agile in their approach (Service orientation can be a good thing).
So what does this mean for the Old Lamps? Well, for sure it means that there is some longevity in the original technology and the irony of this is that there must be skilled people out there to maintain it. What do they do when they stop maintaining those old systems running in a virtualized VAX-11?
Write grid batch processors I'm sure...
- Comment Renewable energy 'simply WON'T WORK': Top Google engineers
- Leaked screenshots show next Windows kernel to be a perfect 10
- Amazon warming up 'cheapo web vid' cannon to SINK Netflix
- Windows Phone will snatch biz No 2 spot from Android – analyst
- Something for the Weekend, Sir? I need a password to BRAKE? What? No! STOP! Aaaargh!