For those who have built their careers coding in Java, a recent Forrester report should give them hope that Java will continue to put food on the table for a long time. Sort of. While Forrester analysts Jeffrey Hammond and John Rymer insist Java has "a long future at the core of enterprise computing architectures," they're less …
Is it just me or portalitis is back. From 0% for 10 years since the days of the dot bomb to 10% in one year. It will be interesting to see the same stats for 2011.
how the mighty are fallen
COBOL *is* mighty!
Like the Force or Dark Matter, it pervades the Universe and controls your bank account.
COBOL is still being used
My mom is a retired COBOL programmer and she came out of retirement to make a BIG chunk of change doing Y2K remediation on COBOL applications; COBOL is still deeply imbedded into mainframe financial computing.
That was 11 years ago, COBOL was still in use 11 years ago.
(Okay, there was a Little Y2K of you-morons-did-it-AGAIN when a lot of devices confused 2010 and 2016.)
Now, what does "mint" in the arcticle mean? An arcane bit of jargon? My guess is the word "limit", do I win anything?
Still there, yes...
I can attest that even in the recent past (2008), COBOL was pretty much alive in the bank's mainframe. In fact, the CICS stuff is still being maintained and getting added functionality.
"But this is only likely to move more mobile development off Java to unencumbered platforms"
and less power-hungry ones too, with any luck
Java already is the new Cobol
Remember when Java was first launched, it was a cool OO language with a nice set of core libraries and GUI controls and a VM that allowed it to run on any platform ? It was perfect for 'applets' which is what it was designed for.
Instead we now use it for server-side web applications where we don't need a VM and portability - it would actually be better to compile to native code and get better performance. We don't use any of the GUI controls and best of all, the imperative, shared-state OO design is exactly the wrong model for highly concurrent web apps. This would be tolerable, if Java was a productive development language but the lack of any meta-programming facilities and higher-order constructs makes it probably the least productive language in use today and it's barely moved on in 15 years. I would argue that actually Cobol is probably more productive for enterprise apps.
Well acquired Oracle, grabbed yourself a bargain there !
Soo.... you start your post with 'Remember when Java was first launched...' and don't appear to have looked closely at the technology since then.
The reason it's being used in everything from high volume, highly transactional environments to mobile handsets is that both the language and virtual machine have proven to be remarkably flexible. Java has evolved, and in the last few years has kept up with many of the upstarts whilst building on a very robust foundation.
Java has flaws but it's not as bad as you say at all
When I fire up Eclipse, Netbeans or SQL Developer I'm using Java. When I use my phone I'm using apps written with Java (even if the VM is dalvik). When I open up GMail I'm using an app that was written with Java.
On the server side, well... it's pretty much everywhere because it's robust, platform neutral and comes in a variety of implementations for virtually every OS.
So yes it's very flexible language and it is suitable for client side apps too. At the end of the day Java the language is an expressive OO language with a useful set of core libraries.
I'm sure what you mean by "meta programming facilities or high order constructs". You can annotate classes, you can define patterns and behaviours in XML or via autowiring, the language offers introspection and dependency injection, you have libraries that prefer convention over configuration, you have dynamic scripting libraries that work with JVM and Java classes. Lots of choices there.
I do think the Java the language is getting pretty mouldy, the glacial pace of development is frustrating and Oracle's control freakery is provocative and self destructive. If someone branched Java to produce a Java++ (i.e. virtually 100% backwards compatible with extensions) I think it would be welcomed and supported. People are fed up and angry with Oracle. I think a branch would be the shot in the arm the platform needs.
"Soo.... you start your post with 'Remember when Java was first launched...' and don't appear to have looked closely at the technology since then."
Maybe he doesn't want the nightmares to come back.
"The reason it's being used in everything from high volume, highly transactional environments to mobile handsets is that both the language and virtual machine have proven to be remarkably flexible. Java has evolved, and in the last few years has kept up with many of the upstarts whilst building on a very robust foundation."
Java was originally developed for embedded systems, albeit with a fairly archaic take on what a virtual machine instruction set should look like, was re-targeted to the applet scene and GUI development, and managed to meet a particular need at the time, even though existing technologies (such as Tcl/Tk) typically delivered a better user experience with less effort for a lot of applications.
And sure, as Java's reputation in the GUI arena faded in the face of better cross-platform toolkits, it became a piece of server-side technology that eventually delivered reasonable performance and portability, delivering stuff like built-in automatic garbage collection to an audience that might have stuck with C++ for everything otherwise.
But none of that means that Java is the only peg that fits into the hole. Had Bell Labs made their technologies (Plan 9, Inferno) free and open from the start (rather than coming back in new forms like Google Go), or had Ericsson pushed Erlang harder into the mainstream, people might have gone with those. And the dynamically-typed languages might get a lot of criticism from the "pedal to the metal" crowd, but they do the job very well, too. Indeed, Java's eventual performance boost came from prior work on the Self language - another technology that could have taken Java's place had Sun chosen to back that horse.
HyperRadioProActive IT is not rocket science, it is much more complicated than that ....
..... but it is quite easy if you keep it simple for the stupid, and have exercise of base control for root privileges to I/O Supply.
"But over time I expect Oracle's failure to engage client-side enterprise developers to have a significant, deleterious effect on the company's ability to make money. Developers are the new kingmakers, and particularly mobile developers.
Oracle won't be a comfortable position if it owns the data center but cedes client-side application engineering to someone else. Microsoft showed long ago that owning the end-user drives a lot of infrastructure decisions."
Quite so, Matt. Having a store of information [owning the data center] and not having the developers/metadatabase analysts in this case, to turn toxic old phorm information into dynamic new build intelligence, is a carbon copy in a parallel plane [and/or if you are heavily into Virtual Reality Fields and the ARGenre], of the present QE fiascos, with the trillions of fast instant cash being pumped into the money system, which is the owning-the-data-center element, doing absolutely nothing to create a bulwark against catastrophic collapse of the system because there is nobody to use the flash cash/toxic old phorm information to build a SMARTer Intelligence System for SMARTer IntelAIgent Systems.*
In the absence of Prime Radical and Fundamentally Extereme Progress, is Rapid Decay towards Impotent Collapse, Guaranteed at an Exponential Rate of Decline with Domino Effect Active Defaults heading the .... well, it is nothing less than a postmodern extraordinary rendering of the Charge of the Light Brigade into the valley of Death, is it not.
* Easily solved though with the engagement of client-side enterprise developers into making kings and obscene amounts of money for spending to generate economies and industry, ......and "This is a global Applications Program and is designed specifically to regenerate Economies and Societies ...The dDutch Initiative" is an early seed that flowers power.
As a contractor who has made a career of developing Java - from Swing tools to online banking, via bio-tech research and embedded apps - Oracle's behaviour is making for fascinating viewing.
If I were a shareholder in Oracle, I'd be most concerned that rather than buying an asset and leveraging it as far as possible, the company appears to be entrenching itself and alienating the wider market. As the author says, it's probably a position they can hold for some time with little ill effect, but companies don't grow by defending a corner.
If I were an aggressive company with some involvement in this space, I'd be looking at the tools and communities that have kept Java relevant and working hard to jump start the next generation of projects that would engage developers and deliver results. The melting pot that has surrounded Java has not only kept it interesting to IT professionals, but has also provided incubation to ideas that have migrated into the higher profile 'headline projects', such as Spring and Eclipse.
Anyway, I'm just going to finish reading that article about Google's fancy build tool...
From the bar chart presented, 49% of developers are doing desktop/client server apps. I would have thought that a competent java desktop developer could handle developing android apps.
So a career in java is very secure, you have all the server side stuff and many can make the change to android if there is money to be made.
The trouble is I don't think there is anywhere near as much money available to develop mobile apps compared to server/desktop apps. So ignore the hype about the mobile platform and stick to the less glamorous but better paying markets?
Still time for Oracle to get a clue
Oracle / Sun lost the mobile market because Jave ME is obsolete for phones and set top boxes and has been for years. It's a subset of Java 1.3 FFS. No concurrency, no generics, no autoboxing, no unsynchronized collections. Using it is like a trip to the year 2000. Small devices have become so powerful that the limitations of Java ME are a nuisance. Add to that the licencing costs for this obsolete tech and it's no wonder devs moved onto other things.
For example I found developing STBs with Skelmir (a cleanroom Java which is basically analogous Java 5 to be vastly more pleasant than Java ME). There is also of course Android + Dalvik which is better again, and more importantly free of any licencing fees.
The only way I can see Oracle salvaging any shred of the mobile or embedded market is by porting Java FX to Android / Dalvik. This is entirely feasible to do and would be a good fit for the platform. Of course I doubt Google & Oracle see eye to eye at the moment, but perhaps Google could be persuaded to support JavaFX as part of a settlement.
Maybe Oracle doesn't care and shouldn't
Given that the bulk of important open source software is paid for by government or industry maybe Oracle doesn't see any reason to worry about annoying the open source community.
For a while I've been thinking that maybe open source has become a serious distraction to a lot of companies that just isn't worth the grief. Maybe Oracle decided that the Hudson community was just a bunch of whiney little b#tches that didn't add anything to its bottom line.
To add a bit of weight to my argument.
1. Apple is very closed source and doing nicely
2. Android is very open source and seems to benefit Google mainly. A freetard customer base that expects free applications for their phones doesn't help developers pay the bills.
3. A lot of effort has been wasted open sourcing Symbian with little real benefit.
The point is that maybe open source has become a mantra when in fact the merits of open versus closed source need to be examined carefully.
The problem here is
1. Apple isn't all closed source at all. Darwin is a BSD kernel and foundation that iOS and OS X are both based off. Other significant open source parts would be LLVM and Clang which are Apple sponsored but open sourced compiler tech which are also heavily used by Apple.
2. Android benefits every vendor who wants a full fledged OS and VM for nothing. Java ME costs money, so do clean room impls like Skelmir. It's a compelling reason to use Android in many embedded cases where in $10 licence fee is too much money. Google benefits too of course because certified hardware will invariably ship with Google's apps which are how they derive revenues.
3. Open sourcing something does not guarantee success. Pragmatism or lack thereof can play a huge part in the success of a project. Look at GNU Hurd vs Linux as a classic example of how this play's out in the success of a project
Anyway Oracle are insane to alienate open source developers whether they are volunteers or employed. The Apache foundation, Spring (VMWare), Hibernate / JBoss (Red Hat), Eclipse (IBM) are all critical contributors to the success of Java. If you develop for Java then the chances are you use a combination of tools and libs from all of the above.
If you piss off these projects, or deny them input into the platform, they'll simply begin to think about taking their business elsewhere.
The article is flawed.
Please remember this is Forrester's research which isn't always accurate and at times has been way off the mark.
First, consider that Google and Oracle are in a massively important lawsuit over the mobile space. Even Apache walked away from that one ...
Second. While Apache dropped their Open Source efforts, that does not mean that Apache will drop any and all project that are written in Java. Looking at the big data disruptive project Hadoop, its all Java. (JRuby, Jython, Java) with a little bit of C/C++ streaming tossed in.
Third. Mobile apps are not developed in a vacuum. The majority of the apps tie in to a website or back end elsewhere. So if not Java, what? Objective-C? C/C++? The point is that you end up with a Gui language for the front end, then something else for the back end. Is this bad? No. But it means that the developer has to master two languages, or you fork in to specialties like one for the front end, one for the back end. (Which is what you already have...) Even in Java, there is sub specialization.
Take those three things and you're not going to see Java disappear any time soon.
For the record... I started programming young and my first languages were 6502 assembler and Basic(s). (Then 8080A and 6800 assembler.) In college, Fortran-77, Cobol, Pascal, 68000 assembler, C, C++ in that order. Later Objective-C and Java. (Not including scripting languages...)
Toss in some Small Talk and I think that rounds it out.
The reason I mention this is that if you look at the languages, Java, C/C++ and Objective-C stand out. Apple still has Objective-C (NeXT baby!), C/C++ for embedded systems and financial systems. and Java for the bulk of the rest of the world, including some financial systems. Sure I bypassed C# because I don't work in the mickeysoft world.
But my point is that until you have a replacement that would be disruptive that would surpass Java, you're not going to see it usurped. The authors of the article don't suggest this, only that they say the trend is for Java to be replaced in the mobile market in favor of something else. Without suggesting anything, that's a bit hollow on the analysis front. To say Android is not based on Java, is prejudicial and wishful thinking because the Oracle lawsuit has not been concluded.
Run, Forrester, run!
- Implying that HTML5 is a language
- Implying that "client-side representation" is somehow important in teaching
Some people still use Pascal to teach programming -- not that I'd do that, but I agree that *concepts* are more important than *tools*, at least for teaching programming.
In a side note, I love when I get people's CVs which mention HTML as a programming language... I have to resist the urge to ask them to implement a sorting algorithm in HTML.
Also missing is a disruptive factor...
Prior to late 80's you had the following:
Machine Code (micro processors and mainframe)
C (developed in '72 by some insanely smart people)
smattering of specialized languages.
(Yes, I keep forgetting about LISP and SmallTalk among some others...)
Then in '84 you had the start of C++ where B.S. wrote his extension to C. The OO concepts were born and you had C++, Objective-C, and a few years later Java.
Since that time, what disruptive technology changes do we have that would cause yet another new language to be born?
The latest disruptive trend that I've seen is in Big Data (Hadoop/HBase) w a nod to NoSQL in general. That's all back end.
What's the disruptive force driving a change on the mobile front end?
You want a prediction? You'll see Google found to be a monopoly and is using its dominance in the Chocolate factory ad/search biz to push in to other markets that were they not a monopoly they would not be able to do what they do. (Ala Microsoft and earlier IBM)
So where's the disruptive factor that will force others away from Java?
BTW, its the handset platform developer that bares the cost of Java mobile... not the developer. So even there Forrester's predictions are flawed.
Switch to what?
This is preposterous Web-2.0-ism.
HTML 5 isn't even a programming language. It's a mark-up script. Useful to know, but hardly the foundation for a computing degree.
"How do I show the output from my program, sir?"
"You use the DOM to find or create HTML elements and then you set their value."
"No System.out.println, then?"
Cloudbees and Hudson?
Cloudbees looks to have swapped to Jenkins from Hudson, just like everyone else without an oracle.com email address. They haven't quite finished the search-and-replace on their website yet though.
Ruby on Rails is not a language
Ruby = Language
RoR = Web Framework written in and for ruby
C'mon reg hacks, it aint that hard.
Still use Java
I still use Java and have done for nearly 15 years. In that time I've led the development of some very large scale projects including one that securely supports 20,000 concurrent users. I am aware of, and regularly use, the developments since then like Spring and annotations but I would argue that Java was never good for web applications and it hasn't improved any in that time. Sure some frameworks have come along making life a little easier but the core language hasn't really advanced.
Look around, at the popularity or RoR, Python and increasingly functional languages like Scala, Clojure, Haskell etc. The reason is that they are all more productive and better suited for web development than Java is. Sun/Oracle could have added features along the way to address this, i.e. Java could have been more like Scala by now. Look at C# and LINQ for an example of how Java could have evolved.
The core problem is that Java is not extensible, so you can't build higher abstractions or DSL's with it - everything is low level which hurts productivity. Instead you are forced to use XML which is clunky, unsafe and verbose to try and capture abstractions. To make it extensible you need higher order functions or macros. HOF's can't efficiently be supported on the JVM without closures and tail call optimisation which Sun promised us in JDK7 but Oracle has delayed indefinately.
This old study found that even C is more productive than Java: http://www.flownet.com/gat/papers/lisp-java.pdf and I would tend to agree.
Java has been left to rot and now its a festering heap of unproductive expensive bloatware, IMHO. I'll be using Haskell where I can get away with it. OO was a nice paradigm for fat client GUI apps but its not the only game in town and it doesn't scale for concurrent apps.
- Updated HIDDEN packet sniffer spy tech in MILLIONS of iPhones, iPads – expert
- Peak Apple: Mountain of 80 MILLION 'Air' iPhone 6s ordered
- Students hack Tesla Model S, make all its doors pop open IN MOTION
- BBC goes offline in MASSIVE COCKUP: Stephen Fry partly muzzled
- PROOF the Apple iPhone 6 rumor mill hype-gasm has reached its logical conclusion