Like Kevin Spacey?
Poorly written code is leaving banks at greater risk of attack and poorly prepared for big changes in the financial sector due to come into effect early next year. CAST, an organisation that reviews the quality of code for businesses, recently reviewed over 278 million lines of code and reveals that out of 1,388 applications, …
As a developer I deal with legacy code bases and to be fair all can be terrible or all can be good, It is not down to the language that is used but the developer.....
Unless we are talking about PHP then that is the languages fault
While this is true, At least you have someone with a good tool set, good guides etc
I can develop PHP on my phone, is not normally taught or learnt with best practices in mind etc etc
At the end of the day all languages are susceptible to this, But in a commercial environment I see much more devastating code from PHP and its variants then I have .NET
I'm sure if you look pre .NET at VB5 / VB6 code you'll find the same horror show.
No, you won't, if for no other reason than most code of that era wasn't exposed to the internet in the first place. But that isn't the only reason. A quick glance at the VB3 standard library, even before "Option Explicit", reveals a concept largely unknown in the Personal Home Page world - consistency.
You would be better off going after PowerBuilder or some other dodgy old 4GL, and you would still be wrong. PHP is uniquely bad. Nothing else comes close or has come close in the last thirty years.
PHP is worse than VB as a language, I'll definitely grant you. But the code written ( while usually not internet exposed ) would have been plenty of awful. I don't think VB was really that bad - the "problem" was that it was very easy to make GUI apps which attracted incompetent people to use it.
PHP is *the* worst real (ie: not brainfuck) language I've ever seen, but that doesn't make it impossible to build perfectly usable things with it.
PHP is worse than VB as a language, I'll definitely grant you. But the code written ( while usually not internet exposed ) would have been plenty of awful.
How did PHP address the same problem? Add new APIs and libraries while retaining the old ones. You retain all the horrific old junk with an extra side order of confusion. As a result, what gets used tends to be whatever Google first trawls from StackOverflow. No other language has ever started with such broken fundamentals and then fought so heroically to prevent them being repaired.
You never coded in ASP (or classic as it's also known ) then?
That being said having a language that doesn't bother with security that much means you learn and develop your own toolset pretty quickly.
(For the record I dev'd object validation and database abstraction as a sticky plaster for any ASP sites I ever worked with though it's been a while since I left said sites alone.)
Anon to save the guilty.
Even then, if you know what you're doing with PHP there's no reason it can't be secure. It has plenty of problems, but none are insurmountable.
99% of the problems with PHP are down to people learning it because it's perceived as easy and following godawful online tutorials. If other languages had the same problem with people learning bad practices from people who have no business teaching others, they'd have the same problems.
( dev who happens to use PHP exclusively in current job )
After 20+ years in programming for desktop, web and mobile, I would have to disagree: 90% of the time, poorly written code is down to the manager who invariably wants the moon on a stick, delivered yesterday, and is unwilling to put time aside for rebuilds of legacy applications, refactoring, maintenance, etc.
"Fast, cheap, right - pick the two ways you want us to do this"...
One of my favorite rants - when you make it so easy to get something going at all, even if the developer doesn't really understand the nitty gritty details - eg make it possible for monkeys to write code - you get monkey code.
I don't let the devs off for a manager with unrealistic expectations either. Grow some spine, man. How do you think they get to have those expectations - no one stood up to them.
How about not "coding at the tube"? How about designing AT ALL? If partway through you find the original design won't work - then do the redesign, not some horrible bodge due to ego. It's often faster that way anyway.
It really does come down to humans. Tech solutions (languages that are "easy") to human problems never work out. Ever.
Banks have been slower than other sectors in adopting modern coding tech, partly because of the need to support legacy apps written in Cobol but also because of complex coding environments.
They were wheeling COBOL guys out of retirement homes for Y2K. What are they depending on this time? Exhumation orders?
Banks don't need to continue to support creaky old legacy systems, what they need to do is devise a strategy (OK so far), on-shore or in-house development (hang on a minute!) and increase dev budgets significantly to pay for it (BURN THE HERETIC!!).
I watched the same movie 20 years ago. This is just a franchise reboot. I expect the story to be remade again with a contemporary cast around 2040.
Yes, code that works is indeed a source of great confusion for developers "educated" in more recent times.
The reason that COBOL code hasn't been re-written is predominantly because it works and it performs.
Of course, lack of "investment" is also a factor but that is as much a symptom as a cause.
Q: Would you tear down your perfectly good house in order to build another one exactly the same just using some more modern, less well proven materials and practices ?
Even if you don't undertake a whole-sale rebuild and gradually renovate using "Encapsulate and Strangulate" type transitions/transformations, even these represent significant investment to fix something that in all key respects bar one (skills availability) is not broken.
But even the skills problem is bogus. Yes, there is a relative shortage of COBOL developers, but that is not because COBOL is hard, it's just unattractive to modern developers. There are ways to make it attractive. Aside from training (something that for some reason seems to have become a dirty word in this industry), COBOL is still being actively developed and variants exist with all the modern tooling that modern devs love.
And of course, anyone willing to enter that world is likely to be very well rewarded.
“A greater density of security weaknesses presents more opportunities for malicious actors to find vulnerabilities to exploit for unauthorised entry into systems,”
It signifies something else. Someone not checking their inputs. Half of the "vulnerability assessment" checks are nothing but checks for the presence of input validation and sanitization.
So "insecure" code as discovered by a tool like this is actually also UNSTABLE code.
As it has just been mentioned, in another thread, I had a quick look at CHIEF. At least, the Wikipedia entry.
Looks as though it is still running on the ICL (now Fujitsu) VME mainframe OS and I would guess there is a lot of COBOL in there.
Allegedly scheduled in 2010 for a ground up rewrite using more modern languages, technology, COTS packages, yada yada yada....
So far at least one abandoned attempt and at least two tendering exercises. Target is also to match(!) current features, capacity and response times. Noting that the reason that CHIEF is currently in the firing line is that the expected increase in capacity required after Brexit is not sustainable by the current system.
My personal biased view is that such a system is not replaceable by any project which does not recognise the enormous resources which went into the original project for design, testing and deployment. Think whole office floors of COBOL programmers and system analysts, complete duplicates of the live system for testing, seriously complex test plans. This is from the era when the hardware and OS cost an eyewatering amount of money so the equally eyewatering cost of the software development seemed proportionate.
I doubt a rack of 2U servers and a "fail early fail often" approach is going to work in this case, and a realistic budget will get a "How fucking much? Piss off!!!" response.
As far as I can see the bank legacy systems are in much the same position.
Plus (as with other government driven software applications) if you take a snapshot then reimplement from the ground up by the time you have finished 2-3 years later the original system has changed almost beyond recognition. So it can't be replaced until the new system catches up. Rinse and repeat.
I think the only viable way to replace any legacy system is for developers to become domain experts. Which is a big problem for organization, because domain experts are expensive (as opposed to code monkeys), and also because some developers are happy with the coding part alone, and finally because there is frequently "old guard" which do not like sharing the secrets (or even sees secrecy as security imperative). The definition of "domain expert" is someone who can understand what the "old thing" is doing and why, and build the mental model necessary to design its replacement.
Those core account systems are not the problem nor is COBOL the problem. The security problem exists in all the layers of code added to allow access from phone, or internet or mobile. No one accesses a COBOL based banking system directly (do you see a CICS screen on your phone?) When security fails in one of those user-friendly interface layers then what is presented to that back-end account system is what looks like a valid user. The legacy system is not your security problem. Replacing is an impossible task if what you will say to a business person is: Give me $250 Miliion dollars and 5 years and I will give you exactly what you have now, except newer. What is the business value in that? You will never get that funding so those old systems will remain in place until the hardware is no longer manufactured - and that may be never. Better to spend a fraction of that $250MM properly securing (or rebuilding securely) those "front-end" layers.
There are two or three things which matter. The actual running program, doing what it was supposed to do, is just one of these. In the short term, it is the most important thing (probably), and it is right to keep it in mind. The other two things are knowledge what it does in the heads of people supporting it and availability of support for the underlying platform. The other two are problematic for any legacy system because people will eventually retire and vendors will stop providing support (or ramp its cost so much that it will become a huge drag). Hence the necessity for long-term planning to replace any such legacy platform, which IMO should start at spreading the knowledge of what the system does (hence, educating the next "generation" of domain experts). This does not mean the same thing as outright planning big bang replacement with a new, flashy and very expensive system.
The problem with old systems is that they work. And work, and then work some more. Until they stop working or prove to be insufficient performance-wise (whatever metrics of "performance" you use). It is right and appropriate to be prepared for such eventuality.
" The other two are problematic for any legacy system because people will eventually retire and vendors will stop providing support"
The cheaper option will probably be to train new people to take over. And, who knows, those people will eventually know enough to direct the re-write. When a piece of software is at the core of your business it's false economy not to take care of it and that includes spending on people.
I'm immensely suspicious of this.
I've been writing code since Fortran was in capitals and my experience suggests that the "managed" languages (like the .NET lanaguages and Java) are significantly less problematic - no buffer overflows (unless you use P/Invoke or JNI) and higher immunity to things like SQL injection (if you use the platform features like EF).
I have, however, been on the "other end" of automated software analysis which does tend to throw up a higher proportion of issues with the managed language environments - the ability to do more static checking means that there are more "errors", though they are usually things like "should you have sealed Class X?" rather than actual potential runtime problems.
There's a big difference between "we sell a software tool that says most of the errors are here" and those errors actually being significant, or being comparable with errors elsewhere.
I'm also very suspicious of this. Generally speaking, with any sort of automated code analysis, I would be very wary of making comparisons across languages, especially those that come with their own runtime such as Java or .NET. Differences in how the analysis is done would make any comparison irrelevant.
> "managed" languages (like the .NET lanaguages and Java)
> are significantly less problematic - no buffer overflows
> (unless you use P/Invoke or JNI)
That's not my experience with Java. I am a consumer in my current role. When using Java-based applications, Buffer-Overflows-R-Us. Some of the applications I support use WebSphere on Windows/PC architecture. I have to carefully use only certain Java versions to avoid the otherwise inevitable buffer overflow.
forget ".Not" and maybe Java, too.
Do it in 'C'. Translate those COBOL programs directly in to C programs [it CAN be done].
Then stop hiring crappy "developers" who don't understand what a compiler is, and have COMPETENT people write the front-ends in a lingo that doesn't SUCK.
(the one thing about COBOL that made it work well for business is the structure definitions that are inherent in the language. Since 'C' can do this too, should be NO problem porting any COBOL program into an equivalent C program)
Also - C compilers exist for just about every platform these days. no need to lock yourself into Micro-shaft's FAIL aka ".Not" and "C-pound".
a search on "cobol to c converter" returns MANY relevant hits.
(the one thing about COBOL that made it work well for business is the structure definitions that are inherent in the language.
I suggest having a look at how Real Klingon programmers do COBOL:
PSD2 and Open Banking are complete nonsense. Financial organisations are being forced to implement a solution to a problem that doesn't exist. I'd be interested to know where the responsibility lies if an account is hacked through a third party app. The third party app provider or the bank holding the account? At least with the present situation a bank holds the account and provides the access mechanism, it's all *their* fault.
"Financial organisations are being forced to implement
a solution to a problem "
This has already started complaints. One of the banks (Lloyd's I think) has started amending Ts&Cs to accommodate. The Radio4 Moneybox audience are already asking how they can opt out.
And I'm with them. There needs to be a serious "no means no, just no" option. I don't want my bank having any chance of coming up with "but they said you'd given your permission" type excuses.
err - it would be more accurate if it said "customers prioritise user experience over security".
security usually involves more effort than simply bashing a card against a reader, however that's what the proletariat seems to demand. Additional hassle of PIN or token even a random check fill twitter with crackling vituperation.
As for the "It's for compatability with legacy code in COBOL" BS
Bo**cks. That code dates from a time when machine time was ruinously expensive (and machines in the 100s of KIPS was close to being viewed as a supercomputer).
Consequently a lot of time was spent "desk checking" before committing even to a compile, let alone a run test. That's why a lot of Y2K COBOL code worked just fine following audit.
Here's the thing with legacy systems. They were developed when machine time expensive, staff time cheap.
Now it's the other way round (and the developers of Unix could see this trend from 45 years ago).
When you work out how much developer staff time was spent on those old systems (100s of man years, not months) and factor in today's hourly rates you think "WTF. I can't afford that." Hence the "If it ain't broke don't fix it" mentality in banking/government/telecomms.
Which is why a small number of niche firms make a good living building tools (and using them) to chomp through MB of legacy code and refactor and/or detect coding weaknesses.
Exactly, nothing wrong with things written in COBOL that have been working for 40 odd years without some wet-behind the ears consultant tosser sticking his expensive nose in and fucking it all up.
I don't work in the financial sector, so this is an honest (if naive) question. When software is developed for systems considered to have a safety impact then there are processes and tools. I'm thinking of SIL, SW01, and I'm sure the nuclear industry has something. I'm not daft enough to assume that these systems and processes (and languages - ADA?) prevent people writing bad code, but they form part of a safety-case process which gives some assurance regarding safety. An airport can't install a new display system without development and interoperability assurance and a safety case that's signed off by the CAA; is there no equivalent in the financial world?
(BTW - I'm not suggesting going back and making old code meet new standards - I'm interested in what happens for new SW development in the financial world)
> Applications between five and 10 years old have the
> greatest potential for security flaws
Yep, that's because new applications are not written in COBOL! Give me a break, could people please get off this "COBOL is evil" and go to the domain experts.... show someone in Accounting what the code is doing in COBOL, after a half-hour you have them nodding their head and understanding. Other languages, such as C++, are so obtuse that I've heard a rumor that programmers love it because even their boss doesn't know what they're doing. Forget about a pencil-head ever understanding what the code is doing.
I think I've read at least 15 articles this year regarding this.
Amazingly, this article doesn't provide any real references or links.
Most of all, there is nothing new or unique.
Not shocking is providing any background into exactly what systems in the financial industry still uses low level language development, and providing perspective into how much of the financial industry has upgraded to systems developed with managed code.
Perhaps an article should be written about the development updates, changes, etc. around financial services. I won't hold my breath... too many lazy column writers.
Companies tend to prioritise user experience at the expense of cybersecurity.
Companies prioritise cost over user experience, security and even just producing decent code. There are very few businesses (especially financial ones) that give a damn about user experience.
I used to work with various banks hooking up systems. The first time I worked with one I was absolutely surprised by the complete lack of technical know how at the bank itself. Most banks outsource their core systems to just a couple companies (Perot Systems being one). Those companies slap a couple logos, change a color scheme and *presto* you have Generic Bank A's customer facing system. There absolutely isn't much concern for the end user experience.
Instead, the priority is on making the system as cheaply as possible so the systems provider/integrator can make as much money as possible. The bank itself, unless it's willing to deal with writing their own, is at the integrators mercy.
That said, Developers are absolutely the problem. It's essentially the wild west out there. Existing certifications are useless in letting a non-tech person know if a tech person actually knows how to write efficient, secure code. Universities/colleges still don't seem to have a handle on how to teach people to program, so having that degree isn't a decent guide either. Until education and certification are solved the problem isn't going away.
I truly believe we are at the point where programmers should be licensed. The testing should be akin to the testing Engineers and Lawyers go through. Even if we differentiate between various types like financial, health, etc. I'd even be ok with the web weenies not being licensed - unless they deal with PII.
You can do the structure definitions in C; you can't do the high precision decimal arithmetic that COBOL does without using a library for all your important calculations. Good luck maintaining your code after you've used an automatic converter to turn it into C. There are certain things that COBOL really does do better than other programming languages.
A much more sensible solution is to pick a modern COBOL compiler and runtime system that lets you put your code on a Linux or Windows server, and also gives you the option of compiling to .NET or JVM if you want straightforward interoperation with your front end in C# or Java. And I'm not just saying that because I work at Micro Focus. Don't get so hung up on the language; you can incremetally refactor old COBOL into better structured code and that's always going to be a simpler and lower risk process than converting the whole lot into something else or throwing it away and starting again.
You've duplicated the functionality you already had in A.N.Other language.
BTW I'm assuming you had the source code to do this. There are programs running for which the source code is long gone. If you don't then you've only duplicated the visible functionality. There may be rarely used functions that are not visible, or invoked indirectly when certain tasks are carried out. You won't realize they are missing till something happens that needs them.
Biting the hand that feeds IT © 1998–2019