Let me be the first one to
on second thought, no.
You can crash the latest version of Google Chrome with a simple tiny URL. Just rolling your mouse over it in a page, launching it from another app such as an email client, or pasting it into the address bar, will kill either that tab or the whole browser. It's perfect for pranking friends by sending it to them in emails and …
Saddest thing was Opera going over to Chrome
I think that wasn't so sad since Blink (Webkit) is not a bad web engine. The sad thing was that Opera 15 was just Chrome with a new skin and all the nice features removed. (kb shortcuts, popup menu etc)
I couldn't see a reason to hang on to Opera since the old version 12 was going to be EOL'd soon and I didn't like Chrome.
Tried this with Chromium 37.0.2062.120 Ubuntu 12.04 (281580) (64-bit) (version, out of date, supplied with Ubuntu 12.04) and no problems. Guess this bug was introduced since then?
Same test for Firefox on this machine (40.0.3), no problems.
Same test for older Opera (12.16), also no problems. Tried new Opera, it lacked most of the good features of old one (the "turbo" proxy server is its only benefit) so went back. If I need more up-to-date support I have Firefox or can fire up VMs with other choices.
The first thing every function I've ever written, for every program suite I've created , was to test every gozintas for compliance with its contract. Furthermore, all gozoutas were also examined whether they obeyed the contract. Error codes were generated, fully documented and matching the documentation(!), which returned to the caller as per that contract. Then the error was handled properly without crashing the program or the OS. This is called 'Playing Well With Others' and sometimes 'Sharing.'
I did this in Fortran (multiple flavors), C, C++, more freaking flavors of assembler than I care to recall, and dBase 2-3-3+-4, Clipper,.... The list is extremely long as I never used the same language twice in a row aside from classes at the university. Professionally. Since 1975 when I was 14. Let me repeat that for the numerically challenged: Fourteen.
The only reason to bash your head roundly with this (lovely English phrase that) is I had to figure it out on my own! I didn't do fucking Kindergarten! So please, please, please think about what you are doing. Working with multiple Coders you should have this down pat. That's why we have contracts in the first place. Wikipedia or the OED.
WTF? [Yeah, I keep harping on this, but damn! Downvotes expected.]
"Since 1975 when I was 14. Let me repeat that for the numerically challenged: Fourteen."
Ah, when I was 16. And I KNOW many others here predate me by a lot. So don't assume that the Reg commentards are all wet-behind-the-ears coders, no no! Some of the minds on this board are so ancient that bats fly out of them.
One of those upvotes is mine. I should have addressed it: Dear Google Chrome Coder.... We have so many weapons now, I just don't fucking get the problem. "Back in the day" I was using formal verification, tons of statistical samples on those problems that were oriented that way even when smaller samples (t-test for normal espected distros, and test that it was normally distributed, predicate calculus, yada yada. As I said, from the very first job. If I got that one wrong, well there were going to be a ton of Congress-critters demanding why those millions of dollars in savings didn't happen. [Ain't naming the agency as it'd create a shit-storm. 14 y.o. kid.]
Backstop, examine the problem along multiple axes/dimensions/schools/disciplines. test, test, test.I am not perfect, nor some programming God. What I am is demanding 'cause the solution had budgets starting in the seven figures and in one case is into 10 figures. I certainly didn't expect that last one. And as I've posted before lives were involved. Dozen or more lives, sometimes an entire facility which has thousands. Mass explosions wiping out a chunk of a major city (in the millions), well I don't think "oh, he just made a null-pointer error, anyone can do that" will have good results at a Courts Martial. Jury of Your Peers doesn't mean they're going to cut you any slack.
Whatever. Apologies to my fellow commentards for poor addressing.
In 1975 I was,er, past 30. Years before, I had mailordered my first transistor, to add to my crystal set. Ex-Army valves were cheap, but no good on batteries. Analogue computing was easiest to implement, just graduate the pots with a Rapidograph.
Yah. I learnt this while developing data recovery software. In that scenario invalid inputs are not just possible they are expected.
Unfortunately you can't stick contractual tests everywhere. That has performance consequences that may outway the risks. The trick is to know where your gateways are so that you can place your gatekeepers.
I'd suggest that they would have been better encapsulating the URI in an object(*) and passing that around instead of passing a string. That way it's obvious where you put the gatekeepers - in the object.
(*)Even if your language isn't object oriented you can still employ the isolation technique and define specific interfaces to underlying data.
> Dear Coders - Rules You Learned in Kindergarten
Which is all well and good, as long as you work on safety critical stuff, as in that scenario someone wants to pay for it. The vast majority of consumer software is "free" or at least "very cheap", and the consequences of a failure are low (you don't kill people), so no one wants to pay 2-4 x the cost of development to find the bugs.
It isn't a coder problem - it's a management problem. Coders like to eat.
I once worked for a company that believed in exhaustive formal unit testing.
I was the maverick who questioned the utility thereof. And I kept finding elementary bugs in that code. Furthermore, fixing them was often hard, as it would cause the unit tests to need fixing too, and that was more red tape than the code itself. A typical example would be an off-by-one index error that caused an incorrect numerical interpolation. Something that could matter a lot when the software controls satellite orbits.
The problem was the human factor. When the hard part of the job is not hacking it but getting it through the tests, that becomes the programmers' goal in life. They no longer see the wood for the trees (or bugs for the tests, if you prefer). And when the tests are more work than the code, they are also more complex, more error-prone.
Like red tape everywhere, the process had strangled itself. And when a two-minute fix requires several days work on the red tape, it drives programmers away, too.
Yet more advice that tells you attack those windmills in the hazy distance, eh?
Only rank beginners attempt that kind of stuff.
Jesus Christ, use assertions and programming-by-contract but don't bother going overboard on unit testing. Bail the fuck out if you must and if you don't know what is going on, reset the system (thus being "anti-fragile", maybe?). It's even done in high-assurance software, fail-safe etc..
See also: WORSE IS BETTER
Richard P. Gabriel suggests that a key advantage of Unix was that it embodied a design philosophy he termed "worse is better", in which simplicity of both the interface and the implementation are more important than any other attributes of the system—including correctness, consistency, and completeness. Gabriel argues that this design style has key evolutionary advantages, though he questions the quality of some results.
Since 1975 when I was 14.
So ... in 1975. At 14. When computers with 32KByte RAM were sitting in cellars and remote access could not be had and when maybe someone in the street had a telephone. (1972 had just seen the birth of PROLOG, btw). When you wouldn't be able to comprehend the stuff coming in large boxes of binders holding pages upon pages of requirements ... you have "tested every gozintas for compliance with its contract and all gozoutas were also examined whether they obeyed the contract"?
Sadly, what you basically describe is 'Defensive Programming'. This should be taught but it is just not cool, sexy and all that crap.
It takes time to learn but when you get it, using it for ecerything you code really takes up no more time than all that sloppy coding that seems so prevalent these days.
Thankfully, the current dev environment I work in makes this easy. OTheres are a PITA until you develop your own set of functions and procedures.
Finally, I've lost count of all thise XML Schema's I've seen that have eveything as xs:string even if the valuesare dates, numbers and booleans. If you properly crafted the XSD you could quickly validate against content AND Value. But no, this is all too hard and un sexy/un cool.
Reasonable advice, but now sadly impractical. Software is too heavily layered, with layers being written by different people in different organisations, and no-one uses return codes any more - too cumbersome. It's all event driven, error conditions by exceptions, and throw in some concurrency and epic locking for good measure too.
So the flow of control within a complex product running against complex inputs is nearly unknowable. And remember: modern browsers are complex: a VM, a database, a fully customisable UI and more. Hard to test if it didn't have external connections: add the Internet and it becomes impossible to test fully.
We probably need entirely new languages (not based on C in any way - the lessons it taught aren't good any more) to enforce very detailed contracts by design - far more that the current 'n must be a positive integer' style we have now.
I remember a time back in 1967 when GE was trying to write a time sharing system at Dartmouth based on their successful 1964 system. The most hated GE programmer was someone who wrote a module that checked its arguments. If they were not valid he declared a system fault. Most of the system faults were caused by his modules but he was never caught with a mistake in his code. It seems that all of the other coders just calculated with bad parameters and passed on the results.
P.S. Their system eventually failed and the Dartmouth Time Sharing system was written by students.
P.P.S. Yes, I'm ancient - been coding for 55 years.
Well, Damn! Mr jack. If you aren't a living, breathing miracle. That near impossible entity: The Perfect Programmer! Not once in all those trillions of functions you must have written over so many years did you ever make a mistake. Not one durn bug or error, not one overlooked fragment of obscure logic, not one missed gotcha. I'm in awe. I bet ya's worth a million in prizes. Now I have a role model to look up to. I shall keep on striving for that elusive perfection. One of these days I will write a non trivial program without a single bug lurking in the Shadows and you, Jack, are my inspiration. You're better even than the Google Gods themselves.
Zero bugs detected in twenty five years or more of use. No explosions or radiation damage to personal or equipment to date. Not a single life lost. Hmmm... last time I looked, Coronado, CA is still there and they are definitely still using my software. That's the major item. Financially, millions per year saved, and that's seriously (orders of magnitude) conservative.
Google Gods, well....
I really couldn't care less who you emulate except that errors cost people frustration, time, money, and occasionally lives. Ask Airbus about their little glitch. Someday an EULA ain't going to protect ya'll. I didn't have that fig leaf.
What I think he means is that he has been granted enough time and resources to fully test and debug his code before releasing it into the wild. He doesn't write bug free code; the bugs have been found before release. Its a fortunate condition for him to be in and I've never experienced the luxury yet.
I'm not sure of the point of the post either. But I'll take a catastrophic browser crash over a critical nuke event. It looks like the relative investment protecting us from either has worked out for the best.
Living reasonably close, fallout-wise, to Coronado CA, I'm wondering what software system is apparently sole prevention from one or more of the 160* W-80 nuclear warheads from doing what comes unnaturally, while also saving untold millions in cost. I do hope system integrity does not rely on further maintenance of a PDP-11/94 running Trusted Xenix.
I'm assuming the author wasn't referring to a possibly more hazardous source of radioactive material on Coronado Island, which is all related to medical devices. Off the navel base, the place is quite gray-haired.
* A 2002 figure from the National Resources Defense Council
Zero bugs... Did you just work on the comments or something?
Admittedly I'm rarely sober, but I can't type in two lines of code with anything less than three serious bugs, and I write software for planes and heart monitors. We get the intern to write the unit tests, or we did back in the 90s when we had one. Now the compiler pretty much catches all of it, and the passengers and patients the rest.
It's getting on for 20 years since I studied it at uni, but there are methodologies to development and testing that you simply have to follow when working on critical or embedded systems. This allows for you to consider race conditions and unexpected input values as well as simple coding mistakes. Once you go live, patching is incredibly difficult, so yeah i imagine it is possible for someone to have gone years without a bug being found in their code once it had gone live.
I'm sure quite a number of bugs get found during unit testing, then more once your code gets to interact with others in system testing. As was highlighted, a bug in the production environment could be catastrophic - be it nuclear meltdown or planes dropping out of the sky!
It does happen. I wrote a fully functional editor in Turbo Pascal that never crashed - and had tens of thousands of users. It took a couple of years to get it there but I did get there.
Anyone remember EasyEdit II in DOS? It got a good review in many magazines (including Byte).
It's definitely possible to write the 'perfect' piece of software (for a given definition of perfect). It all comes down to how much cash and how much time Management is willing to give you to check and test.
There are mathematical techniques you can use to analyse your code to check for logical bugs, and as has been mentioned, if performance is not the bottleneck, then you check ALL of your inputs and outputs to ensure they fulfil your contract, and never, EVER, return anything that you haven't documented in full.
Of course, in the business world that most of us operate in, these niceties are the first thing to go to the wall in preference for getting the software out the door and into the hands of the great unwashed.
Remember the old Project Manager's adage: Quality, Cost, Time ... pick any two.
I mean the URL will be sent verbatim to the server just as it's entered and stored everywhere. There is no reason to turn %20 into a space or anything, let alone doing this multiple times. There may be reasons to do the opposite, for example on forms, but unescaping a string should never be done by the user agent.
Seriously, never, ever assume that any data passed is going to be valid. In security sensitive code like this, the data should always be treated with some suspicion and be validated at the beginning and end of every function that handles it.
I would assume that the URL parsing function would strip the protocol from the beginning, which would leave you with [null] after URL decoding and thus an empty string. The calling function should have noticed that it received an empty string and ignore, and continued o its merry way with the next URL it detects.
curl passes the string on without modification. wget, however does this to it:
$ wget 'http://theregister.co.uk/%%300'
--2015-09-21 12:52:09-- http://theregister.co.uk/%25%300
Resolving localhost (localhost)... ::1, 127.0.0.1
Connecting to localhost (localhost)|::1|:3128... connected.
Proxy request sent, awaiting response... 301 Moved Permanently
Location: http://www.theregister.co.uk/%2500 [following]
--2015-09-21 12:52:09-- http://www.theregister.co.uk/%2500
Reusing existing connection to localhost:3128.
Proxy request sent, awaiting response... 404 Not Found
2015-09-21 12:52:09 ERROR 404: Not Found.
Sincere apologies if this experiment makes anyone's browser crash...
Biting the hand that feeds IT © 1998–2019