Freedom is the freedom to say 2+2=4. f that is granted, all else follows
Welcome Reg readers, to this week's Who, Me?, in which we gather round to share in another person's painful memories of technical cockups. This week, we meet "Fred", who wrote in to tell us about a time about 15 years ago where a rather basic test was seen a little more widely than planned. At the time, Fred was working for a …
Don't think so - the article claims the company was bought out shortly afterwards. 3 was started by Hutchison Whampoa - and is still owned by them.
So, basically Cellnet (Telefonica), Orange (Mannesmann / France Telecom) or one2one (Deutsche Telekom)...
Tell the PHB that technology is our friend? Christ no! I've been there with PHBs told just that, believing it to the hilt and decreeing that "we must be digitised" - without bothering to add that it should actually make sense, be well done and not crap.
Almost everything turned out crap (but which you wouldn't believe from the self-congratulary emails and in-house Faecebook lookalike page announcements. Talk about the King's New Clothes)
Not as well as some languages, it doesn't:
$ perl -e 'print 2 + 2;print "\n";' # prints 4
$ php -r 'print 2 + 2;print "\n";' # prints 4
$ perl -e 'print 2 + "2";print "\n";' # prints 4
$ php -r 'print 2 + "2";print "\n";' # prints 4
$ perl -e 'print "2" + 2;print "\n";' # prints 4
$ php -r 'print "2" + 2;print "\n";' # prints 4
$ perl -e 'print "2" + "2";print "\n";' # prints 4
$ php -r 'print "2" + "2";print "\n";' # prints 4
PHP treats 0, '0', '0.1', '', false and null as false.
PHP treats 1, 1, 0.1, 'true', true and 'false' as true.
Using PHP as a yardstick of handling various types is a terrible idea.
IMO, "2" + "2" should throw an error in any language and JS should have a more sensible string concatenation operator.
We have intVal and parseInt for a reason.
But strings and numbers are not necessarily incompatible types. You can always convert a number to a string in order to concatenate it with another string, and you can sometimes convert a string to a number -- depending what characters it contains -- in order to add it to a number. There is no reason, besides mean-spiritedness on the part of the programming language designer, for any modern programming language not to do type conversion automatically if it would not cause an error.
The real problem is the misuse of a single operator to perform two distinct operations.
It's not perfectly reasonable because an interpreted language should be smart enough to be able to convert a string that looks sufficiently like a number to a number and then add it to a number; or convert a number to a string and then concatenate it with a string. Perl manages all this just fine (see examples above).
But what is a suitably obvious number when displayed as a string?
They're all obviously numbers or text strings depending on context. Context which is beyond your programming language, other than if it's wrapped in quote marks or properly typed.
That still need not be a problem, though:
You're missing the point.
In a loosely typed language (like JS, PHP, etc...) how does the computer know if the programmer wants to concatenate of sum two values? It's easy if they are both in quotes or both unquoted decimal strings. If they differ, the computer has absolutely no idea what the programmer intended so has to make a best guess.
Hence 2+2=4 and "2"+"2"="22" (including in your original incorrect comment) but varying results between languages for "2"+2 and 2+"2".
In a loosely typed language (like JS, PHP, etc...) how does the computer know if the programmer wants to concatenate of sum two values?Easy ..... If you use an addition operator between the operands, it should try to add their numeric manifestations; but if you use a concatenation operator, it should try to concatenate their string manifestations.
Well no, it doesn't.
They behaved completely differently.
In the end it turned out that if the numerical stuff was buried in a conditional firefox took it as a number but IE took it as a string.
It took me hours to find a way to get IE to treat it as a number.
I HATE weakly typed or non typed languages almost as much as I hate Pascal.
Possibly most of the internet is now free of servers displaying today's date as January 14th, 19119. (It's a Millennium Bug thing. Ask your legacy team.)
http://snapahead.freeservers.com/ reports that as of that date "I don't have many mp3z". Right-click to see page source produces a message "I don't want U to steel from me!"
It's like tasting a madeleine cake.
And why do you suppose someone would prefix the year (which, coming from a Unix timestamp, has had 1900 subtracted from it) with "19" -- which was bound to break come the year 2000 -- as opposed to the obvious and future-proof method of just adding 1900 to it? Might this possibly have had anything to do with the relative difficulty of adding numbers versus concatenating strings?
Because for about 28 years people were used to computers returning a 2-digit year.
When Y2k came about, some people expected 99 to become 00 and others expected it to become 100.
That is the root of many of the foreseen problems. It also stems from not having much memory/storage so optimising down the digits stored.
My small company just got bought by a much larger company. The amount of spam coming from inside the larger company is prodigious. The nonstop (largely inapplicable) training, cheerleading, phishing awareness testing and automated compliance emails are enough to drive one to drink. One wonders how we have been able to flourish without all that "stuff" for so long...
Anon, because one cant be too careful...
This is why test/development environments and production environments should never be the same network.
People make mistakes. people pick the wrong boxes. People forget to tell everyone when updates are made. People fuck up. If you put some kind of protection between the people and production you reduce the risk of this type of thing.
And yet some many businesses, organisations, banks and governments departments are moving back to everything on the same network or in the same cloud. Expect to see a lot more "whoopsie" in the press...
"Identical access control on development and production systems? What could possibly go wrong?"
"This is why test/development environments and production environments should never be the same network."
It's quite possible and sometimes desirable for developers to be connected to multiple environments at one time eg DEV and QA and/or UAT. It's possible and sometimes desirable to have multiple instances of the same application open at the same time, each pointing to multiple environments, and accessed using different credentials.
Of course best practice requires checking what environment I'm in before I run any script or execute any code etc, but sometimes it doesn't happen. For me what helps is development environment background that changes colour depending on server environment it's connected to. And even that sometimes fails.
The only way to be completely sure is for devs to simply not have access to Prod, and have change management to implement stuff in Prod... and that brings it's own host of issues
"If you can find a book of four-figure mathematical tables somewhere, and someone who still knows how to use them, try to multiply 2 * 2 using logs."
Oh wow, that's a blast from the past. I'm of the age (in the UK at least) where we learned with books of log tables but ended up using calculators by exam time. Log table books were still provided though, not everyone had calculators. And you still had to show your method/working out so the calculator was little more that a tool to remove the drudgery of the arithmetic and act as a look up table for Log/Sin/Cos/Tan.
I'm old enough to remember log tables and slide rules for my maths exams.
Nice thing about log tables were the pages of equations at the front of them, and the loads of space for putting any other cribs you wanted to remember. Try putting all that useful info on the back of a calculator.
I had all of the equations need and often some example sums in my TI-83 graphing calculator when finishing high school. We couldn't be forbidden from using that as it was an allowed aid by official rules :) Barely used it though, as it turned out I usually just did my homework and actually learned stuff. The programming cable paid for itself and some snacks though. Not all students bothered.
That's not really approximation, just inefficient representation.
0.9999..... == 1.0
Ergo 3.9999...... == 4
'Proof' (probably much over-simplified and possibly even not really a 'proof'): If there are infinite 9's after the dp of a value that you wish to increase to the next whole number, then you need infinite 0's before the 1 to be added, but if there are infinite 0's then you can never reach the point where you can place the 1 to be added. If you cannot express the difference then there is no (meaningful) difference, other than the representation.
By extension I suppose n.0000.....N is always also equal to n.0
Mines the one with reciprocal ∞ in the pocket.
Specifically, when you're doing mathematics with infinite decimal places then the difference between 4.0 and 3.99999999... is 0.00000000... which obviously is 0. So 4.0 and 3.99999999... are the same number written two different ways.
Personally I distrust this infinite stuff, but I'm comfortable imagining someone starting to write out the digits of the number, and never stopping. But there's no question of getting to the end; there isn't an end.
While mathematically speaking, 3.999999.... == 4, using a computer to perform a comparison will indicate a difference. (After all, computers can't do infinite-precision floating point...) So, if your code contains if (x==4), and x=3.99999999999, then the result is 'false', not 'true'. It's always best to round to a couple decimal places beyond the smallest expected difference (if your values are 3.5 to 4.5, round to 3.9999) and THEN do the comparison.
The funniest I ever saw was a routine that just consisted of a set of printf statements
"You shouldn't be able to get here."
"Well I did!"
"I don't know. You wrote the code. Where should I have been?"
And so it went on. I can't remember the rest but ti was hilarious.
Years ago I put some "debug" code in a project I was working on. When I went on 2 weeks leave I told my project manager not to demonstrate the project to the end users as it was unstable and might not work.
When I got back from leave the PM called me into her office and chewed me out. It turns out she had, against my instructions, demonstrated the incomplete project to the users and it had hit my debug code. It was testing a nested if statement and I was unsure of some of the conditions (the specs were a little vague). When it hit the debug code it displayed on the screen "OK Turkey, how the hell did you get here?". Different branches of the If statements gave different messages so I could determine what was happening.
Mind you, the PM never again demonstrated any of my projects unless I was there or I had said it was ready for testing.
Story told to me by a grizzled engineer (so not my "Who Me?") long ago... a team of developers were setting up a control system for newly installed electronic roadway signs, the kind that span overhead across multiple lanes of highway. Not knowing what the inaugural message should be sent from their enclave in city hall, they opted for "GO [local sports team]!!!". Cue a huge, hours long backup and a visit from a VERY irate city manager later...
Decades ago I was reviewing somebody's code which had (something like) this:
switch (intvar & 1)
case 0: do something; break
case 1: do something else; break
otherwise: printf("Something went wrong");
It took a lot of patient explaining to explain how the 'otherwise' clause would never be executed, and years later I still think it didn't sink in.
I'm in the habit of handling all possible values in a CASE statement ..... and then having an ELSE as well.
This gets populated with a variant of a Hex error message suck as the famous "Out of cheese" error.
The idea being that if the error ever activates then someone will come and let me know that it happened. Normally with either a big smile or a look of confusion on their face.
Years ago, a friend wrote a program in Pascal that had as its first three lines:
tmp = TRUE;
TRUE = FALSE;
FALSE = TMP;
I don't know if it did what he expected but it was a bit scary. The rest of the program did some very cool stuff that shouldn't have been possible on the minimal OS on top of a batch system (old CDC Cyber 175).
Biting the hand that feeds IT © 1998–2019