Reminds me of Blind Faith by Ben Elton.
416 posts • joined 4 Sep 2009
Reminds me of Blind Faith by Ben Elton.
That's pretty much what I was thinking. I saw "compact, fixed-length encoding" and my bullshit-o-meter hit a 9.0 and I thought "You mean a big in-memory array, you twat!".
I'm a grey-beard and reserve the right to be a miserable twat in an office full of 20 something graduates that don't know what machine code is.
Ooh! Did somebody say Forth?
Rather cynically, I see true cross-platform .Net as an attack on the multi-platform block-buster that is Java, and therefore an attack on Oracle.
The fact that SQL Server can now run on Linux is probably not keeping Larry awake yet, but it could be in a little while...
...but I think you missed the big whopper:
"among other things"
AMONG OTHER THINGS? WTF is that supposed to mean? How should one interpret that one in a court?
Wouldn't have happened if the firmware was written in Java. It would have crashed, sure, but the buffer overrun would have been caught by the JVM.
C and C++ are great, and certainly the best choice if performance is a major factor, but the freedom that comes with C and C++ requires responsible coding, and a devotion to quality checking. I'd argue that in a webcam server app, performance is not the major factor. As long as it can stream the video in real-time, anything else is kind of superfluous. A higher level, strongly typed language with dynamic run-time checking might be the better option for developing software of this nature. As I say, it wouldn't stop the buffer over-run, but it would catch it, and Java Embedded can be set to reset/reboot a unit if the watchdog isn't fed regularly.
Heck - even C and C++ would have been fine if the quality hadn't failed at at least two layers (the initial development layer (don't they have shop rules about this stuff?) and the quality/testing/review layer).
There's really no excuse for this in 2016. We have the tools to prevent this, and we have the knowledge of other people's mistakes. What some people appear to lack is pure good old fashioned common sense.
Of the Gov. to rate my pr0n for me. Saves me having to do it ;-)
I don't pretend to understand the low-level technical concepts of what they are wanting to do here, but the gist and it does lead me to wonder: Microsoft has had a very good CPU-independent program execution engine for at least 16 years: The Common Language Runtime. I continue to be surprised that they have only ever considered it a platform to run applications on. Had Microsoft invested in making parts of the operating system *itself* run on the CLR (or maybe some special version of it) we'd already be years into developing CPU agnostic applications that could run effortlessly regardless of the underlying CPU. ARM or Intel. As application consumers we simply wouldn't care. Sure, some CPUs would be better than others, but CPU manufacturers would have developed new devices specifically to target the environment, maybe the running the CLR instruction set (or a subset) as native on-silicon instructions.
I think Microsoft are 10 years behind where they should/could be on this issue. Instead, they're faffing around changing the user interface (flat GUI, I'm looking at you) with operating system releasing, polishing the same turd over and over.
I think the obsession with always-connected, mobile computing has pushed progress (not necessarily innovation, but certainly progress) back significantly.
Reading the article, the last paragraph reads like it's been inserted either as an afterthought, or by a different writer.
Windows 7 is *excellent*. Why would I want to change it? I have Windows 10 on my work PC, and, well, it's okay I guess, but the flat user interface just leaves me asking "why?" and continually moving things around in the OS (how many times has the freaking control panel been re-vamped over the years? Stop mucking about MS, FFS) requires me to puzzle-solve, instead of getting my work done.
No pervasive reason to upgrade 10, sorry. ESPECIALLY if you an oldish machine. My personal laptop is a 32-bit Toshiba Tecra M5 with 4GB RAM and a 256GB SSD drive. I bought in 2005 IIRC. It runs Win 7 beautifully.
It also runs Linux Mint beautifully (dual boot), which will probably become my home-use OS at some point in the future, as it's getting easier to install applications in Linux (still a bit of a ball ache though, compared to windows) and there's a great range of free software available that does everything your average home user needs and a whole lot more.
I'm still waiting for printer manufacturers to develop printer drivers for Linux!
I know nothing about ISPs so this is a genuine (probably naïve) question:
Don't ISPs analyse their traffic in some way? I mean, is there not some analytics that goes "Hmmm this IP address is suddenly sending a metric fuck-ton of pings/http gets/DNS lookups per minute, which is not regular for this user. Looks like he's (probably unwittingly) contributing to a DDoS. Cut him off until he phones us"?
Or is that illegal or something because it would mean inspecting the users data? If that's the case, just get GCHQ to do it.
I think iPhone fatigue, and smart-phone fatigue just about sums it up. The release cycle of new hardware is far more frequent than the requirement for the average person to upgrade his/her hardware. The whole thing is propped up by operators pushing "free" upgrades to their customers.
Samsung are releasing new models, what, every year?
I'm still using my Galaxy S4 FFS! There's nothing wrong with it.
I don't understand it. I must be too old.
I did register. Downloaded the official Twitter app to my phone. Couldn't work out the user interface at all. If a UI requires puzzle solving then it's shit.
So this is Android now on the long tail?
I won't bother installing Android Studio then... :-/
This guy is a dangerous idiot who likes the sound of his own voice.
"Now that's Windows10 installed, now, while it's downloading updates i'll just go and throw my perfectly servicable and fully functional printer in the bin, and go and buy another one."
The above sentence is not echoing throughout the living rooms of the land. If the HP execs thought it was ever going to, then they're a particularly rare type of stupid and have no business (ha!) being in the positions that they are in, earning the money that they do. They're as dumb as a box of rocks.
If DARPA / BD are building that as a "rescue bot" then I've got some prime beach-front holiday homes in Fukushima to sell you.
Sorry about that.
Putting the political issues between USA and Russia (which is enevitably going to spill over to Microsoft if it wants to do business in Russia), they took a perfectly good, if not excellent operating system, and wrecked it. They wobbled with Windows Vista but managed to get firmly back on track with Windows 7, which, remains a great OS IMO.
Then they fucked the whole thing up with Windows 8, and doubled down with Windows 10.
They did it to themselves.
I currently run Win7, which will be my last MS OS. I'm dual booting with Linux Mint. It's taking a lot of getting used to, but it seems to do pretty much everything I need/want to do.
Forth has been doing it the exact way he says is terrible for 40 years. When writing an application on an embedded Forth system, the application and the kernal are at the same level. There's no protection whatsover. You're free to f**k up with your poorly written software in any way you want.
Forth has been used in countless space experiments on the shuttle and other space systems for decades. IIRC 10 of the 12 CPUs on the Philae lander and orbiter were Forth CPUs. It's also been used in most of the worlds observatories (controlling radio telecsopes) for years.
Forth is an amplifier. Badly written code shows up real fast as badly written code. However, you *can* write code right on the hardware and it can work just fine. It just takes discipline and good procedures and management. It can be done. It has been done.
All that said; he has a point. These walls between OS and application software are necessary, because software *is* buggy, and software does crash. OS's are buggy too. Part of the problem is simply down to the complexity of modern OS and application software. When a Swing library in a Java program is rendering it's window on the screen and painting its buttons, putting text in a text box etc, how many levels of abstraction are there between it and the graphics hardware? A thousand? Two thousand?
If we want more reliable software, we have to write simpler software.
Forth, which is still around and still used, takes all that away. It is simple enough that (as in my case) the entire workings of the Forth kernal can be understood and held in the head of one person (I should, I wrote my own Forth system) and by extension, the applications written in it, too.
To be fair, the applications written in Forth are vastly simpler than those written on contempory PCs. We tend to write on the metal, in deeply embedded or industrial control environments, where software can be much simpler, and the only code in memory is the code that *you* put there, because it is specifically needed for something that you understand. PCs have the entire kitchen sink in memory and anyone of them could go wrong.
Your boss was a twat of the highest order. And a crap coder, too! You're better out of it.
I have a couple of cracking anecdotes to share, including a 'men in black' type moment that happened in Signapore, but I don't know where to send them.
I reckon you'd need a SERIOUS change of underwear if you happened to be working in there when that went off!
would find this funny...
"The agency sends 180,000 letters and emails, and dispenses $290m, every day. System 204's doing that handily, but has reached the point at which meaningful changes are becoming tricky."
You know what, if it's still ticking along, I'd be tempted to leave the bloody thing exactly where it is. I don't know about the system, if it runs on old antiquated hardware etc, but probably be thinking in terms of upgrading the hardware platform that the current system runs on (perhaps virtualisation, or emulation of the hardware) but try and leave the software itself alone. It's from the 80's, so is it an old Vax system or something?
As usual, Wikipedia has the answer:
Model 204 is a database management system for IBM and compatible mainframe computers, “born” 1965 October 13,:66 and first deployed in 1972. It incorporates a programming language and an environment for application development. Implemented in assembly language for IBM System/360 and its successors, M204 can deal with very large databases and transaction loads of 1000 TPS.
I would have thought IBM would have a migration path for old IBM System/360 stuff. Are they not the undisputed kings of backward compatibility? This would be an interesting project.
I can just imagine the utter chaos that a new system (from the ground up) would bring, though. In fact, I don't even want to think about it!
Edit: There is emulation available: (from the wikipedia page):
Database Programmer's Toolkit is a freeware PC-based emulation.
Haven't turned up any links though.
...and then fire everyone? Just to make a point.
Shame we can't buy laptops that have no OS at all installed on them...
When a company issues a takedown notice, they should pay £1000 into escrow. If the takedown is deemed legitimate, they get their £1000 back. If illigitimate, the £1000 gets paid to the offended party.
Synclavier. Bring back the Synclavier. What a machine that was...
...if everyone just simply stopped using the internet?
You know, if everyone just said "fuck it" and went back to effectively much simpler times and means of communication?
What would happen to Twitter, Facebook, and all those other "social media" spy networks? How would the government get their intelligence then? If we stopped emailing each other...?
I know it's a dream, but I wonder what the governments of the world would do then - if the networks just went quiet? They'd likely shit themselves, terrified by the notion that they don't have a feel for what ordinary people, and ordinary families are doing in the privacy of their homes.
I don't even know where to begin. It looks dumbed down like the BBC news website, and absolutely devoid of any character whatsoever. I can't even right click a link to open it in a new tab FFS. It's like something out of the 90's.
So el reg, it's been great, but I think it's time to move on.
Ah, thanks for the clarification. Yes, that does paint a rather different picture. Thanks.
@Otto - there's so much wrong with your post that I can assume you're trolling, or a fanboi, or both.
Assuming the information in the article to be accurate, you have (conveniently) completely ignored the fact that Apple released a software update that deliberately broke a competitors software. Not only is it illegal, it is completely untenable, and is demonstrative of the absolute lack of ethical and moral fibre that is endemic at Apple.
I'm glad I've never given Apple a penny of my money.
I think they're really missing the point of the speccy with the exclusion of a keyboard. It's true that as kids we all played millions of games on the rubber-keyed wonder, but, a lot of us also *programmed* the bloody thing, both in Sinclair BASIC and Z80 machine code. It's actually the reason why a good chunk of us are employed in IT today, and the reason why we're here on this website reading the article. The inclusion of a keyboard would be an excellent opportunity to use it for programming, and, more specifically, would make an excellent computer to sit our children in front of to teach *them* programming. The problem with PCs and the like is that it's difficult to capture a child's imagination with respect to actual programming, since the things are so complicated. A kid in front of a speccy could pick it up in minutes. Show them how a FOR/NEXT loop works and then show them moving an asterisk across the screen. Show them INKEY$ and how to read the keyboard and move the asterisk around. Show them UDGs, PAPER colours, INK colours and they'd be totally hooked. Since it has an SD card slot, they wouldn't have to slum it with cassette players like we did as kids. I'm sure they'd be hooked.
This *could* have been a Pi beater (in terms of engaging the young in programming), but they aimed low and went for nothing more than a nostalgia games machine that will only have limited appeal to those that owned one as kids.
They missed it IMO.
Where's the bloody keys?
We need Q, A, O, P and SPACE for a start.
Also, we need J (for LOAD) and Symbol Shift, and P (for "") and of course ENTER!
(cyan/red alternating border... ahhh... sigh...)
It costs £30. Live with it. Most folk would spend £30 on a Friday night family pig-out down at Domino's pizza and not bat an eyelid, and it's all going down the toilet a few hours later. Suddenly when their super-cheap TV dongle isn't absolutely perfect it's the end of the world.
Plod asked Vodafone for phone records pertaining one individual, a journalist. What they received was a big list of a load of journalists phone records.
How does/did Vodafone know the occupations of their customers to the extent that they can say "Dear plod, here's a list of ALL of our journalists phone calls"? Upon what criteria did they collate the list? There's a lot more to this than meets the eye. It would suggest that their database is a more granular than one would otherwise assume. Just how much information are they storing?
I'm not really worried about the government reading my soppy WhatsApp message to 'er indoors, they can just ask me if they want to see them! However, it's cool that messages are no longer travelling through the air in clear-text if only to stop opportunist criminals in airport lounges and the like. I know very very little about encryption, but I'm led to believe that security measures such as the WPS/WEPS (is it called?) encryption on Wifi networks is easily breakable, so it would be easy to sit in an airport lounge and sniff up all manner of data. Email is probably the worst offender, being based on ancient clear-text protocols.
Isn't MI6 supposed to be concerned with protecting its citizens from threats that are external to the UK, like ISIS and stuff?
If some perv has logged onto your IP camera and is watching your kids sleep, it's hardly a job for MI6 is it? Disgusting as it is. It's a job for the police.
WOLF. THERE'S A WOLF!
* IBM... Long slow decline
* Sun... Long slow decline
* Microsoft... Beginning a long slow decline (IMO) - fighting hard to stave it off (to their credit) but a long slow decline nevertheless
* EDS... Long slow decline
Now, I think, we're beginning to see the same happen to Oracle. It'll take a long time I think, I mean, they're bringing in 8 BILLION a quarter. That's seriously impressive. However, ultimately, I think Open Source databases that are "good enough" are going to win out. Companies are beginning to seriously evaluate alternatives, being offended by lock-in style selling practices and punishingly expensive rates. I think they've had enough. There's a whiff of software revolution in the air.
Never 'eard of it. The 80's 8-bitters were my thing. Playing games on a PC just didn't feel the same to me! Give me a Speccy, or a BBC B, or a C64 or a... well, just about anything except a PC. That said, there have been some awesome PC games. I'm particularly into anything driving related. F1, NFS etc.
Game on :-)
Open Software often sounds like communism to me. Nobody owns it. Therefore, nobody gets paid for it. Therefore it's not of any (real) value to anyone at all. It's communism in software.
It would be funny if someone offered to buy *his* business for a few million. It's funny how communists suddenly turn into capitalists when a big fucking cheque is waved in their face!
"Where rival ARM's shift from 32-bit ARMv7 to 64-bit ARMv8-a involved rewriting chunks of its instruction set and forcing some low-level engineers to learn a new assembly language, MIPS64 is basically MIPS32 with instructions for using 64-bit-wide data, and it runs MIPS32 code without a mode switch."
I instantly thought of the Data General Eagle and came across all warm and fuzzy.
I worked in Tashkent for a while. A very nice city and nice people. There are no taxis. You just put out your hand and someone will stop and pick you up. Maybe they're on their way to work, or on their way home. Maybe they're just nipping out to do some shopping. As long as your destination isn't too far out of their way, they'll take you for an agreed price.
Sounds terrifying - but worked each and every time for me in Tashkent and Bhukara!
The Uzbeks have the ultimate Taxi business model - no taxis. No middle-men whatsoever.
North Korea do the same things as the USA and the UK and <insert country of choice here>.
The web is indeed..... shit.
My online footprint is tiny. Maybe five or six websites. The rest is just shit. What makes me sad though is what it has done to society. It really has turned people (particularly the younger generation) into attention wanting cabbages that simply cannot appear to function without their umbilical connection to the web, and the ego-boost that their online "friends" appear to give them. I swear, a friend of my wife who recently went on holiday to Spain spent hours and hours and hours posting pictures of her holiday - and I'm not talking about beautiful panoramic vistas, valleys, mesas and what not - no, I'm talking (literally) about pictures of bottles of wine, plates of food, empty plates of food, sun loungers, you name it. It's all "look at me, look at me.". I'm lucky enough to own a Porsche Cayenne. She actually took a selfie of herself leaning against my car, and posted it on Faceslap. What's wrong with people? It seems that instead of people instinctively going and doing something (what ever that something may be - visiting a library or whatever) they now go and do things with a view to impressing an imaginary "audience" that is constantly (not) following them around and constantly (not) hanging on to every word and funny ha ha post that they make. And their so called "friends" are doing it to. All trying to up oneanother, and boost their own collective egos, since they can't *really* face up to the fact that they lead a rather ordinary hum-drum life. They want to be "celebrities".
A friend of mine posted the other day that he was "chilling with a glass of wine and his Gibson guitar". Why? Around 15 people "liked" it. Why would he do that? Before Facecunt we didn't call all our friends on their mobiles and tell them we were just about to go and get a chinese takeaway, and we didn't take a picture of the empty takeaway cartons and dirty plates and post them in the mail to them. We didn't even *email* them to our friends. Because... why would we? But - get a Facetwat account and you instantly turn into a fucking attention grabbing twat.
"It's another example of the fascinating two-caste system at Wikipedia: the workers who put in long hours writing and maintaining the content are paid nothing, while the wealthy administrators at the Foundation devise schemes to spend the cash - to burn through the millions of dollars the charity raises every year."
What you just described is communism. All is not well in the Wiki Duma, comrades.
This is getting into Boffinry territory, and nae a white lab-coat to be seen! Well done!
In the UK these two particular routers must easily have the lions share of the market. Were they tested? If not, why not? I appreciate they are UK only, but there must be millions of 'em out there. If they ownable, we should (and the manufacturers) should know.