Feeds

* Posts by btrower

555 posts • joined 9 Nov 2011

Page:

AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip

btrower
Silver badge

Crossed fingers, but...

I am an AMD supporter from way back, but recent times have been pretty grim. Unless something dramatic happens I will be switching to Intel CPUs on my own workstations or maybe even something like ARM.

What I would like to see is a Lego-style system that allows natural additions of whatever you need be it CPU, GPU, RAM, DISK, etc.

In the 21st century we should not be stuck upgrading entire systems or scrapping perfectly good portions of systems because one portion of the system is not up to snuff.

We keep bashing our heads against artificial architectural limitations because of the single moronic meme "That should be enough. I can't see why we would need more." I understand that for practical reasons it is difficult to produce 6 bit vs 4 bit, 8 bit vs 6 bit, 16 bit vs 8 bit, 32 bit vs 16 bit, 64 bit vs 32 bit, etc, but why are architects unable to see the pattern there? The fact that a chip designer just flat out cannot imagine why I would want a 16K bit register should not be my problem. Their lack of imagination should not be permanently baked into the architecture. Perhaps the implementations need to be crippled due to practical considerations, but the architectures and the APIs should not be dragged down too.

Yes, there are timing and EM considerations, heat dissipation problems, fundamental limitations due to the speed of light, etc, etc. We may *never* be able to physically build some systems but our architectural designs should still not be assuming failure in advance.

I have seen people argue strenuously in favor of the GOF 'singleton' as a valid pattern rather than a corrupt extension of global variables. It is a bad pattern from which significant evil flows: It breaks scope by definition and specifies a specific cardinality (typically 1) that creates havoc in future designs. Examples: mouse, keyboard, window, screen, desktop, CPU, thread of execution, directory, disk, etc. All of those have been deeply crafted into architecture in such a way that breakage continues to this day. If something is architecturally designed with a perfectly artificial limit due to the lack of imagination of the architect it will eventually break.

We should have something akin to an architectural cloud whereby implementation and architecture are deeply separate such that scaling out to address spaces in the trillions of Yottabytes and well beyond is no problem.

There has been in the past a frighteningly moronic argument that we won't ever need addressing beyond a certain point because it exceeds the number of all the particles in the universe. That made sense to too many people for my comfort. That which we can specify is effectively without bound. If your math microscope is powerful enough you can count more points than the number of particles in the universe between 0 and 1, 0 and 0.1, 0 and .000000000001 ... carve it as fine as you please and there are always more points there. There is a simple counting argument used for things like this. If all the particles in the universe are the number X and I wish to specify X+1, I need an address space larger than X. The argument can be repeated ad infinitum. We don't have a rule that the counting numbers stop at a googleplex because it does not make sense. All the artificial limits in the computing architecture universe make no more sense than specifying a particular 'top number' for counting beyond which you have to stop.

</rant>

2
2

Dropbox defends fantastically badly timed Condoleezza Rice appointment

btrower
Silver badge

Re: Alternatives -- there are none

Owncloud *is* preferable in the respect that it can be entirely owner controlled. However, the security advisories show that it is not locked down enough to give me confidence.

http://owncloud.org/about/security/advisories/

I do not consider myself to be a true security expert, but it has been more than twenty years since I designed and wrote a secure dial access system for a Canadian Bank, including the underlying encryption system. It protected billions of dollars of assets, passed a rigorous test and audit regime and logged thousands of man-years of use without a breach. [For security reasons the Assembler code at the Mainframe host end was coded by production staff using my spec (development people cannot access productions systems, ever)]. Related code written by me is used in all sorts of secure applications across the world. Security concerns are a part of my nearly twenty year old research project on 'data packaging'. For the most part, I at least understand the security advisories...

[Saying I am not an expert is not false modesty. Security is a very deep subject that even full time people obviously struggle with. Peter Gutmann is a security expert. I am no Peter Gutmann.]

It is my opinion that to be secure a system must be distributed with multiple independent loci of control such that even the owner doing the encryption cannot access any type of raw storage. No single agency should be able to shut the system down or inspect or influence it in any meaningful way. The very weakest link in such a system should be the originating node beyond which even colluding trustees cannot even determine where data came from or where it is stored, let alone get into the system.

As of now, even for secure local storage, it looks to me as if a disrespect for entropy requirements renders just about everything vulnerable to attack by a well armed adversary like the NSA.

Encryption can be made provably secure in theory. In practice security is a probability only and current methods render the probability of remaining secure laughably small.

People who do not take an active interest in this area can be forgiven for thinking I am wrong. People who *do* take an active interest know that I am right and this is not news to them. Our current systems are essentially vulnerable by design. There are just too many unnecessarily low barriers for this to be accident or incompetence. As I say, I do not consider myself a security expert as such. If I am able to point out multiple long-standing flaws in our systems, you can be assured that true experts know this well.

Digital security is still security. You would never design a bank vault with a single combination and then trust any single person with the combination. Our current systems are equivalent to allowing the same individual to design the vault, hold all the combinations, have keys to the bank and leaving them alone over a long weekend when you know that the person has a criminal record, specifically for robbing vaults and is part of a crime family that specializes in bank jobs.

If you do not consider the situation described in the above paragraph secure, then you do not consider any part of the Internet secure because the analogy in the above situation applies in every detail.

This puts me in mind of an old joke told to me by a coworker at the bank mentioned above:

How do people in Hollywood say "Fuck You"?

"Trust Me".

2
0
btrower
Silver badge

Like a sick joke

The best you can hope for is that this move is incompetent. There is no way to rationalize this move. I don't have anything of value on Dropbox but if I did I would be moving it pronto.

14
1

Russian deputy PM: 'We are coming to the Moon FOREVER'

btrower
Silver badge

The race is on!

Things are not what they were in the 1960s. A concerted effort to occupy real-estate on the moon could definitely be a go.

Who owns it? Is it like the Ocean or is it fair game for the first one to plant a flag?

I think Canada should get in on this. Build a rail-gun the length of the prairies and just keep sending up cargo and robots to terraform the joint with maple trees. Think of the solar farms you could put up there. Nine billion acres at a thousand bucks an acre could start to add up to real money.

3
1

FTC gets judicial thumbs-up to SUE firms over data breaches

btrower
Silver badge

It's a stumper

Hate the notion of the state extending its power even a shred more. On the other hand, we need slap-downs like this to get people to take this stuff seriously. I have *three times* had money swiped due to compromises due to the negligence of financial institutions. One was fixed in a few days, one took a year to get my money back and one is pending me filing a lawsuit to get my money back.

Companies do this stuff because they offload risk to their customers. Looks like the jig may be up for some of them.

5
1

NSA denies it knew about and USED Heartbleed encryption flaw for TWO YEARS

btrower
Silver badge

Time to fess up

There is a programmer out there that made the first commit with the bug in it. Who is it? They need to come forward and state for the record whether or not it was done under orders.

As much as I favor une feuille d'étain chapeau, my instinct is that this is just a bug. Even so, the NSA absolutely has dirt all over its hands when it comes to the state of network security.

Regardless of their role in Heartbleed I am quite convinced that there are a large number of 'law enforcement' types that belong behind bars.

I am not sure how you shut down a military industrial complex backed by years of half-trillion dollar budgets and sitting on weapons that can destroy the world, but maybe we should try before they start weaponizing graphene.

0
0

Facebook: US feds probed over 18,700 accounts in six months

btrower
Silver badge
Joke

Decoys

This is pure disinformation. A copy of all of facebook and its history going back to the beginning is resident on government servers under a mountain somewhere. This is just a bunch of smoke to make it look like they are only accessing data of this tiny fraction of users. They probably sprayed coke -- liquid *and* powder -- out of their noses when they were cooking this announcement up.

6
1

Not just websites hit by OpenSSL's Heartbleed – PCs, phones and more under threat

btrower
Silver badge

Re: The real bug

@MacroRodent:

Correct. Upvote for you.

I have been programming for decades. I could easily mess up a bit of code tomorrow. In fact, I think I shall.

1
0
btrower
Silver badge

Re: Even worse than I thought

Re: "Key length is irrelevant to this, if the key is in memory then it is possible to grab it."

If I understand how the vulnerability works a 1,048,576 byte long key would be very difficult to obtain with this exploit. One of the RSA inventors spoke about using objects on the order of a terabyte at one point for precisely the reason that a large object hobbles certain types of attacks:

"I want the secret of the Coca-Cola company not to be kept in a tiny file of 1KB, which can be exfiltrated easily by an APT," Shamir said. "I want that file to be 1TB, which can not be exfiltrated."

Your statement illustrates why our security is so tragically broken. For end-to-end security to be sound *both* this type of security hole *and* the keys have to be sound. Trivial one byte keys present a barrier low enough that this exploit is not needed. Non-trivial multi-megabyte keys raise the key barrier high enough so it is not a profitable point of attack.

One high barrier does not make the whole thing secure. But one low barrier can make the whole thing insecure.

To be secure, all the barriers have to be strong enough to render attack unfeasible. With IPv4 in place, you can scan entire sub-nets by brute force looking for a vulnerable IP address. With IPv6 that is significantly more difficult, and if configured correctly, effectively impossible.

Security depends upon a lot of different things, nearly all of which are in a poor state of repair in our systems. Key lengths are but one of those many things.

There is no sensible reason to build our systems with key lengths constantly on the edge of vulnerability. Any older backup of some dire secret that is hiding behind DES is trivial to hack.

I might be wrong, in which case, using a longer key length has no impact on security. On the other hand, you might be wrong in which case a shorter key length needlessly renders the system insecure. Which of those bits of advice gives better assurance of security?

At nearly every design point on the current network, it has fundamental security issues. This was a whopper of a security hole and we may not see its like again, but this is not the last breach we will see.

0
0
btrower
Silver badge

Even worse than I thought

This is surely the worst security hole I have ever seen. However, I tested all my net facing servers and they were all OK. I just tested the copy of CURL I use all the time. Complete breakdown. It coughs up the private keys.

Now it occurs to me that a black hat might very well have exploited the vulnerability and then hacked into the system and patched it so that it is no longer vulnerable. It is a time honored practice with malware to gain control and then make the target invulnerable to attack by anyone but them.

I have seen this coming for a long time so I don't have anything particularly valuable to steal and I have long been prepared for the day when I had to change passwords and keys everywhere and finally lock things down properly.

Despite the fact that I was expecting this I am surprised at its scale and pissed at the work it is going to take to clean it up.

After the cleanup? Well, not everybody will clean up so attackers still have a good chance to hop from a compromised system to one that is not (yet) compromised.

Assuming you find your way around the above, what you are left with is the same shaky structure you had before -- a colander with a single hole plugged.

We need to have a much better understanding of these things in the technical community.

We should have long since demanded an end to IPv4. IPv6 is such a crappy alternative that it is understandable people have dragged their feet, but as lousy as it is it is entirely preferable to IPv4.

Security experts should have better explained the fact that even though a 128 bit key is invulnerable in theory, it is entirely vulnerable in practice and longer keys are better.

Public Key Cryptography is fine in principle and I would trust it if I could somehow verify the implementation was an honest one. The current case is an example of it falling down. It is not the first and will not be the last. The implementations are messy and poor. The PRNGs used to generate keys have time and again proven faulty.

The network is so fragile with respect to security that nobody in the know with something serious to protect will connect it.

Is our hardware compromised? Probably.

Can a state agency like the NSA mount side-channel attacks successfully? There can hardly be any doubt.

We cannot be sure of protection against the Military Industrial Complex. They control the factories that make our equipment, the infrastructure, law enforcement and the administration of 'justice'.

We *can* be sure against most attacks otherwise, but it requires much more than we have put in place. Non-technical people would have to take a lot of courses to fully understand the issues, but software developers should be able to understand this with a little digging *and* they should know about this anyway as a matter of course.

Given the truly horrible state of security, you have to wonder. Is this really that mysterious to everybody?

2
6

Anatomy of OpenSSL's Heartbleed: Just four bytes trigger horror bug

btrower
Silver badge

None so blind as those who will not see

@Richard Pennington 1:

Those tools are torture to use, but they work. Problem is, they work. When people see the monstrous cascade of errors coming out of these things they head for the hills.

Splint is not one of the tools above, but even splint sends out showers of warning messages on old code. I use it toward the end of development of stuff I need to be confident is clean. I don't usually bother with older code. For reasons that elude me, some programmers aggressively defend the most dreadful practices even when directly confronted with the consequences.

Here is something that shows how extreme programmer denial can get:

For years now, people have been opening tickets due to a horrendous security flaw in FileZilla. Except for this persistent report that I re-opened four years ago and keeps getting re-opened, they keep getting closed. Anybody with casual access to the file system is able to get FTP passwords with a single command line. They don't have to know anything beyond that. Reason: passwords are stored in the clear in an easily found and inspected place.

Do this on a windows machine with FileZilla on it and it will cheerily show you a list of hosts and their FTP passwords:

find "Pass" "%APPDATA%\FileZilla\sitemanager.xml"

Here is the programmer's take on it in his own words:

http://trac.filezilla-project.org/ticket/5530#comment:6

status changed from new to closed

priority changed from critical to normal

resolution set to rejected

Whether passwords are stored encrypted or in plaintext makes no difference in security.

----------

The above is a critical design flaw that routinely gives up FTP passwords to hackers. The developer is adamant that this design is beyond reproach and will never change.

----------

I confess that I have been programming for more than 30 years and have been fairly involved in one aspect or another with security for about half that time. However, the programming errors and the security issues surely cannot be that hard to understand. Perhaps the solution to the above security flaw is not easy. However, the fact that the single command line above reveals dozens of server passwords just can't be that hard to understand. If you look at the other comments on the ticket above you will see that it is *only* the programmer who seems immune to understanding.

2
0
btrower
Silver badge

Re: There are a lot of comments here...

@Andrew Commons

Upvote for you. Professional testers (not weekend beta testers) are the unsung heroes of software development. Programmers have tunnel-vision (as they should) and are terrible at reviewing and validating their own code. Testing is a difficult, dreary and thankless job, but on large projects it can make the difference between success and failure.

1
0
btrower
Silver badge

Re: So what happened to the coder

Re: "the coder would've normally been put on leave at the very least."

For a moment there I thought you were going to say 'put to sleep'.

Almost the entirety of the source code universe is a total mess. A bug like this should be impossible in burned-in mission critical code like this. Unfortunately, a lot of evil habits are actually cultivated by design. The programmers don't know any better and they are immune to reasoning about it.

The C language is the language of memory corruption. It is a 'high' low level language designed to build operating systems and compilers. Code that does not do bounds checking is faster. In some cases speed differentials are astonishing due to the hierarchy of storage from the CPU registers through L1, L2, L3 level caches and ordinary RAM. Stuff that stays within the L1 cache operates at lightening speed -- about 200 times as fast as a reference to RAM.

It is fair game for called code to have expectations. If you pass a memory reference, you don't want every nested call to check its validity. In some cases, called code could not do bounds checking because the extent of the bound is only known back up the call chain somewhere.

When you look at the sources, you find that these bugs are usually in code that has other tell-tale flaws as well. Older code contains errors by even good programmers. Dennis Ritchie was a god, but not all of his code examples are keepers and he was guilty of some faulty practices.

The worst stuff is stuff that was written or maintained by talented journeymen programmers whose short-term memory was at a maximum and whose coding experience led them to believe that clever code was a good thing that showed off their talent. I doubt it is even true these days, but Duff's Device is an extreme optimization that *may* be a necessary evil at times, but when not necessary simply devolves to being evil. I know of at least one nascent small C compiler that fails to generate correct code when it encounters Duff's Device. A beginner or a journeyman will blame the compiler when it fails. Someone more experienced will blame the coder. Every time you do something fanciful in code you take a chance that a maintainer will have to debug and recode. An experienced, literate and conscientious programmer should not be doing this.

Beyond a failure in coding, this current situation demonstrates something that I have often commented upon in these forums. Our system is insecure and it is essentially insecure by design. Given the enormous resources spent on systems using this SSL code, does it make any sense at all that it suffers from such a mundane flaw? It does if you realize that security is not that important to the people holding the purse-strings and calling the shots.

This is about the most serious security bug I have ever seen. Cleanup is going to be a real bitch. Repairing and replacing the code is the least of the work effort. Prediction: Most of the systems that had this issue will not have passwords changed and keys replaced. If a significant number of systems were actually compromised, we will be living with the consequences of this for a long time.

Despite the severity of this bug, it pales in comparison to the inherent systemic insecurity of the Internet. There is no question in my mind that people in the know are protecting important things with air gaps, enormous keys created with very good sources of entropy, decoy layers, etc. That is to say, nobody who understands this stuff could possibly have any faith in the current security systems as actually deployed.

It is very hard to look at the current state of the Internet, particularly things like the failed introduction of IPv6 and not think that people with influence who understand security have encouraged this situation precisely because they wish it to remain vulnerable to them.

13
0

Spy-happy Condoleezza Rice joins Dropbox board as privacy adviser

btrower
Silver badge

Ballsy

Wow, the people secretly funding DropBox have got quite the cojones. They are just coming flat out and saying they own your ass.

The days of buying us dinner beforehand are long gone.

18
0

Ancient Earth asteroid strike that dwarfed dinosaur killer still felt today

btrower
Silver badge

Re: Fascinating

You misunderstood me. I was saying the exact same thing as you. There is a small probability in your day to day life that you will suddenly die, but you get life insurance if you have a family because as remote (seeming) the chance, the consequences of not having insurance are too severe to bear.

In terms of probability, I expect it is improbable that a planet killer will strike in my lifetime, but It is not far off even money that a nasty one will.

The risk is huge. In fact, the risk is entirely unbearable. It would take a hell of an impact to destroy life, but not nearly as severe an impact to destroy civilization and likely our species. There is a finite chance of extinction and it would not be that expensive to give us a chance to at least escape extinction.

As for the probability that we have an 'event', well ... if it happens the probability is 1. We don't understand what creates the risk well enough to put a good number on it. Our long quiet period may be aberrant and end at any time. My reckoning that the probability of a very bad strike could just be an optimistic hunch.

We know there are rocks in space and we know that we have been hit by rocks from space and we can be fairly certain we will be hit again. Some of those hits can be very bad. Beyond that we don't know nearly enough to have much confidence in any estimate.

My point stands that we spend all sorts of money on other things. The marginal final 20 billion dollars of U.S. Military budget probably does less good with the military than it would on a project to get early warning of Armageddon.

Note: Don't have the military do this. They will spend most of the money figuring out how to weaponize the phenomenon.

2
2
btrower
Silver badge

Fascinating

I find this stuff evilly amusing for some reason. It is spectacular!

It is weird that we are spending so much money on the bogus war on terror and attempting to change the climate when something like this actually poses a realistic threat. The only reason there is not a huge concern about such an event is because people are bad with figures.

The thing that caused this was big, but not big enough to see at a great distance. A similar thing could be on its way right now. We have a good probability of seeing a pretty spectacular collision at any time; not enough to wipe out life but surely enough to send a nasty earthquake and tsunami. If a rock a few meters wide hit dead-center in New York you would hear about it.

We have known of this danger for a long time. We spend more searching the skies these days than we used to but in my opinion it is not nearly enough. We should be looking as deep into as much sky as we can to get early warning and we should be working on some type of strategy to avoid species extinction if we can't dodge a big one.

Would it kill us to take ten percent of the money spent chasing the fictional Climate Change bogeyman and/or the near mythical terrorist danger or perhaps the perfectly insane military budgets and spend it on something that is actually are genuine threat of global proportions? It may well kill us not to.

15
6

Who wants a ROBO-BUTLER? Google and pals do – and they've just put $2m towards it

btrower
Silver badge

Bots ahoy!

Boy, if you thought that denial of service attack bots on the Internet were a problem, you ain't seen nothing yet.

The recent security breach of ... um ... the whole world should make you a bit worried. Will they allocate budget to put in some sort of working hack-proof safety mechanisms? You bet! It will be too small and they will be giving it to 'CrossYerFingersCo' security systems (division of Diebold).

The current melt-down of the Internet's SSL infrastructure happened on our 'state of the art' systems. Clearly the state of the art is not what it should be.

I have no idea what may or may not be under non-disclosure, but let me just say that IBM, for instance, has had much more solid security stuff since at least the 1980s. It is not that ways to secure things are unknown.

In fairness, until fairly recent times our systems had such limited capabilities that getting them working at all was something of a trick. Most of our current infrastructure was never designed to be secure. It was designed to be 'open' and easy to use. Clearly, in the 21st Century when we are about to send semi-autonomous network attached robots out into the physical world we need to be a bit more fastidious about securing our systems.

Murder or innocent software bug? No way to tell. The robot claims it's innocent and the other robots circling menacingly about suggest we take that as our final answer...

2
0

Come to Oz for sun, surf, ratting on co-workers and surveillance

btrower
Silver badge

Re: I hate that quote

Thumbs up for you. I am mystified by the down votes. This is a hugely amusing post. It's funny because it's true.

I personally like the quote in question, but that does not invalidate your entertaining post.

0
0
btrower
Silver badge

The Terrorists are the ones in charge

re: "backed warrantless data collection in the name of defending us from terrorists"

The people breaking the law by spying on people without justification are the real terrorists here. They are destroying the civil liberties of the entire world. Terrorists as they define them might blow up a few cars. I can tell you that I have *never* had a fear of the people they are supposedly protecting us from but I definitely have a fear of them.

There is no technical impediment to making it possible to retroactively inspect every single transaction to occur anywhere and at the same time make it impossible for these weasels to inspect things with out valid permission from somebody trustworthy.

We record everything under a multiple key system and distribute the master keys to entities that are actually trustworthy. The more important the data the more keys required to inspect it. The bank, an insurance company, the local neighborhood association, the ACLU and local and federal law enforcement would all divulge keys necessary to trace and pinpoint a child predator or an actual terrorist about to trigger an atomic bomb. They would not all divulge keys for an illegal fishing expedition nor should they.

The people in charge who are saying this current surveillance regime is necessary are either technologically illiterate or morally corrupt; likely both. In either case, they are the very last ones that should be in charge of privacy. They don't understand it, they don't like it and they don't want it. Give them jobs where they can't do so much harm.

18
0

Siri set to rival Windows beauty Cortana after Apple eats Novauris

btrower
Silver badge

Re: Speech recognition vs providing information

http://download.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2

Current revisions only, no talk or user pages. (This is probably the one you want. The size of the 13 February 2014 dump is approximately 9.85 GB compressed, 44 GB uncompressed).

You can already get phones with more than enough space to hold wikipedia and it will not be long until it is fairly commonplace -- less than five years.

Just for the LOLs, maybe somebody should upload an iOS usable version to the App Store.

0
0
btrower
Silver badge

A little concerned.

I asked Siri if she was about to be replaced and she replied "no comment". Maybe I am imaging it, but she sounded sort of wistful.

0
0

PC market's bleeding slows thanks to XP phase-out

btrower
Silver badge

Tablets do not replace PCs

We have three tablets in my household. We use them, but not all that much and certainly not for any kind of work. Everyone here has a laptop and everyone is either on that laptop or on a full-blown PC every single day. They are also on their phones about the same amount as the PCs.

Somebody is going to make a killing selling thin, light, capable PC notebooks. They can replace a desktop for most people. Tablets can't replace anything. Phones are something new. Getting a phone does not mean you don't need a PC and having a PC does not mean you don't need a smartphone. Two of these three things you actually need if you are to participate fully in our culture. The tablet is not one of them.

It is hard to predict things, especially the future, but here is what I think would be a desktop PC killer:

Notebook PC:

- Less than 2 pounds

- SSD

- 16GB RAM

- 4 core processor 2.5GHz/3.5GHz turbo

- Graphics to drive four monitors

- 3 HDMI ports

- Built in multi-port USB3 hubs

- The usual network connectivity

- NOT Windows 8.whatever

- Decent usable keyboard -- I have never seen one on a notebook

- storage spaces into which can go flash drives, small mouse and power adapter

- fast charge battery

- 16 inch 1920x1080 screen

Option -- available companion monitors that match and fit together with the notebook screen

Out of box experience should look like this:

Hey it booted already and is asking for my password. <Enters password> Hey, it is actually connected to the Internet on its own using a complimentary wifi access point. What the ... there is an unobtrusive welcome icon on the task bar. Hey, a free open source office suite pre-installed. The music player has a library with music in it and a complimentary account on a music service as well as internet radio stations ready to go. The Email works already on a complimentary account. My welcome Email tells me how to hook in my other accounts if I wish and how to access my complimentary online backup storage. Well, I'm a bit disappointed that it is not already set up, but the welcome mail tells me if I reply with a subject line "Send a phone" it will set up a phone number and internet phone for me for free. The free internet fax service is a head-scratcher. Is anybody actually using it? Whatever, it's free. Now that's a browser, simple fast, capable and opened to a page that allows me to choose from a set of sensible home pages and set them up with a single click. Privacy button allows me to surf anonymously, fake cookies, ignore web bugs, etc. I am amazed that there is a proper archive tool, non-invasive open source PDF reader and a free store that allows me to filter for only open source applications and install them without having to sign in or answer invasive questions about myself.

I am amazed to find that everything I need is installed already but there is no crapware anywhere in sight. The vendor supplied control panel allows one touch backups, security scans, warranty status, help that works, a link to online chat support and no gratuitous crap.

The 'booted already' is because it has been completely set up and runs and was just sleeping and came out of sleep when you opened the case.

The three HDMI ports are so that three big monitors can be used instead of a the modest built-in monitor.

Cost: For me, complimentary because I invented it. Retail: $499. Special introductory offer at $399.

5
4

Who's up for yet another software-defined net protocol? Cisco wants to see some hands

btrower
Silver badge

How about ethical standards?

If they really want to help they can get everybody on board a network architecture and switching protocol that makes it impossible for attackers like the NSA to easily record who is talking to whom.

Cisco is one of the old players whose business model is predicated on the economics of scarcity. They should have been leading a massive build out of bandwidth orders of magnitude larger than it is today. Instead, they concentrated on profit margins and reducing competition to the exclusion of all else.

Cisco is not now and never was about making the network better. It has always been, just like the Telcos, about finding ways to make the network pay for them. If that meant arcane, insecure, limited bandwidth networks then so be it.

Cisco has people with the smarts to know that there are a number of critical problems not addressed by our networks. We are already attaching more devices to an essentially IPv4 network than an IPv4 network can address. That has a variety of implications that do not bode well for the already rising Internet of Things. Nobody should be able to hijack DNS or send SPAM. Nobody should be able to inspect data as it traverses the network. Nobody should be able to hack into homes and hijack appliances. Had the power players involved been focused on providing the best network they knew how we would not be mired in a failed multi-decade transition to the mess that is IPv6. That should have been fixed a decade ago and IANA should be irrelevant. Instead, the use of IPv4 addresses costs one to five dollars a month.

All of the players involved in dealing with networking have proven to be dreadful custodians who routinely put their own interests ahead of anything else. We don't need new standards. We need new standard bearers.

0
0

Amazon sets FIRE to your living room in bid to shake up TV streaming

btrower
Silver badge

Amazon's response ...

Amazon's response to competition from Apple, NetFlix, Hulu, etc:

"Kill it with fire"

5
0

Greenpeace reveals WORLD'S FILTHIEST CLOUDS – and the cleanest may shock you

btrower
Silver badge

Shut the right one down

I am a bit of a tree hugger. I watched clear-cutting ruin the scenery when growing up in B.C. Later in University I realized that the damage to the ecosystem was permanent in many ways and the landscape as I knew it would never return.

I am old enough that I remember a world literally much greener than it is now. I think it is important that we understand what constitutes good stewardship of both the environment and our society as we continue to advance. The key here, though, is understanding.

I was once a supporter of Greenpeace, but they have become entirely net negative. Whether by design or ignorance they foster illiterate nonsense thinking that somehow environmental change and species extinction are intrinsically bad and should be opposed at all costs. They are working diligently to send our post industrial society back to the stone age. Thanks to the efforts of Greenpeace and fellow travelers Doctors, Lawyers, Scientists, Judges and other highly trained and expensive human resources waste their time literally sorting through garbage. This activity has prevented sane and sensible industrial scale treatment to recycle materials. How many advances have not yet been made because the productive capacity that would have made them was fussing with garbage or mindlessly protesting fundamental facts of life like evolution through natural selection?

It is ironic that they are so focused on particulars of energy and ignoring the big picture. With abundant cheap energy we can do whatever we wish in terms of limiting our environmental footprints, expanding out into the solar system, automating ever more things, etc. Greenpeace would like to hobble and harass the marketplace without any real understanding and whenever they get their way the results are predictably disastrous.

Greenpeace has their shoulder firmly to the wheel with respect to the increasingly irrelevant Climate Alarm raised by the second-raters in 'Climate Science' and kept alive by spineless politicians who seek to have power rather than exercise leadership. I am not wrong about the fact that *more* CO2 is a *good* thing, but even if I were, with abundant cheap energy we could sequester CO2 at will easily enough.

Mankind is dealing with a limited biosphere. Living things increase in numbers exponentially until they encounter some limiting factor. We have, through a variety of mechanisms been able to dodge Malthusian catastrophe, but the only way to avoid crashing into the boundaries is to lift the boundaries. One way or another the human population will continue to rise until it reaches an external limit.

Environmental stewardship is necessary, but it is not sufficient. We need to respect the math of population growth. Our very best hope for the future is finding a way to get abundant cheap energy. A strategy focused solely on reducing energy usage ultimately results in a reduced standard of living and stagnation or reversal of scientific advancement. Already in Canada we are suffering the ill effects of energy constraints and rising costs due to the pursuit of so-called 'green' energy sources like windmills.

If I had the power, I would shut down Greenpeace tomorrow. It would make the world a better place.

31
10

Ubuntu N-ONE: 'Storage war' with Dropbox et al annihilates cloud service

btrower
Silver badge

Knew it

I signed up for this, but I kept putting off any sort of commitment because I just had a bad feeling.

Ubuntu started out well, but has lost their way. Shutting down that service was probably the best thing to do now, but a better thing would have been to stick to their knitting in the first place.

Ubuntu broke faith with the advertising debacle. There was no way to defend a noxious default setting clearly contrary to user intentions. Since then I have looked at everything with a jaundiced eye and began searching in earnest for an alternative. I am not quite ready to cook my own distribution from Debian directly, but I am getting there.

I have an Ubuntu server here and have kept it up since about version 6 up through LTS versions since 8.04, 10.04, 12.04. I am not committed to upgrading to 14.04 LTS yet. I am installing Linux Mint as a workstation OS this month and may take the time to convert the server as well.

As I say, they started well, but it seems the people directing Ubuntu have become insular and detached from their community.

3
1

Is this photo PROOF a Windows 7 Start Menu is coming back?

btrower
Silver badge

Still hate the tiles and the window decorations

It is better lipstick, but there is still a pig underneath it.

I am not sure that the guts of Windows 8 are all that bad. Were they to offer an inexpensive version of Windows 8 professional (non-crippleware) at something like $29.00 they would likely sell a few to me just so I could keep my legacy stuff alive while switching to Linux.

The current course is just confirmation that their users have been screaming about what they want, that they have heard and have compromised on this rather than just give people what they asked for.

Lots of us would have settled for a less buggy version of 64 bit Windows XP and skipped the smoking Vista disaster, the bland Windows 7 recovery and the subsequent plunge into hell with Windows 8.

I am still a techie at heart and I like interesting new stuff. I am not quite an early adopter, but ahead of most. None of Windows Vista, Windows 7 or Windows 8 were all that interesting, most of what was new was irritating and the same old bugs were joined by new ones.

Microsoft has poisoned the well so badly that even old gadget junkies like me just want to get on with their work.

30
9

Intel details four new 'enthusiast' processors for Haswell, Broadwell

btrower
Silver badge

Worrisome...

I have been an AMD stalwart for a long time, but recently I have been frustrated by the lack of even a convincing roadmap, let alone competitive CPUs.

I was just looking the other day at perhaps swapping over to a multi-socket Opteron rig to at least get some more cores and RAM into the picture, but the Opterons are thin on the ground and terrible bang for the buck.

I am due to upgrade and will likely be switching to Intel after all these years.

I am mystified by AMD's silence. I am assuming they have essentially abandoned high-end CPUs and are doubling down on graphics. If so, it's a sad way to end.

Maybe AMD could at least consolidate server/workstation chips so that enthusiasts could put together a four socket 64 core rig with more than 256 GB of RAM.

Whilst I am griping ... whether it makes sense on paper or not multi-threading definitely increases performance. AMD seem to have effectively adopted the idea but in a clumsy way with modules/cores, but it looks and sounds awkward and does not appear to leverage silicon like Intel's hyperthreading.

Sigh. I am hoping AMD fights back, but thus far it seems ... worrisome.

3
0

Schneier: NSA snooping tactics will be copied by criminals in 3 to 5 years

btrower
Silver badge

Security is possible (ish)

What is currently in place is a shambles in terms of technical merit. I am not sure why the ones who know better are so quiet. Some have vested interests in keeping the status-quo, but plenty do not.

We need a few honest people to break ranks and tell the truth. Flaws and fixes do not take a rarefied understanding of cryptography. I may be naive, but I think that we will see the 'good guys' come out of the woodwork, and finally start telling it like it is.

The ones we have trusted with all this stuff have proven untrustworthy. The solution to this is one they do not like -- to distribute control and trust so it cannot be 'gamed'. If we build it right, their improper control of cyberspace will vanish.

10
1

Private pain: Dell layoff bloodbath to hit over 15,000 staffers – insiders

btrower
Silver badge

Re: Opportunity here

If your going in position is to assume failure that is likely what you will find.

Dell started in a Dorm room and kick-started his company with less than $500K in capital. According to Wikipedia (not always the best source) he grossed more than $70 million in his first year.

Dell currently has revenues better than $500K per employee. The new people come from Dell. If they do this with 5,000 employees then they will have revenues of more than $2 billion. If they eke out a profit of 4% of revenues -- $20K they can give everybody their initial investment back at the end of the year.

I am not saying to deliberately pick dog products whose market is shrinking and then blindly pour $50 million dollars into it. I am talking about concentrating on the few products that are earners and growing. Dell has enormous overheads. You do not. Dell has to somehow support the money-losers until they are no longer losing money. Dell has to service debt. You do not.

Pareto's law rules everywhere and it will rule at Dell. Not every single thing this multi-billion dollar corporation is selling is a money losing dog. That would be statistically bizarre and Mr. Dell would be smart enough to walk away from something like that. Out of 15,000 employees I am sure there are 5,000 who between them have some clue as to what is producing the most profits at Dell right now. I am saying pick the few products producing good profits and go to the vendors you already dealt with and ask them for prices and terms to kick-start your business. They might do it on the QT to avoid annoying Dell, but you can bet one of them will do it.

If 5,000 people joined forces to do something they already know how to do and they started with modest overheads and were able to get some traction, would they agree to build sweat equity in their own company? I think so.

It depends upon who was kicked to the curb, but I have seen a lot of these bloodletting exercises and they cut plenty of great people. There is no reason to think this is any different. Out of 15,000 going, there are surely 5,000 people fit to join forces and carve a niche for themselves.

Re: 'I doubt anyone wants to "stick it" to Dell'

I have no idea how people feel about Dell generally. I expect that faithful employees canned on short notice may feel a bit let down. It would be nice for some people if they could show that they were a keeper.

Re: "and Lenovo will eat their dogfood shortly anyway."

Well if they are vulnerable to one competitor, why not another that has intimate knowledge of their business?

I would think that the main worry would be that Dell would attempt to enforce no-compete clauses and otherwise tie the new company up in court. If they do, that is a tacit admission that, in the opinion of the people who are most likely to know, the new company is a genuine threat.

0
0
btrower
Silver badge

Opportunity here

The PC Market is changing, but it is not dead.

If layed off Dell employees pledged $10K+ each and 5,000 of the best formed a company they could put together $50 million dollars. Focus on one niche that they collectively have seen growing. They could approach suppliers with an offer to pay half up front, half post-sale for inventory. Approach bankers to fund the outlay for inventory secured by inventory. Do all the setup on the QT as much as possible so that when you pull the trigger a press announcement makes a splash. Initially limit this to people working at home and plow money into a good server infrastructure and a website that sucks less than Dell's current one.

Get everyone to take a modest hourly wage with a set-aside for dividends once the enterprise gains traction.

I have no idea how everyone feels about Dell right now. If both Suppliers and axed Employees want to stick it to Dell, this would be a window to do it in.

If they concentrate on the highest margin stuff, undercut a little, and give the kind of service you get from people whose income directly depends on it, they could leave Dell teetering on the brink. Dell would be attempting to staunch bleeding in losing lines, losing sales in winning lines and competing against a vigorous new competitor with zero debt, with a highly motivated workforce out to prove something. Dell, would be juggling enormous debt and attempting to execute with an entirely demoralized workforce wondering when the axe will drop on them.

Dell would probably attempt to stop people from forming a competing company based on non-compete stuff signed by the employees they axed. However, the common-law right to earn a living trumps that sort of thing. Unless Dell pensions them off at full salary for the 3-5 year non-compete period, they would not be able to force employees from doing something commonplace like opening a store.

Not sure if this is like other scenarios I have seen, but if it is, lots of the people axed were the good ones pulling the freight and lots of the people kept are brown-nosing corporate weasels whose skills are limited to stabbing co-workers in the back and climbing the corporate ladder.

Once the dust settles, the new company could pick the bones of the bankrupt remains of Dell. Take back the name 'Dell' if it has any currency and name the remainder -- wait for it -- Hell.

I honestly have no idea if Michael Dell is hero or villain. Just painting a picture that might be a pleasant daydream for former Dell employees hitting the bricks.

1
0

Twitter avoids IP face-off with Big Blue, will buy 900 IBM patents

btrower
Silver badge

Easy

This is how Criminal Cartels work. This one has managed to get 'our' representatives to write the law so that it is legal (only for Cartel members) to run a protection racket. To keep the profits up, the police *we* pay for do the collections when things get ugly.

These days, the Mafia is legal and writes the laws. Lemonade Stands are illegal and the little girls who used to run them are sold into sexual slavery.

The worse you are, the better it gets. War criminals who murder children get the Nobel Peace Prize.

Sure, it was ever thus, and at least the Emperor of the United States is not having people who displease him drawn and quartered. In fact, dispensing with trials or even charges takes the worry out of it for the victim. They never know it is coming, so are not troubled with all sorts of last minute rushing to make sure wills are in order, children provided for, etc.

What makes the new tyranny worse than any other in history is that technology makes it ubiquitous, relentless and irresistible. You cannot escape it even for a moment. Any breach, no matter how unwitting or innocent is recorded and filed and can be used any time you get out of line. Should you or your kind fall out of favor, you might find yourself imprisoned.

Torture was happening recently if it is not still going on. Remember, too, that drawing and quartering *did* happen. We know people as a species are willing to go that far. Who knows what mischief subsequent Emperors will get up to in the future?

This is like the old joke:

Man to perfectly respectable woman: "Would you sleep with me for $1 billion dollars?"

She hesitates, thinks of all the things she could do... Well ... I guess for a billion dollars I *would* sleep with you.

Man says: "How about a hundred dollars?"

Woman draws herself up, outraged: "What do you take me for -- a common prostitute?"

Man says: "We've already established what you are. We're just haggling about the price.

We have already established that we will give up our freedom and live in Tyranny. We have and we do. We are now just haggling over the extent.

Unless you are purposefully acting against this, you are supporting this with your political franchise (vote) and your money (taxes) and your inaction makes them bolder every day.

These things never end well and the longer they are left to grow the worse the ending.

3
0

Verizon's transparency report shows more than 320,000 US data slurping orders

btrower
Silver badge

At last, probable cause

Hundreds of thousands of orders to spy on people? There is no way there was legitimate probable cause anywhere there. That means you have probable cause to start getting search warrants to take a look into the people and departments doing all this dragnet surveillance.

There *are* laws governing this and they flow from the Constitution of the United States. It is time for ordinary people to start convening Grand Juries.

0
0

Facebook will LOSE 80% of its users by 2017 – epidemiological study

btrower
Silver badge

Facebook is plenty strong

Facebook has messed up, IMO, plenty of stuff. However, its grasp on how to build a social network is demonstrably good. They do not have any realistic competitor in a space that is clearly important enough to engage a significant percentage of the world's population.

The inability of the mighty Google with its massive resources to make a dent should give you some indication of the resilience of Facebook.

Twitter is not a one-to-one competitor that worries Facebook. If they were, Facebook could easily mount a similar platform and crush them.

Facebook is where it is because of particular network effects that do not apply the same way elsewhere. The network effects in play with Facebook are exceedingly strong.

Facebook has built out an enormous infrastructure that allows them to not only service the current demands of Facebook users but future demands of those users and other users besides. Should they decide to mount a competitor to Twitter, Vine, SnapChat, Pinterest, Tumblr or whatever, they can do so almost overnight.

Given Facebook's resources they should, if they don't have one already, have a 'skunkworks' where they build and test competitors to all those things.

Facebook's big opportunity, which they have not capitalized on yet, is to refine their marketing apparatus such that people are only presented with advertisements for stuff they would be in the market for anyway, to police offers to make sure they are good ones and not a waste of their user's time and to put in place a payment and delivery infrastructure similar to Amazon so that, for instance, around dinner time, someone who is likely to order a Pizza anyway is presented with a good offer for one for delivery and can order it with a couple of clicks.

Facebook needed network growth more than anything else, so they played fast and loose with privacy and data. They are now reaching a point where they should be putting together a model that entirely locks down privacy and data such that except for possibly public postings of pictures, nobody including Facebook themselves, can gain access to private data.

With the correct privacy controls in place, Facebook can attract back followers for whom privacy is an issue. They can create mechanisms that would allow 'cliques' to erect their own barriers and admit their own members, etc.

The network with the critical mass is the only asset that matters. Everything else can be done once you have that. Competing against the network with the critical mass is virtually impossible, no matter the resources you bring to bear. As of now, Facebook owns that network and the only way they will lose it is if they shoot themselves in the foot -- repeatedly. They have not done so yet and appear to be in no danger of doing so any time soon.

Back in 2012, Forbes published an article saying that Facebook (around its IPO) was *no way* worth $75 billion. I wrote an article in response saying that if I had the $75 billion I would put it down in a heartbeat. Had I been allowed the opportunity, that would be worth nearly twice that today. Not a bad return on such an enormous amount of capital. Facebook is valued around $130B+ right now and I still think it will increase in value over the next few years. The network phenomenon that they have in play is quite unlike anything we have seen before.

3
3

Top patent troll sues US regulators for interfering with its business

btrower
Silver badge

Go after them personally

When does behavior become so far out of line that the individuals must be called to account? I think this is pretty close if it is not there.

Get the names of the actual individuals pressing these bogus claims and press for judgments against them personally both for financial penalties and criminal sanctions.

We make it a no-risk proposition for these people to pursue these odious practices. Add a little fair risk and see how enthusiastic they are.

26
0

Dell will AXE up to ONE IN THREE workers in its US & EMEA sales teams

btrower
Silver badge

Dell could...

Dell could quickly increase the number of boxes it shipped by simply offering sane prices without all the tomfoolery on their website. I went the other day to price out a big workstation and it was hopeless both in terms of attempting to configure the system and in terms of price. I can assemble the thing myself for less than half of what they charge.

They have gone to a hopelessly high-margin model that is great if, like Apple, you can keep it going. However, Dell does not offer anything that I value that is not a straight commodity item and they charge through the nose for it.

I will give them this: For a client that had a service agreement and a Dell only infrastructure you could get pretty good service over the phone from people who knew what they were talking about. I would be willing to bet that those people are the first in line for pink slips.

2
1

New FCC headman brandishes net neutrality carrot and stick

btrower
Silver badge

Do any of these guys believe their own bullshit?

We should be scrambling to produce the highest capacity most fault tolerant low-latency bandwidth we can manage everywhere and putting in place laws to protect that infrastructure *and* the traffic that runs on it.

I have no doubt that the people in charge are pretty clueless, but it is willful cluelessness. Until we have people with a real desire to do the right thing *and* a clue, we are screwed.

Secure, low-latency, high-bandwidth access to the Internet backbone should be ubiquitous to the point that everything keeps a live connection at all times. In Canada, a former leader in telecommunications, we have all sorts of places where internet connections are measured in kilobits. That is way wrong.

The more you know about networks and security the more you realize that the entire industry, especially 'watch-dogs' charged with oversight, is rotten to the very core.

EM Spectrum and physical right-of-ways were either improperly obtained or have already been paid for by the public many times over. It should all be seized back and configured properly as an essential service. The entirety of the telecommunications community is predicated on a model that *requires* scarcity and they have done their damnedest to throttle and cripple bandwidth at every turn. They are the poorest of custodians for what is now an essential service and they need to be removed.

There is no reason we can't have 1Gbps bandwidth just about everywhere and 10-100Gbps and beyond within easy reach in metropolitan areas.

0
0

Take off, nuke 'em from orbit: Kill patent trolls NOW, says FTC bigwig

btrower
Silver badge

Fix the bottom

Jail time should slow those Trolls down a little.

Bottom feeding should not be so attractive. Instead of serving up million dollar extortion payments we should be serving up juicy amounts of jail time instead.

2
0

Anatomy of a 22-year-old X Window bug: Get root with newly uncovered flaw

btrower
Silver badge

Re: Systematic Fix: Type Safe Programming Language

@Frank Gerlach #2:

You make good points, particularly when you say '"stop using plain C++ arrays, pointers and quite a few nasty C/C++ "tricks"/.

Re: "I wonder how you would make Array Access typesafe in C"

It can be done. Don't use naked pointers where it could be an issue. Make an array type with info as to length, etc of structures. One of the big issues with C, from my point of view is that its maintainers are content to have a souped up assembly language rather than just a real modern language. I have been looking into how one might fix that myself. I think it is possible.

Re: "properly optimizing compilers can safely remove lots of bounds checking, as overruns are very often known at the beginning of a loop"

I emphatically agree with this, even in a broader sense. A programming language like Modula or Oberon (or Ada) that gives the compiler lots of information allows the compiler to do all sorts of compile time optimizations. In theory, a language that gives the right type of information to a compiler with sophisticated optimization should be able to beat a human at optimizing the overall code. I have, for many years now, adopted the stance that the compiler gets the first crack at optimization and only if it fails and the optimization is needed do I step in to hand-code.

Re: "do we need to use a portable assembly language for anything "efficient", just because we need it for a very small part of the operating system ?"

No. However, it seems to me that C is tantalizingly close to a good general purpose replacement for everything including Assemblers and Scripting. C, on its own, is perfectly capable of supporting very elaborate abstractions. To fix its few problems, it seems to me that it just needs a few tiny nudges to better rationalize and integrate the pre-processor, provide support for Objects, bring functions up to first class objects, some refinement of syntax to allow nesting of functions and a bunch of ancient assumptions ripped out such as the special treatment of source code in disk files. The compiler should worry about finding and integrating called function and the default language run-time should default to values that would allow simple scripts to be written directly. Here is what the famous 'Hello World' program should look like in its entirety:

"hello, world"

The proof of utility of the C-language family (including C++, C#, Java, ECMAScript, Objective C, golang, etc) is its ubiquity. The proof that it requires refinement is evident in the proliferation of the C-like languages. Each aims to correct perceived deficiencies in C. I think that a good re-think of the language could allow us to toss C++, C#, Java, Objective C, golang, etc. I have a hunch that done right it would actually decrease the size of the base language.

Re: "Even the vast majority of kernel code does not need to be type-unsafe. Both MPE and Oberon prove this."

Don't get me wrong, I am a big believer in the utility of type safety and other facilities that protect programmer's from themselves. Most programmers do not belong in charge of a C compiler. I gave up on Oberon for various reasons, but was a Pascal and Modula 2 programmer in the early 1980s even before I learned C. I preferred Modula to C for many reasons. However, at the end of the day, for a programmer that can handle a C compiler, Pascal and Modula (and from the looks of it Oberon) are like programming with mittens on. Part of its strength in abstracting the machine is also a part of its weakness.

Good teams can make a pretty good go at modest programs using nearly any language. Bad teams can make a mess of modest programs using nearly any language. We have a lot of extremely well trained and practiced people out there including project managers, build specialists, designers, analysts and coders. We could accomplish a great deal just by funding the good ones and getting the bad ones out of their way. Unfortunately, the people deciding on which teams get assembled are generally pretty hopeless.

We do not reward technical success very well. One of the very best projects I worked on was with Sybase to produce a replacement billing system for a large leasing company. It was a team of maybe a hundred or so people and virtually every one was a seasoned professional near the top of their game. The project was delivered beyond spec, below budget and slightly ahead of time. That team was eventually disbanded because even though they got better results for (much) less money, the sales process above them could not compete with the bigger more ruthless players.

For projects like the above, there is not a lot of repeat business if you do your job well. The job is done and systems like that stay in place under low maintenance for many years.

Love it or hate it, C continues to exert influence because it gets the job done. Warts and all, it is often better than the alternatives. I honestly think that C can be cured of some of its ills, but things like C++, Java or C# are not that cure.

0
0
btrower
Silver badge

Re: Systematic Fix: Type Safe Programming Language

Frank Gerlach #2:

You can make C typesafe by creating types and methods that are safe.

At the level in question, C is being used as an alternative to Assembly Language, not as an alternative to Ada.

Although there is not much of a performance hit for a check on the size of a target buffer vs the data being written, you will see performance issues when these things are nested very deep and the checks are done over and over again at each level. Some called functions require the caller to provide safe arguments:

strncpy(s1, s2, n);

The above will ensure the length being written will not exceed n. It will not and should not validate that s1 is a valid target or that s2 is a valid source.

Management hires people who do not have the skills to do it properly and then rush them so they can't even test to find their mistakes. Software development, by its very nature, is something of a research issue. It requires 'enough' time and 'enough' is hard to pin down.

Management demands project plans whose specificity virtually guarantees failure and the larger the project and more it is micro-managed the more likely it is to fail.

We have a massive over-abundance of code in all languages, but are painfully short on solid working code even for basic things.

7
0
btrower
Silver badge

Re: When any C/C++ code includes "goto" you know it sucks...

You are correct. In fact, any unconditional jump in code is a potential source of bugs. In large bodies of code that potential is always realized. Examples:

goto

break

continue

setjmp/longjmp

return

Some usage cannot be avoided in C because it does not contain structured alternatives. When they can't be avoided (say in a switch() statement), they should be carefully used only to accomplish the necessary block exit and nothing else.

For some reason we still see arguments in favor of 'goto'. Don't use it. It is not necessary and is a rich source of very nasty and difficult bugs. The only reason 'goto' exists in the C language today is for backward compatibility so that old code will not break. New code should never use goto. There is no practical exception to that.

Any argument from efficiency for stuff like this is a failed argument. It is conceivable that you actually do have a compiler that cannot optimize that well. However, since all non-trivial bodies of code have bugs, you are not likely at an optimization stage of development. The only optimizations that should be done are ones required to meet some real-time constraint that cannot be met otherwise. Maybe a device driver writer can bend a rule to overcome a problem with a device, but even then it is iffy.

A few simple 'good hygiene' habits ward off a lot of bugs, especially in older code:

Always make things explicit. Do not do stuff like this:

if(x)

dothis();

Do this instead:

if(x) {

dothis();

}

When using resources you are expected to release as the caller start with the allocate/release pair first and deal with failure then and there:

a = allocate(n);

if( a == NULL) {

errmsg("Cannot allocate 'a'");

} else {

/* Code on successful allocate goes here */

deallocate(a);

}

Note that any unconditional jump out of the part that deallocates will leave a memory leak.

Eliminating bugs and getting correct behavior is job one. 'Efficiency' should only be a consideration when optimizing and optimizing should be mostly done by the compiler and should only be done by hand if it is actually needed.

I see a lot of code that looks like this:

if (cannot(x) ) return;

if (cannot(y)) return;

etc.

It will inevitably develop bugs as time goes on and some of those bugs will be very hard to find and fix.

Code blocks should have one single point of entry and one single point of exit. They should generally do one thing well and that is it. They should not have side effects and they should not depend upon side-effects. They should not depend on things that don't properly belong in their scope (no globals or 'relative globals'). In practice, my observation has been this: lots of smaller functions nested are easier to debug than fewer/flatter larger functions. Code so that a scoping block may be turned into its own function. For instance, even inside a function, declare variables within the narrowest scope possible.

High level short-lived code done in languages with things like garbage collection can be forgiven a variety of bad habits. Low level long-lived code upon which many layers of client code depends needs to be coded with strict good habits.

Good habits alone will not yield bug free code that is non-trivial. On top of good coding habits, you need to unit test with a goal of 100% coverage. You will find building in good unit testing much easier if you follow the rules of thumb above.

You should assume that an item that has not been tested will fail.

Chances are good that a modern compiler will be better at optimizing code than you are. Help it do its job by writing clear simple code in small well characterized chunks with no unstructured flow of control to confuse things.

2
1
btrower
Silver badge

I have looked

I have looked at a lot of the code on various Linux systems. It is riddled with bugs even now. Just getting lots of this stuff to compile is an adventure.

Code should not cause compilers to issue dozens of warnings. It sure should not die on actual errors. I have never had the persistence to completely clean up any of that code on a non-trivial system. The main cause is programmers with six to ten years of bad habits under their belts who are just good enough to create havoc.

On the upside, there *are* a lot of good programmers out there who have good habits and some are tackling some of the basic code. There is hope yet.

8
8

Infosec experts boycott RSA conflab over alleged 'secret' NSA contract

btrower
Silver badge

Trust nothing

Anybody doing a security design these days should address all aspects of security as if every single thing from silicon up is compromised. Only fairly wide-spread joint custody can provide a reasonable sense of security.

Even with extremely strong security, we still need legislation to curb abuses. No entity, certainly not an ethically challenged one like the NSA, should be able to make any legal use of ill-gotten information.

We have gone so far over the line that it seems even our experts have lost the plot.

4
0

Beauty firm Avon sticks spike heel into $125m SAP-based sales project

btrower
Silver badge

Re: Typical

@Bronek Kozicki:

Funny. Upvote for you.

0
0
btrower
Silver badge

Re: Typical

Re: "Pity that in the last paragraph you mixed a few unrelated concepts and confused others."

I must have communicated badly. The executive summary is this:

Programming language and programming paradigm are germane. When you start with a language that is not known for producing the world's working code, you start with a problem right off the bat. One of the issues I have with this is that a lot of the people producing code in these languages do not properly know how to program.

Most of the world's software is not written in HLLs like SAP's ABAP or whatever they are calling it these days. The people who know what they are doing enough to produce the actual working code we are all running use things like the languages I mentioned. Much of the world's code doing the 'heavy lifting' is old code, written in old languages by old programmers.*** To the extent that a lot of the modern systems work, they are relying upon older code in many places along the path from concept to a running implementation. The guys who built the stuff that works are retiring. Meantime, instead of properly advancing languages and training literate programmers to use them, we are squandering our resources and a generation of programmers on stuff like the SAP systems under discussion. The problem is, at the end of the day they do not produce real hard-core working systems we all actually use.

Re: "In the end, everything runs in assembly language (machine code, to be pedantically accurate), be it compiled or interpreted. Object orientation is more a matter of style than language. Yes, there are languages that make it easier, but C code can be object oriented just as you can write procedural C++"

In the long run, we are all dead. Ultimately everything is running in microcode on the chip. Unless we are building a chip, that level of abstraction is not where we work. Assembly language as a programming abstraction is not equivalent to machine code. It compiles to machine code, but it is not machine code, it is Assembly language. Similarly, languages which target C as an intermediate on the way to compilation (the old C++ did this), are not equivalent to C. They translate to C, but it is not C, it is C++. You can create object oriented code in C, native machine op codes or even lower for that matter, but at the level of C you are not dealing with an object oriented language. I am not crazy about any of the language alternatives, but some languages naturally support object oriented design and C is not one of them.

*** Things like the following are generally not written in the allegedly 'better' new languages, but rather the allegedly inefficient old languages: Operating systems, language compilers, database engines and tools, spreadsheets, word processors, drawing and document creation tools, typesetting, presentation software, browsers, search engines, Email systems, networking systems, web servers, virtualization systems, multimedia systems, security systems, archiving tools, translators and proofing tools, multi-tier architecture infrastructure, GUIs, expert systems, speech recognition and synthesis, etc. They generally trace back to someone scratching an itch and even though the research may have started with other tools and languages such as spreadsheet macros, Algol derivatives or scripting languages, eventually the hard-core production stuff we all use every day is written in languages like Assembler, C/C++, etc. Application stuff running on big iron was written in languages like 360 Assembler and COBOL.

5
1
btrower
Silver badge

Typical

This is typical of a bunch of the big vendors whose money goes into marketing, lobbying, lawyering and forms of bribery rather than the software they claim to have.

Like others here, I have worked on my share of big projects using stuff like this and the only reason any of them succeed is people like me spending long nights building work-arounds. In my case, a lot of it on my own dime just to maintain my own self respect.

The people at these places definitely know how to get success if you measure that success by their own growth and increasing wealth. Not so much for their customers. They are very good at balancing their take so they don't kill the host organism, but sometimes their calibration is off and the host company dies anyway.

Fortunately for them, a dead host is only a mere nuisance since the executives that acted as vectors into the host move to a new host and engage all over again. Ironically, this is often based on their (vendor issued) 'award-winning success' with the project that killed their former employer.

Most of the world's working systems are running on Assembly code variants, C, Cobol, Fortan and other old languages used by old programmers. God help us when they all retire. [Note that compiling what is essentially partially crippled C code with a C++ compiler does not make it C++ code, let alone 'object oriented'. Presenting screen-scraped results from CICS does not mean your application is written in the scraping language. That is, it is not going to replace the underlying COBOL code doing the actual heavy lifting. Shout out to Ada. Not sure how much code is written in this language, but a lot of important mission-critical code is written in it and it is (despite being a bit clunky) a sane and sensible language.]

10
0

Wait, that's no moon 21.5-inch monitor, it's an all-in-one LG Chromebase PC

btrower
Silver badge

I like it!

Not the device, the reaction of Reg readers. I want to like this, but like other readers here, I have issues with it. My main objection to such devices is that they unnecessarily link different things making things like upgrades and repairs difficult or impossible when they should be easy.

What I would like to see is movement toward a small set of converged high bandwidth interconnects, and fasteners that would allow components that are naturally separable to be purchased separately and snapped together. Monitor, keyboard, mouse, CPU, backup disks, etc do not belong permanently bolted together. A broken mouse wheel should not require replacement.

I recently pulled apart an LCD monitor to determine that it was dead due to a single failed capacitor. Taking it apart was unduly messy and the way it is built I either have to track down a separate replacement capacitor, solder it in place and re-assemble when at the very least I should have been able to snap off the electronics and replace them.

1
0

Harvard kid, 20, emailed uni bomb threat via Tor to avoid final exam, says FBI

btrower
Silver badge

Re: This is why Tor was/is never going to work

@Andy Prough:

I agree with you. I had higher hopes for Tor, but we need a much more comprehensive approach to secure communications. I am not sure why everyone cannot see the need for it, but we need pervasive encryption everywhere. Making point to point communication untraceable should be a fundamental part of infrastructure. Rather than aiding and abetting rogue regimes attempting to illegally monitor and control speech, network infrastructure companies like Cisco should be working diligently to ensure that intercepting communications is as difficult as they can make it.

The state is an adversary in any rational threat model. Creating a system that actually aids adversaries is poor security design indeed.

5
8
btrower
Silver badge

Need to stop more than Tor

Lots of things were involved in this incident. No doubt they used electricity during this time and were aided and abetted by food and shelter. The fact that they were not actively under surveillance was a contributing cause. I think if we do a little digging we will discover that he got tools and ideas from the Internet and somehow was able to make sense of them; perhaps by leveraging early terrorist training in reading and writing. No doubt we could have at least slowed them down by ensuring they did not have access to plumbing and sanitary facilities.

Many things are implicated in any activity. The emphasis on Tor is an attempt to make a case against personal privacy, not because it is peculiarly enabling of crime and of no value otherwise, but because it threatens the entrenched power structure.

Most people are not going to get it, but the Commentardia should be easily able to make sense of this and should be communicating this to their fellows. Stopping private communications is net detrimental just like pulling down any other necessary infrastructure; perhaps more so.

7
3

China turns screws on Bitcoin with third party payments ban

btrower
Silver badge

It's an error. Fix it.

What this means is that the BitCoin system has a bug or design error that makes it vulnerable to this type of attack. If it is to be a proper part of the Internet, it needs to be able to route around such attacks.

This is an argument in favor of Crypto Currency, not against it. The fact that they have control over other forms of exchange and don't over this is a bad thing for them but a good thing for us. Intervening like this is an admission that they can't control Crypto Currency.

I have some technical misgivings about BitCoin in particular and I think it should be replaced rather than upgraded. However, I think we truly need a strong assault-proof Crypto-Currency.

3
1

Page: