Re: Andrew Richards
SVV said it for me - the lack of strong data typing to catch mistakes in data use is the single biggest thing by far. Fine and less effort to write for a 20 line shell script, pants for anything complex.
2602 posts • joined 15 Mar 2007
SVV said it for me - the lack of strong data typing to catch mistakes in data use is the single biggest thing by far. Fine and less effort to write for a 20 line shell script, pants for anything complex.
Anything that allows fast and usable code cross-platform to be developed without resorting to flaky and/or propriety systems like ActiveX, Java applets, or NaCl stuff is to be praised.
Hopefully the MS implementation will remain "standard" and thus be fully cross-platform (browser, OS, and CPU) and future web developers will look at using this best (OK, fastest) sub-set for writing stuff.
The FPTP system is basically broken if you have more than 2 candidates per seat, and even then a tad doubtful with only 2. Some sort of AV/PR system is going to give you a more balanced seating.
However, the biggest problem is not how we vote for the devious, thieving two-faced bastards, but that so many of them are useless at their jobs and do little more than knee-jerk to get voted in again. Until we deal with who stands for election, and what skills they ought to have (you know, like having had a REAL job for some time and not been a carer politician) then nothing will really get better.
As for Scotland, 50% voted SNP but they got 95% of the seats which is not exactly representative. Still, the only glimmer of justice is UKIP got more votes than the SNP but only 1 seat...
Usually the biggest error made in predicting RAID failures is the presumption of uncorrelated faults. Most of us know from bitter experience that faults are much more likely to happen in a strongly correlated manner due to:
1) Manufacturing defects (or buggy firmware) that impact on a lot of disks, and you have all from the same batch...
2) A stress event prompting the failure, such as power cycling after years of up-time, or an overheating event due to fan failure, etc, that is common to most/all of the HDD in the RAID array.
So you should start by assuming HDD faults of around 5% per year and do the maths from that, not from claimed BER figures.
And you don't see the problem in losing/corrupting a chunk of your data without knowing what file it was?
First point has already been made - you just can't do all-flash for a lot of cost & space requirements.
Second point, as most folk will know sooner or later, HDD don't suffer from simple random bit errors, they are almost always big clusters at a time and generally much more common than the quoted BER figures would tell you.
Worst still is that most file systems don't tell you if something is corrupted, so if you do get a rebuild error on sector 1214735999 then how do you know which file to restore? Yes it is possible to work that out but it is a major PITA to do so. Further more, you can have errors that are not from disk surface read flaws, such as the odd firmware bug in HDDs, controller cards, etc. So you really want something that protects against all sorts of underlying errors if you have big volumes of data (or really important stuff). Enter ZFS or GPFS as your friend - they have file system checksums built in. And if it matters make sure the system has ECC memory so you don't get errors in cached data being written to disk!
The multiple day rebuild times are not such a problem in some ways just so long as another HDD doesn't fail during it. So if you have any biggish array you should start by using double parity. It is much better to have 8+2 in a stripe than 2*(4+1) in terms of protection against double errors, etc.
Finally if you have an array make sure you regularly scrub it - most RAID support this (hardware cards, Linux's MD system, ZFS, etc) and it forces the controller to read all sectors of all disc periodically so errors can be detected and probably corrected before you have a HDD fail completely (they will do the parity checks and attempt a re-write, probably forcing a sector reallocation on the flaky HDD). For consumer HDD do it every week or two, for enterprise you can probably get away with once a month.
One day? How about the Boston Marathon bombing? That had the so-called PATRIOT act in place, shit they even had warnings from Russia that these guys were trouble, and what did it prevent?
I don't see how they can tell (yet?) which key was pressed, but they might be able to find out your password's length and so target brute-force on a subset of users with short-ish passwords.
Well thank $DIETY that people realised this and sent them the best answer possible - not buying a shitty locked-in product. One hopes this will be a lesson, perhaps not of Ratner-esque proportions mind you, for other businesses to take heed of.
That part is, to me, fair enough.
What was not fair was it was in effect a fixed fee, and not a progressive taxation based on overall income (including any benefits, etc).
Well, Ukip have their way there will be
no only undesirables remaining...
Fixed if for you...
But I guess there is a big difference between "local" attaches, where the person has to gain some sort of physical access, and the risks from a remote hack being used.
While there probably are very few bad/mad enough to do this in total in the world, the risk of it being done is much higher if the perpetrator need not travel or physically risk being caught. To me that is the real issue with the whole IoT craze, not that someone who gets on my LAN can do something stupid/bad, but that suddenly any twerp anywhere in the world can take a shot at things because the devices are being exposed to the WAN, without adequate security or patching, for whatever reason the designer thought cool.
You are, of course, perfectly right.
Sadly you are also in the minority as developers go, in particular if you have XP-era (or older) software that you need to run. Even a lot of MS's older stuff flouted their own "good practice" guides!
Interesting development. But for now I will stick to a handful of VMs with XP and the strange win32 stuff I can't get on other platforms.
Come on now, they never said they would catch smart terrorists or criminals.
This is about citizens who are disliked by those in power, sorry about catching the ones trying to set fire to their underpants. Probably after they have failed, but see - we have emails to prove they have proper explosive pants.!
Ah yes, illustrates the importance of recording such meetings, completely off the record of course, on a phone. An Android most probably...
Are the patents for FAT32 not expired now?
After all its been 20 years since Win95 came out with long file name support. Sure it sucks as a file system, but I doubt you need a license for that any-more. Not true for exFAT of course as it is a recent one...
The answers to your points are:
1) Yes, realistically you need newer machines to have a decent chance of running a VM. Think of at least 4GB RAM and support for virtulisation (AMD A8 ought to be fine).
1.1) What the VM buys you is you don't need to have drivers for the new hardware for an old OS (currently a w2k or XP issue).
1.2) You can also (sometimes!) migrate a working machine in to a VM image and thus save the process of installing the OS, patching, installing applications, getting license keys, setting stuff up, etc. Down side is you don't then clear out years of crud.
2) Most software that is currently performing OK on a 5 year old machine will be fine in a VM, and you can get some video acceleration support for the VM as well (depends on OS/video driver/etc).
Obviously you won't get "bare metal" performance but often the convenience beats that except for really high performance tasks, gaming, etc.
3) USB dongles are not usually a problem, you can selectively connect USB devices through the host to the VM, but you might find the occasional thing that won't work.
However, all change has a cost (time, software and hardware, sometimes all 3) and eventually you need to attend to it. Better to do it before the excrement hits to HVAC attachment so you don't find big problems that take ages to work around.
If all you need is XP/7 application support, and not special hardware, then running Windows in a VM is a good solution.
OK for the typical end user its a little more training/understanding of the whole "computer in a computer" arrangement, but it allows you to totally decouple the application+OS you depend upon from the hardware you have. You also can lock it down so web/email is from the host, and the VM has only the internet access it really needs (which could be zero). Finally, as a lot of malware now avoids running in a VM to evade analysis, and you are probably not exposed so much, you can drop a lot of crappy AV software and rely on other methods of recovering from an infestation (as AV is pretty shit generally at that job).
For myself I have XP and 7 VMs for CAD software, Office, etc, and use Linux for my host machine. No need to rent, no need for cloud unless I want it, no need to sign up to a MS account, etc.
What? Did I miss Twitter having an actual use?
I think most SSD support a "secure erase" instruction that wipes the device. They would have to prove you did it (harder to prove if the wipe software was on the SSD when it wiped) but that way there is nothing encrypted to be forced in to decrypting (or trying to prove that random data is in fact random data, for example, as I have from the Numerical Recipes CD). Might also be useful if your device is stolen/confiscated for espionage (industrial or nation state) reasons.
What is a bit sad is the fact this discussion is taking place. That people feel enough of a threat of 'data' being used/abused to convict them when in the past you generally had to be shown to have physically done something and/or have corroborating evidence from others.
The issue has nothing to do with disk space (usually) but everything to do with the mindset of your typical large system IT department where if it can't be locked down by AD policies, it ain't going on their machines.
It is not that daft a rule, as typically they want to be able to control trust certificates and proxy settings, etc, as well as controlling what sort of plugins are permitted.
If Mozilla really do want to be relevant and get a bigger share of the corporate world they ought to make their web browser and email clients much easier to administer remotely using Windows practice and ideally something for Mac/Linux as well.
Stop copying the dumbed down Chrome UI and its policy of changing stuff every month or two, as that just pisses of people who have to manage and train non-technical staff.
So if no one has checked the person requesting the certificate, How can you trust it? how do you know it was issued to the site that is now signed as being so?
That is the underlying problem of the whole https system: the certificates are only as secure as the logical-OR of all 600+ authorities who can issue them, and some (or their governments) I would not trust as far as I can comfortably spit out a rat...
Hence we than have the "certificate pinning" that sort of works on some browsers & sites. And we have Chrome basically ignoring certificate revocation completely (speed matters! WTF do you care if its dodgy?)
"customers will be willing to download a free alternative"
Try telling that to the Gov, NHS, etc, who have their balls in IE's vice...
Would a better approach not be to have a system where all of the stuff they access on-line is included in the submission the examiner gets?
That way you can mark how well they "used" google and check for simply asking the question and/or using google to go to a 'mechanical Turk' site for a solution.
OK, would make marking a bit more tedious, but maybe the knowledge that their search operations are assessed would make for a more focused approach.
If MS has really done the decent thing and put a bullet in IE's multiple mutant heads, and developed a new standard-compliant browser that is up with the rest of them, I applaud them.
But why only Win10?
I mean all of the major browsers like Chrome, Firefox and the also-ran Opera (sadly now a skinned Chrome engine) manage to support various versions of Windows and also Mac & Linux. Why can't MS do this?
Of course if their business model did not consist of screwing every last cent out of whoring its users' data from advertiser to advertiser, maybe this would not be a problem?
Yes, it is free, and no I would not pay for it. Unless it really did offer privacy and respected laws outside of the USA.
Why? Really, if anyone is sick enough to want to use a radio controlled bomb there are plenty of other RC devices out there that don't need a mobile phone network. Or a timer. Or the lack of a radio link. Or as demonstrated with those very 7 July tube bombing, but pushing the button and blowing yourself up as well.
One strongly suspects there is much more to this reluctance than just how some bomber could manipulate the network kill-switch.
In 5 years will we be reading "After a couple of years,
Microsoft Google moved the goalposts again. IBM Microsoft couldn’t keep up and threw in the towel" I wonder?
I don't like Google's behaviour with many things, but it is hard to feel much sympathy here.
"your child's child likely will have no need of handwriting"
So they can't sign things and thus we all have to be corralled in to a biometric future like cattle, all suitably tagged and compliant.
They may not be "safety critical" but they sure are business-critical as shown today.
Also I doubt the cost of having software for two OS is anywhere near double to cost of one, but we will have to wait and see if it was an OS problem killing the connections or an app problem. Either way, it is a timely reminder of just how much companies depend on IT systems working.
Problem = solved."
Not really. While the "Professional Starter 5 Server" looks as if it provides your cloudy store & share, it still leaves open the whole issue of how you secure access to your own server to host it (assuming that you have the need for enough data to make them hosting it uneconomical or too slow, so you want only some data synced but lots more on-demand).
Also you might have software on a home/work machine you need to run remotely (maybe its tied to MAC address or whatever for licensing). That was why the issue of choosing & configuring a router/VPN was mentioned, as it could drastically reduce the chances of other having a pop at your server, etc.
"safe solution would be to have your own NAS box somewhere on the network"
Yes - except most home & small-office products are shockingly shit when it comes to security.
Maybe Trevor Pott has some advice from his much greater experience than me, but personally I won't put any of my machines on the world-accessible network as I don't trust them much. My own Linux PC which I can SSH in to also has a 2nd software firewall (behind my el-cheapo router) that only allows my work's sub-net to even try a log-in.
It might make a useful article, how to chose & set up a router and NAS + few machines so you can VPN in and access your data or desktop with tolerable risk?
Its not just the Americans, though they seem to be the worst offender these days given the open attitude of "USA courts can enforce USA laws in other countries".
It is about anyone out there who wants to get a hold of your data: be it spy agency in your own country or another, business competitors, jilted spouse, nosey employee at your hosting provider, whatever.
As for deliberate weaknesses, that is far easier to do in a closed source implementation (to leak the key as claimed for Crypto AG devices) than in a standard (where you hope that the breaking effort is much less than obvious brute-force due to some knowledge you have about it). Which is why the only standards you should consider are ones that have been publicly analysed by the international community (e.g. AES) and not ones where the creation was done in secret (e.g. Dual EC).
The only way that is trustworthy is to have your own encryption.
That way if anyone has a legal reason to access your data they have to come directly to you with a court order. You then only have to respond to courts that have legal authority over you, not over your ISP or over your cloud provider, etc.
Just to add that SpiderOak claim to provide a drop-box like file sync/share with "zero knowledge" of the data stored on their servers. Of course, just so long as you don't create a share link for web access as that needs your key to be transferred.
This is how it should be!
The only reservation I have is I don't think it has been independently audited and even if the source was available to me, I doubt I could audit it myself.
Yes, look at BT here in the UK.
They outsourced email to Yahoo and the buggers changed settings from time to time without it being updated on BT's help pages, and their useless hell desk had no clue either :(
I mean WTF are they doing changing an email server's settings without informing the users. You know, maybe by emailing them in advance?
If I am kind then it is simple incompetence in not knowing the POP/IMAP settings at any point in time. If cynical then its because they want people to use the web-mail interface where they can serve up adverts.
Encryption works if you use the "cloud" for data storage, say as an off-site back-up. And it is only trustworthy if you have control over exactly what software is doing it (and realistically that means a well regarded open source system) and you are the only one holding the key.
Where it all falls down is if you are using the "cloud" as a computing-on-demand service, or for document sharing and web-based editing, because then it has to be decrypted on the servers of the host, so they have access to your key.
Sure, the data at rest (i.e. stored on disk) may be encrypted, but they could snapshot the running VM or whatever and then poke through its memory for the key.
Really if you are concerned about privacy then run everything on a local machine, with multiple layers of firewall/VPN style protection depending on who/where access is needed, and only use an off-site provider to keep encrypted backups. That you encrypt before they move off-site.
Yes, fines should be large and enforced otherwise bugger-all will change.
How said companies chose to respond is up to them. It would be better for free software and probably cheaper for them to cooperate in making specifications fully public, also it would help build trust that nothing dodgy was added. But sense seems to be a rear thing these days.
Even if not going so far, it is time that suppliers were punished financially for failing to freely patch bugs in a timely manner for, say, 5 years after the software/product was last sold.
I don't see the logic here, if they are using phones to simultaneously trigger bombs then by time you know about it all said bombs have gone off. And if your aim is to detonate other bombs a bit later, you have timers and/or the ability to notice the network has gone dead for that.
The only situation where it would make any sense, and probably it is the reason for them wanting the document kept secret, is for demonstrations and similar where you would not want the organisers to be able to re-route a march, etc. And then it starts to look rather undemocratic.
Doh, me being stupid again! Why would they presume the people should have any say in their government's actions?
The problem comes down to two simple issues:
1) People want new & shiny & cheap.
2) No one gets punished for shit software.
Put them together and you see what IoT is bringing. As we can't stop people buying cheap tat, the only other real option[*] is to start making suppliers liable for shit security.
We know you can never be perfectly secure, but "shit" means things like known insecure protocols, no enforcement of password changes, no patching, ignoring vulnerability reports for more than 30 days, etc. That sort of thing ought to be punishable by more or less unlimited fines depending on how much lacking in diligence is found.
[*] Of course we could pay lots to mitigate other people's shit, but that is a lost battle if the projected numbers of IoT are true. Making the "polluter pay" is a better idea IMHO.
They are allowing for the Spinal Tap Hacking Crew.
I think he meant the first consumer-facing system. They ran in parallel with 95/98/ME and were intended for serious applications (proper 32-bit programs, multi-user, etc).
Sadly in the push to make consumer & professional lines converge and be fast enough for gaming, compatible with older badly written software (some of it MS' of course!), etc, a lot of dumb decisions were made w.r.t. security, etc.
"That affects the price of everything"
Yes, but it also pays for better standards of health, hygiene and public safety. Where would you rather live, a poor-to-middle region of the UK, or poor-to-middle of Indonesia?
(nothing against Indonesia as such, but its your example)
AFIK the reason for taxation is we all want to live in a safe and prosperous environment.
That in turn means we need protection from those who would steal our sheep and rape our wife (or steal the wife and rape the sheep, same principle). For outside of our nation (a somewhat arbitrary boundary, usually resulting from hundreds of yeas of bloodshed and the odd natural boundary) it means we need some armed services and intelligence agencies , and inside that boundary we need the police and legal system. All has to be paid for.
We also want things to be generally clean and safe, so we need things sanitation and refuse disposal, health care, some standards and enforcement of employment law, etc. For long term prosperity we also need education so those who are able can do well in employment, not just those lucky enough to be born to those who value and can afford to pay for it. Bitter experience has shown that most people are lazy and will try to avoid "public spirited" support, very much so if it costs them money, so we also have to find a way of making sure it is paid for. So we have taxation.
"why not negative income tax for individuals"
Is that not one key aspect of the welfare state? To provide support for those who cant otherwise afford food, shelter, etc? While it might be popular in certain political circles to class them as spongers and time-wasters, and I dare say there is a proportion who are like that, the reality is a lot of folk will find themselves out of work at some point in their lives for any one of a number of reasons. Without support they could well end up as 'unemployable tramps' and never get a 2nd chance. Even if you are totally self-interested you should still want some welfare state, as poor and hungry people may decide to take your property and maybe life as well since they have little to lose.
I am not saying current governments are optimum, but it is a hell of a lot better than the pre-taxation days.
One thing that has been touched upon in these comments, rather than in the article, is the issue of how easy it is to evade the show of profit in order to avoid taxation.
That is the main beef of "man in the street" when it comes to corporate tax, not that it is, say, less than standard rate income tax, but that on massive turnover somehow international business (and some UK based ones) magic it away via shell companies, curious accounting practice, etc, and they are only seeing a pittance in "profit" to tax, when we know (or at least suspect) someone, somewhere, has made a fortune.
Now there may well be a truth that taxing people directly, be it consumer, worker or shareholder, is simpler and ultimately who pays anyway. But for a lot of the public having some system that taxes on turnover or related activity would be seen as fairer as there are not huge sums of money going abroad without tax being paid to support the local government and population.
Indeed. As the apocryphal survey found out: 90% of men masturbate and 10% are liars.
Shame that she felt she could not come back to the job. He should really have just given her a stiff talking-to so she could come clean and not be interfering with the company's download jobs, those that ought to have been in-hand at that time of the evening.
At least if you are using a synced-to-the-cloud system and the supplier goes off line one day forever, you still have a local copy of your data.