26 posts • joined Tuesday 8th May 2007 08:29 GMT
Re: Only if it plugs into USB
> > No they're not, you can prevent users from setting PINs to 1234.
> Our IT overlords have not done so. One of the many things I'd change
> if we weren't owned by a company so big it takes them a week to even
> look at an urgent problem.
0000, 1111, 2222, 3333, ...
0123, 1234, 2345, 3456, ...
are quite bad too... But if you eliminate all the bad passwords you'll lose entropy
@Skrrp, "We are still selling standard Mifare cards to customers who are quite happy and don't report problems with attacks many years after the cracking method became public knowledge."
Your customers aren't reporting problems because they don't know they are being attacked! People ride for free on the London Underground due to the vulnerabilities in Oyster cards. Do your customers know that you are selling them obsolete kit? Are you advising them of the risks? You really should be!
@Anonymous Coward, "Why don't people use 10,000 bit keys?"
Because the computation would be rather slow...
Cryptography doesn't solve problems.
I didn't realise losing unencrypted data was still newsworthy.
What is news, is lost encrypted data, secured by a weak password.
Cryptography doesn't solve problems.
BS wrote: "Is the author suggesting that all engineers & scientists are equal"
Paul 4 wrote: I hope not since to be a "science" graduate takes three years, to be an engineer takes about 7 (3 for your BEng, 1-2 for your MEng and then 2-3 for your charterd status). I know because im an engineering drop out.
I consider myself a scientist not an engineer. But I have an MEng (which includes BEng), which took four years of study. Chartered status can be applied for without any additional effort as such, you merely have to have demonstrated various skills in your day job.
All engineers & scientists are equal...
Is it just me, or does the opening paragraph not make sense....
``University admissions statistics reveal that more students than ever before in Blighty have enrolled on courses in science and engineering this year."
So far, so good. But then
``Unfortunately this progress has been achieved at a grim cost, as far larger numbers of young people have as usual chosen to study law, business, management, psychology - and computer science."
This seems to imply that the author prefers science and engineering over "law, business, management, psychology"; but then the author adds ``computer science," which is both a scientific and engineering discipline.
Is the author suggesting that all engineers & scientists are equal, but some are more equal than others?
``attackers have figured out how to crack the captcha Facebook uses [sic] to ensure profiles are created by humans, rather than computer scripts that automate the process so it can be carried out thousands of times."
Attackers break captchas? Why not simply write a great porn site and require captchas-per-movie?
Did Schneier really ``issue a refresher on the secure creation of passwords." When I read the article last week I didn't feel that Schneier was advocating these guidelines. I felt that Schneier had read them and found them impractical, as suggested by his remark ``I'll bet -- no one follows [the said guidelines]." Moreover, he goes onto say that he ``regularly breaks seven of the rules." Surely a leading security expert would not advocate the use of something he refuses to follow himself? Furthermore, Schneier is the author of a product http://www.schneier.com/passsafe.html that clearly violates the advice to ``[not] putting [passwords] into a file on your computer." On this basis I feel Dan Goodin's claim that ``security guru Bruce Schneier issued a refresher on the secure creation of passwords, just last week" to be deeply flawed!
Who watches the watchman?
``Sequoia and manufacturers of other brands of e-voting machines frequently discount vulnerability research into their products by pointing out that the underlying source code is closely guarded."
How many employees do Sequoia have?
(If you're a Sequoia spokesperson and require me to elaborate further then you clearly do not understand the problem!)
"The more uncertain users feel about typing passwords, the more likely they are to (a) ... and/or (b) copy-paste passwords from a file on their computer. Both behaviors lead to a true loss of security," he said.
Copy-and-pasting from a file on your computer leads to greater security! Clearly Nielsen has not heard of MyPasswordSafe.
Big Brother blocks content
Google have blocked the guy in the original link, but you can still see him:
No need for encryption
For those that are suggesting encryption is the answer, I think you should elaborate. Physical security via "two successive manned security doors" is sufficient to prevent unauthorised access to data. Whoever stole the data has the ability to access it (or the physical security policy wasn't working!)
There is however a risk that should be considered: if physical security permits theft then data is accessible. Note that the risk only arises if physical security is violated. If this is thought to be a threat then encryption should be used. Theft of all computing equipment should be considered and hence all hard drives should be encrypted.
``I saw a Diebold cash machine in Slovakia earlier this year, I needed some cash, but I decided to wait until I found one who's ability to count wasn't in question."
You should have withdrawn a large sum. No bank could possibly contend that Diebold can cound money correctly.
On a tangent.
Nationwide now require me to use their card reader. One problem -- I don't have any cards! I don't use them so why would I want them, I certainly don't want them for the sole purpose of using a card reader.
``Your set of requirements for e-voting is impossible to meet, specifically: Receipt-freeness and Individual verifiability"
This seems quite bizarre, but cryptography can help us here. Okamoto has developed one such scheme which claims to provide both receipt-freeness and individual verifiability: http://citeseer.ist.psu.edu/129743.html
Chevallier-Mames, Fouque, Pointcheval, Stern & Traore have claimed that [in their standard model] the following properties cannot be met simultaneously:
* Universal verifiability and privacy
* Universal verifiability and receipt-freeness
(It may of course be possible to achieve these properties if different assumptions are made. I haven't looked at what they define as the "standard model," hence I cannot offer judgement as to whether such a model is realistic.) See http://www.di.ens.fr/~fouque/pub/wote06.pdf
For further academic research you may be interested in the following survey papers:
Further links can be found from Helger Lipmaa's voting page:
I've heard about the voting machine in Holland. Rop showed me how he could eavesdrop on all the votes cast using a cheap bit of kit! See:
@A J Stiles, I like it ;-)
@Daniel B: Even with folded ballots some information is leaked - how many ballots have been cast. The question is then: does this matter?
``As for the topic: pure e-voting machines can't be constructed which meet the current UK requirements for an election."
Well change the law! We don't even get privacy with the current system.
(Every ballot has a serial number which is linked to your name)
On a more serious note (@James Dunmore)
Developing a secure electronic voting protocol is hard enough. In fact, elliciting the requirements is hard enough! Here's one set of properties which an electronic voting protocol should satisfy:
* Privacy: the way in which a voter cast her vote is not revealed to anybody.
* Receipt-freeness: the voter is unable to prove that she voted in a particular way.
* Fairness: no partial tally of results may be obtained until the official count.
* Eligibility: only authorised voters may vote and at most once.
* Universal verifiability: anybody can check that the published tally really is the sum of the votes.
* Individual verifiability: a voter can verify that her vote was really counted.
That list isn't complete, you could for example add:
* Invisible abstention: the voter herself should be the only one person who knows whether she participated.
Numerous other properties have been discussed in the literature. It is not known whether an e-voting protocol can even achieve all these properties simultaneously.
I have yet to even consider actual implementation...
If you want evidence as to why its not easy, take a look at what the guys at Princeton have been up to.
``WHY - when you go to a polling both, isn't there an electronic console - press the buttons, job done, vote counted"
There's no need for me to write a response, this cartoon sums it all up:
(Maybe Paris would believe it is secure...)
``are transparent ballot boxes. As it stands, voters have no way of knowing if the box they're putting their ballot into has been pre-stuffed."
I'm unsure as to how a transparent ballot box gives you an advantage. The first voter of the day is of course assured that the ballot box hasn't been stuffed at that point. The officials will also be satisfied (but don't they check the box before using it? I would hope so!) However, subsequent voters have no idea. Unless of course we all turn up early and watch all day. Note that this is also a DoS attack, since only so many people can fit into a polling station.
Furthermore, could a transparent ballot box be considered to violate fairness? Let me first define fairness as ``no partial tally of results may be obtained until the official count." Can leaking an approximate number of ballots which have been cast in a particular polling station violate this property?
Does TheRegister believe in proper journalism?
"However, in certain unusual circumstances a savvy attacker can lift the keys from computer memory. "
Presumably the article is referring to the recent work at Princeton. This attack is simply infeasible in this instance. The main memory of the laptop would have faded and thus the key would not have been recoverable. Furthermore, since the data has held on floppy disk there is nothing to suggest that the floppy disk's encryption key would have ever resided in the laptop's main memory (this relies upon the assumption that at some stage the disk had been used in that laptop, moreover, it must have been used recently)
"As to the TPM, that is giving the computer to the manufacturer or ISP and them letting you use it. It only works if YOU THE OWNER are in control of TPM. But that doesn't help MS and it doesn't give ISP's power, so it's not going to happen."
Sorry, can you explain your point?
The technology is there
``Most important, can we design a sandbox PC, which does what most of us want (visiting Facebook, managing photographs and videos and music, searching the web for news and chat) but which can only run other software when recognised by the ISP which provides our web link?"
There is no need to `_design_ a sandbox [sic] PC.' The technology to achieve your goal already exists to some extent in the form of Trusted Computing. A Trusted Platform Module (TPM) can attest to the systems state. An ISP can then decide as to whether this state is suitable. The problem, of course, is too many states exist. If, however, an ISP subscription included a laptop (or PC) which it maintained remotely (as part of its service package) then the state would be known and this solution would be viable.
It wasn't his fault....
Personally I believe there *could* be a perfectly valid reason why the gentlemen *accidentally* contacted his wife. Facebook allows you to give it access to your MSN profile in order to (spam/)email all of your `friends.' If the man's wife was listed in his MSN profile, then Facebook would have emailed her.
This is is also the hardest (ever) to remain logged into!!
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Beijing leans on Microsoft to maintain Windows XP support