back to article Your consent 'almost always' needed when firms use your data to profile you

Organisations "almost always" require individuals' "free, specific, informed and unambiguous 'opt-in' consent" in order to make use of personal data they have previously collected in Big Data projects that involve analysing or predicting the "personal preferences, behaviour and attitudes of individual customers", an EU privacy …

COMMENTS

This topic is closed for new posts.
Bronze badge

free, specific, informed and unambiguous 'opt-in' consent

So that'll be a "By use of this service, you consent to..." line in the EULA small-print then.

5
0
Silver badge
Thumb Up

Re: free, specific, informed and unambiguous 'opt-in' consent

Ah, you beat me to it.

0
0
Silver badge
Facepalm

Oh FFS

"almost always" require individuals' "free, specific, informed and unambiguous 'opt-in'"

Then it's not free, specific, informed or unambiguous. So what was the point of that working party again?

2
1

Re: Oh FFS

Yes it is... just not in absolutely all cases.

1
0
Silver badge
Boffin

Re: Oh FFS

Then how do you know if you should opt-in or not?

0
0
Holmes

ha ha ha ha ha ha ha ha ha

Still haven't stopped laughing. A watchdog with no teeth barking at a lorry driving past the gate.

At least I'm more cheerful than before reading the article. The only downer is we pay for these morons to waste time making up pointless reports that will never be relevant.

6
1
Big Brother

Meaningless...

... unless it is 'almost always' backed by enforcement.

In the UK, there is no enforcement, ipso facto the law & the soothing words of WP29 don't protect you.

Sadly.

2
0
FAIL

Dumb beaurocrats

There's at least one fundamental flaw in these recommendations:

.....underlying "logic of the decision-making (algorithm) that led to the development of the profile".....

Is fine if the logic= Salary >50000 give loan.

However if you have an evolutionary (or density based, or almost any other form of intelligent mining post dating the 1980's) algorithm, then the algorithm itself is effectively changing based on the other data received. In order to tell you why you are considered not worthy of a loan I need to tell you all the personal details of all the other loan candidates with their credit history, locational information and everything else that factored into the decision because that's what makes the algorithm work. There's also quite possibly a randomness element which means that somebody else with the same history as you might actually be granted the loan, because the system is risking a bad loan to effectively test it's internal assumptions.

I don't blame the EU - It's just typical of lawyers/politicians/civil servants who assume that because it only takes a few seconds on a website, the underlying technology is as simple as their assumptions.

2
0
Anonymous Coward

Re: Dumb beaurocrats

risk bad loan to test assumption... how obvious, but i never thought of that... i learn something evrry day

0
0

I agree that strong enforcement is critical

I agree that strong enforcement is critical. I like the the statements that "those firms would still require to insure that the information is kept confidential and secure", and "EU's Charter of Fundamental Rights indicates a hardening of attitude,". I also like the statement "Expect to spend time looking at your purposes and consents much more closely before you embark on your next big data project,", since I believe that the big data security crisis is just around the corner:

1. I think a big data security crisis is likely to occur very soon and few organizations have the ability to deal with it.

2. We have little knowledge about data loss or theft in big data environments.

3. I imagine it is happening today but has not been disclosed to the public.

What do you think?

Ulf Mattsson, CTO Protegrity

0
0
This topic is closed for new posts.

Forums