back to article Politically linked deepfake LinkedIn profile sparks spy fears, Apple cooks up AI transfer tech, and more

Here's your latest dose of machine-learning news beyond what we've already published. Beware, deepfake-generated accounts on LinkedIn An expert in cybersecurity and policy in Russia found a suspicious-looking LinkedIn profile containing what looks like a deepfake-generated profile photo. Said fake profile described a 30- …

  1. sorry, what?
    Facepalm

    Was this an "AI" generated article...

    Enticing people to click through to the LinkedIn profiles so the owner(s) of the profiles can find out who in tech is naive enough to access those profiles and therefore give away their identities (assuming they are on LinkedIn, which you need to be to view the profiles from what I can see)?

    1. Anonymous Coward
      Anonymous Coward

      Re: Was this an "AI" generated article...

      The article confirmed that indeed not English well known as a journalist by Register.

  2. STOP_FORTH
    Thumb Down

    Careful now

    Down with this sort of thing.

    Also, I'm Brian, and so is my wife.

  3. Christoph

    Why are you using facial recognition on our own citizens

    Yet again, it's terrible that this intrusive technology is used on US citizens and violates their rights - but it's perfectly OK to use on everyone else, they are not real people and don't have any rights that can be violated.

    1. Michael Wojcik Silver badge

      Re: Why are you using facial recognition on our own citizens

      There are plenty of people in the US, and likely at least a few in Congress, who believe facial recognition shouldn't be used on anyone, at least as a form of mass surveillance.

      But you don't achieve anything by picking only the battles you can't win. There's some chance that Congress can squash this particular program, in part by playing the "against citizens" card. If they do, that 1) makes it easier to subsequently limit or cancel similar programs, including those against non-citizens; and 2) makes it much more expensive for CBP and TSA to run these programs at all, because they'd have to sort travelers into citizens and non-citizens before applying FR, and the returns would be much lower.

  4. DCFusor
    Unhappy

    Article linked-in link leads to wrong profile

    You get a gal claiming to be a journalist, not even the same name - and you don't even have to click the link to see that - hover and look at your status bar.

    Either this author or someone else is playing a game here....

    I had seen the "real" fake profile a couple days ago when it was initially reported. This article itself becomes a scam of some sort.

  5. mr-slappy

    LinkedIn Scammers

    I get a lot of LinkedIn crap (I only use it when changing jobs tbh) and about a year ago I received a request from a young attractive blonde woman who I didn't know.

    Nothing particularly unusual there, but her job was as a Geography teacher at my kids' secondary school. I'm also a school governor (elsewhere) so I thought maybe she was a real person who was a staff governor. But when I checked with my kids, they said there was no-one of that name who worked at the school and she didn't appear on the school's website.

    It took LinkedIn a good six months to remove her from the site, and bizarrely, I saw that some of my more gullible work colleagues (definitely not connected with the school) had actually connected with her.

    I am still trying to work out how the scammers knew which school my kids go to. (They're not connected with me on LinkedIn because, well, they have better things to do with their time)

    1. Michael Wojcik Silver badge

      Re: LinkedIn Scammers

      Personally, I don't accept connections on any social media network unless 1) I've met the purported individual IRL, 2) I can confirm with reasonable probability that the account belongs to that person, and 3) I have some reason for accepting the connection, such as learning of family events.

  6. Irongut

    "Men are given fake long hair, for example,"

    So Snapchat are sexist. Big surprise. Men can have long hair too you know.

    1. Michael Wojcik Silver badge

      You train an ML system with a corpus labeled using stereotypes, and it will implement those stereotypes. If a random sampling of images of the populace shows a correlation between long hair and being female, it's perfectly reasonable for a filter like this - which after all is just a toy - to treat long hair as a feminine attribute.

      Coding a rule that long hair is feminine into a classifier (or a generator trained by such a classifier) would be sexist. A classifier learning that correlation on its own is imprecise - and certainly someone might refuse to use it for that reason - but it's not particularly useful to label it "sexist". Using such a classifier (or generator) in certain applications could certainly be a sexist decision.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like