Skip to navigation

PCPro-Computing in the Real World Printed from www.pcpro.co.uk

Register to receive our regular email newsletter at http://www.pcpro.co.uk/registration.

The newsletter contains links to our latest PC news, product reviews, features and how-to guides, plus special offers and competitions.

// Home / Blogs

Posted on March 6th, 2013 by Steve Cassidy

The wall that knows whether you’re a criminal

Dermalog face identification

It’s pretty common to end up wandering around CeBIT in a daze. The size of the show (there are bus routes inside the showground), the hubbub of languages, and the constant obstructions caused by gawping nerds. It was while semi-hypnotised, then, and irritated by a crowd behind me and a crowd in front of me, that I got my wake-up call, taking the form of the system of which I took the very bad picture above.

It’s not a cartoon-face pre-processor: it claims to be an automatic face recognition and fraud-prediction system. It was on the stand of German identity-management firm Dermalog, though I confess I was jostled so much by gurning techies eager to get a picture, of their picture, on this screen that I didn’t manage to verify how complete the development is.

The idea of the system is that fraudsters will very likely not be happy, and may otherwise be characterised by predominating flags that help to narrow down their intentions

Dermalog was pretty proud of it, though. The screen beside it showed the visit yesterday to their stand by German Chancellor Angela Merkel, apparently unfazed by the irony of a politician taking a long, hard look at a system devoted to the detection of fraudsters by their facial expression alone.

The floating text beside that torrent of faces which you probably can’t read is what shows the ability of the system. Having spotted a face, the system tags it with a gender, an age estimate, and a mood. Charmingly, it recognised me as a male but then pegged me as “60, +/- 15 years, happy”, which is technically correct but painfully unflattering. The idea of the system is that fraudsters will very likely not be happy, and may otherwise be characterised by predominating flags that help to narrow down their intentions.

I would say that it’s not really ready for primetime, based on my results. If the face was a good enough indicator of mood then it should have tagged me as “freaked out on business technological ennui”, not simply “happy”, and no police force would accept a description of someone as “aged between 45 and 75″ – that’s the gap between Daniel Craig and Jack Nicholson.

I suppose this kind of party-trick technology is trapped in a demo Catch-22, since if it’s any good at detecting fraudulent intent from a face, then the vendors have to engage the talents of a known criminal, and persuade them that they should make like they think they are going to pick someone’s pocket (at CeBIT containing mostly freebie USB keys) so that they get a “live alert” out of the system.

I’m not sure which aspect of this worries me more: that the time-to-acquire for a new face drifting into the webcam field was down in the sub-second level, or that the value of the system is touted as being a predictor of behaviours, a better detector of true intent than all the Columbos, Crackers, Wexfords and Marples rolled together.

“The innocent have nothing to fear” is all well and good, while prediction takes a back set to actual criminal acts, but “he looks just the sort” is the favoured excuse for all manner of presumptuous, and baseless, totalitarianism.

Tags: ,

Posted in: Real World Computing

Permalink

Follow any responses to this entry through the RSS 2.0 feed.

You can skip to the end and leave a response. Pinging is currently not allowed.

7 Responses to “ The wall that knows whether you’re a criminal ”

  1. George Says:
    March 6th, 2013 at 4:11 pm

    This technology needs to be blended with MIT’s video enhancement technology which can analyze video footage to determine the health and other findings of the subject being filmed:
    http://youtu.be/3rWycBEHn3s

     
  2. Cellar Says:
    March 6th, 2013 at 4:25 pm

    “The innocent have nothing to fear” has also been pretty convincingly disproven by a rather long string of colourful historical characters.

    The innocent *ought* have nothing to fear, in theory, but in practice usually get the worst of it.

     
  3. Tibs Says:
    March 6th, 2013 at 6:38 pm

    I don’t think behavioural technology based on facial expression alone will go farm,
    There are too many edge cases which would be difficult to test for. Previous criminals who have been hired to steal may have a different expression to if they hadn’t been hired (with no chance of getting convicted for the crime they will be in a different state of mind and could have a different expression).

    ‘The innocent have nothing to fear’ except those trying to convict him. I once had advice from a law student who said to never talk to the police without a lawyet, as their job is to convict criminals for the crimes they commit as any mistake will come back to haunt you in the courtroom (if you said you thought you were wearing a red cap but when recollecting it in court you said it was blue. The police can testify against you, calling your story into question based on the fact you made a mistake). There’s also the fact that you can’t say anything that will help you with your case to an officer (unless you can back it up with hard evidence, even then you can still trip up to a degree)

     
  4. Surefire Says:
    March 7th, 2013 at 10:14 am

    “the constant obstructions caused by gawping nerds”

    ROFLMAO!

    So here we have a wannabe gawping nerd foiled and irritated because there are other gawping nerds in his way.

     
  5. Steve Cassidy Says:
    March 7th, 2013 at 10:54 am

    Well spotted Tibs; this is the same type of problem as the one that subtly trains drug sniffing dogs to “show” on black suspects, because they can read the body language of their handlers and all they want is the Scooby-snack.

    I assume the whole “unhappy” possibility is taken from CCTV footage of known frauds, though quite often systems like this one depend on *3d* imagery to measure facial features, muscle tensions and the like.

    Surefire: You write that like you think I’m not aware of it. Clearly from my words (and the verdict of the machine…) I am partway between Daniel Craig and Jack Nicholson…

     
  6. Darren Says:
    March 7th, 2013 at 4:19 pm

    This reminds me of an episode of Red Dwarf I was watching the other week called Justice (S4E3) where a computer decides if anyone has committed a crime. Rimmer is committed because in his own mind he was responsible for the nuclear disaster that wiped out the rest of the crew, however the computer didn’t tell that Rimmer was actually blaming himself for what happened, when really it was nothing to do with him. Some people do this and they haven’t really committed any crime, they just blame themselves for not doing something that could have prevented something. So I hope this justice wall takes this into account.

     
  7. technogeist Says:
    March 7th, 2013 at 8:27 pm

    I wonder what it would make of people who have aspergers or some other socially awkward disposition.

    What the world doesn’t need is this crap.

     

Leave a Reply

Spam Protection by WP-SpamFree

* required fields

* Will not be published

Authors

Categories

Archives

advertisement

SEARCH
SIGN UP

Your email:

Your password:

remember me

advertisement


Hitwise Top 10 Website 2010