Surveillance technology

by Chris Bertram on July 25, 2005

The BBC showed “a programme the other day”:http://www.blackjackscience.com/bbc/BBC%20-%20Science%20&%20Nature%20-%20Horizon%20-%20transcript.htm about the history of card counting in blackjack and how the casinos eventually defeated the card counters using facial recognition technology. Having traced suspected card counters to MIT, Griffin Investigations, the agency employed by the casinos, then fed the faces from the MIT yearbooks into their databases. When a face appeared in a casino and the software matched it to a suspect, that person was shown the door. The relevant bit of the transcript:

NARRATOR: It was then that Beverley noticed something unusual. Many of the big winners had given addresses from around the same area, Boston. Then she noticed something else, most of her suspects played only at weekends, and they were all around college age. Beverley made the connection. Could these card counting team members be students at M.I.T.? To find out Beverley checked the M.I.T. student year books.

BEVERLEY GRIFFIN: And lo and behold there they were. Looking all scholarly and serious and not at all like a card counter.

NARRATOR: The M.I.T. yearbooks viewed like a rogue’s gallery of team counters. Beverley now realised she was up against some of the smartest minds in America. So the casinos began to develop facial recognition technology, for quick and accurate identification of team play suspects. The basis for the database were the M.I.T. yearbooks. From the moment a suspected counter entered a casino they could be monitored by the hundreds of cameras on the casino floor. Snapshots could then be downloaded for computer analysis.

TRAVIS MILLER: Each time he moves I try to see which shot is going to be the best for him, that we can use to match him up further down the road. This would be the perfect shot, he’s directly in the centre of the photo, all we see is his face, he’s looking straight ahead in to the shot.

NARRATOR: Facial recognition software analysed the relative position of over eighty coordinates on a suspects face. As individual as a fingerprint this information could be run through the Griffin database of suspected card counters, and an identification made.

I’m guessing that if casinos can do this with MIT students then states and security agencies could certainly employ the same technology to keep anyone photographed at a “Hizb ut-Tahrir”:http://en.wikipedia.org/wiki/Hizb_ut-Tahrir meeting (or similar) off the London Underground or Heathrow Airport. As soon as a match appeared, they could be stopped.

I hasten to state that the civil liberties implications of any such system are horrendous. But my interest here is in whether it would be technologically feasible. Could it work for a large system? How many false positives and false negatives would there be? Any answers?

{ 28 comments }

1

Anders Widebrant 07.25.05 at 5:09 am

Bruce Schneier thinks that large-scale facial scanning is bound to fail, unless the system is extremely good:

“Suppose this magically effective face-recognition software is 99.99 percent accurate. That is, if someone is a terrorist, there is a 99.99 percent chance that the software indicates “terrorist,” and if someone is not a terrorist, there is a 99.99 percent chance that the software indicates “non-terrorist.” Assume that one in ten million flyers, on average, is a terrorist. Is the software any good?

“No. The software will generate 1000 false alarms for every one real terrorist. And every false alarm still means that all the security people go through all of their security procedures. Because the population of non-terrorists is so much larger than the number of terrorists, the test is useless. This result is counterintuitive and surprising, but it is correct. The false alarms in this kind of system render it mostly useless. It’s “The Boy Who Cried Wolf” increased 1000-fold.

“I say mostly useless, because it would have some positive effect. Once in a while, the system would correctly finger a frequent-flyer terrorist. But it’s a system that has enormous costs: money to install, manpower to run, inconvenience to the millions of people incorrectly identified, successful lawsuits by some of those people, and a continued erosion of our civil liberties. And all the false alarms will inevitably lead those managing the system to distrust its results, leading to sloppiness and potentially costly mistakes. Ubiquitous harvesting of biometrics might sound like a good idea, but I just don’t think it’s worth it. ”

http://www.schneier.com/crypto-gram-0109a.html

2

bad Jim 07.25.05 at 5:51 am

Scanning everybody who enters a casino, and more particularly anyone sitting down at a blackjack table, has some chance of being useful, especially if the number of targets is limited.

Scanning a large unselected crowd for a large number of possible targets tends to submerge the signal in the noise. With enough cameras and enough computers, universal facial tracking might have a chance if we never subverted it with makeup, jewelry, eyewear and facial hair, which we do already for other reasons.

3

Chris Bertram 07.25.05 at 5:59 am

Thanks guys, I think you’ve answered my question.

4

Jimmy Doyle 07.25.05 at 5:59 am

Given that Dsquared thought it uncomfortably close to a “witchhunt” to agitate for the removal of a member of Hizb ut-Tahrir from his position as a journalist at the Guardian, I’d hate to see what he thinks about keeping such a person off the tube.

5

sien 07.25.05 at 6:04 am

The trials have been even worse than the accuracy that Schneier posits.

6

dsquared 07.25.05 at 7:32 am

I must say that I don’t understand why Chris specifically referred to HuT here since neither us nor the US State Department regards them as a terrorist organisation (and in fact, British HuT has condemned the London bombings). I’d agree that someone who had been to HuT meetings would be more likely to end up as a terrorist than someone who preferred the Rotary Club, but I would hope that our intelligence would be somewhat less broadbrush than just going through the card of HuT members. This would be more or elss equivalent to, during the Cold War, banning RCP members from taking buses.

7

Chris Bertram 07.25.05 at 7:44 am

Banning RCP members from buses strikes me as an admirable idea …

But seriously, I was just looking for a specimen Islamist organization, and I might just as well have mentioned Al Marjaroun, or whatever they are, but I wasn’t sure how to spell them (then or now).

And just to reiterate, I wasn’t advocating this but just wondering whether it was feasible. I think respondents have explained that it isn’t.

8

nofundy 07.25.05 at 7:52 am

How would one know which faces to enter into the reference database? Use the MIT yearbook again? Point being, its much easier to identify card counters than possible suicide bombers as they do it only once.

9

asg 07.25.05 at 8:21 am

I believe most modern facial recognition software focuses on the elements of the face that nothing, not even surgery, can change. For example, there is nothing at all that can change how far apart your eyes are from one another, or widen the bony bridge of your nose, or widen your jawbone.

All of these things in turn generate mathematical ratios which are not unique to individuals but which can eliminate people from a list of suspects. This does nothing to address comment #1 but the concerns in comment #2, particularly about masking facial characteristics, can be answered by technology.

10

jim 07.25.05 at 8:23 am

The numbers the first commenter quoted have nothing to do with terrorism, of course. They apply to any relatively insensitive test used for mass screening to identify a rare condition. They also applied to the casino screening. No-one uses a screening test in isolation. What matters is the protocol that’s used following a screening (probably false) positive.

For medical screening tests, like, say, mammograms, the doctor tells you that there’s something suspicious they’d like to check out more, just to rule out anything that might be a problem. The word “cancer” is never used. A more sensitive, more invasive, more expensive test, used only on the screening positives, is then done. If that’s positive, then perhaps a really sensitive, really expensive test is done to confirm it.

Similarly, the casino, when the software indicated a match, had a member of the security/surveillance staff, familiar with the faces of all the banned gamblers, come over and eyeball the CCTV image. If that person said, “Yes, looks like him,” someone on the floor, familiar with the faces of all the banned gamblers, would approach the person. If he’s sure the person is, in fact, banned, he escorts him out. If either of these people decided that the person on the CCTV or the person in the flesh on the floor wasn’t actually the banned gambler, no contact would be made. No-one would be any the wiser. The BBC didn’t show the staff ruling out false positives, but BBC documentaries are not scholarly papers: they omit the dull stuff.

The problem with the casino’s protocol is it doesn’t scale. There are only a few banned gamblers. It’s perfectly possible to ensure that someone familiar with all of them can be on duty in the booth and on the floor at all times. Once the number of people that you need to recognize gets too large, the procedure breaks.

The real reason we don’t use facial recognition technology to screen at points of entry is we don’t have a protocol into which it would fit.

11

Joshua W. Burton 07.25.05 at 8:29 am

12

Joshua W. Burton 07.25.05 at 8:32 am

Sorry, didn’t mean to hit post with mismatched tags.

Another concern about broad use of biometrics is the repudiation problem. After a successful identity theft, how do you ask for a new face?

Phil Agre has a brilliant old piece on why ubiquitous face recognition would be a Bad Idea even if it worked. And then there’s the hoodie problem: even without automated face recognition, London cameras are driving dress-code countermeasures, which in turn are driving a moral panic and further repression.

13

roger 07.25.05 at 9:20 am

I think your example points to what the CIA should be doing: setting up yearbook companies in Pakistan. 50% off for madrassas! Act now!

14

Matt Austern 07.25.05 at 9:48 am

Possibly that excerpt is misleading (I’d imagine casinos aren’t going to give detailed and true information about their security procedures), but if it’s taken seriously, what it means is that the casinos have decided to ban anyone who attends MIT, anyone who used to attend MIT, and anyone who looks like anyone who attended MIT.

This is a tremendous false positive rate. The fraction of MIT students who are card counters is small. Most MIT students, surprisingly, can’t even pick locks or scale walls.

Accepting a tremendously high false positive rate may or may not be acceptable for the casinos. If this excerpt is to be believed, they’ve made a business decision that it is. It probably is less acceptable for law enforcement.

15

Barry 07.25.05 at 10:05 am

Matt, they probably figured that excluding students (and grad students) from certain schools is not a big loss. If they’re smart, they ascertain ID’s for matches, and match that with lists of rich people. Those students who have the DiLithium Amex cards (courtesy of rich parents) could be ID’d before contact is made. That way, the casino could avoid booting out people who have real money to lose.

16

paul 07.25.05 at 10:20 am

The false-positive rate for the casinos, given the other circumstances, may not be that high — of the MIT students and grads who don’t card-count, it might well be that not too many go to the casinos, and not that many of those sit down at a blackjack table and start engaging in sophisticated play.

As Jim points out, it’s all about the protocol that the system is embedded in. And given the recent tragic killing in the London Tube, we seem an awfully long way from a good protocol even without scaling problems.

17

jacob 07.25.05 at 10:53 am

Two minor and perhaps obvious points: One is that the consequences of a false positive in the casino are far less than the consequences of a false positive for policing. Falsely identifying an MIT student or alum as a card counter results, at worst, in that person being denied the ability to gamble that evening–hardly the end of the world. Falsely identifying someone as a terrorist leads at best to reduced mobility (which leads to an potential inability to make a living) and political repression–and at worst, it can lead to death, as Jean Charles de Menezes discovered last week.

The second point is that the casinos almost certainly violated copyright laws in using the facebook the way they did. Most modern university face books which I’ve seen say they are only for private, personal, or university use, and specifically ban commercial use, which the casinos’ use falls under.

18

msw 07.25.05 at 11:10 am

I was an integrator for a large facial recognition project, adding a third-party facial recognition vendor’s product to our customer’s security system. My experience matched the ones reported by sien’s link – it just doesn’t work. And we had absolute control over the initial template capture – we could specify the exact lighting conditions, camera vendors, capture angles, head positions, etc – we weren’t trying to match against yearbook photos. Still, the error rate was so high that the system was worthless. The guy in charge of security had a stuffed doll with thick plastic novelty glasses that he kept on his desk – it matched his facial template better than his own face did.

This was one vendor’s product, but the industry is more incestuous than you might think – a lot of vendors are just licensing some other vendor’s technology. And I haven’t heard many success stories.

I’m a little suspicious of the casino’s claims.

19

Harry 07.25.05 at 11:10 am

Wait — what is this RCP? I thought they disbanded, and became libertarian pro-Serbs and started writing columns for The Times. Or is that in bizarro universe?

20

jim in austin 07.25.05 at 11:52 am

Be on the lookout for subway passengers wearing red rubber clown noses…

21

kevin 07.25.05 at 12:00 pm

“For example, there is nothing at all that can change how far apart your eyes are from one another, or widen the bony bridge of your nose, or widen your jawbone.”

Not even stage makeup? Mind you, I have no idea how easily seen through stage makeup would be on someone up close, but I would think that it would be possible to change the appearance of some features enough to fool these kinds of remote systems.

22

jane adams 07.25.05 at 12:47 pm

Actually I believe the technology has been employed at some sports events for some criminal searches. Presuming skilled and experienced securit officers are on hand to double check and confront suspects it *might* be useful. But lots of things might be useful.

But at this time besides the tradeoff of civil liberties the trade off of resources is: do you put uniformed and plain clothes security (and good ones do remember lots of pictures) out on the floor or behind cameras? Probably both. But there is really a lot to be said for skilled people looking and feeling, sensing and yes even smelling, gently probing with stares and possibly questions people coming by.

23

Steve 07.25.05 at 1:22 pm

“I hasten to state that the civil liberties implications of any such system are horrendous.”

Why do you believe this? How does it differ (in civil liberties terms) from the (US) method of putting Wanted posters in post offices? Or mugshots/witness sketches being broadcast on the news? Is it because it is far more effective (though the posts here suggest that it is not very effective at all)? Faster? More people looked at? Something else?

Steve

24

Barry 07.25.05 at 4:10 pm

Probably because such systems have the potential to track a very large number of people almost as well as if each subject had a large team of followers. And can store, retrieve and match movements of many people in a very short period of time.

25

John Quiggin 07.25.05 at 6:55 pm

Totally off the main topic, I’ve never really understood why casinos persist in offering games that a skilled player can win without cheating. In the case of blackjack, all that’s required to stop cardcounting is to change/shuffle decks more often.

No doubt they attract a few would-be card counters who overestimate their skills, but I would have thought that would be outweighed by negative publicity, of which the story above is a modest example, not to mention the losses to undetected cardcounters.

26

Tom T. 07.25.05 at 7:17 pm

The false-positive objection does not strike me as enough in itself to disqualify this system from use at airport security, since the current system is set up to generate essentially 100% false positives anyway. Certainly, the facial-recognition software could not replace security checkpoints, but it conceivably could be used to refine the selection of travelers who get singled out for extra searching. I don’t know how such selections are currently made; whether it’s random or whether there are factors, like one-way cash-purchased tickets that figure into the mix, but it strikes me as at least feasible to incorporate facial recognition. Obviously, there would be a host of practical questions, as others have pointed out. What would it cost? How would the database be populated? Etc.

27

Chris Bertram 07.26.05 at 2:16 am

The programme dealt with the issue John Q asks about (you can check the transcript). Shuffling as a countermeasure led to a decline in the number of players. The programme claimed this was because shuffling took time and the players got bored. I’d have thought a more plausible explanation was that players knew that the odds were being shifted against them and declined to play. The casino wants to keep those players playing, just in the case where they aren’t _actually_ sophisticated enough to exploit their advantage.

28

bad Jim 07.26.05 at 3:04 am

Biometrics are great for positive identification, for validating that someone is who they claim to be, because the initial and subsequent observations can be performed under controlled conditions.

It seems entirely likely that the popularity of (and concern over) hoodies in the U.K. is related to the ubiquity of cameras there. (Signs in London commonly say something like “Privacy Notice: camera in operation” which is roughly backwards; when you see the notice your privacy is history.)

As for the ability to disguise facial traits: dark glasses effectively prevent the observation of any features of one’s eyes, their color, shape or spacing, and somewhat obscure the nose as well. Large, fluffy beards, not currently in fashion, alter the appearance of one’s jaw and chin, which is probably why I first grew mine.

Comments on this entry are closed.