By     9/17/11

Brain_Scan_Detector.jpg As if the world of high and detection hardware hasn't gotten bizarre enough, a new machine being deployed at airports in the UK that goes beyond the traditional metal detection and drug sniffing right into the heart of emotions – and makes a judgment based on what emotions you are feeling at the time.  If technologies such as these catch on we may have to do more than just be careful about what we think – and what we feel.

The " detector" is a long range sensor that is designed to be deployed at times not otherwise defined by legislation.  And even if it is a temporary measure, we may consider it a more permanent fixture as the public becomes more acclimated to it.  This has been the case with surveillance equipment since the first security cameras.  And it's difficult to imagine how different a world we would live in if everywhere cameras proliferated we now instead had detectors.

But while we have heard the perspective of civil rights groups when regarding detection systems that invade our very thoughts, there's another and possibly even more disturbing side to this story that remains unspoken in the .

Just as the previously mentioned technological wonders of surveillance, this technology too will find other uses.  When the system becomes more effective, cheaper, and more widely distributed eventually it will – like so many other technological achievements – fall into less than benevolent hands.  If the companies hoping to prevent tragedy find it useful, it may be equally useful to those hoping to spread it.

And while the creators and distributors of the technology most likely only wish for it to be used benevolently, these tools could be used by criminals under false pretenses.  And if the idea of a handheld emotion detector seems far fetched, just look at the great strides being made in this field for smart phones.  Already a South Korean company called Blue Finger, incorporated has developed a application that reads facial features to assess emotions.  While the claim may be ambitious now, as the arch of technological achievement progresses we may find in time that most people possess a potential remote mood tracker in their hands.

Everything from selecting marks for a mugging to checking up on the vulnerability of store personnel prior to a robbery could be potential avenues for criminals.

Does this make the technology bad?  In the end it may be no more good or bad than a hammer.  But it is a manipulative sort of technology – and that's one factor that doesn't sit well with most critics.  If we wished for people to know what we felt, we would simply say it all the time.  The fact of the is, no who is on the other end of the camera we simply like the privacy of our own minds.