Tuesday, January 26, 2010

Racist Camera? Does Nikon CoolPix Fail on Asian Eyes?

Racist Camera? Does Nikon CoolPix Fail on Asian Eyes?
By Lizz Carroll - Jan 26, 2010

Photo

Last year, when Joz Wang, a Taiwanese-American strategy consultant, decided to test out a new Nikon CoolPix S630 digital camera with her family members, she continually received an error message on the camera's display screen: "Did someone blink?" After several tries with the same results, the Wangs tried taking a shot with their eyes open extra-wide. There was no error message after this attempt, they say.
Wang, who has a blog called jozjozjoz.com, posted one of the "blinking" pictures on her page with the title, "Racist Camera! No, I did not blink... I'm just Asian!" Bloggers and people on Twitter soon picked up on Wang's controversial post.
But does the gadget really discriminate against Asian features or is this just a case of a technology glitch?
According to TIME, which publicized the story, Nikon says it's working to improve the accuracy of the blink-warning function on its Coolpix cameras. Nikon has not yet returned DiversityInc's calls and e-mails for comment.
Opinions are mixed in the blogosphere.
Keith, a commenter on Wang's site, said: "You would think that Nikon, being a Japanese company, would have designed this with Asian eyes in mind? You'd think."
When Wang's post was picked up by a blog called Sociological Images, one commenter, Elizabeth, said, "I just got back from vacation with a friend who has this camera (we are three white women) and after every photo, it asked us, 'Did someone blink?' It became a running joke because the sensor asked this question whether or not there was a person (or blinking person) in the shot."
Another commenter on the popular photo site Flickr cited the same issue, saying, "Yeah, my girlfriend (who is white) has one of those cameras and it's constantly asking if someone blinked."
But one user, Orchid 64, on the site Digg, points out the possibility of a technology issue: "Nikon is a Japanese company. I doubt they're racist against other Asians when designing their software…Someone just didn't do a very good job programming it to suggest someone blinked. This is hysterical overreaction to the poor results of pixel examination software and the resulting suggestion."
Nikon, much like other tech companies, is using a form of facial-recognition software in an effort to add convenience to its consumers' photo-taking experience. HP computers encountered a similar problem when the web camera on its MediaSmart laptop was called "racist" because it recognized white faces but not Black faces. When two coworkers in a Texas store—one Black and the other white—discovered this, they posted a video on YouTube titled "HP computers are racist." The video quickly went viral.
In the case of HP's web camera, the company claimed the problem was caused by a lack of proper lighting and was even tested by Consumer Reports to prove the point.
According to research by TIME, while face-detection software is based on a math, the science isn't always exact: "The principle behind face detection is relatively simple, even if the math involved can be complex. Most people have two eyes, eyebrows, a nose and lips - and an algorithm can be trained to look for those common features, or more specifically, their shadows. (For instance, when you take a normal image and heighten the contrast, eye sockets can look like two dark circles.) But even if face detection seems pretty straightforward, the execution isn't always smooth."


TIME also tested two of Sony's latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they also had a tendency to ignore people with dark complexions.
In Wang's case, the Nikon camera may have been programmed to detect an eye area of a certain number of pixels and her narrow eye did not fit the "equation." Instead, it "decided" that her eyes were in a closing position, hence the "blink" message.
While the solutions may be found in the programming of facial-recognition software, the larger question is: Why are technology companies failing to test their products out on a greater mix of people, with varying facial structure and complexions?












No comments:

Post a Comment

On Line Dating Tips


Traffic Swarm Deals & Steals

Bag, Bags, Bags

Internet Coupons and Specials