How do you rate your appearance
"How normal am I?" This website rates your appearance - and more
Am i beautiful Am i normal An interactive website promises answers: the webcam analyzes the user's face with the help of artificial intelligence. You learn a lot in the process - not just about yourself.
The English language website "Am I normal?" shows visitors what modern face recognition algorithms can do. In the interactive show, the user is asked to activate their smartphone camera or webcam and solve a few small tasks. In the meantime, the software analyzes the facial features, assigns a high score for good looks, for example, or tries to estimate the age or body mass index of the user.
Here you can try it out for yourself. Good knowledge of English is an advantage.
Don't worry: the analysis is carried out locally in the browser, no data is forwarded to other servers. The source code of the site is publicly available, which speaks for a high level of trustworthiness.
This is not about beauty
In truth, the website is not about evaluating the attractiveness of the user. Rather, it is an art and education project funded by the European Union, which aims to educate people about the use of facial recognition software. The user learns in a playful way where the limits and risks of the technology lie, how easily the algorithms can be tricked, and that they are still in use many times.
The artist and data protection activist Tijmen Schep developed the website with the help of various tracking algorithms, each of which has a specific purpose. While the user holds his face in the webcam, Schep explains how the software works and which companies or government institutions use similar technologies - and why.
The behavior is also analyzed
Often it is about evaluating the user in some way. On social media, for example, the algorithms prefer images of attractive people and have a greater chance of being seen by many people. Non-whites are also disadvantaged by facial recognition software, as the majority of the programs are trained and tested on light-skinned people.
However, the behavior of the users is also assessed. There are algorithms that recognize emotions or analyze how attentive someone is. In the advertising industry, for example, it is investigated which online content users view particularly intensively.
With his project, Schep wants to draw attention to the long-term social consequences of such automated evaluation systems. People who are slightly different than the average could be systematically disadvantaged.
In addition, the impression arises of being observed and judged constantly and everywhere. The more total surveillance by intelligent camera systems finds its way into our lives, the more individuals could feel compelled to behave in a certain way - namely, as "normal" and inconspicuously as possible. Ultimately, protecting privacy is also about the right to be yourself and different from others, says Schep at the end of the interactive lecture.
More on the topics
- Web pages,
- European Union,
- Network policy,
- Artificial intelligence
- What is uncompressed data
- Who coined the term intersectionality?
- How can justice be achieved for Honduras
- Can clothing be inherently feminine or masculine
- Were you sad to see coin phones disappear
- How does my ISP count the data used
- What do industrial organizational psychologists do
- Why are we Americans taught German history
- Orcas attack adult gray whales
- What is your best subject and why
- People have hidden skills
- What ethnicity was hannibal
- How does the Swiss bank store money securely
- What does the snake symbolize
- What are the benefits of content farms
- How is Rick Rubin personally
- Why are you afraid of hospitals?
- What are some causes of concussions
- What is Big Data or Hadoop
- What wakes you up in the morning
- Mathematics is nice
- Erasing the CD fixes some problems
- Why doesn't a top knock?
- What are good films about Haiti