
Khazan, Olga. “This App Reads Your Emotions on Your Face.” The Atlantic, Atlantic Media Company, 15 Jan. 2014, www.theatlantic.com/technology/archive/2014/01/this-app- reads-your-emotions-on-your-face/282993/.
Khazan, Olga. “This App Reads Your Emotions on Your Face.” The Atlantic, Atlantic Media Company, 15 Jan. 2014, www.theatlantic.com/technology/archive/2014/01/this-app- reads-your-emotions-on-your-face/282993/.
Breakthroughs

The gap between the current method of diagnosis and the future we envision with the ETM is the use of:
-
Research that correlates every part of DSM-V individually and collectively the machine discernable symptoms that involve body language, physical activity, facial configurations, and verbal activity so that they can be translated into the emotional status of the individual.
-
Machine access friendly DSM-V forming an extensive digital database of symptoms, diagnostic tools, and action protocols
-
Artificial intelligence Engine to improve the speed of decision making, so that the entire recorded database doesn’t have to be sifted through every single time.
-
Training of the Artificial Intelligence Engine until the efficacy of the decisions are consistent ethically and medically to that of a trained psychologist
-
The ability of Artificial Intelligence Engine to provide Action Protocols that are executable by a trained person like a school nurse, an EMT, or law enforcement professional and yet not a certified mental health professional
Our prototype provides a simple demonstration based on a very limited set of conditions and symptoms consistent with DSM-V. We do not claim to be psychologists in the set-up of our model database nor are do we claim to be experts in AI to exploit its maximum current capability. in our vision of the ETM, the courses of action to take would be developed after extensive research and work by many mental health and technology experts.

