05-22-2019, 07:57 AM
(05-21-2019, 02:27 PM)Keralin Wrote: There's something to be said for face-to-face human interaction when it comes to therapy. AI may be better at logging information to track patterns (that's my interest in it) and predict future behaviors but an AI can't pick up on non-verbal cues and interpret them in a meaningful way - at least not yet.
hello Keralin,
yeah very true, although software is now under development to use your device's camera to track what you are looking at (part of screen) and for how long. This data is correlated with the content of the screen thus - well you see what I'm getting at; google's mission statement of "organising all the information in the world" seems to include as well trying to understand the mess inside my head and I seriously hope they fail.
Also I've seen something (documentary I think) about progress on machine learning for interpretation of facial expression and non-verbal communication... so what you refer to with "at least not yet" I rather think "kinda soon" which is... as Dragon has put it in the previous post - SCARY.
Hope you're doing ok today! The last bit about your experience with (wo)man-machine interaction made me laugh for the first time today