The data we share with companies online has become a hot-button issue, but new technologies could soon be scanning us as we go about our day.
That’s the claim made by a neuroscientist, who believes that devices such as cameras in the real world will start gathering unprecedented levels of information about us.
Our bodies give off various signals that can be scanned and analysed by advanced computer systems, revealing everything from our current mood to our overall health.
In a similar way to wearable gadgets already available, future devices could be set up throughout public spaces to harvest this valuable bio-data.
Because the the cameras and other devices are part of our surrounding environment there will be no way for us to opt out or ditch the technology and new regulations will be needed, she warns.
The claims were made during a presentation given by Dolby Labs’ chief scientist Poppy Crum, who has spent the past few years studying people’s reactions as they watch films, at the Ted 2018 conference in Vancouver.
Using thermal imaging cameras, ‘mind-reading’ electroencephalogram (EEG) caps, heart rate monitors and skin response sensors, she can watch how volunteer’s bodies and minds respond to what they watch on screen.
And its a small step to imagine these techniques making the move to the real world in the near future.
Speaking at the Ted conference, the BBC reports she said: ‘We like to believe we have cognitive control over what someone else knows, sees, understands about our own internal states – our emotions, insecurities, bluffs or trials and tribulations.
‘But technologies can already distinguish a real smile from a fake one.
‘The dynamics of our thermal signature give away changes in how hard our brains are working, how engaged or excited we might be in the conversation we are having, and even whether we’re reacting to an image of fire as if it were real.
‘We can actually see people giving off heat on their cheeks just looking at a picture of flame.’
Professor Crum, who is also a researcher at Stanford University, demonstrated to the audience how some of these techniques can already be taken out of the lab.
Using carbon dioxide monitors, she displayed fear levels among conference attendees using a data visualisation.
Explaining the experiment, she added: ‘There are tubes in the theatre – lower to the ground since CO2 is heavier than air.
‘They’re connected to a machine that lets us measure with high precision the continuous differential concentration of carbon dioxide.
‘It takes about 20 to 30 seconds for the CO2 from your reactions to reach the machine.
Of course, we shouldn’t forget that back in 2011, Darpa announced that it’s moving forward in earnest with a program to endow cameras with “visual intelligence.” That’s the ability to process information from visual cues, contextualize its significance, and learn what other visual data is necessary to answer some pre-existing question. Visual-intelligence algorithms are already out there. They can read license plates in traffic or recognized faces (in limited, brighly-lit circumstances). But the programs are still relatively dumb; they simply help collate data that analysts have to go through. Darpa’s program, called Mind’s Eye, seeks to get humans out of the picture. If it works, it could change the world of surveillance overnight.
Mr Americana, Overpasses News Desk
April 15th, 2018