Facebook aims for AI to better understand human beings on a day.to.day basis and advance with first.person data

Facebook announced its new long.term project on Thursday Ego4D, with which you seek that Artificial Intelligence (AI) mechanisms can improve your perception of the human being in everyday life situations, especially those first person data.

The new project led by Facebook is marked as a challenge that the AI can understand and interact from an egocentric perspective, in the same way that people do. However, most algorithms, such as image algorithms, are fed with third person videos, which complicates this aspect.

“Next.generation artificial intelligence systems will need to learn from a completely different kind of data: videos that show the world from the center of the action, rather than on the sidelines“, as has assured the head of investigation of Facebook, Kristen Grauman, as the company collects in a statement.

These investigations will help establish essential pillars so that AI.based virtual assistants, Augmented Reality glasses and robots They can help people in everyday situations, such as finding lost keys, making a cooking recipe, teaching drums or remembering a precise moment of the day in hologram form.

The company has established a partnership with 13 universities and research centers from nine countries. These agencies have collected more than 2,200 hours of first person video in everyday situations, thanks to 700 participants who have recorded their daily routines.

This bank of audiovisual resources will increase the amount of data available to the scientific community, being 20 times more extensive than other available resource banks attending to the hours of stored images.

Facebook AI, in collaboration with this consortium and with Facebook Reality Labs Research (FRL Research), has developed five landmarks focused on these first.person experiences that will drive advancements in the application of these applications in the real world for future personal assistants.

These pillars are the episodic memory, the ability to predict, manipulation of objects with the hands, audiovisual memories and social interactionsAll of them fields in which AI is currently not good at replicating the first.person perspective of the human being.

Facebook hopes that the development around these five pillars will make possible the interaction of AI with people not only in the real world but also in the metaverse, in which augmented and virtual reality are also part.

“AI will not only begin to better understand the world around it, it will one day it could be customized on an individual levell; it could meet your favorite cup of coffee or guide your itinerary for your next family trip, “said Grauman.

This year, the Ego4D consortium of universities will release the data, which can be used for the permitted purposes, and early 2022 The researchers will invite experts to a challenge to teach machines to understand the world in the first person.

By Editor