Science and Tech

Google launches Project Gameface technology on Android, a "mouse" that will allow you to control ‘apps’ and games with your face

Google launches Project Gameface technology on Android, a "mouse" that will allow you to control 'apps' and games with your face

May 15. (Portaltic/EP) –

Google has announced the launchProject Gameface development on Android, he “mouse” that allows you to control applications and games with facial movements and facial expressionswhich is now available in open source for Android developers, in order to improve accessibility on devices with this operating system.

Within the framework of its annual Google I/O developer event, the technology giant announced its latest news related to Artificial Intelligence (AI), in addition to its Android operating system.

Among the news shared, Google has announced that it has launched Porject Gameface technology on Androidso it will allow users to control applications with their face and facial expressions.

Project Gameface is an open source tool that Google announced at last year’s Google I/O event. As defined by the company, it is a “mouse” that allows users to control a computer cursor through head movement and various grimaces.

That is, this technology makes it possible for users to control, for example, a game raising your eyebrows to click, or opening your mouth to move the cursor. All of this with the aim of making the use of technology more accessible for people with movement difficulties or diseases such as muscular dystrophy.

Now, Google has shared that the code source of this technology is open so developers can start creating Android apps, with this control system.

Thus, as explained in a statement on his blog, the company has replicated the idea of ​​moving the computer cursor with your head and has integrated it into Android smartphones. With this, you can use the front camera of the device in question for analyze users’ facial expressions and head movementstranslating these gestures into a “intuitive and personalized” control.

In fact, users will be able to configure their experience of using this technology, customizing facial expressionsthe size of the gestures or even the speed of the cursor, among other options.

On the one hand, to integrate Project Gameface into Android, Google has indicated that they have used the operating system’s accessibility service to “create a new cursor”. On the other hand, they have also leveraged the facial landmark detection API of its MediaPipe serviceto program the cursor so that it moves across the screen according to the movement of the user’s head.

Additionally, this API is capable of recognize 52 facial gesturesas raise your eyebrows, open your mouth, or wink. Google has therefore used these gestures to “map and control a wide range of functions.”

In this way, in addition to games, Android applications that include Project Gameface will make it possible, for example, to write on instant messaging platforms by controlling the cursor with the directional movements of the head.

With all this, Google has reiterated that the Project code Gameface “is now open source on Github”, so any user can access it to implement it in their applications and “make Android devices more accessible.”

Source link