Gaming

Part of OpenAI’s founding team creates its own company "Safe Superintelligence" to create security-focused AI


Part of OpenAI's founding team creates its own company "Safe Superintelligence" to create security-focused AI



OpenAI has been one of the pioneers that has allowed Artificial Intelligence models to reach the general public with which they can interact directly as chatGPT. However, this tool and the company itself have been under general scrutiny due to certain concerns about security and implications that “AI” could have for humanity.

Focusing on this specific problem, one of the founders of OpenAI and part of his team have embarked on a new company called Safe Superintelligence with which they seek the development of new Artificial Intelligences with a security-oriented approach.

Geeknetic Part of the founding team of OpenAI creates its own company "Safe Superintelligence" to create a security-focused AI 1

According to the founder himself, Ilya Sutskever, his approach will be based on a development model where progress, safety and protection of users are priorityd and are insulated from the commercial or expense pressures that other companies have.

Today, the website of the project and the company itself ISS.inconly shows a cover with a plain text statement explaining their intentions to create the world’s first secure AI laboratory to improve the capabilities of this technology without putting users at risk.

Initially, SSI offices are located in Palo Alto and Tel Aviv.

Geeknetic Part of the founding team of OpenAI creates its own company "Safe Superintelligence" to create a security-focused AI 2

End of Article. Tell us something in the Comments!

Article Editor: Antonio Delgado

Antonio Delgado

Computer Engineer by training, editor and hardware analyst at Geeknetic since 2011. I love to tear apart everything that passes through my hands, especially the latest hardware that we receive here for reviews. In my free time I tinker with 3D printers, drones and other gadgets. For anything, here you have me.

Source link