top of page

Publications / Conference Presentations

A Collective Intelligence Approach to Safe Artificial General Intelligence
​
AGI Seattle 2024
 
​Craig A. Kaplan, PhD

If Artificial General Intelligence proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest and fastest path to AGI may be to harness the collective intelligence of multiple AI and human agents in an AGI network. 

​

This approach has roots in seminal ideas from four of the scientists who founded the field of AI: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon.

How to Create AGI and Not Die 
​
International Conference on Artificial Intelligence and Machine Learning â€‹San Francisco 2024
 
​Craig A. Kaplan, PhD

To ensure the safe and beneficial development of AGI, a collaborative approach involving both humans and AI is proposed. By actively engaging humans throughout the development process, AI's values can be better aligned with human values. This approach allows for a broader and more diverse input from millions of people, leading to a more representative and reliable reflection of human values in AGI. This collaborative approach contrasts with the alternative approach of relying on a small group of experts to define a set of rules for AI to follow, which may not accurately capture the full spectrum of human values.

How to Create AGI and Not Die 
​
IFoRE 2023 by Sigma XI
Long Beach, CA 2023
 
​Craig A. Kaplan, PhD

The safest path to AGI is to create a community of human and AI agents. Keeping humans in the loop for as long as possible maximizes the opportunity for humans to align the values of AI before it achieves SuperIntelligence. Enabling millions of humans to teach AI agents their values, ensures that the values of AGI reflect a statistically representative and valid sample of human values. This approach is in stark contrast to the idea of allowing AI to teach itself values by following a "constitution" created by an elite few.

bottom of page