Future of Life Institute
(2014; Boston; 5 founders, 6 staff, 14 advisors; futureoflife.org)
Seeks to mitigate the threats stemming from four existential risks they identified: climate change, nuclear weapons, biotechnology, and AI. They catalyze research for safeguarding life and developing optimistic visions of the future. Partner Organizations include the Future of Humanity Institute, Global Catastrophic Risk Institute, and Machine Intelligence Research Institute. They inform the public about relevant topics by spreading and creating relevant content and research.
- Future of Life Awards to honor people that helped to preserve humanity.
- AI Grants Competitions that focuses on projects that are enabling the development of AI that is beneficial to society and robust in the sense that the benefits have some guarantees.
- Ethics of Value Alignment Workshops
- autonomousweapons.org – A website designed to help concerned citizens learn more about threats from autonomous weapons and how they can help.
Leadership: Jaan Tallinn (1 of 5 Founders)
Note: Advisors include(ed) Stephen Hawking, Elon Musk, and Morgan Freeman.