History
Hiroshi Yamakawa, a researcher of artificial general intelligence, sent out a social networking site (December 12, 2019) with the view that the essence of the singularity is the limitation of evolutionary strategies based on “autonomous decentralization + diversification + local competition.” In response, Kazuo Okanoya, the evolutionary linguist and others commented that they had arrived at similar views around the same time, and this sparked a discussion by many people, including Toshiji Hase, the science fiction writer. In the midst of these discussions, Hiroshi Nakagawa proposed holding an event, which led to the seminar “The End of Life Evolution and the World after the Singularity” (March, 19, 2020) with Hiroshi Yamakawa and Kazuo Okanoya. (ref. FB thread in Japanese: http://bit.ly/3725z8v)
Aim of this seminar:
It seems that the three basic elements of the evolution strategy are autonomous decentralization, diversification, and local competition. In other words, it can be considered that distributed agents with increased diversity have been developed autonomously and compete locally. And mankind has developed language, society, civilization, technology, etc. at an accelerating rate beyond biological evolution.
This evolutionary strategy has worked effectively when individuals and their populations in relatively large environments develop against environmental threats and pressures. However, as humanity has increased its power, its by-products have increased artificial risks, such as nuclear war and climate change. This inconvenience was likely to be caused by the destruction of the “locality” of competition, which was the premise of the evolution strategy. The nature of technical singularity may have been the limit of the evolutionary strategy. This risk is unavoidable for non-human intelligent creatures and advanced AI agents yet to come, if they follow a similar strategy. Thus, this seminar discusses whether the destruction of locality really limits the evolution strategy, and if so, whether it is necessary to review the values of “autonomous decentralization” or “diversification”, the current evolution strategy, and whether a new form of growth beyond that will be born.
In order to maintain an increasingly globalized, complex, and high-speed human society, support from AI systems with higher autonomy will be inevitable. Therefore, it is necessary to use advanced AI and its intelligence explosion not as a risk factor but to use them to overcome global risks. In the latter half of the seminar, we make open discussion with the audience on what capabilities and what kind of value standards advanced AI should have if we are to advance in the future with it.
Video contents:
- Video vol.1 (Japanese, English subtitled )
Opening: Kazuo Okanoya
Introduction: Hiroshi Yamakawa
Thread from AI: Hiroshi Nakagawa - Video vol.2 (Japanese, English subtitled )
The end of the evolution strategy brought about by technological advance: Hiroshi Yamakawa - Video vol.3 (Japanese, English subtitled )
End of language generation and evolution: Kazuo Okanoya - Video vol.4 (Japanese, English subtitled )
Emergence of Environment: Satoshi Hase - Video vol.1 (Japanese, English subtitled )
Discussion:
Moderator: Hiroshi Nakagawa
Panelist: Satoshi Hase, Hiroshi Yamakawa, Kazuo Okanoya
All video list is here
Speakers:
- Hiroshi Yamakawa (Whole Brain Architecture Initiative, RIKEN AIP)
- Kazuo Okanoya (University of Tokyo)
- Hiroshi Nakagawa (RIKEN AIP)
- Satoshi Hase (Science Fiction writer: guest)
Organizers:
- Co-Hosted by The Whole Brain Architecture Initiative
- Co-Hosted by New academic field “Linguistic Evolution for Co-Creative Communication“
- Supported by Alt Inc. with its AI minutes system
- Video production and distribution by Eurasia Film Festival
Related literature:
- Yuval Noah Harari, Sapiens – A Brief History of Humankind, Harper; 2015.
- Yuval Noah Harari, Homo Deus: A Brief History of Tomorrow, Vintage Digital, 2016.
- Yamakawa, H. Peacekeeping Conditions for an Artificial Intelligence Society. Big Data Cogn. Comput. 2019, 3, 34.