Tuesday, September 26, 2023
HomeArtificial Intelligence#368 – Eliezer Yudkowsky: Risks of AI and the Finish of Human...

#368 – Eliezer Yudkowsky: Risks of AI and the Finish of Human Civilization


Eliezer Yudkowsky is a researcher, author, and thinker on the subject of superintelligent AI. Please assist this podcast by testing our sponsors:
Linode: https://linode.com/lex to get $100 free credit score
Home of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order
InsideTracker: https://insidetracker.com/lex to get 20% off

EPISODE LINKS:
Eliezer’s Twitter: https://twitter.com/ESYudkowsky
LessWrong Weblog: https://lesswrong.com
Eliezer’s Weblog web page: https://www.lesswrong.com/customers/eliezer_yudkowsky
Books and sources talked about:
1. AGI Wreck (weblog submit): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
2. Adaptation and Pure Choice: https://amzn.to/40F5gfa

PODCAST INFO:
Podcast web site: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
YouTube Full Episodes: https://youtube.com/lexfridman
YouTube Clips: https://youtube.com/lexclips

SUPPORT & CONNECT:
– Try the sponsors above, it’s one of the best ways to assist this podcast
– Assist on Patreon: https://www.patreon.com/lexfridman
– Twitter: https://twitter.com/lexfridman
– Instagram: https://www.instagram.com/lexfridman
– LinkedIn: https://www.linkedin.com/in/lexfridman
– Fb: https://www.fb.com/lexfridman
– Medium: https://medium.com/@lexfridman

OUTLINE:
Right here’s the timestamps for the episode. On some podcast gamers it’s best to have the ability to click on the timestamp to leap to that point.
(00:00) – Introduction
(05:19) – GPT-4
(28:00) – Open sourcing GPT-4
(44:18) – Defining AGI
(52:14) – AGI alignment
(1:35:06) – How AGI could kill us
(2:27:27) – Superintelligence
(2:34:39) – Evolution
(2:41:09) – Consciousness
(2:51:41) – Aliens
(2:57:12) – AGI Timeline
(3:05:11) – Ego
(3:11:03) – Recommendation for younger folks
(3:16:21) – Mortality
(3:18:02) – Love



RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments