
Eliezer Yudkowsky
Research Fellow at Machine Intelligence Research Institute
The original AI alignment person. Missing punctuation at the end of a sentence means it's humor. If you're not sure, it's also very likely humor.
Articles
-
2 weeks ago |
ifanyonebuildsit.com | Eliezer Yudkowsky
The scramble to create superhuman AI has put us on the path to extinction — but it's not too late to change course, as two of the field's earliest researchers explain in this clarion call for humanity. In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person.
-
3 weeks ago |
ifanyonebuildsit.com | Eliezer Yudkowsky
The scramble to create superhuman AI has put us on the path to extinction—but it's not too late to change course, as two of the field's earliest researchers explain in this clarion call for humanity. In 2023, hundreds of AI luminaries signed an open letter warning that artificial intelligence poses a serious risk of human extinction. Since then, the AI race has only intensified. Companies and countries are rushing to build machines that will be smarter than any person.
-
Feb 19, 2025 |
lesswrong.com | Eliezer Yudkowsky |Ebenezer Dukakis |Kris Moore |David Gross
Working in the field of genetics is a bizarre experience. No one seems to be interested in the most interesting applications of their research. We’ve spent the better part of the last two decades unravelling exactly how the human genome works and which specific letter changes in our DNA affect things like diabetes risk or college graduation rates. Our knowledge has advanced to the point where, if we had a safe and reliable means of modifying genes in embryos, we could literally create superbabies.
-
Feb 14, 2025 |
80000hours.org | Robert Wiblin |Eliezer Yudkowsky
So one criticism of “the AGI ideology,” as these people would put it, is that AGI is not foreordained… But when we talk about it as if inherently it’s coming, and it will have certain properties, that deprives citizens of agency. Now, the counterposition I would offer is: you don’t want to equip groups trying to shape history with a naive model of what’s possible.
-
Sep 24, 2024 |
lesswrong.com | Eliezer Yudkowsky |Lucius Bushnaq |Thomas Kwa |Joey KL
Crossposted from Twitter with Eliezer's permissionA common claim among e/accs is that, since the solar system is big, Earth will be left alone by superintelligences. A simple rejoinder is that just because Bernard Arnault has $170 billion, does not mean that he'll give you $77.18. Earth subtends only 4.54e-10 = 0.0000000454% of the angular area around the Sun, according to GPT-o1.
Try JournoFinder For Free
Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.
Start Your 7-Day Free Trial →X (formerly Twitter)
- Followers
- 190K
- Tweets
- 26K
- DMs Open
- Yes

RT @WillCainShow_: .@So8res on A.I.: "Once they build AIs that are smarter than us and give them things like robot bodies. If they have eno…

RT @Aella_Girl: You never really know how good someone is until you see how they act in a situation where they'd be rewarded for cruelty. T…

Some people say they liked this one better than previous podcasts. https://t.co/xdiGrP7tYN