
Eliezer Yudkowsky
Research Fellow at Machine Intelligence Research Institute
The original AI alignment person. Missing punctuation at the end of a sentence means it's humor. If you're not sure, it's also very likely humor.
Articles
-
2 months ago |
lesswrong.com | Eliezer Yudkowsky |Ebenezer Dukakis |Kris Moore |David Gross
Working in the field of genetics is a bizarre experience. No one seems to be interested in the most interesting applications of their research. We’ve spent the better part of the last two decades unravelling exactly how the human genome works and which specific letter changes in our DNA affect things like diabetes risk or college graduation rates. Our knowledge has advanced to the point where, if we had a safe and reliable means of modifying genes in embryos, we could literally create superbabies.
-
2 months ago |
80000hours.org | Robert Wiblin |Eliezer Yudkowsky
So one criticism of “the AGI ideology,” as these people would put it, is that AGI is not foreordained… But when we talk about it as if inherently it’s coming, and it will have certain properties, that deprives citizens of agency. Now, the counterposition I would offer is: you don’t want to equip groups trying to shape history with a naive model of what’s possible.
-
Sep 24, 2024 |
lesswrong.com | Eliezer Yudkowsky |Lucius Bushnaq |Thomas Kwa |Joey KL
Crossposted from Twitter with Eliezer's permissionA common claim among e/accs is that, since the solar system is big, Earth will be left alone by superintelligences. A simple rejoinder is that just because Bernard Arnault has $170 billion, does not mean that he'll give you $77.18. Earth subtends only 4.54e-10 = 0.0000000454% of the angular area around the Sun, according to GPT-o1.
-
Sep 22, 2024 |
lesswrong.com | Eliezer Yudkowsky
Crossposted from Twitter with Eliezer's permissionA common claim among e/accs is that, since the solar system is big, Earth will be left alone by superintelligences. A simple rejoinder is that just because Bernard Arnault has $170 billion, does not mean that he'll give you $77.18. Earth subtends only 4.54e-10 = 0.0000000454% of the angular area around the Sun, according to GPT-o1.
-
Aug 30, 2024 |
theguardian.com | Eliezer Yudkowsky |Tom Chivers |Alex Hern |Rudi Zygadlo
This week we are revisiting the Black Box series. This episode was first broadcast on 21 March 2024. For decades, Eliezer Yudkowsky has been trying to warn the world about the dangers of AI. And now people are finally listening to him. But is it too late? Illustration: Eleanor Shakespeare/The Guardian Support The Guardian The Guardian is editorially independent. And we want to keep our journalism open and accessible to all.
Try JournoFinder For Free
Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.
Start Your 7-Day Free Trial →X (formerly Twitter)
- Followers
- 190K
- Tweets
- 26K
- DMs Open
- Yes

RT @So8res: Whether or not criminals deserve due process is beside the point. The point is that due process is how the state determines whe…

RT @patrissimo: While I’m a big fan of the new administration, I am strongly against the tariffs, and have been frustrated that even econ l…

RT @Ike_Saul: The tone of victimhood from the Trump-right is really startling right now. You control the House, Senate, and White House. Yo…