Gabriel Mukobi's profile photo

Gabriel Mukobi

Featured in:

Articles

  • Jun 13, 2023 | forum.effectivealtruism.org | Mark Xu |Gabriel Mukobi

    The Alignment Research Center’s Theory team is starting a new hiring round for researchers with a theoretical background. Please apply here. What is ARC’s Theory team? The Alignment Research Center (ARC) is a non-profit whose mission is to align future machine learning systems with human interests.

  • Jun 6, 2023 | lesswrong.com | Gabriel Mukobi

    The Story as of ~4 Years AgoBack in 2020, a group at OpenAI ran a conceptually simple test to quantify how much AI progress was attributable to algorithmic improvements. They took ImageNet models which were state-of-the-art at various times between 2012 and 2020, and checked how much compute was needed to train each to the level of AlexNet (the state-of-the-art from 2012). Main finding: over ~7 years, the compute required fell by ~44x.

  • Feb 15, 2023 | lesswrong.com | Garrett Baker |Daniel Kokotajlo |Julian Bradshaw |Gabriel Mukobi

    Yesterday in conversation a friend reasserted that we can just turn off an AI that isn't aligned. Bing Chat is blatantly, aggressively misaligned. Yet Microsoft has not turned it off, and I believe they don't have any plans to do so. This is a great counter-example to the "we can just unplug it" defense. In practice, we never will. To demonstrate this point, I have created a petition arguing for the immediate unplugging of Bing Chat. The larger this grows, the more pertinent the point will be.

Contact details

Socials & Sites

Try JournoFinder For Free

Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.

Start Your 7-Day Free Trial →