Samuel K. Moore's profile photo

Samuel K. Moore

New York

Senior Editor at IEEE Spectrum

Senior editor at IEEE Spectrum, the flagship publication of IEEE. Spectrum delivers news and analysis on computing, energy, semiconductors, and other tech.

Articles

  • 5 days ago | spectrum.ieee.org | Samuel K. Moore

    Naveen Verma’s lab at Princeton University is like a museum of all the ways engineers have tried to make AI ultra-efficient by using analog phenomena instead of digital computing. At one bench lies the most energy-efficient magnetic-memory-based neural-network computer ever made. At another you’ll find a resistive-memory-based chip that can compute the largest matrix of numbers of any analog AI system yet. Neither has a commercial future, according to Verma.

  • 5 days ago | spectrum.ieee.org | Samuel K. Moore

    Naveen Verma‘s lab at Princeton University is like a museum of all the ways engineers have tried to make AI ultra-efficient by using analog phenomena instead of digital computing. At one bench lies the most energy efficient magnetic-memory-based neural-network computer every made. At another you’ll find a resistive-memory-based chip that can compute the largest matrix of numbers of any analog AI system yet. Neither has a commercial future, according to Verma.

  • 1 week ago | spectrum.ieee.org | Samuel K. Moore

    Ask what—if anything—is holding back the AI industry, and the answer you get depends a lot on who you’re talking to. I asked one of Bloomberg’s former chief data wranglers, Carmen Li, and her answer was “price transparency.”According to Li, the inability of most of the smaller AI companies to predict how much they will need to spend for the privilege of renting time on a GPU to train their models makes their businesses unpredictable and has made financing AI companies unnecessarily expensive.

  • 1 month ago | spectrum.ieee.org | Samuel K. Moore

    In data centers, pluggable optical transceivers convert electronic bits to photons, fling them across the room, and then turn them back to electronic signals, making them a technological linchpin to controlling the blizzard of data used in AI. But the technology consumes quite a bit of power. In a data center containing 400,000 GPUs, Nvidia estimates that optical transceivers burn 40 megawatts.

  • 1 month ago | evdriven.com | Samuel K. Moore

    In data centers, pluggable optical transceivers convert electronic bits to photons, fling them across the room, and then turn them back to electronic signals, making them a technological linchpin to controlling the blizzard of data used in AI. But the technology consumes quite a bit of power. In a data center containing 400,000 GPUs, Nvidia estimates that optical transceivers burn 40 megawatts.

Contact details

Socials & Sites

Try JournoFinder For Free

Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.

Start Your 7-Day Free Trial →

X (formerly Twitter)

Followers
578
Tweets
247
DMs Open
No
No Tweets found.