Samuel K. Moore's profile photo

Samuel K. Moore

New York

Senior Editor at IEEE Spectrum

Senior editor at IEEE Spectrum, the flagship publication of IEEE. Spectrum delivers news and analysis on computing, energy, semiconductors, and other tech.

Articles

  • 2 weeks ago | spectrum.ieee.org | Samuel K. Moore

    This week at the IEEE Electronic Components and Packaging Technology Conference, Intel unveiled that it is developing new chip packaging technology that will allow for bigger processors for AI. With Moore’s Law slowing down, makers of advanced GPUs and other data center chips are having to add more silicon area to their products to keep up with the relentless rise of AI’s computing needs.

  • 3 weeks ago | spectrum.ieee.org | Samuel K. Moore

    Naveen Verma’s lab at Princeton University is like a museum of all the ways engineers have tried to make AI ultra-efficient by using analog phenomena instead of digital computing. At one bench lies the most energy-efficient magnetic-memory-based neural-network computer ever made. At another you’ll find a resistive-memory-based chip that can compute the largest matrix of numbers of any analog AI system yet. Neither has a commercial future, according to Verma.

  • 3 weeks ago | spectrum.ieee.org | Samuel K. Moore

    Naveen Verma‘s lab at Princeton University is like a museum of all the ways engineers have tried to make AI ultra-efficient by using analog phenomena instead of digital computing. At one bench lies the most energy efficient magnetic-memory-based neural-network computer every made. At another you’ll find a resistive-memory-based chip that can compute the largest matrix of numbers of any analog AI system yet. Neither has a commercial future, according to Verma.

  • 1 month ago | spectrum.ieee.org | Samuel K. Moore

    Ask what—if anything—is holding back the AI industry, and the answer you get depends a lot on who you’re talking to. I asked one of Bloomberg’s former chief data wranglers, Carmen Li, and her answer was “price transparency.”According to Li, the inability of most of the smaller AI companies to predict how much they will need to spend for the privilege of renting time on a GPU to train their models makes their businesses unpredictable and has made financing AI companies unnecessarily expensive.

  • 1 month ago | spectrum.ieee.org | Samuel K. Moore

    In data centers, pluggable optical transceivers convert electronic bits to photons, fling them across the room, and then turn them back to electronic signals, making them a technological linchpin to controlling the blizzard of data used in AI. But the technology consumes quite a bit of power. In a data center containing 400,000 GPUs, Nvidia estimates that optical transceivers burn 40 megawatts.

Contact details

Socials & Sites

Try JournoFinder For Free

Search and contact over 1M+ journalist profiles, browse 100M+ articles, and unlock powerful PR tools.

Start Your 7-Day Free Trial →

X (formerly Twitter)

Followers
578
Tweets
247
DMs Open
No
Samuel K. Moore
Samuel K. Moore @SamuelKMoore
26 Jul 22

This one goes to 11. No, make that 232 https://t.co/v2hHrxTTxB

Samuel K. Moore
Samuel K. Moore @SamuelKMoore
12 Jul 22

Sure the #JamesWebbSpaceTelescope images are awesome, but why not find out: 1) How the heck the optics work 2) How it stays at 2.7 degrees above absolute zero 3) How you send 28 megabits/s a distance of 1.5 million kilometers #UnfoldTheUniverse @NASAWebb https://t.co/3JWB931jo2

Samuel K. Moore
Samuel K. Moore @SamuelKMoore
11 Jul 22

Fodder for @HelloGarglers and @hellobuglers https://t.co/5rs1H3y7Vn Just think of how much work went into this...