IBM’s cloud adds support for Nvidia’s fastest GPUs yet

The content below is taken from the original (IBM’s cloud adds support for Nvidia’s fastest GPUs yet), to continue reading please visit the site. Remember to respect the Author & Copyright.

IBM today announced that users on its Bluemix cloud will soon be able to add two Nvidia Tesla P100 accelerator cards to their bare metal servers. The company says this feature will launch later this month and when it’s live, IBM will likely be the first major cloud provider to offer support for these chips, which can provide up to 4.7 teraflops of double-precision performance and 16 gigabytes of memory.

There is still a chance that Google could beat IBM to the market, though. Late last year, Google also announced that it would support Nvidia’s newest GPUs early this year, but we haven’t heard when exactly the company plans to launch this feature. We asked Google for an updated timeline but haven’t heard back yet.

AWS, too, offers GPU support, of course, and its machines can be outfitted with up to 16 GPUs (those are the older K80 cards, though 16 of those obviously offer a lot of raw computing power, too). Microsoft’s Azure offers a similar setup with support for up to 4 of Nvidia’s slightly older GPUs.

The Nvidia Tesla P100 GPUs are Nvidia’s fastest yet and have been optimized for deep learning applications. In addition to machine learning, though, there are also plenty of other high-performance computing applications that now make use of GPUs to speed up their processing.

“As the AI era takes hold, demand continues to surge for our GPU-accelerated computing platform in the cloud,” said Ian Buck, Nvidia’s general manager for accelerated computing, in today’s announcement. “These new IBM Cloud offerings will provide users with near-instant access to the most powerful GPU technologies to date — enabling them to create applications to address complex problems that were once unsolvable.”

All of this rush to bring the newest and fastest GPUs to the cloud goes to show how these big cloud platforms are now in a race to offer their customers the fastest machines to run their deep learning and artificial intelligence applications. Offering basic cloud computing capabilities, after all, is quickly becoming a commodity, but at least for the time being, the major players can still differentiate their services by offering better support for GPUs — until that, of course, also becomes table stakes again.