AI News | How Intel FPGAs Accelerate Microsoft Bing's Intelligent Search

  • Overview
  • Transcript

This episode looks at Microsoft’s Bing Intelligent Search and how Intel® FPGAs  make real-time AI possible by providing customized hardware acceleration to complement Intel® Xeon CPUs!

Intel FPGAs Accelerate Artificial Intelligence for Deep Learning in Microsoft’s Bing Intelligent Search

Subscribe to the Intel Software YouTube Channel

AI News YouTube Playlist

Hi, I'm David Shaw. And welcome to the weekly edition of AI news. Today, we'll discuss how Intel FPGAs accelerate deep learning in Microsoft's Bing Intelligent Search. Bing recently launched features such as returning results that include relevant information across multiple sources and even hover-over definitions of uncommon words. Bing Intelligent Search will provide answers instead of just web pages. It enables an understanding of words and the meaning behind them. 

How do Intel FPGAs deliver intelligent search? In applications like Bing, they make real-time AI possible by providing customized hardware acceleration to complement Intel Xeon CPUs. This is a great example of how Intel FPGAs enable developers to design accelerator functions directly in the hardware to reduce latency, increase throughput, and improve power efficiency. Read more about how Intel FPGAs can help at the links provided. Don't forget to like this video and subscribe. And we'll see you next week for more AI news.