DataON announces Intel Select Solutions for AI Inferencing


Accelerate artificial intelligence (AI) inferencing and deployment on an optimized, verified infrastructure based on Intel technology

ANAHEIM, Calif. – June 18, 2020 – DataON, the industry-leading provider of hybrid cloud solutions for Microsoft Azure Stack HCI with cloud-based Azure Services, has collaborated with Intel to  offer verified Intel Select Solutions for AI Inferencing, powered by 2nd Gen Intel Xeon Scalable processors.

Businesses are increasingly looking to artificial intelligence (AI) to increase revenue, drive efficiencies, and innovate their offerings. AI use cases powered by deep learning (DL) generate some of the most powerful and useful insights. Some of these use cases can enable advances across numerous industries such as image classification, object detection, image segmentation, natural language processing, and recommended systems.

Intel Select Solutions for AI Inferencing provide you with a jumpstart to deploying efficient AI inferencing algorithms on solutions built on validated Intel architecture so you can innovate and go to market faster. To speed AI inferencing and time to market for applications built on AI, Intel Select Solutions for AI Inferencing combine several Intel and third-party software and hardware technologies.

Intel Select Solutions for AI Inferencing combine 2nd Gen Intel Xeon Scalable processors, Intel Optane solid state drives (SSDs), and Intel 3D NAND SSDs, so your business can quickly deploy a production-grade AI infrastructure built on a performance-optimized platform that offers high-capacity memory for the most demanding applications and workloads. For Base configurations, the Intel Xeon Gold 6248 processor provides an optimized balance of price, performance, and built-in technologies that enhances performance and efficiency for inferencing on AI models. 2nd Gen Intel Xeon Scalable processors include Intel Deep Learning Boost, a family of acceleration features that improves AI inference performance using the specialized Vector Neural Network Instructions (VNNI) instruction set.

Intel Optane technology fills critical gaps in the storage and memory hierarchy, enabling data centers to accelerate their access to data. This technology also disrupts the memory and storage tier, delivering persistent memory, large memory pools, fast caching, and storage in a variety of products and solutions.

AI inferencing performs best when the cache tier is on fast SSDs with low latency and high endurance. Intel Optane SSDs are used to power the cached tier in these Intel Select Solutions. Intel Optane SSDs offer high input/output (I/O) operations per second (IOPS) per dollar with low latency, coupled with 30 drive-write-per-day endurance, so they are ideal for write-heavy cache functions. The capacity tier is served by Intel 3D NAND SSDs, delivering optimized read performance with a combination of data integrity, performance consistency, and drive reliability.

Intel Select Solutions

These solutions add to DataON’s first generation of Intel Select Solutions for Windows Server Software-Defined Storage, second generation Intel Select Solutions for Azure Stack HCI, and Intel Select Solutions for Microsoft SQL Server. These solutions are designed by DataON and verified by Intel to reduce the time required to evaluate, select, and purchase hardware for today’s workloads and applications. They are vigorously benchmark-tested with today’s high-priority workloads, helping businesses realize smooth deployments and optimal performance. By eliminating guesswork and ensuring predictability with pre-defined configurations, businesses can take advantage of new technologies faster.

“DataON was an inaugural partner for Intel Selection Solutions for Windows Server Software-Defined Storage, and we are proud to continue to add to our family of Intel Select Solution offerings,” said Howard Lo, vice president of sales and marketing, DataON. “Having these validated solutions demonstrates our commitment to providing Intel-based solutions for popular workloads such Azure Stack HCI, SQL Server, and AI Inferencing. Customers will be able to realize a faster time-to-value and reduced operational costs with these pre-validated solutions.”

“Substantial resources are needed to support inferencing on deep neural networks and require high performance and consistent quality for AI inferencing,” said Jake Smith, Director, Data Center Technologies at Intel. “Intel Select Solutions for AI Inferencing, delivered through industry leaders like DataON, accelerate time-to-deployment and help protect your IT investments and provide our customers with a turnkey platform solution composed of validated Intel architecture building blocks for low-latency, high-throughput inference performed on a CPU.”
About DataON

DataON is a hybrid-cloud computing company focused on delivering Microsoft Azure Stack HCI, on-premises storage systems, intelligent edge appliances, and cloud-based Microsoft Azure Services. Our company is helping enterprises and customers who have made the “Microsoft choice” to modernize their IT with Microsoft applications, virtualizations, and data protection through a complete and turnkey experience.

With over 650 HCI clusters and 150PB of storage deployed, DataON enterprise-level solutions are designed to provide the highest level of performance, manageability, and security offered. DataON is a Microsoft Gold Partner, Microsoft Cloud Service Provider (CSP), and an Intel Platinum Partner.

For more information, go to www.dataonstorage.com or call +1 (714) 441-8820.

 

 Intel, the Intel logo, Xeon, Optane and other Intel marks are trademarks of Intel Corporation or its subsidiaries. All trademarks, registered trademarks, service marks, brands and names mentioned herein are property of their respective owners.


div#stuning-header .dfd-stuning-header-bg-container {background-image: url(https://www.dataonstorage.com/wp-content/uploads/2020/06/DataON_Tech_BK_Blue.jpg);background-size: cover;background-position: top center;background-attachment: scroll;background-repeat: initial;}#stuning-header div.page-title-inner {min-height: 320px;}#main-content .dfd-content-wrap {margin: 0px;} #main-content .dfd-content-wrap > article {padding: 0px;}@media only screen and (min-width: 1101px) {#layout.dfd-portfolio-loop > .row.full-width > .blog-section.no-sidebars,#layout.dfd-gallery-loop > .row.full-width > .blog-section.no-sidebars {padding: 0 0px;}#layout.dfd-portfolio-loop > .row.full-width > .blog-section.no-sidebars > #main-content > .dfd-content-wrap:first-child,#layout.dfd-gallery-loop > .row.full-width > .blog-section.no-sidebars > #main-content > .dfd-content-wrap:first-child {border-top: 0px solid transparent; border-bottom: 0px solid transparent;}#layout.dfd-portfolio-loop > .row.full-width #right-sidebar,#layout.dfd-gallery-loop > .row.full-width #right-sidebar {padding-top: 0px;padding-bottom: 0px;}#layout.dfd-portfolio-loop > .row.full-width > .blog-section.no-sidebars .sort-panel,#layout.dfd-gallery-loop > .row.full-width > .blog-section.no-sidebars .sort-panel {margin-left: -0px;margin-right: -0px;}}#layout .dfd-content-wrap.layout-side-image,#layout > .row.full-width .dfd-content-wrap.layout-side-image {margin-left: 0;margin-right: 0;}