I recently caught up with Bernie Wu, Head of Business Development at MetalSoft, to address core industry trends like the evolution within companies where integrated technologies from Big Data to AI are driving entirely new ways for how businesses operate and how these leaps impacting their infrastructures and workload and where he sees the next generation of computing heading. Bernie has 25+ years of leadership experience at data center, hardware and software infrastructure companies including companies such as Conner/Seagate, Cheyenne Software, Trend Micro, FalconStor, and Levyx. He has a BS/MS Eng. from UC Berkeley and an MBA from UCLA.
At this point in 2020, with the impact of COVID-19 and other factors, increasingly distributed infrastructures are growing the need for more remote and automated data center management. In addition, the rise of new kinds of workloads such as cloud-native applications, containers, Big Data, NoSQL Databases, and artificial intelligence are driving the need for bare metal infrastructure.
insideAI News: What are you seeing evolving quickly within companies where integrated technologies from Big Data to AI are driving entirely new ways for how businesses operate?
Bernie Wu: Integrated technologies spanning Big Data to AI were originally architected and developed to run and consume entire dedicated computing clusters. However, developers have often chosen to use these technologies on virtualized clouds platforms to facilitate operational agility and rapid innovation. As applications based on these technologies mature, there is more interest in optimizing costs, performance, and efficiencies associated with multi-cloud and private cloud strategies. In addition, 5G opens a new frontier of applications that increases the data needed to be collected and analyzed at the edges of the internet in real-time. This is driving the need for further innovations in the area of distributed, resilient infrastructure and analytics.
insideAI News: How are these leaps impacting their infrastructures and workloads? What do you foresee as the next generation of computing?
Bernie Wu: Advances in these areas are driving renewed interest running applications directly on bare metal infrastructure and increasing demands for lights out automation of this infrastructure. Containers orchestrated by Kubernetes are increasingly becoming the cloud-native application platform of choice for public and private clouds. An equally important need is speed up AI-Big Data workloads by running them on bare metal servers equipped with GPUs, FPGAs, and other types of kernel bypass accelerators. Finally, the shift towards edge computing and the Internet of things will also drive demand for new types of automated, remote management platforms and tools.
insideAI News: How does a bare metal infrastructure empower companies to do more with these greater capacities?
Bernie Wu: In our own studies, we have estimated that costs savings of up to 5X and performance/latency improvements of over 10X can be attained by migrating certain types of Big Data-AI workloads from overly virtualized server and networking environments to bare metal infrastructure clouds. Multi-tenant bare metal infrastructure cloud software platforms that include self-service provisioning and orchestration of a user’s networking, storage, and dedicated servers also can deliver greater infrastructure agility, reduced “noisy” neighbor problems, and greater security isolation between different workloads. Dev/op teams can also begin to apply Infrastructure as Code (IaC) approaches to managing bare metal clouds and their edges.
Sign up for the free insideAI News newsletter.