In this special guest feature, Steve Conner, Vice President of Solutions Engineering at Vantage Data Centers, discusses how AI is currently rewriting the rules of future computing. As AI becomes increasingly advanced, data centers must be able to adapt and scale to fit the needs of its clients.
Search Results for: data center
Machine Learning Solves Data Center Problems, But Also Creates New Ones
In this special guest feature, Geoff Tudor, VP and GM of Cloud Data Services at Panzura, believes AI poses both opportunities and risks in the automation of the datacenter. This article provides an overview regarding the impact of AI in the datacenter, and how companies can prepare their storage infrastructure for these technologies.
HPE Brings Artificial Intelligence to the Data Center
Hewlett Packard Enterprise (NYSE:HPE) announced an artificial intelligence (AI) recommendation engine designed to simplify and reduce the guesswork in managing infrastructure and improve application reliability. HPE InfoSight is an industry-leading predictive analytics platform that brings software-defined intelligence to the data center with the ability to predict and prevent infrastructure problems before they happen.
ROOT Data Center – Wholesale Provider to Implement AI and Machine Learning for Reduced Downtime Risk
ROOT Data Center, announced that it is the first wholesale data center in the world to use Artificial Intelligence (AI) and machine learning to reduce the risk of data center downtime. ROOT Data Center has partnered with state-of-the-art AI and machine learning technology firm Litbit, within ROOT’s Montréal-based facility.
The Cost of Updating Older Data Centers for Big Data Needs
In this contributed article, freelance human Avery Phillips suggests that before your company makes the leap to update your servers and other data-center equipment for the needs of your big data deployments, make sure you understand the challenges ahead.
Argyle Data Extends Predictive Analytics Offerings to Enterprise Data Centers
Argyle Data has expanded its core machine learning and AI application suite to engage with clients in enterprise areas including IoT security, financial services and online/mobile banking.
Interview: Adaptive Computing Brings Big Workflow to the Data Center
“Our thought process was that Big Data + a better Workflow = Big Workflow. We coined it as an industry term to denote a faster, more accurate and more cost-effective big data analysis process. It is not a product or trademarked name of Adaptive’s, and we hope it becomes a common term in the industry that is synonymous with a more efficient big data analysis process.”
Big Data Sparks Business for Containerized Data Centers
Over at Information Week, Kevin Fogarty writes that Big Data growth has more companies adopting containerized data centers and other solutions to meet storage demands. Demand is growing so quickly that data center construction projects are getting backed up or delayed at unprecedented rates, according to Data Center Journal. As a result, record numbers of […]
Alice & Bob to Integrate Cat Qubits in Datacenters of the Future, Accelerated by NVIDIA Technology
Alice & Bob, a quantum hardware manufacturer and leading QPU designer, announced it is working to accelerate the integration of quantum technology into industry by introducing cat qubits into the datacenters of the future.
The Future-Proofed Datacenter: DDC Delivers 85kW Air-Cooled Density for AI and HPC Workloads
[SPONSORED POST] At DDC, the global leader in scalable datacenter-to-edge solutions, we are taking an innovative approach to building new and retrofitting legacy datacenters. Today, our patented cabinet technology can be deployed in nearly any environment or facility and supports one of the highest-density, air-cooled thermal loads—85kW per cabinet—on the market. In a recent deployment with TierPoint, a US-based colocation provider, we are supporting a 26,000 sq ft facility augmentation outside of Allentown, PA.