Big Data Industry Predictions for 2021

Print Friendly, PDF & Email

2020 has been year for the ages, with so many domestic and global challenges. But the big data industry has significant inertia moving into 2021. In order to give our valued readers a pulse on important new trends leading into next year, we here at insideAI News heard from all our friends across the vendor ecosystem to get their insights, reflections and predictions for what may be coming. We were very encouraged to hear such exciting perspectives. Even if only half actually come true, Big Data in the next year is destined to be quite an exciting ride. Enjoy!

Daniel D. Gutierrez – Editor-in-Chief & Resident Data Scientist

Analytics

The “analytic divide” is going to get worse. Like the much-publicized “digital divide” we’re also seeing the emergence of an “analytic divide.” Many companies were driven to invest in analytics due to the pandemic, while others have been forced to cut anything they didn’t view as critical to keep the lights on – and a proper investment in analytics was, for these organizations, analytics was on the chopping block. This means that the analytic divide will further widen in 2021, and this trend will continue for many years to come. Without a doubt, winners and losers in every industry will continue to be defined by those that are leveraging analytics and those that are not. – Alan Jacobson, Chief Data and Analytics Officer, at Alteryx

Likely gone are the days of piecemeal analytics and reporting solutions that are likely fulfilling niche business use cases. This is unsustainable. Companies cannot have highly departmentalized analytics implementations that have the effect of localized problem solving and the larger business not seeing the full benefit. This current situation will change into one where analytics will be done on all data that the company has access to, with the capability of these analytics be implemented in a collaborative manner by a variety of interest groups with different skills sets (e.g., data science, lines of business leaders) and with a full-on focus towards operationalizing analytics insights in near real time. In other words, no more piecemeal and no more just science experimentation. – Sri Raghavan, Director, Data Science and Advanced Analytics Product Marketing at Teradata

Prescriptive analytics will be a key component for digital transformation success: Advanced analytics are becoming mainstreamed as businesses increasingly collect and analyze data across their organizations, with 35% of U.S. manufacturers deploying advanced analytics in the past three years. For AI to have a significant impact across the value chain, prescriptive analytics will be the catalyst to optimize performance. Prescriptive analytics will become an essential piece for scaling AI within organizations, by leveraging product and customer data to advise AI models on how to improve processes, adjust production and increase efficiency. Prescriptive analytics enables constant improvement with an AI model by continuously monitoring and adjusting based on evolving conditions. Prescriptive models can then enable decision automation, where the models can take the best course of action based on prescriptions. Going beyond predictive analytics to prescriptive analytics will ultimately enable digital transformation success for manufacturers in 2021. – George Young, Global Managing Director of Kalypso

Augmented analytics and self-service will become more widely in demand given the distributed workforce and hunger for information. In response, traditional analytics will become increasingly disrupted by AI. The increase in a distributed workforce is going to create a greater demand for augmented analytics where the individual user is guided through the process of creating queries to get immediate answers to their data questions. We are seeing a converge of analytics and AI in two areas – at the infrastructure level and at the analyst level. People are beginning to realize that they have different data pipelines that are providing data for an analytics engine and they are building a different stack for ML. Instead of two completely separate stacks, we see a convergence of these into an infrastructure that is easier to maintain while ensuring that the same data is being used to supply both engines . A second convergence will happen regarding a ‘hunger’ for information and bridging a gap to answer questions using data. Traditional analytics will start to get more disrupted by AI. Platforms (such as Tableau, Power BI, etc.) will start get displaced by bots and virtual assistants that will be conversational in nature. We see this as a push to speed through a pull for self-service. We also anticipate NLP becoming more widely used in 2021. – Scott Schlesinger, Global Data, Analytics & AI Practice Leader at Ness

The lines between IT and other departments when it comes to data and analytics in particular will continue to blur. Data and analytics have the potential to drive extremely positive and meaningful business outcomes, and when it happens, there is often also powerful collaboration across different functional areas as each one has a level of accountability for the success of the analytics approach. Areas like data governance, data literacy, open data platforms, integration and utilization of data in different parts of the enterprise will enable business users to perform tasks traditionally reserved for IT teams and the data that business units generate will feed into platforms that IT manages. This — coupled with a shortage of data scientists and analytics professionals — also means that data platforms will become more seamless and easy to deploy so that all parts of an organization will be able to leverage it. – Frances Zelazny, CMO of Signals Analytics

In the 2000s, putting Microsoft Office on your resume could make you a good candidate for a job, but a decade later it was a skill that was taken for granted. Nowadays, SQL proficiency can make you stand out, but what will happen in the years ahead?

As data literacy rises, analytics skills will become the norm for all business professionals and start to disappear from candidates’ resumes. Just as you’re unlikely to see ‘Office proficiency’ today, you’re unlikely to see ‘data proficiency’ by the end of the decade. We’ve entered a third wave of analytics, and with it the expectation that business users can interact with data without the help of an expert. Very soon, if you’re unable to marry hard data with business context to define and execute a strategy, you’re going to struggle in the workplace. The ideal candidate for businesses in 2021 and beyond will be a person who can both understand and speak data — because in a few short years, data literacy will be something employers demand and expect. Those who want to get ahead are acquiring these talents now. – ThoughtSpot CEO Sudheesh Nair

As companies shift their data infrastructure to a federated (one engine queries different sources), disaggregated (compute is separate from storage is separate from the data lake) stack, we’ll see traditional data warehousing and tightly coupled database architectures relegated to legacy workloads. But one thing will remain the same when it comes to this shift – SQL will continue to be the lingua franca for analytics. Data analysts, data engineers, data scientists and product managers along with their database admins will use SQL for analytics. – Dave Simmen, Co-founder and Chief Technology Officer (CTO), Ahana

Organizations everywhere are escalating their use of analytics systems but are challenged with the need to for event-data platforms that can perform real-time data wrangling. In 2021 organizations will demand intelligent data platforms that can consume static and streaming data from a variety of sources in any format, size or velocity; Wrangle the data (enrich and map) on-the-fly; and deliver the data to systems, devices and applications securely and in real time. – Sean Bowen, CEO of Push Technology

One single SQL query for all data workloads. The way forward is based not only on automation, but also on how quickly and widely you can make your analytics accessible and shareable. Analytics gives you a clear direction of what your next steps should be to keep customers and employees happy, and even save lives. Managing your data is no longer a luxury, but a necessity–and determines how successful you or your company will be. If you can remove complexity or cost of managing data, you’ll be very effective. Ultimately, the winner of the space will take the complexity and cost out of data management, and workloads will be unified so you can write one single SQL query to manage and access all workloads across multiple data residencies. – Raj Verma, CEO of SingleStore

AI and Analytics capabilities were provided by different platforms / teams in the past. Over the years, we are seeing the platform is converging and the AI team is more focused on the algorithmic side, while AI & Analytics platform teams merged to provide the software infrastructure for both analytics and AI use cases. – Haoyuan Li, Founder and CEO, Alluxio

As data professionals, we have a responsibility to the broader public. I think that within the next year we will see progress toward a code of ethics within the data analytics space, led by conscious companies who recognize the seriousness of potential abuses. Perhaps the US government will intervene and pass some version of its own GDPR, but I believe that technology companies will lead this charge. What Facebook has done with engagement data is not illegal, but we’ve seen that it can have deleterious effects on child development and on our personal habits. In the coming years, we will look back on the way companies used personal data in the 2010s and cringe in the way we do when we see people smoking on a plane in films from the 1960s. – Jeremy Levy, CEO of Indicative

Emotion is a key factor affecting customer behavior and has a strong influence on brand loyalty. Therefore, it is increasingly useful for companies to find a way to measure emotions of customers during their decision-making processes. Emotional analytics focuses on studying and recognizing the full gamut of human emotions that includes mood, attitude and personality. It employs predictive models and AI/ML to analyze human movements, word choices, voice tones, and facial expressions. Emotional analytics can help companies build a more holistic customer profile, understand how to influence emotions and develop customized product and services tailored to individuals. Sentiment analysis about products and services, across geographies, social networks, and review web sites enables companies to better understand and improve their customer satisfaction level. Using emotional analytics, companies can better understand how their marketing and services influence emotion in order to provide more positively engaging customer experiences. – Paul Moxon, SVP, Data Architecture at Denodo

Getting product analytics right is hard. Every interaction results in mounds of data, and digging through that to find that ‘needle in the haystack’ insight requires a lot of effort, discipline, and time to make it work. These barriers to entry mean data analysis is often limited to companies who have the resources, bandwidth, and the knowledge to do it right. But it’s also a discipline that’s growing in importance — even before the pandemic, consumer interactions with brands were generally happening on digital platforms, and now they are there almost exclusively. There are countless amounts of information out there that can explain the ROI of each interaction, and without a doubt, some of that is potentially game changing. But, frankly, we’re human, and if we have to work hard to get value out of something, we’re going to be less likely to do it consistently. That’s why in 2021, analytics will move from being a reactive game — gathering data that analysts then have to sift through to find those insights — to a proactive one, connecting teams directly to those “a-ha!” moments that inspire immediate and informed action. – Matin Movassate, CEO and Founder at Heap

The further development of predictive analytics will shape the future for companies that adopt VSM. Over recent years, value stream management (VSM) platforms have improved the way organizations develop software, but what is going to really move to the forefront in 2021 is that VSM predictive analytics will shape organizations’ knowledge and foresight of what their customers need. The need for visibility into the software delivery process will enhance the ability to make informed decisions based on that insight and become a differentiator for companies that rely on software. As we go forward, companies will have to embrace VSM platforms if they want to become a software player. But it will be the improved visibility and utilization of predictive analytics that VSM provides that will enable companies to understand what technology and products matter most to their customers so they can move in that direction. The importance of visibility also points to the vitality of gathering data. While many companies talk about visibility, they don’t discuss what it takes from a data perspective. Collecting data requires a common data model across the value stream. If you want visibility, the ability to fix things fast, and measurement of the value you’re delivering, it’s always about proving you know how to do it and convincing the powers that be to invest in that vision. VSM platforms will provide clear advantages for those who choose to use them through the power of data driven decisions. – Bob Davis, CMO, Plutora

The Next Evolution of Analytics Brings a Federated, Disaggregated Stack – A federated, disaggregated stack that addresses the new realities of data is displacing the traditional data warehouse with its tightly coupled database. The next evolution of analytics foresees that a single database can no longer be the solution to support a wide range of analytics as data will be stored in both data lakes and a range of other databases. SQL analytics will be needed for querying both the data lake and other databases. We’ll see this new disaggregated stack become the dominant standard for analytics with SQL-based technologies like the Presto SQL query engine at the core, surrounded by notebooks like Jupyter and Zeppelin and BI tools like Tableau, PowerBI, and Looker. – Dave Simmen, Co-founder and Chief Technology Officer, Ahana

More verticalization of data analytics and data platform technologies: The early 2000s saw a paradigm shift away from traditional relational databases and data warehousing toward building distributed data platforms. Companies like Cloudera led the trend, paving the way for other next-generation data platforms to follow. By the mid-2010s, companies introduced point solutions designed to solve specific pain points that emerged in these new environments (like Databricks for analytics and Snowflake for cloud consolidation). In the next year and throughout the next decade, we’ll see these and many other best-of-breed solutions hit their stride, offering much faster time to value through fully integrated and much more end user-friendly solutions.  This will enable enterprises to dramatically accelerate the building of new data platforms and derive more value from their data faster. – Nong Li, Okera co-founder and CTO

Artificial Intelligence

As businesses look toward goals to reopen and recoup sufficient revenue streams, they’ll need to leverage smart technologies to gather key insights in real-time that allow them to do so. Adopting artificial intelligence (AI) technologies can help guide companies to understand if their strategies to keep customers and employees safe are working, while continuing to foster growth. As companies recognize the unique abilities of AI to help ease corporate policy management and compliance, ensure safety and evolve customer experience, we’ll see boosted rates of AI adoption across industries. – Hillary Ashton, EVP and Chief Product Officer at Teradata

In 2021 we will see AI, machine learning and IoT define and shape our lives and behaviors, a phenomenon that will continue for many years to come. These advancements impact how we work, how we buy, how we spend, how we do every little thing in our lives. But I think the real star that companies will turn to will be the enabling technologies such as cloud and edge computing, which will continue to dominate due to their ability to process and manage all the necessary data that fuels AI, ML, and IoT, as well as enabling technologies like iPaaS, APIM and RPA. These technologies will continue to lead the digital transformation charge for businesses as they move from manual or paper-driven business to digital businesses that can finally tap the power of AI and IoT. – Manoj Choudhary, CTO at Jitterbit

Artificial Intelligence becomes less artificial in 2021: Even with a vaccine for COVID-19 on the horizon, how people work and interact has fundamentally shifted. In the new year, remote work will continue, social distancing requirements will remain, and supply chains will continue to face disruption. This new way of life demands a new way for companies to continue operations effectively across the value chain – from the product to the plant to the end user. The use of artificial intelligence (AI) will be the standard for addressing these challenges. However, without considering how humans will interact with and leverage these new autonomous systems, AI will fail. In 2021, enterprises will take a human-centered approach to AI initiatives, understanding user needs and values, then adapting AI designs and models accordingly, which will in turn, improve adoption. Enterprises must put the same focus on people and culture as the technology itself for AI to be successful. Organizational change management (OCM) teams will be critical for driving digital transformation and AI forward by bringing people along for the change journey and setting the organization up for measurable results. Proper change management is the most important – yet overlooked – aspect of any digital transformation initiative. – George Young, Global Managing Director at Kalypso

In 2021, enterprises will move away from quick wins by relying on AI systems, to focus on lasting and meaningful business value. This change will drive deeper data literacy initiatives across organizations. It will require people to learn new skills and behave in new ways. – Sundeep Reddy Mallu, Head of Analytics at Gramener 

Most consumers will continue to be skeptical of AI. With several big consumer brands in the hot seat around questionable AI ethics, most people still don’t trust AI. For many, it’s because they don’t understand it or even realize they’re using it daily. Consumers are getting so many AI-powered services for free — Facebook, Google, TikTok, etc. — that they don’t understand what they’re personally giving up in return — namely their personal data. As long as the general public continues to be naïve, they won’t be able to anticipate the dangers AI can introduce or how to protect themselves — unless the market better educates customers or implements regulations to protect them. Despite this, there’s some evidence that we’re turning the corner on AI’s trustworthiness. Eighty-one percent of business leader respondents to Pega’s upcoming survey said they’re optimistic that AI bias will be sufficiently mitigated in five years. Businesses had better hope this turns out to be true – because as more of the public wakes up to how AI impacts their lives, and in some cases plays favorites, they will continue to ask harder questions that further erodes trust in AI, forcing businesses to have to answer to them. – Vince Jeffs, Senior Director – Product Strategy, Marketing AI and Decisioning, Pega

AI powered digital workers will help businesses stay strategic in the long-term. Few disagree with the notion that AI and automation are essential to companies’ survival going forward. However, research has indicated that most companies have not fully realized the benefit of their AI and automation investments. By linking powerful AI capabilities to business processes through the digital workforce, we’ll increasingly see organizations implement AI driven automation at scale. AI infused automation will increasingly be linked to core strategic initiatives such as improved customer focus, revenue growth, capital allocation, supply chain management, risk management, cost and operational efficiency and more. AI powered digital workers will be leveraged as primary tools for executing on corporate strategy and managing enterprise scale risks. Rapid and effective adoption of automation will increasingly be seen as an essential component to remaining competitive in markets. – Eric Tyree, Head of AI and Research at Blue Prism

AI experimentation will become more strategic. Experimentation takes place throughout the entire model development process – usually every important decision or assumption comes with at least some experiment or previous research to justify those decisions. Experimentation can take many shapes, from building full-fledged predictive ML models to doing statistical tests or charting data. Trying all combinations of every possible hyperparameter, feature handling, etc., quickly becomes untraceable. Therefore, we’ll begin to see organizations define a time and/or computation budget for experiments as well as an acceptability threshold for usefulness of the model. – Florian Douetteau, CEO and co-founder of Dataiku

In 2021, we will finally see AI go mainstream. As a result of COVID-19, businesses were forced to digitally transform in order to survive in the new normal. According to our research, digital acceleration shows no sign of stopping in the new year, with 86% of companies currently reaping the benefits of better customer experience through AI, likely to continue. The pandemic has also changed business priorities for AI investment. For instance, we’ve seen companies shift from simpler tasks like automation to focus on workforce planning and simulation modeling. As organizations continue to see benefits from their digital investments in complex processes, AI will only become more widespread and widely used over the next year. – Anand Rao, Global Artificial Intelligence Lead at PwC

Convergence of AI & BI will boost data insights. AI has been part of every corporate discussion over the past 5 years. And yet, challenges persist in democratizing advanced AI insights across large sections of employees. As new AI-powered BI products emerge, silos will be broken and every user will be able to leverage data analytics and find insights easily. Simple interfaces, personalized insights, and engaging data experiences will become the hallmarks of data analytics in 2021 and beyond. – Dhiren Patel, MachEye’s Chief Product Officer & Head of Customer Success

Racial bias in many AI-driven facial recognition algorithms has been a big topic of conversation over the past year and came to a head due to the social unrest of 2020. Research has found widespread evidence that racial minorities were far more likely than whites to be misidentified. In 2021, we will see the correction of AI bias become a major topic for any company that leverages AI or facial recognition technology. By using government-issued documents, you can quickly and easily prove ID ownership by analyzing the face on the document and comparing it to the face trying to access your system. 2021 will be the year that AI bias comes to light and companies will begin implementing radical change to eliminate racial bias in its software — some of which can be done by putting a deliberate focus on fairness and training of the company’s ML system to reduce racial facial recognition errors. – Mohan Mahadevan, VP of Research, Onfido

2021 will be the year that teams go from casually dating AI to being in a committed relationship. AI isn’t just for R&D projects anymore. It’s time to commit to adapting these solutions instead of just flirting with them. We have to automate now. – David Karandish, Founder and CEO of Capacity 

With the confluence of computational power, internet-scale data and modern machine learning algorithms, we have broken remarkable new ground with AI over the past few years. In the coming years, we will enter an expansionary era, where a long tail of commercial use-cases will be prototyped, packaged and productionized – either to enhance existing products and services or to create entirely new ones. – Dave Costenaro, Chief Data Officer at Capacity 

AI Success Moves From General Purpose to Niche Focuses. While AI investment continues to grow in the enterprise, businesses are reevaluating their tech stacks to accommodate niche AI, rather than “general purpose” black boxes that claim to do everything. Niche, perfected use-cases that solve specific problems are going to take budget priority, rather than automation which promises to do everything. – Viral Bajaria, CTO at 6sense

Rise of Artificial Narrow Intelligence: Not long ago, AI was what we now know as artificial general intelligence, like self driving cars or image recognition. However, today there is a new category of artificial narrow intelligence which is trying to replicate a human decision making process. From a supply chain perspective, this new AI can help to inform better decision making around every aspect of a supply chain, from “How do I fill a truck?” or “How do I get products on time?” In 2021, I’m envisioning an increase in these narrow solutions to replace tactical and smaller scale decisions. – Andy Fox, Director of Global Impact with LLamasoft

At the fringes, we will begin to see “Counter-AI” start to materialize. As governments try to track people and businesses try to manipulate them or gain deep insights into behavior, I predict a backlash of methods to foil tracking and customer 360’s. Not unlike the work various groups have done on anti-facial recognition tools, we will begin to see high and low-tech methods for boggling the AI’s used to monitor and understand us. – Head of Architecture for Atos North America’s AI Lab in partnership with Google Cloud, Jonas Bull

As more agencies begin to adopt these AI- and ML-based solutions, there’s an onus on law enforcement to abide by ethical policies and to remove bias in such tools. As such, departments will begin to establish their own policies and work with governing bodies on responsible and ethical AI usage, including proper training for the relevant teams and business functions, as well as creating an environment with an ethos of data-driven and responsible decision-making. Going a step further, law enforcement organizations will continue to ensure AI systems are vetted to be bias-free and corrected as needed. And they will open a line of communication with the public to promote transparency regarding the use of these tools. – Heather Mahalik, Senior Director of Digital Intelligence, Cellebrite

We’ll see more data-driven companies leverage open source for analytics and AI in 2021. Open source analytics technologies like Presto and Apache Spark power AI platforms and are much more flexible and cost effective than their traditional enterprise data warehouse counterparts that rely on consolidating data in one place–a time-consuming and costly endeavor that usually requires vendor lock-in. Next year will see a rise in usage of analytic engines like Presto for AI applications because of its open nature – open source license, open format, open interfaces, and open cloud. – Dipti Borkar, Co-founder and Chief Product Officer (CPO), Ahana

The industry will shift away from generic horizontal AI platforms, such as IBM Watson and Amazon Lex, towards domain specific AI powered products and managed service models. Generic platforms are not solutions, They start cold, without any training data or data model structure — building this, then optimizing it in production is an expert and resource intensive task that is beyond most companies capability. The move from the early innovator market into mass market adoption will be driven in 2021 by the adoption of domain specific AI powered products that are pre-trained for a specific industry and are proven to work. – Jake Tyler, co-founder & CEO, Finn AI

In 2021, AI won’t be mapped on the human spectrum of competence. We can have algorithms that crush any human at chess but are unable to make a cup of tea and computer programs that can perform mathematics millions of times faster than humans but, if asked who might win the next World Cup, they wouldn’t even understand the question. Their capabilities are not universal. We’ve reached a point with AI where we simultaneously overestimate and underestimate the power of algorithms. When we overestimate them, we see human judgment relegated to an afterthought – a dangerous place to be. The use of a “mutant algorithm” in grading A-level results is the scandal du jour in the UK, despite the algorithm producing many results that simply violate common sense. When we underestimate algorithms, we see entire industries crumble because they didn’t see change on the horizon. How can the traditional taxi business compete when Uber’s algorithm can get you a ride in less than 3 minutes? In 2021, expect engineers to avoid AI and algorithmic blunders by not trying to map algorithms onto the human spectrum of competence. Using AI technologies – such as any-context speech recognition – to enhance what humans can do and finding the right balance between AI automation and human knowledge for real world use cases – such as customer experience and web conferencing – will begin to shape the effective use of AI for the future. – Ian Firth, VP at Speechmatics

Responsible AI / ML will become the hottest topic in the cloud ML industry. Given society’s increased emphasis on combatting unfairness and bias and the overall interest in better interpretability and explainability of machine learning models, cloud providers will invest and enhance their ML offerings to offer a full suite of responsible ML / AI capabilities that will aim to satisfy and reassure regulators, modelers, management and the market on the fair use of ML. Meanwhile, AI / ML will continue to see explosive growth and usage across the whole industry, with significant enhancements in ease-of-use and UX combining within a responsible AI / ML framework to drive the next growth spurt of this sector. – Yiannis Antoniou, analyst, Gigaom

AIOps for networking will become mainstream: Next year, AIOps will go from theory to practice for many organizations. With the increase of remote workers and the home becoming the new micro branch, AI will become table stakes for delivering a great client to cloud user experience while controlling IT support costs for remote employees. IT teams will need to embrace AIOps to scale and automate their operations. AIOps cloud SaaS will turn the customer support paradigm upside down. Instead of users submitting tickets to IT, AI will proactively identify users with connectivity or experience issues and will either resolve (the self-driving network) or will open a ticket with suggested remediation actions for IT. – Bob Friday, CTO of Mist Systems, a Juniper Networks company

Artificial intelligence and machine learning will play a much more integral role in supply chain strategy than in previous years. The need for more real-time insights throughout the supply chain will continue to grow in 2021, especially as supply chain organizations re-evaluate their operations as a result of sudden changes in buying behaviors during the COVID-19 pandemic. To address this need, supply chain organizations will need to look to artificial intelligence (AI) and machine learning (ML) enabled technology to upgrade from current, descriptive and prescriptive analytics, and leverage predictive analytics — which provide recommended actions before an incident occurs based on previous actions. Oftentimes, companies experience a mess of silos and fragmentation due to being acquired by large companies that have different systems. In 2021, supply chain stakeholders will look to deploy digital twins across all modules as an extra layer of visibility and to ensure synchronization between a company’s existing systems and new technology, such as sensors and nano sensors, which are coming to market in increasingly larger volumes. – Mahesh Veerina, CEO of Cloudleaf

Bias in AI causes harm at great scale – from impacting the recruitment process by reinforcing gender stereotypes to racial discrimination in credit scoring and lending. Organizations know that hiring a diverse workforce can provide a level of truth for AI models, and they know that training data needs to be constantly monitored for bias, as it impacts the quality and accuracy of algorithms. They also know that there is no current benchmark for ethics-based measurements to truly mitigate bias in AI, and that there needs to be. In 2021, we’ll see organizations moving past just acknowledging and “worrying” about bias in AI and start to make more significant moves to solve for it – because it will be required. Specific teams and/or initiatives will be formed to combat all the concerns that fall under the umbrella of responsible AI, including everything from inherent bias in data to treating data trainers fairly. Establishing responsible AI initiatives will not only become a board-level mandate for some, but the partners and customers of companies leading AI efforts will demand it. – Appen CTO Wilson Pang   

AIOps Will Heat Up to Enhance the Customer Experience and deliver on Application Assurance and Optimization. With a year of unpredictability behind us, enterprises will have to expect the unexpected when it comes to making technology stacks infallible and proactive. We’ll see demand for AIOps continue to grow, as it can address and anticipate these unexpected scenarios using AI, ML, and predictive analytics. The increasing complexity of digital enterprise applications spanning hybrid on-premise and cloud infrastructures coupled with the adoption of modern application architectures such as containerization will result in an unprecedented increase in both the volume and complexity of data.  While data overload from modern digital environments can delay repair and overwhelm IT Ops teams, noisy datasets will be a barrier of the past as smarter strategies and centralized AIOps systems help organizations improve the customer experience, deliver on modern application assurance and optimization, tie it to intelligent automation, and thrive as autonomous digital enterprises. In fact, conventional IT Operations approaches may no longer be feasible – making the adoption of AIOps inevitable to be able to scale resources and effectively manage modern environments. – Ali Siddiqui, Chief Product Officer, BMC Software

The stark reality is that 2021 will be the year when those actually doing AI will start achieving value at scale, while those spending months training brittle models and failing to catch up will be at an increasing, exponential, disadvantage. Last mile challenges won’t get any easier – but a fundamental shift in thinking and approach will be critical to overcoming complexity obstacles. – Dr. Josh Sullivan, Head of Modzy 

Elegant risk assessment: As the AIOps space continues to mature, we see an opportunity for vendors to refine their risk assessment capabilities to enable customers to fix issues with near-certainty, without breaking anything else in the system. In 2021, one area where we will see increased focus from both vendors and more adoption among users will be around enabling more elegant  dependency mapping so engineers can accurately assess risk as a part of the remediation process or build-deploy cycle for software changes, to ensure that a change in one part of an environment won’t break the system elsewhere. – Michael Olson, Director, Product Marketing at New Relic 

In 2021, AI Won’t Be Mapped on the Human Spectrum of Competence: We can have algorithms that crush any human at chess but are unable to make a cup of tea and computer programs that can perform mathematics millions of times faster than humans but, if asked who might win the next World Cup, they wouldn’t even understand the question. Their capabilities are not universal. We’ve reached a point with AI where we simultaneously overestimate and underestimate the power of algorithms. When we overestimate them, we see human judgment relegated to an afterthought – a dangerous place to be. The use of a “mutant algorithm” in grading A-level results is the scandal du jour in the UK, despite the algorithm producing many results that simply violate common sense. When we underestimate algorithms, we see entire industries crumble because they didn’t see change on the horizon. How can the traditional taxi business compete when Uber’s algorithm can get you a ride in less than 3 minutes? In 2021, expect engineers to avoid AI and algorithmic blunders by not trying to map algorithms onto the human spectrum of competence. Using AI technologies – such as any-context speech recognition – to enhance what humans can do and finding the right balance between AI automation and human knowledge for real world use cases – such as customer experience and web conferencing – will begin to shape the effective use of AI for the future. – Ian Firth, VP at Speechmatics

ML on the edge is going to be one of the major focus in the AI/ML industry in 2021. Demand for intelligent edge applications is rising rapidly in the automotive, smart factory, and smart home industry. With widely available efficient edge ML development tools and semiconductor companies launching new MCUs with ML features, adoption of edge ML applications will become the major trend. – Sang Won Lee, CEO of Qeexo

The clinical community will increase their use of federated learning approaches to build robust AI models across various institutions, geographies, patient demographics and medical scanners. The sensitivity and selectivity of these models are outperforming AI models built at a single institution, even when there is copious data to train with. As an added bonus, researchers can collaborate on AI model creation without sharing confidential patient information. Federated learning is also beneficial for building AI models for areas where data is scarce, such as for pediatrics and rare diseases. – Kimberly Powell, Vice President & General Manager, NVIDIA Healthcare

AI Center of Excellence: Companies have scrambled over the past 10 years to snap up highly paid data scientists, yet their productivity has been lower than expected because of a lack of supportive infrastructure. More organizations will speed the investment return on AI by building centralized, shared infrastructure at supercomputing scale. This will facilitate the grooming and scaling of data science talent, the sharing of best practices and accelerate the solving of complex AI problems. – Charlie Boyle, Vice President & General Manager, NVIDIA DGX Systems

AI Expression Will Narrow in on Seamless User Experiences: As we look at the history of AI, algorithms were king and user experience came second. But as we head into 2021, AI-enabled applications will be increasingly focused on usability as a priority. The best expressions of AI are seamless for the user and work unobtrusively in the background. Platforms supported by AI/ML will find new ways to lead users to better conclusions and solutions. This happens by interrogating huge volumes of data, looking for anomalies, insights and trends, then presenting results in the appropriate business context. Truly frictionless AI/ML should be the end-goal for all business platforms. I hope to see more sophisticated applications of AI that will identify what each user is trying to accomplish and automatically surface insights that can be leveraged for quick action. This ease-of-use will be incredibly valuable for the broad base of users, both technical and non-technical. – Sanjay Vyas, CTO of Planful

Ethical AI will take a key role in product development in 2021, but it is a difficult problem to solve: Ethical AI is becoming an important issue, but a difficult dilemma to solve. Companies are using data and AI to create solutions, but they may be bypassing human rights in terms of discrimination, surveillance, transparency, privacy, security, freedom of expression, the right to work, and access to public services. To avoid increasing reputational, regulatory and legal risks, ethical AI is imperative and will eventually give way to AI policy. AI policy will ensure a high standard of transparency and protective measures for people. In the data sphere, CEOs and CTOs will need to find ways to eliminate bias in algorithms through careful analysis, vetting and programming. – Krishna Tammana, CTO of Talend

Next year, we will see companies focus on, adopt and develop AI solutions that actually deliver ROI as opposed to gimmicks or building technology for technology’s sake. Organizations will be focused on demonstrable progress and measurable outcomes and will therefore invest in solutions that solve specific problems. The companies that have a deep understanding of the complexities and challenges their customers are looking to solve and are willing to invest their R&D dollars in the solutions will find success. – Joe Petro, CTO at Nuance Communications, Inc.

The AI skills gap will persist, and organizations will think of new ways to adapt. It’s been difficult for organizations to hire the talent needed to deploy AI and reap all the benefits, with half of industry insiders reporting this challenge. What’s more, many organizations have accelerated digital transformation initiatives by a matter of months or years – but there is a discrepancy in available talent and training opportunities to support these initiatives. Due to increased demand, we predict that companies will offer more upskilling initiatives and incentives for employees to learn new skills, as well as work to build data and AI literacy across all levels of the organization. The pandemic has presented an opportunity for organizations to prioritize these actions and help employees develop new skills in their rapid transition to remote work. Looking ahead, 2021 will be about education – both operating in a new normal and catching up to the expedited digital initiatives. – Traci Gusher, Principal, Data & Analytics, KPMG

Addressing bias in AI algorithms will be a top priority causing guidelines to be rolled out for machine learning support of ethnicity for facial recognition. Enterprises are becoming increasingly concerned about demographic bias in AI algorithms (race, age, gender) and its effect on their brand and potential to raise legal issues. Evaluating how vendors address demographic bias will become a top priority when selecting identity proofing solutions in 2021. According to Gartner, more than 95% of RFPs for document-centric identity proofing (comparing a government-issued ID to a selfie) will contain clear requirements regarding minimizing demographic bias by 2022, an increase from fewer than 15% today. Organizations will increasingly need to have clear answers to organizations who want to know how a vendor’s AI “black box” was built, where the data originated from and how representative the training data is to the broader population being served. As organizations continue to adopt biometric-based facial recognition technology for identity verification, the industry must address the inherent bias in systems. The topic of AI, data and ethnicity is not new, but it must come to a head in 2021. According to researchers at MIT who analyzed imagery datasets used to develop facial recognition technologies, 77% of images were male and 83% were white, signaling to one of the main reasons why systematic bias exists in facial recognition technology. In 2021, guidelines will be introduced to offset this systematic bias. Until that happens, organizations using facial recognition technology should be asking their technology providers how their algorithms are trained and ensure that their vendor is not training algorithms on purchased data sets. – Robert Prigge, CEO of Jumio

We will see a gradual and constant introduction of AI into more areas of our lives. After 2020 put a big stress-test on many machine learning models due to Covid-19 producing data being significantly different from the data that the models were trained on, I would be surprised to see a big bang of revolutionary AI next year. In 2021, we will see a more gradual and constant introduction of AI into more areas of our work and private lives, where tangible value can be demonstrated while at the same time user acceptance towards AI is not being overstrained. – Tobias Komischke Ph.D., UX Fellow, Infragistics and Adjunct Professor, Rutgers University

The Rise of Explainable AI/Machine Learning: Expect developers and business users to demand more insight and reasoning into AI and machine learning algorithms and how they are applied. Wide-scale adoption of these solutions will occur after we build trust in the underlying technology, which can only happen if the drivers for a given prediction are explained to the end user. For example, in the context of machine learning in recruiting—why a given candidate is recommended for a particular role is important both to allow the hiring manager to make an informed decision and also to expose the risk of unintentional (or malicious) bias in hiring practices.Because most AI/ML models are somewhat of a black box, users and developers don’t have visibility into why the models make the decisions they do. More insight into this process will ensure people understand the factors on which the model is basing its decision, and it will also help developers prevent an adversary from exploiting it. – Jim Stratton, CTO, Workday.

At present, more than 80% of CEOs across industries, believe that AI needs to be explainable for it to be trusted. On the other hand, only about 35-40% of enterprises have detailed understanding of how and why their systems produce outputs that they do. This is especially true for TM&E which gathers information on hundreds of millions of customers worldwide. Moreover, as the volume of data increases exponentially, regulatory oversight will become even more critical. Businesses will need to assess the risks and challenges for every data science project including its social impact, privacy, bias, and transparency. In a significant step towards this direction, a recent paper by Begley et al., 2020, introduces a meta-algorithm for fairness-imposing perturbation on an unfair model. It provides a new perspective on model fairness, flexibility, and stability, without compromising the effects of training-time interventions. Development of such Ethical AI solutions and toolkits, in sync with guidelines published by international bodies like IEEE SA, OECD and EU Commission will be the key to maintain the trust of both business and its consumers. – Chandramauli Chaudhuri, Principal Data Scientist Telecom, Media and Entertainment, Fractal Analytics

AI will be integrated throughout every step of an organization’s day-to-day operations: In 2021, we’ll finally see artificial intelligence (AI) become embedded in all aspects of how organizations run — and it will become the way businesses create a competitive advantage, deliver new products and services, transform the back office and improve the customer experience. This will include using AI to help mitigate risk and optimize costs, such as predicting issues in supply chains and suggesting alternative suppliers. Security technologies will see an increased use of AI, which will also be used more often to prevent activity and attacks from threat actors, including combatting ransomware or the exfiltration of sensitive data. AI facial recognition technologies in CCTV systems coupled with keycard systems, proximity devices and building diagrams will be used to quickly identify intruders in a building. – Laserfiche CIO Thomas Phelps 

The pandemic accelerated the digital transformation of healthcare at an unprecedented rate. Now, what used to be considered “nice-to-have” technology has become a vital lifeline for providing remote patient care. While telehealth and telemedicine will continue to dominate in 2021, we need improved artificial intelligence (AI) and analytics, as well as practical interoperability, to unlock its full potential. For example, AI with deep learning could accurately analyze and detect potential problems, even from images patients send from their mobile phones during telehealth sessions. And when that process is enriched with accurate and complete data that can be shared among providers, patients, payers, and pharmacies, AI can reach its full power to provide faster care and reduce costs in a virtual setting. – Kunal Agarwal, Senior Vice President, Product Innovations and Interoperability, DrFirst

AI is ubiquitous — Entering 2022, AI will finally have entered the mainstream. AI will be the unifying force behind all key technology advances serving employees and customers: from workflow, to process automation, guided selling, customer communication, and customer data platforms. AI is what will power customer engagement and employee engagement. – Michael Maoz, SVP, Innovation Strategy at Salesforce

In order for businesses of all sizes to fully capitalize on AI and data analytics, 2021 is poised to be the year of ‘small data.’ Small data will allow businesses to become more agile, taking segments of data and gleaning instant insights for increased productivity. Accessing more specific datasets with powerful workstations allows data & business analysts to deliver the real-time insights needed to make informed business decisions. To achieve this, companies will need to invest in smarter solutions to gain an agile and predictive view of their world. Small data will enable a whole new breed of AI creating a ‘connected and open enterprise’ experience for collaboration. – Mike Leach, Solution Portfolio Lead for Client AI, Professional VR and Remote/Rack Workstation Solutions at Lenovo

It is possible in 2021 we will see more breakthroughs with impact on the same orders of magnitude as GPT-3 or DeepMind’s AlphaFold – giant leaps of historical importance that shape industries in a profound way. Increasing number of organisations will continue the adoption of AI and we will see a transition from doing pilot studies to creating real organisational changes to integrate AI into their process. The AI integration does not happen overnight, and more work is needed on the organisational level to achieve ‘organisational learning’ on top of ‘machine learning’. In healthcare we will see more IPOs of companies that integrate AI into their drug discovery process. – Lomax Ward, Partner, Luminous Ventures

AI and ML will go beyond predictions: While predictions are one of the most valuable outcomes, AI and ML must produce actionable insights beyond predictions, that businesses can consume. AutoML 2.0 automates hypothesis generations (a.k.a. feature engineering) and explores thousands or even millions of hypothesis patterns that were never possible with the traditional manual process. , AutoML 2.0 platforms that provide for automated discovery and engineering of data “features” will be used to provide more clarity, transparency, and insights as businesses realize that data features are not just suited for predictive analytics, but can also provide invaluable insights into past trends, events, and information that adds value to the business by allowing businesses to discover the “unknown unknowns,” trends and data patterns that are important, but that no one had suspected would be true. – Ryohei Fujimaki Ph.D., Founder & CEO of dotData

AI will become mainstream in 2021, but a “one size fits all” approach won’t work, especially for AI-powered contract management: Now that AI is more widely used and enterprises in every industry are discovering new ways to harness its capabilities, we finally see AI technology becoming mainstream in 2021. This means even organizations that are considered “late adopters” will begin to automate various tasks using AI. However, when IT teams consider deploying an AI solution, it’s important to remember that the AI/ML model must be adaptable to the specific needs of each organization. In the case of contract lifecycle management (CLM), the digital contracting system must be agile enough to scale with the organization and its changing needs over the next 10+ years. If the CLM isn’t flexible and tailored to the organization’s specific challenges at that time, the company will be forced to roll out a completely new system every couple of years, which is extremely costly and time-consuming. AI can be extremely effective if implemented properly, but like other powerful and sophisticated tools, it must be done right. – Colin Earl, CTO of Agiloft  

AI and ML finally grow up: Companies have been talking about utilizing AI and machine learning for nearly a decade, but the progress that’s been made in those areas is close to zero because of the poor state of the data. With the investment in data integrity, the use and applications of AI and ML will accelerate in 2021 and beyond. – Amy O’Connor, Chief Data and Information Officer at Precisely

Big Data

In 2021, open and free data collection will fuel future innovations. A recent survey from Frost & Sullivan found that 54% of IT decision-makers expressed a need for large-scale data collection to keep pace with their businesses’ growth and online competition. However, for businesses to utilize online data effectively, it first needs to be accessible – not blocked. Today, businesses often prohibit public data collection attempts despite collecting it themselves. This situation is caused by two major factors: the continuous need to block malicious or fraudulent online activity as part of security precautions, and the notion that this public data contributes to a company’s competitive edge. I believe that during 2021 and onwards companies will realize that public data collection is part of the general and necessary ongoing business conduct. They will also realize that data isn’t everything when it comes to a business’s competitive edge. Areas such as inventory, prices, product quality, and service quality, etc., play a big role as well.  Once that realization settles in, blocking data will serve only to protect against abusive online activities. To secure ethical data collection, I hope we all promote an open exchange of information in central data hubs. Sites will continue to block abusers; this will not change. However, they may permit ethical data collectors. Ultimately, the future of online data collection is up to those who control it. At the rapid rate that data is being produced, future data collection efforts will need to evolve and grow. Companies will need automated data collection to keep up with their competitors and be able to gather data at a faster rate.  After all, the speed at which companies can collect fresh data will determine their relevancy and success. – Ron Kol, CTO at Luminati Networks

Data will become truly operational on an enterprise scale: The amount of data businesses have is growing exponentially – there are more sources, types and amounts than ever before, plus increasing amounts of data are being delivered in near-real time. But to truly understand, access and take action on data, enterprises will need to change how they consume it — starting by cutting out the middleman. By finding ways to automate the data cataloguing and profiling processes, employees – including those with less of a technical background – will be able to get the data they need to effectively and efficiently make good business decisions. – Eric Raab, SVP, Engineering and Product, Information Builders

It’s essential to capture and synthesize “alternative” data: How early could we have detected COVID-19? Studies of “alternative” data – in this case, traffic data outside hospitals in Wuhan and keyword searches by Internet users in that area – indicate that the virus may have been circulating in late 2019. The investment community has been a pioneer in using alternative data, including audio, aerial photos, water quality, and sentiment.10 This is the front line for data-driven innovation, and getting an edge here can result in huge gains. But in the wake of 2020, alternative data will become mainstream, with the goal of spotting anomalies much earlier. From that, we can get derivative data, which comes from combinations, associations and syntheses with data from systems of record. As IDC says: “As more data gets captured and becomes available from external sources, the ability to use more of it becomes a differentiating factor. That includes taking lessons from industries other than your own.” 11 This trend, similar to what Gartner calls “X analytics,” 12 isn’t new but is finally becoming an important foundation of modern data and analytics, thanks to cheaper processing and more mature AI techniques – including knowledge graphs, data fabrics, natural language processing (NLP), explainable AI and analytics on all types of content. This trend is completely dependent on ML and AI, as the human eye can’t catch it all. – Dan Sommer, Senior Director, Global Market Intelligence Lead at Qlik

In the industry we often talk about breaking down data silos, but we should acknowledge that some silos will always be there. In large organizations you will always have local departments or regions that have their own tools or databases, and that will continue. If you have data sovereignty, that local office in your organization will have a silo. That’s why the best approach is to look at how you can have a better understanding of the data you have. A data intelligence platform can serve as your index and your map, showing you the silos you have and how they are connected by providing a 360-degree view of data assets. – Stijn “Stan” Christiaens, co-founder and CTO of Collibra

OpenTelemetry will create data overload. In 2021, the use of OpenTelemetry will become the new industry norm. Yes, it will make data collection easier by creating consistency across sources — but it will also create a data firehose for companies, making it even harder to find the small portion of data containing actionable insights. The constant stream of data will overwhelm companies if they don’t have a system in place to quickly find the 5% that is truly actionable. Because of this, IT teams will shift their focus from acquiring data to building a framework to take action from data. As teams do so, it will be imperative to implement tools that can immediately start surfacing actionable data in the time it takes to make a cappuccino. – Phil Tee, CEO of Moogsoft

A digital twin is a virtualized model of a process, product or service. The pairing of the virtual and physical worlds allows data analysis and system monitoring to help identify problems before they even occur. This prevents downtime, develops new opportunities and even plans for the future by using simulations. This generation of digital twins allow businesses to not only model and visualize a business asset, but also to make predictions, take actions in real-time and use current technologies such as AI and ML to augment and act on data in clever ways. – Anil Kaul, CEO at Absolutdata

Digital transformation will – at last – start to become transformational. At this point, “digital transformation” has become a buzzword that all enterprises have learned to recognize, yet the vast majority (80% according to IDC) of these efforts are still too tactical in nature. Robotic process automation (RPA), for example, may be considered a transformational tool, but on its own it’s not. In order for organizations to see true transformation in 2021, they’ll need to leverage more advanced platforms that combine core automation and AI features—such as text analytics, document understanding, and process mining. It’s also critical that these platforms have low-code capabilities that enable citizen developers to build and deploy enterprise grade automations that drive value back to their organizations. Without that, it will continue to be challenging for companies to deliver enterprise-wide digital transformation—which is fueled by the ability to easily deploy automation, even to the most complex processes. – Guy Kirkwood, Chief Evangelist at UiPath

Companies Will Race to Solve a New Data Gravity Problem – Employees Working from Home: COVID-19, and the massive shift it has caused in the number of employees working from home, has exacerbated companies existing data gravity challenges. Even if (and hopefully when) the pandemic subsides a much more distributed workforce is here to stay – and with a more distributed workforce comes more distributed data. This difficulty compounds companies existing data gravity problem with their on-premises infrastructure These companies want to move their applications and workloads to the public cloud, but the gravity of data on existing on-premises infrastructure makes this difficult, hindering these companies digital transformation initiatives. Expect to see in 2021 a surge of corporate investment in technologies and services that allow them to address this data gravity challenge. For example, enterprise demand will help drive the growth of new wired 10G technologies and wireless 5G technologies that allow companies to ensure the connection between their edge data and their clouds is fast, responsive, reliable, and secure. In addition, expect to see greater adoption of Backup as a Service (BaaS) and other intelligent data management solutions that enable companies to move much of the on-premises data and the data found on employee laptops to the cloud, while also providing them with the ability to secure, protect, govern, and otherwise control the distributed data that remains at the edge. Companies will also invest in training and other change management services that will enable them to build a cloud-based culture for their distributed workforces. Companies that confront this data gravity problem head-on can keep their increasingly distributed data environments from slowing their move to the cloud — a move they must make if they hope gain the flexibility, scalability and agility they need to foster innovative in today’s digital economy. – Manoj Nair, General Manager at Metallic   

Business Intelligence

Proliferation of low-code/no-code ML. The increase of low-code and no-code ML systems, designed to make AI more accessible to companies, will help improve adoption of AI. However, eventually companies will reach a ceiling and outgrow the one-size-fits-all approach, seeking more advanced use cases for AI that require deeper expertise. Ultimately, the need for customization will increase the need for qualified data scientists, rather than low-code systems replacing them. We aren’t going to automate away the need for data scientists any time soon. — Kevin Goldsmith, CTO, Anaconda

Business Intelligence is shifting to a new paradigm of advanced data analytics with the integration of Natural Language, Natural Search, AI/ML, Augmented Analytics, Automated Data Preparation, and Automated Data Catalogs. This will transform business decision-making processes with higher-quality real-time insights. – Ramesh Panuganty, CEO of BI company MachEye

BI and AI will deepen their liaison. Whether scoring BI data sets against ML models and visualizing the predictions, or leveraging natural language processing for generating visualizations, insights, and summaries, AI and BI will increase their synergies. And as conventional BI capabilities continue to commoditize, vendors will need BI+AI as a new front in the innovation wars. – Andrew Brust, analyst, Gigaom

Chatbots

Employee to Enterprise – Conversational AI adoption will be natural and often the first contact. Conversational AI is normalized and here to stay. Interfaces that guide consumers through the online marketplace, employees through training courses and users through search engines and websites saw great returns on investment when outfitted with advanced Conversational AI technology. – Shiva Ramani, CEO of iOPEX

AI will not displace human beings any time soon. When you look at the use of AI in consumer-facing operations today, it’s mainly used in AI-supported chatbots and customer personalization features. If we look at how consumers have taken advantage of AI-supported features during the pandemic, we can see that they’re actually using them to resolve issues faster through human agents. Companies like Bank of America, which has a consumer-facing AI-powered chatbot named Erica, saw consumers using Erica to find the best course of engaging customer support teams. Rather than asking Erica questions to fix any issues directly, customers simply asked Erica how they should go about reaching out to the customer service team to rapidly resolve their problem with the appropriate human agent. – James Isaacs, President and CEO of Cyara

Today, we interact with bots more than ever before, whether it’s customer service chatbots or the AI on our devices, like Siri and Alexa. These bots are used for real-time decision making to automate processes that were previously done by humans. For example, bots have automated the retail return processes for companies like Amazon. However, it becomes more complicated for enterprises to manage the identities of automated bots, especially when they are interacting with other bots at machine speed. The identities of bots must be managed and protected by the enterprise, similar to employee and customer identity, so that data isn’t compromised. This is important for CIOs and security leaders to keep in mind, because using bots for automation purposes will open new attack vectors if those bots’ APIs are hacked. – Jasen Meece, CEO of Cloudentity

NLP (natural language processing) changes the conversation on data analysis: Just as we are using Google Home and Alexa in our everyday lives, conversational analytics through NLP will be the golden ticket for enterprises in extracting valuable big data insights from their business operations. This includes unearthing trends that may have gone unnoticed and allowing experts from within the enterprise to engage with data in a meaningful way. – Sam Mahalingam, CTO, Altair

Conversational AI, first and foremost, needs an ubiquitous messaging channel to converse on. The rise of business messaging on IP-based channels such as Whatsapp, GIP and others is driving a resurgence in the use of Conversational AI. Companies across industries such as banking, e-commerce, retail, travel etc are now enabling conversational AI for virtually every customer touchpoint including marketing, sales and support. Powered by recent advances in natural language processing (NLP), conversational AI is poised to transform how consumers interact with businesses. – Beerud Sheth, CEO of Gupshup

Employee to Enterprise – Conversational AI adoption will be natural and often the first contact: Conversational AI is normalized and here to stay. Interfaces that guide consumers through the online marketplace, employees through training courses and users through search engines and websites saw great returns on investment when outfitted with advanced Conversational AI technology. – Shiva Ramani, CEO of iOPEX

Cloud

I think we will begin to see a more thoughtful, balanced approach to multi- and hybrid cloud adoption, particularly for hybrid cloud. We are getting past the public versus private cloud conversations, and businesses are accepting the reality that cloud is not an “either, or” decision. Historically, we have seen “public cloud” being associated with cutting-edge innovation and “private cloud” being associated with slow, legacy businesses that are resistant to change. This sentiment is changing, as businesses are beginning to better understand the value they can get from a hybrid cloud architecture that enables them to deploy agile, modern applications on the platform that best balances their specific cost, performance, security, compliance and governance needs. With this comes an increase in hybrid enabling technologies such as containers and hybrid integration platforms. Another consideration is tethered compute, which is a hyperscale cloud provider solution running in your own data center. Examples are AWS Outposts, Google Anthos, and Microsoft Azure Stack. Although these have been too slow to adopt to date, we could start to see the beginning of growth here as customers see the value of private/public cloud, coupled with the consistency of hyperscale cloud service consumption. – Kim King, Director of Product Marketing – Cloud Management at Snow Software

COVID-19 Accelerates Cloud Spending: With the increase of remote working due to the COVID-19 pandemic, companies are investing a larger portion of IT budgets on cloud-based technologies, moving away from paper-based processes. Enterprises’ average cloud spending is up 59% from 2018 to $73.8M in 2020. That trend will continue into 2021 as companies are forced to adopt strategies to work remotely and recognize the benefits of maintaining those modes of operating even as they begin to transition employees back to physical locations. A prime example will be contracting where COVID drove digital transformation of the contract request, approval, execution, and post-award management systems and has laid the groundwork for even more advancements in contract lifecycle management. – Harshad Oak, General Manager, Customer Adoption & Value, at Icertis

Once considered the “layover” on the way to the cloud, hybrid is now the destination: A hybrid cloud approach used to be considered the stepping stone to a cloud-first implementation. Now, customers are seeing that a hybrid approach makes the most sense, both strategically for their business needs and economically. According to IDC, 70% of customers’ apps and data remain outside the public cloud. With that in mind, in 2021, we’ll see even more customers embrace a hybrid approach. Due to data latency, application entanglement and security and compliance reasons we see more and more organizations across industries wanting to keep their data on-premises. At the same time, partially due to pandemic economics, data egress charges and vendor lock-in with public cloud providers, the reality is CIOs and IT orgs are embracing hybrid as the outcome and not a means to an end. – Keith White, General Manager, GreenLake Cloud Services

Cloud agility is fantastic, but it can easily lead to runaway costs. Similarly, shared on-premise big data clusters often waste resources. Both of these result in missed SLAs. If they want to eliminate chronic overspend, companies need to institute a method to monitor and manage their cloud spend. The most effective way to do this is through observability and auto-tuning. – Ash Munshi, CEO, Pepperdata

Cloud storage will become commoditized. Competition for apps will increase. Cloud vendors will be concerned less about where data resides, and have more financial stakes in what’s being done with it. As the cloud becomes more about apps and less about storage, there will be a commoditization of the storage layer and more competition on applications and services. As enterprises migrate data from on-prem to the cloud, they are looking to move the data directly into apps, such as Snowflake and Databricks, to perform analytics faster rather than into storage first. – WANdisco CEO David Richards

Database/Data Warehouse/Data Lake

The solutions that companies use to store their data continue to rapidly evolve in the next year. We are seeing increased migrations into open source relational database solutions, non-relational database solutions, PaaS-based database solutions, and a combination thereof. The primary focus of these initiatives can be grouping under the heading of reducing operating costs, whether they are being undertaken to reduce hefty support contracts from vendors like Oracle and Microsoft (both the open source and non-relational database migrations fall into this category), reduce headcount expense (migrations to PaaS services falls into this category), or gaining performance efficiencies by migrating to a more purpose-built database solution. Data migration is happening right now and at a large scale, so there are many considerations that need to be made when transitioning to these new database solutions, including the capabilities of the future state solution versus the current state, the impact to licensing and support contracts, and a method to ensure that the correct solutions are deployed. While PaaS solutions provide some great benefits, DBAs are still required to monitor and manage those systems and work with application teams to drive efficiencies in performance, availability, and security. – Marc Caruso, Chief Architect, Syntax

360. That’s the number of database systems out in the wild. And while choice is good and finding the right tool for the job is smart, it also adds major complexity. As companies move to modernize in the cloud, they will seek simplification, which will lead to massive consolidation in the database market. Database vendors that offer multi-functional capabilities will win, rather than a multitude of niche databases that need to be stitched together and require different ways of accessing data. – Franz Aman, CMO of relational database company MariaDB

The solutions that companies use to store their data continue to rapidly evolve in the next year. We are seeing increased migrations into open source relational database solutions, non-relational database solutions, PaaS-based database solutions, and a combination thereof. The primary focus of these initiatives can be grouping under the heading of reducing operating costs, whether they are being undertaken to reduce hefty support contracts from vendors like Oracle and Microsoft (both the open source and non-relational database migrations fall into this category), reduce headcount expense (migrations to PaaS services falls into this category), or gaining performance efficiencies by migrating to a more purpose-built database solution. Data migration is happening right now and at a large scale, so there are many considerations that need to be made when transitioning to these new database solutions, including the capabilities of the future state solution versus the current state, the impact to licensing and support contracts, and a method to ensure that the correct solutions are deployed. While PaaS solutions provide some great benefits, DBAs are still required to monitor and manage those systems and work with application teams to drive efficiencies in performance, availability, and security. – Marc Caruso, Chief Architect, Syntax

The database market will grow to $1 trillion by 2025. For the last two decades, there’s been an iron grip on the database market with IBM, Oracle and SAP HANA leading the charge. Now we are seeing a changing of the guard, which gives customers the option of deciding what is best for their business. Forrester even points out that the public cloud infrastructure market will grow 35% in 120 billion in 2021. I predict that the database market cap will grow to $1 trillion by 2025 and over seven to 10 really strong database companies will grow significantly in the next decade. – Raj Verma, CEO of SingleStore

The Data Lake Can Do What Data Warehouses Do and Much More: While the separation of compute and data provides advantages for data lakes over data warehouses, data warehouses have historically had other advantages over data lakes. But that’s now changing with the latest open source innovations in the data tier. For example, Apache Iceberg is a new table format that provides key data warehouse functionality in the data lake such as transactional consistency, rollbacks and time travel while introducing new capabilities that enable multiple applications to work together on the same data in a transactionally consistent manner. Another new open source project, Project Nessie, builds on the capabilities of Iceberg as well as Delta Lake by providing Git-like semantics for data lakes. Nessie also makes loosely-coupled transactions a reality, enabling a single transaction spanning operations from multiple users and engines including Spark, Dremio, Kafka and Hive. – Tomer Shiran, co-founder of Dremio

Three major trends will emerge in 2021, the return of the meta-data layer, embedded AI and automated analytics and new simplified query interfaces designed specifically for business users.  The return of meta data layers, as key foundational components of analytic solutions, is needed to support improved governance and extensibility of data assets.  With smart meta-data layers new simplified user interfaces will emerge that allow business users to interact with the data in a more guided approach allowing them to reduce the time to insight with minimal analytical skills.  AI and automated analytics will shift from the enterprise domain towards software vendors who will embed these capabilities and enable mass adoption via their customer base. – Glen Rabie, CEO at Yellowfin

In the industry we often talk about breaking down data silos, but we should acknowledge that some silos will always be there. In large organizations you will always have local departments or regions that have their own tools or databases, and that will continue. If you have data sovereignty, that local office in your organization will have a silo. That’s why the best approach is to look at how you can have a better understanding of the data you have. A data intelligence platform can serve as your index and your map, showing you the silos you have and how they are connected by providing a 360-degree view of data assets. – Stijn “Stan” Christiaens, co-founder and CTO of Collibra

Data Engineering

Companies will reinvest in the data engineer and data pipelines. One impact of 2020 was that a lot of companies shifted to a survival-first approach, which resulted in a “grab-and-go” mentality to their data integration. As businesses’ bottom lines are stabilizing and we’re seeing more predictability at the macroeconomic level, our prediction is that 2021 is the year of the data engineer, and that companies are going to get back to a “built to last” approach for data pipelines. “Built to last” for the water in your pipes at home means that water is always on, clean and at the right temperature. “Built to last” for data means that you build smart data pipelines to ensure timeliness and confidence in your data analytics. – StreamSets CEO Girish Pancha

Companies will realize the need to put more effort into DevOps: “There is still so much work that needs to be done with DevOps pipelines, including securing and testing the delivery process. The software developer community knows where it needs to go, but the work and obstacles in the way are always bigger than expected. Because of this, I am skeptical we’ll see big changes in 2021 in terms of tooling or CI/CD patterns. Rather, we’ll see more people realize they need to put more efforts into their DevOps pipeline, processes, and validation. They will double-down to accelerate and improve their CI/CD automation. Only when these processes are mature can organizations have confidence in their delivery practices and tooling. – Fred Simon, co-founder and Chief Data Scientist, JFrog

Data operations (DataOps) will connect data custodians to data consumers in real-time, at unprecedented scale: Because there is a continued convergence of data management with data analytics, and an exponential rise of data consumers (such as data analysts, application developers, data scientist and citizen data scientist) the need to seamlessly operationalize data for business value across currently disjointed data-driven use case solving workflow steps (connect, explore, clean, transform, model, deploy, visualize, monitor, improve, scale) takes centerstage of all digital transformation programs large and small. Data operationalization throughout data workflow steps becomes the new ‘data trapped in source system siloes’ challenge for data and innovation leaders. – Cognite’s (a global industrial DataOps company) President of Product Marketing, Petteri Vainikka

Kubernetes usage continues to lag for critical business apps:  Even though Kubernetes has become a red-hot trend among tech media and influencers, few organizations are actually deploying Kubernetes for critical business apps. Even forward-looking enterprises that have widely deployed containers, such as those in the tech and financial services sectors, tend to only use Kubernetes for a small fraction of their containerized workloads. Operational difficulties are the biggest issue. Simply put, Kubernetes is very challenging to deploy, manage and run at scale.  To make things more complicated, deploying Kubernetes in production and at scale requires massive internal buy-in. Unlike microservices, which are leveraged almost exclusively by developer teams and don’t necessarily require top-down approval from IT to adopt, Kubernetes impacts an organization’s entire infrastructure stack and often must be greenlit by the CIO before it can be fully deployed. Kubernetes also has a steep learning curve. It is a radical departure from VMs, which the majority of IT pros still rely on to deploy and run infrastructure and applications. Because of these factors, Kubernetes will not yet see widespread usage supporting business apps next year. Eventually, though, organizations will realize they can turn to pre-made Kubernetes clouds to overcome these challenges, and full production deployments at scale will finally take off. – Ankur Singla, CEO of Volterra 

Data Governance

IT will infuse access governance with intelligence to protect workforce cybersecurity in 2021. Accelerating changes in enterprise technologies, cyberthreats and the user landscape are increasing pressure on traditional identity governance and administration (IGA) solutions and, in turn, on security and compliance teams. On top of growing compliance risks, enterprise IT environments become more complex every year, increasing the number of applications and systems to which companies provide user access. These challenges are driving organizations to seek out AI-driven solutions that simplify and automate the access request, access approval, certification and role modeling processes. In 2021, we will see AI increasingly employed to enable an autonomous identity approach. AI-infused authentication and authorization solutions will be layered on top of, or integrated with, existing IGA solutions, providing contextual, enterprise-wide visibility by collecting and analyzing all identity data, and enabling insight into different risk levels of user access at scale. The use of AI will allow systems to identify and alert security and compliance teams about high-risk access or policy violations. Over time we will see these AI systems produce explainable results while increasing automation of some of the most difficult cybersecurity challenges inside the enterprise. – Eve Maler, CTO at ForgeRock

We have seen the global implementation of AI governance frameworks take off in 2020 where enterprises are asking for details on the outcome of AI applications. Ensuring an appropriate level of explainability of AI applications is key as well as using good quality data, ensuring auditability, being ethical, fair and transparent, complying with data protection requirements, and implementing effective cybersecurity measures. Implementation of AI governance frameworks is seen more in financial and banking currently, but in 2021 we’ll see this become more widespread. Other verticals such as healthcare, e-commerce and mobility services will begin to use it as a competitive differentiator. For instance, healthcare providers are beginning to be more transparent with how data is used, and how they are ethical and fair in protecting that data. If businesses want to stay ahead of the curve, they should start developing ethical AI frameworks now in order to position themselves as a leader in this global movement. – Mohan Mahadevan, VP of Research, Onfido

AI Will Gain Momentum in Cloud Security and Governance. In 2021, AI will go far beyond simply detecting anomalies and thereby flagging potential threats to security teams. Cloud governance is an increasingly complex task and is quickly reaching a point where it’s impossible for humans to manage alone. AI will increasingly be relied on in the coming year to maintain cloud hygiene by streamlining workflows, managing changes and archiving. Once proper cloud hygiene is established and maintained with AI, it will also be used as a strategic predictive knowledge tool. By predicting and addressing threats and vulnerabilities, AI will help enterprises create the best possible outcome for their cloud environments. Leveraging AI as a strategic asset will empower CIOs to make informed decisions about their cloud environments, such as evaluating costs and compliance risks. – Keith Neilson, Technical Evangelist for CloudSphere

As we look to 2021, we will see the conversation of ethical AI and data governance be applied to multiple different areas, such as contact tracing (fighting COVID-19), connected vehicles and smart devices (who owns the data?), and personal cyber profiles (increased cyber footprint leading to privacy questions). – Cindy Maike, VP of Industry Solutions, Cloudera

Data governance for a multi-environment reality. Long gone are the times where organizations simply housed all their own data on-premise or even just within one cloud provider. Now organizations have data on-premise and are partnered with several cloud providers based on their specific needs. This reality has created a “rethink” of how data governance needs to be approached. Organizations must determine how their current data governance will be impacted and what needs to be adjusted, how to monitor data quality in the cloud, and how to manage data movement in and out of the cloud (and the massive expense that comes with that). – Todd Wright, Head of Data Management and Data Privacy Solutions at SAS

AI Will Gain Momentum in Cloud Security and Governance. In 2021, AI will go far beyond simply detecting anomalies and thereby flagging potential threats to security teams. Cloud governance is an increasingly complex task and is quickly reaching a point where it’s impossible for humans to manage alone. AI will increasingly be relied on in the coming year to maintain cloud hygiene by streamlining workflows, managing changes and archiving. Once proper cloud hygiene is established and maintained with AI, it will also be used as a strategic predictive knowledge tool. By predicting and addressing threats and vulnerabilities, AI will help enterprises create the best possible outcome for their cloud environments. Leveraging AI as a strategic asset will empower CIOs to make informed decisions about their cloud environments, such as evaluating costs and compliance risks. – Keith Neilson, Technical Evangelist for CloudSphere

…‘active’ data governance will be “normalized.” GDPR gave rise to a “traditional” approach to governance in many organizations in 2016. Four years later, it’s clear that those efforts have largely failed. Meanwhile, an “active” approach emerged, where governance efforts are prioritized by activity & policies are put into action where data is actually used. This active approach was first attempted by innovators and early adopters, attracted by its novelty and logic—and they achieved exceptional results. Such success has been attracting increasingly conservative institutions and in 2021, we’ll see the beefy part of the bell curve adopt this increasingly vetted and popular approach. – Alation’s co-founder and CDAO Aaron Kalb

Data Science

2020 was brutal for some firms, rewarding for others, and challenging for all. As we enter 2021, laggards have an existential imperative to reinvent themselves digitally, leading firms struggle to keep pace with demands. All of these enterprises need to capitalize on 100% data integration with predictable costs, reliable performance and real-time visibility. – Bonnie Holub, Practice Lead, Data Science, Americas at Teradata

Data democratization will become the new norm. It’s the job of the CDO to ensure expansion of growth across the entire business. This can be achieved by providing structured data that people can actually use. A successful CDO should democratize data so that it’s accessible and understandable by people. A good CTO will complement the CDO by creating the necessary tooling to find the required data. This means giving users a set of visualization tools and reporting tools that allow them to get after the data to run insights. As we move into 2021, we’ll continue to see further and tighter collaboration between these two roles, driven by necessity. If you have tools with bad data, you’re exacerbating the data challenge. If you have limited tools, only a small subset can do anything with the data. – Derek Knudsen, Chief Technology Officer at Alteryx

Citizen analysts will increasingly up-skill to become data scientists. The growing complexity of most industries and companies also means that once we see self-reliance in terms of developing IT processes or using analytics, there will quickly be a huge push to expand that skill-set further. With the market erratically changing from month to month there will be a much greater emphasis placed on data science than ever before. This, in turn, will drive more citizen analysts to up-skill to become data scientists. – Sharmila Mulligan, Chief Strategy and Marketing Officer at Alteryx

Python data visualization libraries will sync. We’re finally starting to see Python data visualization libraries work together, and this work will continue in 2021. Python has had some really great visualization libraries for years, but there has been a lot of variety and confusion that make it difficult for users to choose appropriate tools. Developers at many different organizations have been working to integrate Anaconda-developed capabilities like Datashader’s server-side big data rendering and HoloViews’ linked brushing into a wide variety of plotting libraries, making more power available to a wider user base and reducing duplication of efforts. Ongoing work will further aid this synchronization in 2021 and beyond. — James A. Bednar, Sr. Manager, Technical Consulting, Anaconda

Business skills will become more critical than ever for data scientists. Data scientists will need to speak the language of business in order to translate data insight and predictive modeling into actionable insight for business impact. Technology owners will also have to simplify access to the technology, so that technical and business owners can work together. The emphasis for data scientists will be not just on how quickly they can build things, but on how well they can collaborate with the rest of the business. – Florian Douetteau, CEO and co-founder of Dataiku

Self-service has evolved to self-sufficiency: In a virtual world, self-service needs to evolve. When there are no instruction manuals and no one there to hold a user’s hand, a fast, intuitive ramp-up becomes a hygiene factor for adoption, and compelling user interfaces will no longer be a nice-to-have. But we’ve also seen that users often don’t want to self-serve; they increasingly expect insights to come to them. As a result, we’ll see more micro-insights and stories for the augmented consumer. In addition, data is too often overlooked. Empowering users to access data, insights and business logic earlier and more intuitively will enable the move from visualization self-service to data self-sufficiency. AI will play a major role here, surfacing micro-insights and helping us move from scripted and people-oriented processes to more automated, low-code and no code data preparation and analytics. If more people can be self-sufficient with data earlier in the value chain, anomalies can be detected earlier and problems solved sooner. – Dan Sommer, Senior Director, Global Market Intelligence Lead at Qlik

Historically companies put a lot of value on people who were “Data Scientists”. Going forward, there will be a need to hire people that are experts in data collection. For AI models to work, vast amounts of data is required, and moreover, critical data still resides in silos in many organizations; hence, individuals with skills in data collection will be high in demand. – Clara Angotti, President of Next Pathway

Data scientists will play a critical role in the development of a COVID-19 vaccine. From development of a vaccine to analysis of trials and deployment, data will be the key to knowing if we have found a preventative solution. Data scientists will be as important as traditionally trained scientists in producing the first viable vaccine. To accelerate the development of vaccines, people must be able to manage, make decisions and trust that data. Knowing that speed is critical, data agility is required and new automated systems will enable new innovations, ultimately leading to a vaccine. Accelerating the delivery of the vaccine will require a great deal of agility and automation in managing data. – Infoworks CEO Buno Pati.

While data  continues to rule the world, organizations are still finding themselves struggling to leverage that data for a true competitive advantage. The Citizen Data Science Movement has emerged to widely promote the ability to manipulate and interpret data. But is there is a better way? Wouldn’t it be smarter (and easier) to simply bring business meaning to the data and repair the data rather than fix the people given that raw uninterpreted data located somewhere in a system isn’t very helpful. – Kendall Clark, founder & CEO of Enterprise Knowledge Graph Platform developer, Stardog

We’ll See an Uptick of Architecting for Data Science: Mastering data management will be top of mind for many IT groups as they look to improve business intelligence and agility. For this reason, data science—the umbrella under which artificial intelligence, machine learning, automation, data lakes and others thrive—will see huge growth in 2021. From analyzing data-driven behaviors to transform grocery shopping to leveraging powerful computing in the cloud to improve media production models, data science will take the lead for many to stay competitive. Too expensive to provision on their own, many of these companies will outsource their data science projects to third parties with a subscription model. — Dustin Milberg, Field CTO Cloud Services at InterVision

Automate Your Pipelines to Unleash the Full Potential of Data Scientists: Data scientists are too often busy with tasks like data preparation, feature engineering and modeling. As these tasks become augmented with tools that help automate these steps, we’ll see data scientists trade in routine tasks for time spent on deeper, strategic approaches that will make them invaluable resources.  We expect to see more systematic implementations of business AI solutions to make ad-hoc analyses more efficiently repeatable. – Justin Silver, Ph.D. an AI Strategist at PROS

Data Science and AI/ML should likely be together. Companies and Technology vendors will increasingly engage in federating intelligence through APIs. End-points are becoming a “must” for consuming models either in internal application or through the value networks externally. Expand the learning curves of data scientists to business problems and operational concepts to accelerate your DevOps processes for embedded AI applications. – Radu Miclaus (Director Of Product, AI and Cloud), Lucidworks

With conflicting team priorities, data mutinies will be on the rise. Data teams today are on a collision course with conflicting priorities. For infrastructure teams, building for scale, security, and cost are of the utmost importance while engineering teams prioritize flexibility, development speed, and maintainability. Meanwhile, data scientists and analysts are focused on the availability and discoverability of data, and the connectivity of tools. As enterprises scale their efforts and their teams to build new data products, the interconnectedness and resulting complexity can be paralyzing for these groups. If organizations continue to cater to one group’s needs amidst these conflicting priorities, we can anticipate a rise of “data mutinies” in 2021 – in which internal users create their own engineering organizations with a mandate to move quickly and free themselves from these conflicts. – Sean Knapp, founder and CEO of data engineering company Ascend.io

Data science leaders create new team practices to accommodate scale. There will be an accelerated shift in 2021 away from a focus on algorithms and techniques (one data scientist building one model) to enabling large teams to collaborate, across disparate data science orgs, functions (IT and business), and geographies. To succeed, data science leaders will need well-defined practices for model operations, deployment, monitoring and retaining, and new tools for better collaboration and team management. – Nick Elprin, co-founder and CEO of Domino Data Lab

Deep Learning

The adoption of Deep Learning based enterprise solutions in startups and enterprises will see a gradual uptick. The key hindrance will continue to be the costs of procuring GPU instances and high-cost human resources. – Sundeep Reddy Mallu, Head of Analytics at Gramener 

As we all witnessed in recent years, research and development in Natural Language Processing has progressed rapidly by breakthroughs in Transformer language models such as BERT, GPT-3 etc. While they are achieving state-of-the-art performance, they require large datasets and large amounts of computational resources for training and inference with a significant carbon footprint. We will see more efforts and research coming out with new model architectures and training techniques to address the concerns of carbon emissions, very long training times, with space and compute effective models to make these breakthroughs more accessible; recent models like Performers with Fast Attention will serve as catalysts to move in this direction. – Kavan Shukla, Data Scientist, Finn AI 

Hardware

Hardware and software converge with the rise of AI-specific hardware. As Apple’s announcement of the M1 chip showed, purpose-built hardware is becoming more mainstream, meaning that people will begin to think more about the actual hardware that they are working on than they did previously—including data scientists. The rise in ML-specific hardware will likely lead to performance improvements, but also provides another variable in model deployment. It’ll be particularly impactful in cloud and mobile environments. This will further break down the wall that has traditionally existed between hardware and software, with AI use cases leading the way. — Kevin Goldsmith, CTO, Anaconda

Since 2012 AI compute power has grown at 5X the rate of Moore’s Law, doubling approximately every 3.5 months. Given the growing number of applications built on top of AI engines impacting our everyday lives – some even critical to humanity as a whole (e.g. modeling and solving for climate change), finding a solution to this performance scaling mismatch is high on every serious fabless and chip manufacturing company’s priority list. The need for shifts in how Moore’s Law is perceived will become more apparent in 2021. The latest trend has been to talk about writing more efficient software to yield year-over-year performance improvements. This is a risky bet, since the development of fundamentally new algorithms cannot happen on a schedule and are therefore not compatible with the traditional semiconductor tick-tock advancement schedule. Underlying compute technologies must also improve. We will continue to see shifts and improvements in the coming year. – Nick Harris, CEO and co-founder of Lightmatter

In-memory Computing

In 2021, accelerated by COVID-19 and stricter regulations, enterprises will continue to drive their data transformation initiatives to thrive in the burgeoning online-digital economy. Extreme speed, cloud agility, and operational analytics will be adopted by enterprises to optimize data-driven operations and to rapidly introduce new services and applications.  Technology solutions based on a cloud-native data fabric, also known as a Digital Integration Hub, will allow organizations to offload and decouple from legacy systems of record and databases to meet their digital and analytical requirements and be able to migrate to the cloud without the need to completely divest from their existing-mission-critical systems. The introduction of In-memory speed and scale for analytics and BI will fuel real-time reporting and visualization of fresh data and enable ML models to utilize more accurate real-time data for on-line services such as loan approvals, fraud analysis and customer 360 capabilities. AIOps will also be a focus and be deployed to automate and streamline complex data and analytics operations, reduce time-to-market and lower costs while minimizing human errors. – Adi Paz – CEO – GigaSpaces 

In 2020, the COVID-19 pandemic drove many businesses, especially those in food delivery, ecommerce, logistics, and remote access and collaboration services, to dramatically scale out and upgrade infrastructure to maintain high application performance in the face of surges in website visitors, delivery requests, sales transactions, video streaming and more. Many of these businesses found that the fastest approach to maintaining or improving performance while simultaneously increasing application throughput was to deploy a distributed in-memory data grid (IMDG) – built using an in-memory computing platform such as Apache Ignite – that can be inserted between an existing application and disk-based database without major modifications to either. The IMDG improves performance by caching application data in RAM and applying massively parallel processing (MPP) across a distributed cluster of server nodes. It also provides a simple path to scale out capacity because the distributed architecture allows the compute power and RAM of the cluster to be increased simply by adding new nodes. In 2021, IMC platforms will become easier to use and the number of knowledgeable IMC practitioners will continue to grow rapidly. This will enable IMC adoption to spread across more industries and to a wider pool of companies. As a result, more businesses will be better positioned to take advantage of IMC for rapid application acceleration, not just for response to the demands of COVID, but also to meet new strategic and competitive demands as the pandemic threat abates. – Nikita Ivanov, CTO and founder of GridGain Systems

IoT

IoT adoption in the enterprise will heat up more than ever: In light of the pandemic’s impacts on business, enterprises will be looking for new or additional ways to increase the speed to decision making in 2021. IoT can play a role in this. From a BI standpoint, the challenge is to recognize that IoT has different data models that need to be accommodated, like performance over time. Reducing the lag time between data production and operations will be key. The smartest organizations will realize that they can’t simply spend money on this, but instead need to be strategic to create new data models that share thoughtful insights. – Eric Raab, SVP, Engineering and Product, Information Builders

The pandemic has greatly accelerated the need for companies to complete their Industry 4.0 transformations with solutions that allow them to have more flexibility, visibility and efficiency in their operations. We’ll see an acceleration of adoption of solutions that help address that need, ranging from AI including machine learning, machine vision and advanced analytics. As the economy bounces back, we’ll continue to see investment in the foundational OT infrastructure with more IT capabilities to allow the broad ecosystem of players to deploy these solutions and will see Industry 4.0 adoption significantly ramp up in 2021. – Christine Boles, VP, IoT Group and GM, Industrial Solutions Division, Intel

Explosion of edge computing: We will continue to see an increase in edge computing throughout the data center industry due to increased computing and speed demands from consumers and companies. A low latency network is critical in environments that strive to maximize compute throughput and reduce server idle time. – Timothy Vang, Ph.D., vice president of marketing & applications for Semtech’s Signal Integrity Products Group

Edge is the new cloud: For companies scaling smart factory initiatives in 2021, real-time availability of mission-critical workloads will be necessary to ensure business outcomes. Edge computing will complement existing cloud infrastructure by enabling real-time data processing where the work takes place (e.g., motors, pumps, generator, or other sensors). Implementing integrated analytics from the edge to the cloud will help these enterprises maximize the value of investments in digital systems. The industry will continue to move toward more decentralized compute environments, and the edge will add significant value to digital transformation initiatives. By integrating edge functionalities with existing cloud infrastructure, organizations will worry less about logistical IT considerations and, instead, focus on rethinking what’s possible in a smart machine: What questions can it answer faster? What new problems can it solve? How can it protect operations better? Analysts note that by 2022, 99% of industrial enterprises will utilize edge computing for this reason. – Keith Higgins, VP of Digital Transformation for Rockwell Automation

Creative minds push IoT forward: IoT and smart product development will hinge on creative designs and thoughtful solutions as technical improvements of microprocessors slow due to engineers running up against the limitations of what is physically possible as chipmakers near the theoretical limit for how thin these devices can be. Post-Moore’s Law product development will rely on the ingenuity of engineers and designers to create imaginative solutions to solve business and society problems and improve everyday consumer processes, instead of simply relying on the next generation of powerful chipsets. – Sam Mahalingam, CTO, Altair

Enabled by reliable, high-performance 5G connectivity and real-time edge computing capabilities, enterprises will more rapidly implement AI technology to support various use cases across their organizations, including robotic assembly lines, autonomous vehicles, augmented reality training, and more. These intelligent, AI-driven solutions will automate traditional, manual processes to increase performance and visibility while lowering cost. Traditional manufacturing settings employ a significant amount of manual labor. To keep operational costs low, many industries have taken their factories outside of the U.S. By implementing AI throughout factories, manufacturing can automate many resource-expensive tasks. For example, Tesla was able to place its factories in one of the world’s most expensive geographies by automating most of its factory functions. By utilizing AI to operate their robot workers, there is a massive opportunity for the western world to reshore or nearshore manufacturing operations. In 2021, manufacturers will lay the groundwork for company-wide, cost-effective industrial automation. Over the next five years, we will see these industries completely transform themselves due to the automating power of real-time AI. – Vinay Ravuri, CEO of EdgeQ

Machine Learning

Investment dollars in IT Operations will shift from vanilla workflow automation to native AI/ML solutions with a drive to become digital operations. Workflow operations and their respective automation will naturally evolve to include AI/ML solutions as the technology becomes more powerful. AI and ML are advancing and in turn improving workflow automation as companies collect more data as well as shift organization and administrative operations. – Shiva Ramani, CEO of iOPEX

Enterprises will find new applications for machine learning technologies that automate manual processes and enhance monitoring capabilities. Companies will look for products that deliver deeper monitoring, more automation and value-added information across their IT spend. For example, availability solutions that provide application-aware monitoring and automation of configuration and management tasks would be prioritized over traditional failover solutions. New innovations in HA will emerge to handle the increasing complexity of failures and disasters brought on by IoT devices and their dependencies. – Cassius Rhue, VP, Customer Experience, SIOS Technology

Historically, algorithms were more about machine learning and neural networks. We are now seeing more and more machines that are self-contained and can teach and train themselves in a way that is remarkably similar to the subconscious part of the human brain. In other words, algorithms used to mimic the analytic part of the brain; now they are mimicking the largest, most powerful, and most intriguing part of the human brain, which we call common sense, gut feelings, and intuition. Instead of relying on human beings to train and teach them, today’s unsupervised machine algorithms are able to gather massive amounts of data, create pictures of the world, and make deductions that are very similar to ones that would be made by human beings. We’re coming into a world where computers can train themselves. – Mark Gazit, CEO of ThetaRay

Reducing bias: this year, there have been many necessary conversations around bias and mitigation in AI algorithms and around how to address the societal impacts of algorithm-based personalization. However, we need to continue development of tools that provide insight into the results of ML systems, reveal bias, and check drift in deployed models over time. This becomes ever more critical as more of these systems are put into production, to ensure that we’re not perpetuating or creating sources of harmful bias. — Kevin Goldsmith, CTO, Anaconda

Enterprises will find new applications for machine learning technologies that automate manual processes and enhance monitoring capabilities. Companies will look for products that deliver deeper monitoring, more automation and value-added information across their IT spend. For example, availability solutions that provide application-aware monitoring and automation of configuration and management tasks would be prioritized over traditional failover solutions. New innovations in HA will emerge to handle the increasing complexity of failures and disasters brought on by IoT devices and their dependencies. – Cassius Rhue, VP, Customer Experience, SIOS Technology

Organizations whose early successes in machine learning have spurred them to expand their programs are finding that a fast moving production line of high quality datasets are the fuel that will drive that expansion. This will elevate Data as a Service to a high priority for data engineering teams. – Luke Han, co-founder and CEO, Kyligence

The Ability to Trust and Operationalize ML will be 2021’s Litmus Test For Survival: On top of a pandemic and a recession, we’re continuing to grapple with the exponentially growing amounts of data and ever-increasing complexities of new technologies. If businesses want to be successful in making sense of their large data sums and technical complexities, they must leverage and operationalize machine learning models in explainable and easy to understand ways. It is no longer enough to focus on getting models into production, the focus must now be on getting models into the hands of the business users and decision-makers. But to operationalize, businesses must be able to trust in, derive understanding from, and communicate about, a model’s ability to meaningful impact business potential. In 2021, a business’ ability to trust its model — to the extent that they are able to produce action from AI-derived insight — will be determinant of its ability to survive. – Santiago Giraldo, Senior Product Marketing Manager of Machine Learning, Cloudera

Companies of all sizes and at all stages are moving aggressively towards operationalizing machine learning efforts. There are several popular frameworks for model training, including Tensorflow and PyTorch, leading the game. Just like Apache Spark is considered a leader for data transformation jobs and Presto is emerging as the leading tech for interactive querying, 2021 will be the year we’ll see a frontrunner dominate the broader model training space with pyTorch or Tensorflow as leading contenders. – Haoyuan Li, Founder and CEO, Alluxio

SaaS change data as the missing piece for ML/AI: Organizations with a focus on artificial intelligence and machine learning will continue to hunger for meaningful training datasets that can be fed into their ML algorithms to spot cause-and-effect change patterns over time. To do this, they will turn to their ever-changing datasets in 3rd party cloud/SaaS applications as inputs into these algorithms. This will create pressure for them to capture and ingest every single change in that data over time into their DataOps ecosystem. – Joe Gaska, CEO of GRAX

Role played by AI and ML will expand as identity intelligence comes to the forefront. As we reach a tipping point in the future of authentication, users are increasingly security-aware when it comes to protecting their digital identities online. Identity verification will become increasingly contextual, and AI will play an expanding role to determine the dynamic risk of access which a rule-based system simply cannot provide. Supervised and unsupervised deep learning, reinforcement learning, and genetic algorithms will not just apply pre-defined inference models but will also allow security solutions to adapt to changing enterprise behavior and learn from other companies as they encounter and mitigate threats. Combating deep fakes with in-built algorithms, deriving value from big data and driving decision-making through powerful analytics will play a key role in identity intelligence. – Rajesh Ganesan, Vice President, ManageEngine (division of Zoho Corp.)

There will be more consolidation in the ML platform space. As AI became the “it” technology over the last few years, a bunch of AI infrastructure companies popped up and began peddling AI platforms to ease the task of building models for companies looking to leverage AI. While it sounds good on the surface, there is no identified business task being solved here, it’s simply more efficient use of technology, and that’s hard to sell.  It’s likely that the VC’s who backed these plays will begin severing the cash lifelines in 2021. – Lexalytics CEO and Chief Scientist Paul Barba

The Time is Now to Adopt Responsible Machine Learning

The era in which tech companies had a regulatory “free ride” has come to an end. Data use is no longer a “wild west” in which anything goes; there are legal and reputational consequences for using data improperly.  Responsible Machine Learning (ML) is a movement to make AI systems accountable for the results they produce.  Responsible ML includes explainable AI (systems that can explain why a decision was made), human-centered machine learning, regulatory compliance, ethics, interpretability, fairness, and building secure AI. Until now, corporate adoption of responsible ML has been lukewarm and reactive at best. In the next year, increased regulation (such as GDPR, CCPA), antitrust, and other legal forces will force companies to adopt responsible ML practices. –  Rachel Roumeliotis, VP of AI and Data Content at O’Reilly Media 

Robotics

With the need to keep people out of close quarters perpetuating into the new year, we will naturally see significant investment in automation. However, for maybe the first time, robotics will be taking on the mundane, simple human tasks as opposed to the more difficult and strategic. We have seen robots assist humans in many complicated applications, such as robots trained to perform the most precise microsurgeries. Robots will now start to take on tasks that let essential workers who previously needed to be in person, work remotely. With more investments in augmented and virtual reality, for example, we will see robot security guards controlled by remote workers roaming office and factory floors; remote workers will be able to control drones remotely to pick and pack boxes in a warehouse. In 2021, the revolution will be roboticized. – Ahson Ahmad, Chief Product and Customer Officer, Ripcord

Security

Deepfakes will become a significant threat to business integrity. COVID-19 has forced in-person communication to go virtual, which means businesses are relying on video conferencing to conduct meetings more than ever before. While the notion of deepfakes may not be new, they are getting increasingly sophisticated and are becoming remarkably easy to generate. Take ThisPersonDoesNotExist.com, for example, which leverages AI to create completely believable images of people that don’t exist in real life. If this process can be conducted with relatively little information, then certainly hackers can leverage work profiles used for video conferencing technology — which have employees’ names and pictures automatically associated with them — to create convincing fakes. – James Carder, Chief Security Officer for LogRhythm

Prediction: As Fraud Detection Becomes Harder, ML Fraud Models Will Strengthen But Use More Recent Datasets: To determine fraud risk, companies typically use a data set of past transactions that they believe will be representative of the future to train their machine learning (ML) models. However, the huge impact of COVID-19 on consumer data and behavior has created a disconnect because past data is no longer representative of the future. This has led many organizations to either use underfit models that perform well but don’t catch new fraud patterns, or overfit models that create a lot of surprises such as flooded manual review queues or more chargebacks and fraud. Many companies have also shifted from using ML to rules-based models and manual reviews that rely more on human intuition. In 2021, companies will be able to leverage their understanding of these new behavioral patterns to start building stronger ML models again. However, to be successful, they will need to use more recent data, take things as they come when building models, and assess their progress as they go. – Arjun Kakkar, Vice President of Strategy & Operations at Ekata

Artificial intelligence has created new security threats, the greatest of which may be deepfakes. Deepfakes are fake audio, video, or images that rely on artificial intelligence technology to mimic reality. Deepfakes can have serious consequences in the wrong hands, such as deepfake fraud. While we haven’t seen many of these attacks yet, in 2019, fraudsters used deepfake audio to steal over $200,000 from a UK-based energy company. And with remote work environments giving fraudsters more ammunition to carry out their attacks, 2021 will be the year that technology unleashes real time audio transcription and businesses will have to remain vigilant to ensure they don’t get scammed. Businesses should be wary of any suspicious phone calls, and never send money or share sensitive information without verifying that a caller is who they claim to be. Additionally, setting up basic cybersecurity tools and protocols can prevent fraudsters from gaining access to the sensitive information they need to create deepfake images and audio in the first place. Cybersecurity researchers are working on tools to detect deepfake content, but until then, companies will need to rely on their intuition and existing cybersecurity tools to make sure they don’t get duped. – Terry Nelms, PhD, Sr. Director of Research, Pindrop

Fueled by the influx of data breaches and the perceived exploitation of personal data by Big Tech, consumer data privacy will continue to be a huge focus in 2021 and beyond, and we can expect to see more legislation introduced that protects consumer rights and fines businesses for the irresponsible usage of data. To cultivate trust and improve the customer experience in an increasingly competitive business landscape, more organizations will give consumers ownership and control of their personal data in the coming years. By combining ethical, compliant and privacy-preserving principles with technology infrastructure built to scale for the future, society will move towards a system where the value of data will benefit both individuals and enterprises alike. – James Kingston, VP of Research and Innovation Partnerships at Dataswift, AI researcher, and Director of the HAT-LAB.

Data security governance is a required and critical building block to threat mitigation. Until recently, most data governance programs have focused on data flows and analytics without thinking much about security. New data privacy laws and regulations have forced data stakeholders such as CDO, CFO, CISO, and DPO to make data security one of the necessary building blocks of their data governance efforts. But data security governance is complex as no single vendor product can implement all required data security governance controls. In 2021, as businesses continue to collect and process more and more data, they will have to figure out how to quickly unify their information, so their entire organization is drawing information from the same, trusted and secure well. Next, businesses need to implement and manage their data source through a data protection system with necessary privacy controls in place, so data threats are mitigated. These steps will ensure future business and financial risks are minimized. – Anne Hardy, CISO of Talend

AI will be Key to Bolstering Security in a Remote World. Security is top-of-mind for any organization’s C-suite that has embarked on a digital transformation journey, but its importance has only been accelerated by the pandemic. With so many endpoints scattered across the world as employees have the flexibility to work remote from wherever they choose, vulnerabilities multiply. A major trend we will see in 2021 and beyond is the application of AI to security measures, because humans alone cannot monitor, control and check each endpoint to adequately or efficiently protect a modern enterprise. If security leaders (especially those at Fortune 500 companies) don’t make the time and financial investment to enhance security with AI now, they can expect to be targeted by hackers in the future and scramble to protect their data. -Scott Boettcher, VP, Enterprise Information Management, NTT DATA Services

Secure AI will usher in new privacy regulations across geographies, governments, and industries: In the coming year, we can expect to see more data privacy standards and regulations emerge – both by industry and by geography. We’re already beginning to see more interest from governments into how enterprises are operating within their borders so they can better preserve the privacy of their constituents. At the industry level, insurance, healthcare, and financial services will continue to see highly regulated mandates. Because of the pandemic, we’ve witnessed the world move to a new reality of digital purchasing and procurement. Although it won’t be at the size, scale, or completeness of an industry like financial services, I foresee businesses in online retail and online service procurement quickly stepping into a much more regulated space than they have experienced historically. – Rick Farnell, President and CEO, Protegrity

Storage

Legacy NAS is Dead for AI. With the introduction of PCIe Gen4, I/O rates have now completely broken away from CPU core evolutions. Legacy NFS providers are stuck with single-stream TCP that is rate-limited by the capability of a single CPU core on the application server. PCIe Gen4 will double the peak I/O performance of applications in 2021, while a CPU core will no longer be able to equally double single-core I/O performance. There is no greater concentration of single-host IO than in the AI market – for applications such as machine learning and deep learning. To resolve this, customers will seek solutions that support multi-threading, RDMA, and the ability to bypass CPUs altogether – as is the case with NVIDIA’s GPUDirect Storage. The demands to keep GPUs and AI Processors fed and efficient will dramatically outstrip the I/O capabilities of legacy TCP-based NAS, leading customers to walk away from legacy NAS altogether in 2021. – Renen Hallak, Founder and CEO of VAST Data

Object storage shatters the myth that it’s only used for archive. Although object storage is best known as a backup and archive storage solution, three trends will expand that perception in 2021. First, flash-based object storage will gain favor in data analytics workloads that also have high capacity requirements. Second, S3-compatible storage will simplify Kubernetes deployments, making it a logical choice for modern applications. Third, cloud-native applications will be increasingly be deployed on prem, driving the need for on-prem S3-compatible storage to enhance application portability. As a result, more organizations will use object storage to support compute-heavy use cases, such as AI, ML and data analytics, shattering the “cheap and deep” myth once and for all. – Jon Toor, CMO for Cloudian

Organizations are now collecting massive amounts of machine learning and IoT data. If your company depends on collecting and analyzing data to operate and succeed, what happens if that data is not fully backed up and easily recoverable? Most companies are thinking mainly about data analysis and much less about data backup or security. But as data increasingly moves from analysis to production environments, that’s when protection becomes critical. Cutting-edge storage tools increasingly rely on AI and machine learning to automate the data backup process. Given the exploding size of enterprise data, these intelligent tools will become vital for maintaining an efficient backup process that can quickly and effortlessly react to changing requirements while saving untold hours on manual backups. – Shridar Subramanian, CMO of StorageCraft

Verticals

The potential for AI to improve supply chain processes has been an area of focus for companies for at least 5 years, but after the disruptions caused by COVID-19, many supply chain analysts and enterprises have turned their attention to AI as a possible solution to their woes. 67% of enterprises invested in some technology solution to help them weather the pandemic, and 60% of industrial enterprises are looking to AI specifically. However, AI models are fueled by data. The accuracy, scope, and capabilities of an AI model depend entirely on the training data behind it. However, that data must be organized and labeled in a machine-readable format before an AI program can digest it. Before they embrace AI, enterprises must leverage modern integration technology to automatically compile data from interactions with their ecosystem of suppliers, partners, traders and customers in a format that is structured to fuel AI models. – John Thielens, CTO at Cleo

Throughout COVID-19, telcos have responded with urgency, rapidly extending network capacity and high-speed connectivity to aid governments and health care systems and enable remote working virtual classrooms. But the unrelenting pressure will take a toll in 2021 by accelerating several trends that were gaining ground even before the current crisis began, pushing telcos to expedite their digital transformations so that they can survive and thrive. Artificial intelligence will be table stakes as it will help telcos deliver superior performance in both the short and long term, allowing telcos to better cope with fluctuating demand levels, leverage data driven insights to better adjust to disruptions, and adapt to sharp shifts in consumer priorities. – Hriday Ravindranath, Chief Technology and Information Officer at global telco BT

Sign up for the free insideAI News newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*

Comments

  1. Thank you for helpful information sharing with us.

  2. I am so happy that I spent my time reading this long article. The information that you shared is valuable especially the part covering Secure AI sector.
    Thanks for this article, waiting for more.