New Advancements in GenAI are Warding Off an AI Winter

Small language models, diversified revenue streams and algorithm advancements will see GenAI continue to grow in the coming months. 

Businesses across industries have already embraced and accepted the potential of AI, but many are now grappling with the task of delivering AI powered solutions that have a tangible impact and deliver high return on investment (ROI).  

According to Isabel Al-Dhahir, Principal Analyst at GlobalData, a leading provider of AI-powered market intelligence, while delivering on AI is not a straightforward endeavour, advancements in AI algorithms, continued diversification of revenue streams and the rise of SLMs will all see AI and particularly generative AI (GenAI) continue its growth through Q4 and into 2025. 

SLMs, diverse revenue streams and more efficient algorithms 

There are three key drivers for GenAI’s continued growth, the first of which is the rising prominence of small language models (SLMs). SLMs are models with fewer than 10 billion parameters. When compared with large language models (LLMs), SLMs are found to be cheaper as well as more energy-efficient to train and deploy.  

Further to this, as the risks and impacts of larger models become more widely known, SLMs could prove to be a more practical alternative for enterprises as they can be designed for domain-specific functions. They are also safer as they can be operated locally, thus reducing the risk of data breaches.

Next is the diversification of revenue streams. AI vendors are monetizing the technology through various channels such as licensing, data-as-a-service (DaaS), and AI-as-a-service (AIaaS). By delivering specific AI solutions for different customers, AI vendors will continue to offer an attractive proposition for a broad range of industries.  

Lastly, advancements have seen more efficient AI algorithms that prioritize compression, pruning, and quantization, producing the same output with lower compute requirements. This means that less advanced hardware could potentially be employed, thus democratizing access to AI and mitigating the impact of compute scarcity. 

Low enterprise uptake, unclear path to profitability and compute power limitations 

There are however still a series of challenges that could limit GenAI’s growth. Beyond basic use cases, enterprises are now demanding explainability, domain-specific knowledge, high and deterministic accuracy, and predictable savings and costs for integrated AI tools, which today’s general-purpose models cannot deliver. This is where the popularity of SLMs will likely surge as they can be tailored to an organization’s own needs. 

Elsewhere, a recurring challenge is the cost of implementing AI at scale and bringing projects from pilot to production. This can become vastly expensive due to hardware and cloud hosting costs. 

Vendors are also burning through billions of dollars for training and inference of their AI models in and are in fierce competition with each other. Following Meta’s release of its open-access Llama 3, competition has only intensified with user pricing subsequently decreasing. It remains to be seen if this is sustainable, and it is rumored that OpenAI will make a $5 billion loss in 2024.  

Lastly, compute power is increasingly scarce due to the dwindling availability of GPUs which are in relatively short supply. In practice, this means that only well-funded organizations will be able to afford high-performance computing, leaving startups behind and potentially stalling innovation.

Staying ahead of the genAI curve 

Despite these challenges, a large majority of vendors are already well ahead of them and successfully pivoting their efforts towards SLMs and more advanced technologies. Meanwhile, the energy requirements, security risks and validity concerns around larger models have sparked concern at the enterprise adoption level. 

Now and into next year, GenAI and its array of potential capabilities and benefits remain a core focus for organizations across industries, with many still at an early phase. With re-focused attention and innovative thinking, generative AI will safely avoid being left out in the cold. 

Sign up for the free insideAI News newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insideainews/

Join us on Facebook: https://www.facebook.com/insideAINEWSNOW

Check us out on YouTube!