Trending: How Energy Efficient LLMs Could Boost GenAI Use in eCommerce

AI, energy consumption, artificial intelligence, LLMs

UC Santa Cruz researchers have devised a method to significantly reduce the energy costs of running large language models.

It’s a development that could significantly impact the use of artificial intelligence (AI) in eCommerce. By slashing power consumption, their approach may make advanced AI capabilities more accessible and affordable for businesses of all sizes.

“We got the same performance at way less cost — all we had to do was fundamentally change how neural networks work,” Jason Eshraghian, an assistant professor of electrical and computer engineering at UC Santa Cruz’s Baskin School of Engineering and the study’s lead author, said in a Thursday (June 20) news release. “Then we took it a step further and built custom hardware.”

The Cost of AI in eCommerce

Currently, running advanced AI models like ChatGPT comes with a hefty price tag. Recent estimates suggest it costs nearly $700,000 per day in energy costs alone for OpenAI. These costs gets passed along in the price and could create a significant barrier for smaller businesses looking to leverage AI in their eCommerce operations.

The UC Santa Cruz team’s research aims to address the high energy costs associated with running advanced AI models. By eliminating matrix multiplication, the most computationally expensive element of running large language models, they were able to make the model more energy efficient.

“Neural networks, in a way, are glorified matrix multiplication machines,” Eshraghian said. “The larger your matrix, the more things your neural network can learn.”

The researchers claim their approach is strikingly efficient.

“We were able to power a billion-parameter-scale language model on just 13 watts, about equal to the energy of powering a lightbulb and more than 50 times more efficient than typical hardware,” Eshraghian said.

This level of efficiency could enable eCommerce platforms to offer advanced AI-driven features like personalized recommendations, chatbots and dynamic pricing at a fraction of the current cost.

Implications for Mobile eCommerce

The team’s innovation also has significant implications for mobile eCommerce. Rui-Jie Zhu, the paper’s first author and a graduate student in Eshraghian’s group, noted in the news release: “We replaced the expensive operation with cheaper operations.”

The reduction in computational complexity achieved by the UC Santa Cruz team could potentially enable full-scale AI models to run on smartphones. This advancement comes at a time when mobile shopping is rapidly growing.

If implemented, this technology could significantly enhance mobile shopping experiences and app-based eCommerce by allowing more sophisticated AI-driven features like personalized recommendations and advanced search capabilities to run directly on users’ devices.

Building on their software advancements, the team extended their research by collaborating with other UC Santa Cruz faculty to develop custom hardware. This specialized hardware was designed to maximize the efficiency gains of their new approach.

“These numbers are already really solid, but it is very easy to make them much better,” Eshraghian said. “If we’re able to do this within 13 watts, just imagine what we could do with a whole data center worth of compute power. We’ve got all these resources, but let’s use them effectively.”

For eCommerce giants with vast data centers, this could mean significant cost savings and improved AI capabilities. For smaller businesses, it could level the playing field, allowing them to compete with more sophisticated AI-driven strategies.

As PYMNTS previously reported, Big Tech companies like Microsoft and Google  are struggling to profitably monetize their generative AI products due to high production, development and training costs.

As the eCommerce industry continues to evolve, innovations like this could reshape how businesses interact with customers, manage inventory and make strategic decisions. While the technology is still in its early stages, its potential to democratize advanced AI capabilities in the eCommerce sector is profound.

The researchers have open-sourced their model, potentially accelerating adoption and further innovation in the field. As Eshraghian puts it, “we’ve fundamentally changed how neural networks work.” The eCommerce world will be watching closely to see how this change translates into real-world applications and competitive advantages in the digital marketplace.

PYMNTS-MonitorEdge-May-2024