James Henderson

Evaluating the enterprise process of manufacturing intelligence

The ability to acquire and apply knowledge should not be underestimated in business – a precious commodity that is vital to corporate well-being.

Between the wafer-thin line of surviving and thriving sits information, building know-how, shaping strategy and driving execution. Whether good information or bad information, that’s the stock-standard enterprise play.

But now, a new process is on the table. A process designed to surgically extract new levels of value from organisations in a controlled and scalable manner.

“We’re moving from computation and calculation – which has been around for 60 years – to cognition,” stated Michael Dell, Chairman and CEO of Dell Technologies.

Michael Dell (Dell Technologies)

Entering the age of artificial intelligence (AI) in the process, the enterprise is embarking on an era in which knowledge is power.

Arguably, most already understand such a progression – the why was outlined many years ago with the direction of corporate travel clear. The problem, has always been the how.

“Every company at its foundation is intelligence – fundamentally every company is an intelligence manufacturer,” outlined Jensen Huang, Founder and CEO of Nvidia.

“We now have the ability to manufacture intelligence. The last Industrial Revolution was the manufacturing of software; previously, it was manufacturing electricity – now we are manufacturing intelligence.”

Huang joined Dell during the opening keynote of Dell Technologies World 2024, hosted in Las Vegas amid the backdrop of heightened AI market interest.

“I get asked all the time, how big is AI?” Dell shared. “Like there’s some super secret answer that only I know. Sorry, I don’t know and I don’t think anybody really knows the answer to this question.

“What is the demand for intelligence? Is there a limit to the demand for intelligence? What is the appropriate investment in infrastructure to meet this potentially limitless demand? I can tell you this, the early movers are making massive bets.”

Curiosity moves into cognitive deployments, fuelled by data

As the company figurehead, Dell has observed a sharp escalation of enterprise interest in AI – notably during the most recent quarters, leading to customer conversations now being “dominated” by the topic.

“70-80% of the questions are about AI, and that’s unprompted from customers,” Dell confirmed.

At this stage of maturity, current dialogue tends to produce more questions than answers however.

  • What does it mean?
  • How does it work?
  • What do we do first?
  • What are the tools?
  • What are you doing?
  • What are the use cases?

“There’s a lot of interesting curiosity in AI,” Dell summarised. “The reason for that is pretty simple – there’s just an enormous opportunity to make businesses more productive, more efficient and reimagine what they do.

“Of our top 200 customers, 15-20% now rank AI as the number one priority in their business, which is driven from the board at the top of the organisation.”

Beyond the leading pack of “aggressive” companies setting the pace, a “big portion” of the market is still attempting to make sense of the use cases and first steps required, followed by a lagging tail of “really slow” organisations.

“But think about it,” Dell questioned. “Whatever it is that you do in your company, are you making that better by using technology? We used to call that digital transformation – and I suppose we still call it that – but now we have these AI tools.

“Imagine if the top 2000 companies in the world adopted this idea? That’s a lot of productivity, efficiency and speed. That creates better companies and stronger companies would be unleashed.”

In response, Dell as a business is centring all GenAI initiatives around the core fundamentals of engineering, product quality, support and services, security and supply chain. This is a vendor throwing the kitchen sink at becoming “faster, stronger and smarter” to best compete in “both a sprint and a marathon”.

Jensen Huang (Nvidia) and Michael Dell (Dell Technologies)

“Because we’ve been doing this with hundreds of the largest companies, we have a playbook on how to do this on a repeated basis,” Dell added.

Dell AI Factory is the positioned market playbook.

Rolled out to offer customers access to an AI portfolio from device to data centre to cloud, the suite of solutions is supported by an open ecosystem of technology partners creating AI applications at the top of the stack.

Collaboration with Hugging Face will provide on-premises deployment of GenAI models while a continued partnership with Meta will simplify deployment of Meta Llama 3 models to provide test results, performance data and deployment recipes.

Alignment with Microsoft Azure AI Services will speed deployment of AI services, such as speech transcription and translation capabilities.

“The idea here is that you start with data, because if you don’t have any data then you don’t have any AI,” Dell explained. “This is an area we know a lot about because no company in the world has provided more storage capacity than Dell.”

Citing market research, Dell advocated against migrating all data to a centralised environment such as the public cloud – referencing that large language models can be 75% more cost effective running on-premises instead. So much so that 83% of enterprise CIOs are expected to repatriate select workloads from the public cloud in 2024, as noted by Enterprise Strategy Group.

“We’ve learned this over time,” Dell shared. “You get locked in and as your data grows and you want to put it to work, it gets harder and more expensive.”

According to Dell, the on-premises shift has been driven by two key factors – inference and data gravity.

“You want to bring AI to your data, not the other way around,” he advised. “This should be hosted on accelerated and optimised AI infrastructure. Then through our ecosystem of partners, build AI factories with your data, performance services, costs and security under your control.”

Simply put, a data centre of the future – built and designed from the ground up with an AI-first mindset.

“Data has been and is at the centre of everything,” Dell reiterated. “It’s the rocket fuel so go where the data is. In the AI Factory, data is the fuel. What are you creating at the end of the factory line? You’re creating intelligence.”

Following data is modernised infrastructure delivering enhanced compute power, spanning storage, servers and networking. This is supported by new innovations in the GPU space.

“Put all that together and then add services,” Dell said. “Then access a broad ecosystem of software, closed and open LLMs, small and large LLMs, PCs and edge devices. Put all that together and it’s an AI Factory.”

Building on the how element of bringing AI to life in the enterprise, Huang acknowledged that the entire computing stack has been overhauled. Instead of human engineering applications and programs written by people, now the same people will assemble AI teams which are built on LLMs.

“These business applications of the future can run on-premises or in the cloud and organisations will have to build state-of-the-art AI factories,” Huang detailed.

Under the banner of Dell AI Factory with Nvidia, the expanded partnership now includes new server, edge, workstation, solutions and services advancements.

The full-stack offering – which includes computing, networking and software – is designed to drive the copilots, coding assistants, virtual customer service agents and industrial digital twins of the digital enterprise.

Michael Dell (Dell Technologies)

“This partnership is going to be the first and largest GenAI go-to-market in history,” Huang claimed. “Building AI factories and delivering it to the enterprise as a solution.”

From a market standpoint, the opportunity is seismic.

“Immediately, there’s a trillion dollars worth of data centres in the world that were created for the last generation of IT – that’s going to be completely modernised to this new form,” Huang predicted.

“On top of that, a whole new class of data centres will be designed for one purpose – the purpose of manufacturing intelligence at scale. We call it AI factories.”

Enterprise use cases emerge, externally and internally

The path to the promised land of AI is scattered with enterprise use cases and examples of best practice, notably ServiceNow who also shared the opening keynote at the conference – which housed more than 13,000 executives from end-user and channel partner organisations.

“Every workflow in every enterprise, in every industry, in every corner of the world will be reinvented with GenAI,” forecasted Bill McDermott, CEO of ServiceNow.

“But it’s a mess out there, ladies and gentlemen. The enterprise is a mess. When you have people swivel chairing among 17 applications a day, no wonder they don’t want to come back to the office.”

The ambition for McDermott is to propel ServiceNow to “defining enterprise software company” status or as Huang referenced it – “the AI operating system for the enterprise”.

Powering such efforts is Dell infrastructure, upon which the vendor’s cloud platform runs 24/7.

“This is the reliability gold standard in the enterprise, and that’s a fact,” McDermott claimed. “That compute platform powering our cloud has enabled us to execute beautifully and maintain a near 99% retention rate, which is unheard of.

“We use Dell servers to train our LLMs and we’re putting AI to work for people, which is really what it’s all about. We’re now live with GenAI solutions on Dell compute and are achieving 48% acceleration to innovate, with two times greater deflection rates on both the employee experience and the customer experience with conversational self-service.”

McDermott was on the same stage in Las Vegas days prior, headlining ServiceNow Knowledge 2024.

“I was also joined by Jensen then and one of the things that he said – and I think it’s appropriate – is when a train is moving by really fast, it’s like you can’t even believe how quick it looks,” he shared. “But when you get on the train, it doesn’t seem like it’s moving that fast at all. Think about it.”

Bill McDermott (ServiceNow) and Michael Dell (Dell Technologies)

Internally speaking, Dell as an organisation has identified primary use cases in which “distinct value” can be created for customers, in addition to improved internal efficiencies. This spans content delivery and content creation on the customer service side of the business, as well as via sales and marketing functions, plus supply chain.

“Firstly, we had to simplify, standardise and automate,” explained Jeff Clarke, Vice Chairman and COO of Dell. “If you don’t have simplified processes that have been standardised and automated consistently across your enterprise, then applying AI isn’t a very efficient method to fix broken processes or ways in business.”

Such a discovery came early in the process of applying AI internally across the wider company. For Clarke, this was the first natural checkmark and helped lay the foundation upon which insights and knowledge can be extracted from data.

“Take services as an example,” he shared. “We have used our telemetry data and the amount of contacts that come into the company each year to improve our methodologies and responses to customers.

“We use an approach called ‘best known actions’ and ‘best known methods’ which is machine learning and AI driven. We are taking information from customers and are solving the types of problems that our customers encounter.”

The aim? To ensure every service technician within the organisation becomes more efficient and effective.

“Every interaction with our company now has the knowledge of the entire install base,” Clarke added. “We can now see improvements in accuracy and the amount of time it takes to solve a customer issue.

“Then, ultimately, we can arrive at a point of turning that into long-term predictive maintenance and care while also feeding that information back into our product organisation to make our products better.”

In building a closed loop, the business can now hear directly from the field, triage a specific customer problem and take a long-term improvement view to maximise the benefits of this knowledge.

“Content creation is another one,” Clarke highlighted. “Our company creates a lot of information and we’ve been able to curate a new methodology and process to cut the amount of time that it takes us to deliver new content by 90% – with more than 90% fewer touches.”

This change was triggered by a commitment to simplify and standardise the process, dovetailed with automation and the application of intelligence to improve the quality of content deliverables. Whether for internal or external use, a consistent desire to improve content performance was also a strong factor in the transformation process.

“One of the biggest benefits has been that we can give more time to our most valuable resources,” Clarke noted.

“Our developers are spending more time developing, writing code and using new technologies and techniques to provide us with productivity gains. Likewise, we’re freeing up more selling time on the commercial side of the business.”

In closing, Clarke simplified the proposition… “factories take raw materials – in this case, data – and turn those materials into something useful – in this case, insights.”

SIGN UP FOR INSIGHTS VIA MOXIE MAIL

Inform your opinion with executive guidance, in-depth analysis and business commentary.