Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Is Liquid Cooling the Key Now that AI Pervades Everything?

30 September 2025 at 13:13
B. Valle

Summary Bullets:

• Data center cooling has become an increasingly insurmountable challenge because AI accelerators consume massive amounts of power.

• Liquid cooling adoption is progressively evolving from experimental to mainstream starting with AI labs and hyperscalers, then moving into the colocation space and later enterprises.

As Generative AI (GenAI) takes an ever-stronger hold in our lives, the demands on data centers continue to grow. The heat generated by the high-density computing required to run AI applications that are more resource-intensive than ever is pushing companies to adopt ever more innovative cooling techniques. As a result, liquid cooling, which used to be a fairly experimental technique, is becoming more mainstream.

Eye-watering amounts of money continue to pour into data center investment to run AI workloads. Heat management has become top of mind due to the high rack densities deployed in data centers. GlobalData forecasts that AI revenue worldwide will reach $165 billion in 2025, marking an annual growth of 26% over the previous year. The growth rate will accelerate from 2026 at 34%, and in subsequent years; in fact, the CAGR in the period 2004-2025 will reach 37%.


Source: GlobalData

The powerful hardware designed for AI workloads is growing in density. Although average density racks are usually below 10 kW, it is feasible to think of AI training clusters of 200 kW per rack in the not-too-distant future. Of course, the average number of kW per rack varies a lot, depending on the application, with traditional IT workloads for mainstream business applications requiring far fewer kW-per-rack than frontier AI workloads.

Liquid cooling is a heat management technique that uses liquid to remove heat from computing components in data centers. Liquid has a much higher thermal conductivity than air as it can absorb and transfer heat more effectively. By bringing a liquid coolant into direct contact with heat-generating components like CPUs and GPUs, liquid cooling systems can remove heat at its source, maintaining stable operating temperatures.

Although there are many diverse types of liquid cooling techniques, direct to chip is the most popular cooling method, also known as “cold plate,” accounting for approximately half of the liquid cooling market. This technique uses a cold plate directly mounted on the chip inside the server, enabling efficient heat dissipation. This direct contact enhances the heat transfer efficiency. This method allows high-end, specialized servers to be installed in standard IT cabinets, similar to legacy air-cooled equipment.

There are innovative variations on the cold plate technique that are currently under experimentation. Microsoft is currently prototyping a new method that takes the direct to chip technique one step further by bringing liquid coolant directly inside the silicon where the heat is generated. The method entails applying microfluidics via tiny channels etched into the silicon chip, creating grooves that allow cooling liquid to flow directly onto the chip and more efficiently remove heat.

Swiss startup Corintis is behind the novel technique, which blends the electronics and the heat management system that have been historically designed and made separately, creating unnecessary obstacles when heat has to propagate through multiple materials. Corintis created a design that blends the electronics and the cooling together from the beginning so the microchannels are right underneath the transistor.

Technology Leaders Can Leverage TBM to Play a More Strategic Role in Aligning Tech Spend with Business Values

By: siowmeng
19 September 2025 at 12:44
S. Soh

Summary Bullets:

  • Organizations are spending more on technology across business functions, and it is imperative for them to understand and optimize their tech spending through technology business management (TBM).
  • IBM is a key TBM vendor helping organizations to drive their IT strategy more effectively; it is making moves to extend the solution to more customers and partners.

Every company is a tech company. While this is a cliché, especially in the tech industry, it is becoming real in the era of data and AI. For some time, businesses have been gathering data and analyzing them for insights to improve processes and develop new business models. By feeding data into AI engines, enterprises accelerate transformation by automating processes and reducing human intervention. The result is less friction in customer engagement, more agile operations, smarter decision-making, and faster time to market. This is, at least on paper, the promises of AI.

However, enterprises face challenges as they modernize their tech stack, adopt more digital solutions, and move AI from trials to production. Visibility into tech spending and the ability to forecast costs, especially with many services consumed on a pay-as-you-go basis is a challenge. While FinOps addresses cloud spend, a more holistic view of technology spend is necessary, including legacy on-premises systems, GenAI costs (pricing is typically based on the tokens), as well as labor-related costs.

This has made the concept of TBM more crucial today than ever. TBM is a discipline that focuses on enhancing business outcomes by providing organizations with a systematic approach to translating technology investments into business values. It brings financial discipline and transparency to their IT expenditures with the aim of maximizing the contribution of technology to overall business success. Technology is now widely used across business functions such as enterprise resource planning (ERP) for finance, human capital management (HCM) for HR, customer resource management (CRM) for sales, and supply chain management (SCM) for operations. Based on GlobalData’s research, about half of the tech spend today is already from budgets outside of the IT department. It is becoming more crucial as the use of technology becomes even more pervasive across the organization especially with AI being embedded into workflows. Moreover, TBM capability also help to elevate the role of tech leaders within an organization, as a strategic business partners.

IBM is one of the vendors that offer a comprehensive set of solutions to support TBM in part enabled by acquisitions such as Apptio (which also acquired Cloudability and Targetprocess) and Kubecost. Cloudability underpins IBM’s FinOps and cloud cost management, which is a key component that is already seeing great demand due to the need to optimize cloud workloads and spend as companies continue to expand their cloud usage. Apptio offers IT financial management (ITFM) which helps enterprises gain visibility into their tech spend (including SaaS, cloud, on-premises systems, labor, etc.) as well as usage and performance by app or team. This enables real-time decision-making, facilitates the assessment IT investments against KPIs, makes it possible to shift IT budget from keeping the lights on to innovation, and supports showback/chargeback to promote fairness and efficient usage of resources. With Targetprocess, IBM also has a strategic portfolio management (SPM) solution that helps organizations to plan, track, and prioritize work from the strategic portfolio of projects and products to the software development team. The ability to track work delivered by teams and determine the cost per unit of work allows organizations to improve time-to-market and align talent spend to strategic priorities.

Besides IBM, ServiceNow’s SPM helps organizations make better decision based on the initiatives to pursue based on resources, people, budgets, etc. ServiceWare is another firm that offers cloud cost management, ITFM, and a digital value model for TBM. Other FinOps and ITSM vendors may also join the fray as market awareness grows.

Moreover, TBM should not be a practice of the largest enterprises but rather depends on the level of tech spending involved. While IBM/Apptio serves many enterprises (e.g., 60% of Global Fortune 100 companies) that have tech spend well over $100 million, there are other vendors (e.g., MagicOrange and Nicus) that have more cost-effective solutions to target mid-sized enterprises. IBM is now addressing this customer segment with a streamlined IBM Apptio Essentials suite announced in June 2025 which offers fundamental building blocks of ITFM practice that can be implemented quickly and more cost-effectively. Based on GlobalData’s ICT Client Prospector database, in the US alone, there are over 5,000 businesses with total spend exceeding $25 million, which expands the addressable market for IBM.

For service providers, TBM is also a powerful solution for deeper engagement with enterprises and delivers a solution that drives tangible business outcomes. Personas interested in TBM include CIOs, CFOs, and CTOs. While there are TBM tools and dashboards that are readily available, service providers can play a role in managing the stakeholders and designing the processes. Through working with multiple enterprise customers, service providers are also building experiences and best practices to help deliver value faster and avoid potential pitfalls. Service providers such as Deloitte and Wipro already offer TBM to enterprise customers. Others should also consider working with TBM vendors to develop a similar practice.

IBM Think on Tour Singapore 2025: An Agentic Enterprise Comes Down to Tech, Infrastructure, Orchestration, and Optionality

28 August 2025 at 17:30
D. Kehoe

Summary Bullets:

• Cloud will have a role in the AI journey, bit no longer the destination. The world will be hybrid, and multi-vendor.

• Agentic AI manifests from this new platform but will be double-edged sword. Autonomy is proportionate to risk. Any solution that goes to production needs governance.

The AI triathlon is underway. A year ago the race was about the size of the GenAI large language model (LLM). Today, it is the number AI agents connecting to internal systems to automate workflows, moving to the overall level of preparedness for the agentic enterprise. The latter seems about giving much higher levels of autonomy to AI agents to set own goals, self-learn and make decisions, possibly manage other agents from other vendors, that impact customers (e.g., approving home loans, dispute resolution, etc.). This, in turn, influences NPS, C-SAT, customer advocacy, compliance, and countless other metrics. It also raises many other legitimate legal, ethical, and regulatory concerns.

Blending Tech with Flexible Architectures

While AI in many of its current forms are nascent, getting things right often starts with placing the right bets. And the IBM vision, as articulated, aligns tightly to the trends on the ground. This is broadly automation, AI, hybrid and multi-cloud environments and data. Not every customer will go the same flight path, but multiple options are key in the era of disaggregation.

In February 2025 IBM acquired HashiCorp. This was a company that foresaw public cloud and on-prem integration challenges decades ago and invested early in dev tools, automation, and saw infrastructure as code. Contextualize to today’s language models, enterprises still will continue to have different needs. While public cloud will likely be the ideal environment for model training, inferencing or fine tuning may better at the edge. Hybrid is the way, and automation is the solution glue. The GlobalData CXO research shows that AI is accelerating edge infrastructure, not cloud. And there are many considerations such as performance, security, compliance, and cost causing the pendulum to swing back.

Watsonx Orchestrate

The acquisition of Red Hat six years ago helped to solidify the ‘open source’ approach into the IBM DNA. This is more relevant for AI now. Openness also translates to middleware and one of the standouts of the event with is the ‘headless architectures’ with Watsonx. The decoupling of UI/UX at the frontend with the backend databases and business logic focuses less on the number of agents, but rather how well autonomous tasks and actions are synchronized in a multi-vendor environment. Traditional vendors have a rich history of integration challenges. An open platform approach working across many of the established application environments with other frameworks is the most viable option. In this context, IBM shared examples of working with a global SaaS provider using Watsonx to support its own global orchestration roll-out; direct selling to the MNC with a large install base of competing solutions, to other scenarios of partners who have BYO agents. IBM likely wants to be seen as having the most open, less so the best technology in a tightly coupled stack.

The Opportunity

Agentic AI’s great potential has a double-edged sword. Autonomy is proportionate to risk. And risk can only be managed with governance. These can include guardrails (e.g., ethics) and process controls (e.g., explainability, monitoring and observability, etc.). Employees will need varying levels of accountability and oversight too. While IBM is a technology company with its own products and infrastructure, it also has its own consulting resources with 160,000 global staff. Most competitors will lean towards the partner-led approach. Whichever path is taken, both options are on the table for IBM. This is important for balancing risk with technology evolution. Still, very few AI peroof of concepts ever make it to production. And great concepts will require the extra consulting muscle, especially through multi-disciplinary teams, to show business value. Claims of internal capability needs to walk that tight rope with vendor agnosticism to keep both camps motivated and the markets confident.

GPT-5 Has Had a Rocky Start but Remains an Extraordinary Achievement

15 August 2025 at 12:05
B. Valle

Summary Bullets:

  • OpenAI released GPT-5 on August 7, 2025, a multimodal large language model (LLM) with agentic capabilities.
  • This is the latest iteration of the famous chatbot, and the most important upgrade since the release of the previous generation, GPT-4, in 2023.

As it happens sometimes when a product is thrust with such force into the realm of popular culture, the release of GPT-5 sparked a veritable PR crisis, leading CEO Sam Altman to make a public apology and backtrack on the decision to remove access to all previous AI models in ChatGPT. Unlike enterprise customers, which received advanced warnings of such movements, consumer ChatGPT users did not know their preferred models would disappear so suddenly. The ensuing kerfuffle highlighted the strange co-dependency relationship that some people have developed with the technology, creating no end of background noise surrounding this momentous release.

In truth, OpenAI handled this launch rather clumsily. But GPT-5 remains an extraordinary achievement, in terms of writing, research, analysis, coding, and problem-solving capabilities. The bête noire of generative AI (GenAI), hallucination, has been addressed (to a limited degree, of course), and GPT-5 is significantly less likely to hallucinate than previous generations, according to OpenAI. With web search enabled on anonymized prompts representative of ChatGPT production traffic, GPT-5’s responses are around 45% less likely to contain a factual error than GPT-4o. The startup claims that across several benchmarks, GPT-5 shows a sharp drop in hallucinations, about six times fewer than o3.

However, safety remains a concern. OpenAI has a patchy record in this area: Altman famously lobbied against the US California Senate Bill SB 1047 (SB 1047), which aimed to hold AI developers liable for catastrophic harm caused by their models if appropriate safety measures weren’t taken. In 2024, members of OpenAI’s safety team quit after voicing concerns about the company’s record in this area.

Meanwhile, there has been talk in industry circles and trade media outlets of artificial general intelligence (AGI) and GPT-5’s position in this regard. However, the AI landscape remains so dynamic that this is missing the point. Google’s announcement on August 5, 2025 (in limited research preview) of Google DeepMind’s Genie 3 frontier world models, which help users train AI agents in simulation environments, positions the company against AI behemoth Nvidia in the realm of world AI. World AI in this context means technologies that integrate so-called “world models,” i.e., simulations of how the world works from a physics, causality, or behavior perspective. It could be argued that this is where true AGI resides: in real-world representations and in the trenches of the simulation realm.

On the other hand, Google’s latest salvo in the enterprise space has involved a fierce onslaught of partnerships, with several deals announced in the last 48 hours. Oracle will sell Google Gemini models via Oracle’s cloud computing services and business applications through Google’s developer platform Vertex AI, an important step to boost its capillarity in corporate accounts. With Wipro, Google Cloud is going to launch 200 agentic AI solutions in different verticals that are production-ready and accessible via Google Cloud Marketplace. And with NTT Data, Google is launching industry-specific cloud and AI solutions, with joint go-to-market investments to support this important launch.

The AI market is advancing at rapid speed, including applications of agentic AI in enterprise environments. This includes a variety of AI-driven applications and platforms that are transforming business processes and interactions. The release of GPT-5 is simply another tool in this direction.

The Season of Agentic AI Brings Bold Promises

31 July 2025 at 16:59
C. Dunlap Research Director

Summary Bullets:

  • Spring/summer platform conferences led with AI agent news and strategies
  • AI agents represent the leading innovation of app modernization, but DevOps should be wary of over-promising

During this season of cloud platform conferences, rivals are vying to own the headlines and do battle in the cloud wars through their latest campaigns and strategies involving AI agents.

2024’s spring/summer conferences led with GenAI innovations–2025’s with agentic AI. AI assistants and copilots have transformed into tools used to create customized agents, unleashing claims of new capabilities for streamlining integrations with workflows, speeding the application development lifecycle, and supporting multi-agent orchestration and management. Vendors are making bold promises based on agentic AI for its ability to eliminate a multitude of tasks mandated by humans and taking workflow automations to new heights.

AI agents, which can autonomously complete tasks on behalf of users leveraging data from sources external to the AI model, are accelerating the transition towards a more disruptive phase of GenAI. Enhanced memory capabilities enable the AI agents to develop a greater sense of context, including the capacity for “planning.” Agents can connect to other systems through APIs, taking actions rather than just returning information or generating content.

Recap of the latest AI agent events:

  • Amazon announced Bedrock AgentCore, a set of DevOps tools and services to help developers design custom applications while easing the deployment and operation of enterprise-grade AI agents. The tools are complemented with new observability features found in AWS CloudWatch.
  • Joining the Google Gemini family of products, including Gemini 2.5 and Pro, Vertex AI Agent, ADK, and Agentspace, is Google Veo 3, a GenAI model providing more accessibility to high quality video production.
  • OpenAI released ChatGPT agent, an AI system infused with agentic capabilities, that can operate a computer, browse the web, write code, use a terminal, write reports, create images, edit spreadsheets, and create slides for users
  • Anthropic released Claude Code, which uses agentic search to understand an entire codebase without manual context selection and is optimized for code understanding and generation with Claude Opus 4.
  • IBM announced watsonx Orchestrate AI Agent, a suite of agent capabilities that include development tools to build agents on any framework, pre-built agents, and integration with platform partners including Oracle, AWS, Microsoft, and Salesforce.

Cloud platform providers are strategically highlighting their most salient strengths. These range from the breadth of their cloud stack offerings to mature serverless computing solutions to access to massive developer communities via popular Copilot tools and Marketplaces. Yet all are focused on gaining mind share amidst heated campaigns of not only traditional platform rivals, but an increasingly crowded ecosystem of new platform and digital services providers (in the form of infrastructure providers) vying to catch the enterprise developer’s attention.

Recent vendor announcements are aiming to strike a chord among over-taxed enterprise IT operations teams, with claims of easing operational provisioning complexities involved with moving modern apps into production. Use cases supporting these claims remain scarce, and details to help prove new streamlined and low-code methods, particular around AI agent orchestration, are still vague in some cases. Enterprises should remain vigilant in seeking out technology partners providing a deep understanding of an evolving technology which comes with a lot of promises.

Carriers Grow Traffic Significantly While Also Delivering Energy Efficiency

10 July 2025 at 12:25
R. Pritchard

Summary Bullets:

  • Comcast has nearly doubled the energy efficiency of its network ahead of its 2030 target while also carrying 76% more data.
  • Other examples of greater energy efficiency through new technology include BT Global Fabric, where the replacement of legacy platforms will see a 79% energy consumption reduction.

Comcast announced that it is near to reaching its goal of doubling its network energy efficiency ahead of its 2030 target, stating that it is “delivering dramatically more data at faster speeds and greater reliability at the highest quality for our customers, all while conserving the amount of energy needed to power our network.”

Comcast reported that it has achieved an 11% reduction in energy usage between 2019 and 2024, while at the same time carrying 76% more traffic over the same period as all customer segments use their connections for applications and services needing higher bandwidths – ranging from streaming videos to unified communications. As a result, the energy savings combined with network growth have delivered a 49% reduction in electricity per consumer byte since 2019 (from 18.4 kWh [kilowatt hour] per Terabyte to 9.3 kWh in 2024). Like many others, Comcast has noted both the increase in data as a result of the artificial intelligence (AI) revolution as well as its potential to optimize network performance, including enhanced monitoring/network diagnostics, and optimization.

The other trend driving improved sustainability and efficiency in networks is the latest generation of equipment, with decommissioned legacy technology having been far less efficient. GlobalData analysis has found that replacing copper lines with fiber can be up to 85% more efficient, and power-saving measures using AI can lead to energy savings of up to 40%.

Another notable example is BT’s move to the BT Global Fabric Network-as-a-Service (NaaS) platform, which replaces multiple previous technology platforms and will result in a 79% energy consumption reduction. These technology developments and evolutions are all helping to keep telecoms service providers – national and international – in the vanguard of reducing greenhouse gas (GHG) emissions. Given recent flash floods in Texas (US) and wildfires across Europe and Canada, alongside further destructive climate change impacts on society and nature, these examples of progress should be celebrated and encouraged.

Advancing AI 2025 Event: AMD Heeds the AI Opportunity

30 June 2025 at 14:26
B. Valle

Summary Bullets:

• AMD’s “Advancing AI 2025” event, held in San Jose, California (US) in June 2025 helped analysts delve deeper into the company’s strategy for the next few years.

• The chip designer aims to build a fully open ecosystem and stack, supported by a string of acquisitions, including Silo AI and Brium.

AMD continues executing upon its annual roadmap cadence since it launched the AMD Instinct MI300 GPUs in late-2023. The launch of the AMD Instinct MI350 series, with a quadruple jump in performance compared with the previous generation, was a highlight of the conference. As AI agents become conspicuous, compute requirements will grow, driving an exponential demand for infrastructure. AMD also focused on its software roadmap and highlighted the importance of an open ecosystem, something the company has invested in through acquisitions.

The chip designer announced the launch of the AMD Instinct MI350 series GPUs, the fourth generation within the AMD Instinct family, and the forthcoming rack servers based on these chips, slated for availability in late 2025. The company is also unveiling the AMD Instinct MI400 processors in 2026, which will run on AMD’s Helios rack, pitted against Nvidia’s Vera Rubin.

AI is moving beyond the data center to intelligent devices at the edge and PCs. AMD expects to see AI deployed in every single device, running on different architectures. From a portfolio standpoint the company offers a suite of computing elements spanning GPUS, DPUS, CPUs, NICS, FPGAs, and adaptive SCIs. Its strategy is based on delivering a broad portfolio of compute engines so customers can match the right compute to the right use case, and on investing in an open, developer-first ecosystem that supports every major framework, library, and model. The chip designer believes that an open ecosystem is central to the future of AI and claims to be the only company committed to openness across hardware, software, and solutions.

Openness shouldn’t be just a buzzword because it will be critical to scale adoption of AI over the coming years. AMD has invested heavily both organically and through acquisitions to promote its open software stack; in the last year, it made 25 strategic investments in this area, including the Finnish company Silo AI, and more recently, Brium. Other acquisitions across the entire AI value chain include ZT Systems, Pensando, Lamini, Enosemi, and Xilinx. However, there are always risks associated with inorganic growth that the company needs to actively address.

However powerful AMD’s hardware may be, it is a common criticism in the industry that the software cannot match up to Nvidia’s CUDA platform. AMD has pinpointed software as a key AI enabler and therefore a crucial focus, shaping M&A plans. The ROCm 7 software stack is designed to broaden the coverage of AI models by accelerating the pace of updates and foster a developer-first mentality, with integration with open-source frameworks top of mind. This lends capillarity to the AMD hardware and makes it easier to scale.

The company highlighted that demand for compute based on inference workloads will soon be equal to model training, although training will remain the foundation to develop AI systems. As AI undertakes complex tasks like reasoning, driving demand for more compute, inference will soon become the majority stake of the market. AMD is focusing on inferencing as a crucial differentiator, with a focus on “tokens-per-dollar” as a metric.

Looking ahead, the chip designer believes there is further opportunity in an environment where customers have not invested enough in the refresh cycle of the last couple of years. However, and with the industry still relatively immature in the AI stakes, it is difficult to predict how successful the agentic AI experiment will be. Many enterprises remain in the PoC phase with lots of projects still in their infancy, and it is difficult to project the real size of the opportunity within this market. For a deeper analysis of the event, please read GlobalData’s report Advancing AI 2025: AMD Announces MI350 GPUs and Targets the Inference Opportunity, June 30, 2025.

B2B Advertising Campaigns Underline Importance of SMB Market

8 August 2024 at 11:45
R. Pritchard

Summary Bullets:

• BT, Orange, and Vodafone have launched significant multi-channel advertising campaigns targeted at the small and medium-sized business (SMB) market, underlining its importance to future growth.

• They all emphasize the evolving role of the service provider beyond connectivity, with a focus on security and digital business-enabling solutions that characterize the future of enterprise.

Traditionally, telecom companies have focused mass media advertising campaigns on the consumer market. But now major European service providers are advertising to target the enterprise market, focusing on smaller businesses (e.g., SMBs). This reflects both their strategic shift toward SMBs as offering the best potential for revenue growth, and that their portfolios of technology solutions have become far more relevant in running businesses of all sizes – the market has moved beyond connectivity.

BT Business, Orange Business, and Vodafone Business (there is a trend here in the naming conventions) have all been using adverts across TV, online, and other digital media to promote their business solutions. All their adverts cover similar messages but are also distinctive and memorable.

The BT Business campaign is based on the line ‘We’ve Got Your Back.’ It aims to “showcase BT’s support for every type of business, from the person just starting out at their kitchen table, to major multinationals and critical public services.” The focus of the campaign is to recognize that every business today is a digital business. The goal for BT is to position itself not merely as a supplier, but as a partner offering digital solutions and security alongside reliable connectivity:


BT Business campaign – screengrab

Orange Business aims to underline “the importance of network and digital integrators in the digital ecosystem.” The campaign illustrates the challenges of interconnected components in a complex digital landscape as well as underlines Orange Business’ ability to help across technology areas such as AI, IoT, connectivity, cloud, data, and cybersecurity, promoting the concept that ‘it works better when we work together.’ The campaign avoids the dullness often associated with technology by taking a comedic angle across its adverts, with the goal of making businesses think ‘maybe they should have asked Orange Business?’


Orange Business campaign – screengrab

Vodafone Business’ campaign is based on the strapline ‘Your Business Can’ (echoing Vodafone Group’s ‘Together We Can’ strapline) and is aimed at SMBs that can benefit from digital tools to help boost productivity and security. It also looks to help move the perception of Vodafone as ‘just’ a mobile company to support its strategic push into the broader business market with solutions such as cybersecurity, collaboration tools, and connectivity products.


Vodafone Business campaign – screengrab

Although adverts are often seen as ‘fluffy,’ these three campaigns absolutely underline the seriousness with which these major service providers are focused on the SMB market. Telecom companies globally have realized that the SMB market provides the best opportunity for selling additional services beyond connectivity, with the goal of adding revenues from value-added solutions, leveraging their resources to differentiate against price-focused competitors, and cementing longer-term, stronger relationships with customers. Adverts might just be seen as ‘a bit of fun,’ but this is serious stuff.

Colt and Proximus’s Wholesale API Trial Highlights the Importance of Industry Standards Bodies in Achieving Automation

30 July 2024 at 19:47
Gary Barton – Analyst, Business Network and IT Services

Summary Bullets:
• Colt and Proximus partner for a proof of concept (PoC) focusing on API collaboration based on the MEF’s Lifecycle Service Orchestration Sonata designed to enhance carrier-to-carrier automation.
• GlobalData expects wholesale providers to embrace API technology to enhance and automate inter-network functionality with the end target of delivering automated end-to-end services across multi-vendor networks.

Colt Technology Services and Proximus recently announced they have collaborated to trial a new PoC designed to enhance carrier-to-carrier automation. The collaboration was the first of its kind between the two providers and focused on provisioning network services between the UK and Belgium. The crucial aspect of the trial is that it was able to support Proximus’ Wholesale E-Access service and Colt’s On Demand Network-as-a-Service (NaaS) platform across both networks on an end-to-end basis.

The trial was based on the MEF’s LSO Sonata APIs, which are a crucial part of the industry standards organization’s effort to promote and enable carrier-to-carrier automation. The MEF has become one of the most important players in the wholesale scene in driving the levels of automation that are crucial for telecoms service providers to increase the efficiency of their networks and to provide the level of automation demanded by wholesale (and, to a lesser extent, enterprise) customers.

The partnership is in response to a growing awareness across the telecoms sector that for API-driven automation to work it needs to achieve scale. To do that, cooperation is necessary between a critical mass of leading industry providers. This approach has already generated success for the GSMA Open Gateway’s APIs on the mobile side. Demos such as this by Colt and Proximus are fundamental to achieving similar momentum on the fixed side. The MEF can highlight the fact that carriers who have adopted this API framework also include Verizon, Orange, Telia, Zayo, PCCW, and Lumen. It now has further evidence, albeit on a relatively small scale, of the efficacy of its APIs.

Proximus’s MEF’s E-access is a core component of its efforts to deliver a digitalized, end-to-end, Network-as-a-Service (NaaS) proposition. Longer-term, it is seeking the ability to add extra features such as universal CPE (uCPE) and virtual network functions (VNF). Colt has offered on-demand services including Ethernet and Internet for some time, and it too is adding a growing set of functions to its NaaS portfolio. Building these capabilities demands significant investment and true value will not be achieved until NaaS functionality is available off-net through a zero-touch ordering process.

Full global enablement of cross provider capable automation APIs will take multiple such PoCs. Indeed, this isn’t the first API partnership for Colt. It was one of the first to partner with AT&T to provide its business customers with the ability to place automated orders for Colt’s Ethernet Services. AT&T customers were also able to validate site addresses, check service availability, get a quote, and place automated orders on Colt’s network which saw improvement in ordering times from days or weeks to minutes.

GlobalData expects to see the adoption of API’s continue to grow in 2024. Telecommunications wholesale carriers are by no means the only players in the API space, but they will continue to play a leading role in the charge toward API ecosystems as part of the search for new ways to monetize their networks. These APIs will also pave the way for increased AI functionality within and across networks.

Vodafone Business Germany’s Sustainability Tool Underlines Urgency Of EU Regulation Compliance

22 July 2024 at 09:11
R. Pritchard

Summary Bullets:

• Vodafone Business Germany’s partnership with Envoria offers CSRD-compliant ESG reporting for enterprises as service providers continue to shift toward offering technology-enabled business solutions.

• Growing range of ESG reporting portals being made available to enterprises from telecoms services providers to meet evolving legislation and customer demand.

Vodafone Business Germany has teamed with Envoria, a sustainability reporting specialist, to help its customers meet the environmental, social, and governance (ESG) reporting requirements of the European Union Corporate Sustainability Reporting Directive (CSRD). This requires sustainability reporting by large companies and listed SMBs. Some non-EU companies also must report if they generate over EUR150 million in the European Union market. The companies must apply the rules for the first time in the 2024 financial year for reports published in 2025.

The new offering, ESG Navigator, aims to act as an all-in-one solution for collecting and analyzing ESG data. According to Vodafone, “companies can use the tool as a central collection point for all relevant data, bringing together sources from different locations and systems for processing and analysis.”

The tool lets customers create and export reports, including visualizations that comply with the CSRD’s predefined templates. Alongside the tool, Vodafone Germany is also offering workshops, training courses, and demos to help customers’ relevant employees to use the software correctly.

The scope of regulation both for telecoms services providers and their customers continues to grow because technology plays an ever greater role in all business and personal aspects of life, and because the technology sector as a whole is a major source of greenhouse gases – most notably this has recently been further highlighted by the impact of generative AI (GenAI) in data center usage, with both Microsoft and Google admitting the impact of heavy cloud computing usage is responsible for their net-zero targets being impacted. They are still committed to reducing their emissions, but they are going to have to be very smart and innovative to meet their goals. This also impacts enterprises along the supply chain as part of Scope 3 emissions targets – which are tougher than Scopes 1 and 2 where companies have greater direct control.

Failure to comply with the EU’s goals, or goals set by corporations themselves, will potentially attract fines by the EU (and other) regulators. Other potential negative impacts include corporate reputation and a growing number of customers and consumers choosing only to deal with enterprises that they perceive to be responsible. This is particularly true in Germany where the Green Party (even though its vote share in the recent European Parliament election fell to 12%, it is indicative of substantial environmental concerns in the country).

Offering solutions like these also makes commercial sense for telecom services providers targeting the enterprise market. As noted in GlobalData’s recent quarterly SMB Watch report, there is a continued movement away from selling connectivity products to selling services that have tangible positive business impacts. These are largely focused on productivity and security, but increasingly include sustainability – and they will embrace GenAI over time too.

AT&T Under Fire After Disclosing Massive Data Breach

19 July 2024 at 17:19
Amy Larsen DeCarlo – Principal Analyst, Security and Data Center Services

Summary Bullets:

• AT&T divulged that the call and text records of 109 million cellular customers had been unlawfully downloaded from a third-party cloud provider’s environment.

• Wired magazine reports AT&T paid $370,000 to hackers to delete the records, which included cell site data. While the hacker provided a video of the deletion, there is no way to prove the threat actors don’t have a copy of the records.

AT&T is feeling the heat after admitting that the call and text records of 109 million wireless customers had been illegally downloaded from third-party provider Snowflake’s cloud. The records, which include the incoming and outgoing phone numbers and cell site locations that these communications were relayed through, covered a more than six-month time span in 2022 and a single day in January 2023.

In a Securities and Exchange Commission (SEC) filing this month, AT&T disclosed an internal investigation discovered the theft in April 2024. At the Department of Justice’s request, AT&T delayed a public disclosure so the agency could investigate. At least one person, a US citizen, was arrested in Turkey. The Federal Communications Commission is also probing the breach.

Wired magazine reports that AT&T paid a hacking group $370,000 in cryptocurrency to delete the records. While the bad actors provided a video showing the data deletion, there is no way to prove that the cyber criminals don’t have other copies of the records.

The theft involves call and text records of almost all of AT&T cellular clients as well as customers of mobile virtual network operators (MVNOs) including Cricket and Boost. While the data doesn’t include personally identifying information such as names or social security numbers, the scale and the inclusion of communicating phone numbers and location data present a damning picture of the severity of this breach.

Security and intelligence experts are sounding the alarm on how valuable this information would be to many bad actors and espionage agencies. The identities of individual customers can be linked to the phone numbers contained in the metadata, which can be found in public records. Adding cell sites provides the kind of information sought to map communications and locations for individuals by intelligence agencies and other entities.

This metadata can be used for several different applications, including discerning the connection between phone numbers through network mapping, geofencing analysis for targeted advertising, behavioral pattern recognition to establish travel patterns, fraud, and cold-case resolution. Intelligence agencies around the world have tapped into these types of records for surveillance purposes.

This is not AT&T’s first major security incident this year. In March 2024, AT&T disclosed that the passwords of 7.6 million customers were stolen. That theft occurred in 2019. AT&T never clarified why it took so long to notify its customers of that breach.

Big questions loom about the lack of security protections for such high-value and high-volume data. Why did it take so long for AT&T to identify that breach? What actions is the company taking to ensure that customer data is protected in the future?

Turkcell’s Initiatives Will Play a Central Role in Turkey’s Ambitions as a Regional Data Center Hub

18 July 2024 at 10:46
I. Patel

Summary Bullets:

• News of Turkcell’s $27 billion investment over its 30-year span will spur further development in the country’s technology infrastructure as the operator seeks to sell a quarter of its shares, possibly to a wealthy Gulf player.

• With this announcement, Turkcell is positioning itself strategically in Turkey as the go-to operator for enterprise solutions for government and large enterprises, and it is presenting itself as an indispensable partner for B2B services.

Turkish telecoms operator Turkcell disclosed it has invested $27 billion in communications and technology since its founding in 1994, of which $350 million have been allocated to data center technology. While this represents a meager 1.3% of the total investment, GlobalData expects the figure to ramp up significantly as the telecoms operator moves away from legacy products, services, and solutions to B2B and enterprise. GlobalData’s revenue forecast for Turkey’s data center and hosting market is growth from $551 million in 2023 to $729 million, representing a CAGR of 6.8%. The telecoms operator says its objective is to make the country a ‘global data hub.’ In order to maximize this opportunity, on July 11, 2024, Turkcell announced a new data subsidiary, TDC, which now operates as a standalone entity. Turkcell states it has evolved from a telecoms operator to an end-to-end technology provider.

The announcement comes on the back of Turkey’s sovereign wealth fund (TWF) reportedly mulling the sale of its 26.2% stake in Turkcell, which it bought in 2020. Highlighting the success story of Turkcell is critical to the sale in a market that has been seen to play second fiddle to Gulf players in terms of next-generation enterprise technologies. Arabian investors are reportedly interested in purchasing the stake: It would make sense for Gulf operators like stc or Etisalat by e& with significant interests in data centers to buy as they seek to boost their own data center footprint.

Turkcell currently provides cloud and data hosting services for over 3,000 local and international businesses, boasting eight data centers (four of which are next-generation-grade) and an active white space of 36,500 square meters – the largest of any Turkish operator. The localization rate – i.e., domestic production of technology for the construction of data centers – averages well above 50%. The company operates across multiple sectors and is well-diversified. Continuous investment will assist Turkcell in maintaining relevance and steady revenue growth in an ever-evolving market.

Competition in the data space is not limited to Turkcell’s telco competitors like Turk Telekom and Vodafone Turkey, which also operate their own centers. Global giants like Amazon, Google, and Microsoft are also vying for dominance in the data and cloud technology space. The question remains whether Turkcell will be able to effectively compete in the country and region against its rivals and the global hyperscalers in terms of reliability and scalability, even with the $350 million invested. Where Turkcell can compete is in the ability to offer aggressive pricing, superior uptime, and advanced local features. If it so happens that a Gulf investor or telco assumes Turkcell’s stake, it might become an effective competitor against the likes of Google, Amazon, and Microsoft. It will almost certainly solidify the operator’s status as a leading data center provider in the region.

Generative AI Watch: Salesforce’s World Tour Event Confirms the Trend Toward Vector Databases

11 June 2024 at 12:16
B. Valle

Summary Bullets:

• Salesforce is leveraging generative AI (GenAI) capabilities to address customer pain points such as processing unstructured data and unlocking the value in this data creating unified customer profiles.

• Salesforce Data Cloud will be available on Hyperforce in the UK in July 2024. Salesforce Hyperforce aims to address customers’ growing appetite for compliance, safety, and standardization in the public cloud.

The Salesforce World Tour took place on June 6, 2024, at the Excel Centre in East London (England), with sponsors such as Amazon Web Services (AWS), Cognizant, Deloitte, and PWC. The annual event included workshops, demos, and discussions with partners, the announcement of an AI center, and innovations in the Salesforce Data Cloud and Slack platforms. For GenAI observers, the most salient news was the general availability of Salesforce Data Cloud Vector Database, built into the Salesforce Einstein 1 Platform, which infuses GenAI into the vendor’s CRM platform, Salesforce Customer 360. The vector database collects, “ingests,” and combines structured and unstructured data regarding end users. This is of great importance to Salesforce’s customers’ customers. According to the vendor, around 80% of customer data is scattered across internal corporate departments in an unstructured configuration, “trapped” in PDFs, emails, chat conversations, transcripts, customer reviews, and so on. This data can be leveraged to create a closer overall relationship with the customer by creating a unified profile of the so-called customer journey. Being able to ground all types of data in Salesforce Data Cloud – where it is processed – unlocks a ton of valuable information and not just to engage with the customer in positive ways: It makes it possible to be agile as possible in case of problems including issues such as product recall, returns, and so on.

During the keynote, it was emphasized that personalization is a critical tenet of customer engagement and one of the advantages of deploying GenAI in customer-facing verticals. “Putting data to work” was one of the highlights of the speech as well as how enterprises can augment employee productivity through upskilling to increase use of GenAI tech internally. Overcoming the fear factor and general mistrust of GenAI is also essential. Although there were no new product launches per se, the vendor announced that Salesforce Data Cloud will be available on Salesforce Hyperforce in the UK. Salesforce Hyperforce is designed to help firms tackle data residency problems by creating a layer where all Salesforce applications are integrated across the same compliance, security, privacy, and scalability standards. The solution is built for the public cloud and is composed of code rather than hardware so that all applications can be safely delivered to locations worldwide. Salesforce Hyperforce provides a common layer to deploy all the application stacks, offering Salesforce’s version of similar solutions available in the market. These solutions allow companies to handle data compliance for an increasingly fragmented technology world. Enterprises serve their employees and customers globally while providing choice and control for residency and compliance.

The event was also a launchpad for the Salesforce AI Center, whose pilot will be inaugurated in the UK to encourage collaboration among AI experts, support Salesforce partners and customers, and facilitate training and upskilling programs. The company said the center, which is planned to be the first of many globally, has capacity for 300 people and is located in Blue Fin Building near Blackfriars, London (England). Recognizing the value of training in the nascent GenAI market, Salesforce has set itself the ambitious goal of upskilling 100,000 developers worldwide leveraging a string of similar centers globally. The London facility will open on June 18, 2024 and is part of a $4 billion investment drive in the UK and Ireland.

Salesforce continues to incorporate GenAI across its portfolio from its data visualization platform Tableau, to Einstein for analytics, and Slack for collaboration. The company claims the Salesforce Data Cloud tool leverages all the metadata in the Salesforce Einstein 1 Platform by connecting unstructured and structured data, reducing the need to fine-tune large language models (LLMs) and enhancing the accuracy of the results delivered by Salesforce Einstein Copilot, Salesforce’s conversational AI assistant. Vector databases are not new, but the GenAI “revolution” has brought them to the forefront as enterprises use them alongside retrieval-augmented generation (RAG) techniques to link their proprietary data with LLMs, like OpenAI’s GPT-4, enabling them to generate more accurate results. Vector databases are becoming widespread because they power the RAG technique and are used by enterprises to build chatbots for employees needing to access internal company information (e.g., researchers using an AI hub or salespeople pulling information from knowledge hubs). Rivals including Oracle, Amazon, Microsoft, and Google have their own vector databases; Salesforce demonstrates its early investments in GenAI are bearing fruit with the Salesforce Data Cloud Vector Database launch.

❌
❌