How Data Centre Investment Just Overtook Oil
Global investment now favours data centres over new oil supplies, reflecting the scale of electricity demand created by AI and the increasing importance of digital infrastructure to national economies.
Data Spending Overtakes Oil For The First Time
The International Energy Agency has reported that global spending on data centres will reach around 580 billion US dollars this year, overtaking the 540 billion dollars allocated to new oil supply projects. The agency described this comparison as a clear marker of how modern economies have become anchored in digital services, cloud computing and large-scale AI models, all of which require vast physical infrastructure and reliable electricity.
Usage To Triple By 2035
Electricity use from data centres is projected to approximately triple by 2035. AI systems are a major driver, and the IEA expects half of all demand growth to take place in the United States, with Europe and China accounting for most of the remainder. Many new facilities are located near existing clusters around large cities, with around half of the sites currently in development designed to deliver at least 200 megawatts.
The concentration of this growth is, therefore, already testing the limits of energy systems. Grid connection queues for new facilities continue to lengthen, and in several regions networks are so congested that new requests have been paused. Shortages of transformers, cables and other grid components are adding to delays. These issues highlight how the rise of AI is now tightly linked to national energy planning, rather than being a purely digital challenge.
Electricity Systems Under Growing Pressure
The IEA describes the global system as entering an “Age of Electricity”, with most new energy demand coming through power grids rather than fossil fuels. Investment in electricity generation has increased significantly since 2015, yet grid investment has not kept pace. New solar and wind capacity is being deployed at record levels, but the lines and substations needed to carry this electricity to major users are often slowed by planning processes and supply chain constraints.
Cooling demand is creating additional pressure. For example, rising temperatures and rising incomes in many regions are driving higher peak electricity loads from air conditioning. These peaks often coincide with the load patterns of data centres, electric vehicles and electrified heating. As a result, grids are increasingly stretched while they await new capacity and greater flexibility from storage technologies.
In several established markets, energy regulators have warned that large electricity users may need to be subject to stricter technical rules or new pricing structures to ensure network stability. Data centres are therefore becoming part of broader energy security discussions, particularly in regions where supply margins are tightening.
Power Shortages Slow Construction Across EMEA
It’s also the case now that power constraints are directly affecting the pace of new construction across Europe, the Middle East and Africa. For example, new research from Savills shows that only around 850 megawatts of new power capacity for data centres has been delivered across the region so far this year, representing an eleven per cent decline compared with the same period last year. New take-up has also slowed to approximately 845 megawatts, roughly half of 2024’s level.
This slowdown is not driven by falling demand. In fact, total contracted power capacity has risen to almost 14,500 megawatts, up by twelve per cent year-on-year. Also, occupancy rates have increased to ninety-one per cent, and around a quarter of new take-up is now pre-let. These figures illustrate that operators are securing power well ahead of time because there is no guarantee that future capacity will be available when needed.
Property advisory firm Savills found that established hubs continued to expand over the past year, including France, Germany, the UK and Ireland. Strong growth was also recorded in emerging markets such as Portugal, Saudi Arabia, Spain, the UAE and Sweden, where land and power availability are more accessible. This trend suggests that some operators are shifting attention to secondary and tertiary locations that offer fewer bottlenecks and more flexible permitting.
The Effects of Cost Inflation
It seems that cost inflation remains a significant factor. For example, across EMEA, data centre build costs now range between roughly 7.3 million and 13.3 million US dollars per megawatt of IT load. It seems that some cities have even experienced double-digit annual increases in land prices, labour and equipment. The result is that these rising costs are lengthening project timelines and prompting developers to form closer relationships with suppliers to secure key components earlier.
Also, electricity consumption forecasts continue to add urgency. For example, one well-known industry analysis last year suggested that up to forty per cent of data centres could face power availability constraints by 2027, and that total electricity consumption for AI-optimised servers could reach around 500 terawatt hours. This would represent more than two and a half times the level recorded in 2023.
Superconductors Move Into Data Centre Design
While grid upgrades are essential, many of the most immediate challenges are emerging inside existing data centre campuses. For example, as AI systems become more computationally intensive, rack-level power has risen from tens of kilowatts to around 200 kilowatts in just a few years. Some operators are now planning for 600 kilowatts per rack, and there is growing discussion of multi-megawatt rack architectures.
A US-based engineering company, backed by several major technology investors including Microsoft, has now adapted high temperature superconducting cables for use within data centres. The firm’s first commercial system is designed to deliver three megawatts of low voltage power through superconducting cables cooled with liquid nitrogen to approximately minus 196 degrees Celsius. This cooling allows the material to carry electricity with zero loss, which in turn supports far higher power density.
The company reports that its cables require around twenty times less physical space than equivalent copper cables and can deliver power roughly five times farther within a campus. A demonstration installation has already been completed at a simulated facility, and pilot deployments at live data centres are expected next year ahead of a planned commercial launch in 2027. These technologies do not replace the need for additional grid capacity, but they allow operators to make better use of limited on-site power and cooling infrastructure.
Data Centres And AI Companies
For data centre operators, the expansion in investment highlights both opportunity and risk. For example, facilities with dependable power connections, competitive energy prices and space for expansion can attract long-term demand from cloud providers and AI companies. At the same time, rising construction costs, lengthy permitting and potential regulatory intervention make project planning more complex. There is increasing attention on how much electricity AI infrastructure consumes, which may influence approval processes in some regions.
It seems that AI companies now face equally important considerations. Access to high-density, well-powered infrastructure directly shapes the pace at which new models can be trained and deployed. Delays in securing suitable hosting capacity can slow research progress or increase operational costs. There is also growing pressure for AI to run on renewable energy, which means the location of data centres and the structure of power contracts matter more than ever.
Governments, Economies And Businesses
Governments now have to balance national competitiveness with energy security and climate commitments. Data centres underpin cloud services, logistics, digital payments and AI-driven innovation, yet they also place significant demands on power networks. This means that policymakers must decide where new facilities can be built, how grid upgrades should be prioritised and how to maintain public support when large projects are proposed near urban areas.
Economically, the sector supports construction, engineering, manufacturing and digital roles. The long-term nature of data centre contracts also encourages investment in renewable energy, battery storage and potentially small modular nuclear reactors, which several countries are exploring as a source of stable low-carbon power for high-demand sites.
For ordinary businesses using cloud and colocation services, the main effects are likely to be reliability, availability and cost. Capacity constraints may lead to higher hosting costs in busy regions, while areas with strong renewable resources and efficient planning may become more attractive for new deployments.
Investors and Infrastructure Funds Increasing
Another relevant trend here is that investors and infrastructure funds continue to increase their exposure to the sector. For example, since 2021, around eighty to ninety per cent of the value of closed data centre deals has involved private equity, infrastructure funds or real estate investors, compared with half in 2020. This reflects confidence in the long-term demand for digital infrastructure but also raises questions about concentration of ownership in assets that underpin national digital resilience.
Challenges And Criticisms
The scale of AI-related electricity use has raised many questions about environmental sustainability, especially where data centres draw power from grids still reliant on fossil fuels. Concerns have been raised about water consumption for cooling, land use in crowded urban regions and the impact of construction on local communities.
Energy regulators have also highlighted system risks linked to large power users. For example, data centres can influence grid stability if they ramp up unexpectedly or disconnect suddenly, prompting discussions about new standards or pricing structures. There are wider equity concerns too, as global statistics show that hundreds of millions of people still lack basic access to electricity while trillions of dollars flow into advanced digital infrastructure.
What Does This Mean For Your Business?
The trends here show a sector that’s expanding rapidly while running up against some clear structural limits. Investment is rising because demand is strong and immediate, yet the electricity needed for large-scale AI is difficult to deliver at the pace operators require. This creates a landscape where data centres are becoming essential to economic performance, but their growth is constrained by the slow evolution of energy infrastructure.
Operators now depend far more on securing reliable power than on adding floorspace or equipment. This means that sites with firm grid connections and competitive energy costs will be best placed to meet rising AI demand, while regions with slow planning processes or congested networks risk falling behind. AI companies face similar pressures because training and running advanced models depend on reliable access to powerful, energy-intensive processing systems. Delays caused by grid bottlenecks or supply chain issues can slow deployment and raise operating costs.
Also, governments must now balance digital competitiveness with energy security and climate targets. Data centres support cloud services, logistics, payments and AI innovation, so the ability to host them is becoming a strategic priority. Grid upgrades, renewable investment and more efficient permitting processes will be required if countries want to remain competitive. This matters directly to UK businesses, which rely on stable cloud services and cost-effective data processing. Rising pressure on electricity networks could influence the reliability and price of digital services across the economy.
It seems that investors are continuing to increase their involvement because long-term demand remains strong, although greater private ownership of strategic infrastructure raises questions about affordability and resilience. Meanwhile, environmental concerns around electricity use, water consumption and land availability remain under close scrutiny. These issues highlight the importance of ensuring that rapid AI and cloud expansion aligns with national climate goals and local community interests.
The overall picture, therefore, appears to be that of a sector that will continue to grow but will be shaped most of all by the availability, cost and cleanliness of electricity. The choices made now on grid investment and energy policy will likely define how quickly AI infrastructure can expand and how the associated benefits are shared across economies and industries.
Google AI Tools Let Anyone Build Interactive Map Projects
Google has introduced a new set of AI tools that allow developers and non-technical users to build interactive map projects from simple text prompts, marking one of the biggest upgrades to Google Maps Platform in years.
AI Agents Designed To Build Projects, Not Just Maps
Google Maps has long been one of the world’s most widely used mapping services. According to Google, more than 10 million websites and apps rely on Maps Platform for location data, imagery, routes and place information. The latest update signals a major step towards turning Maps from a data source into a fully assisted creation environment, where AI agents handle much of the early design and coding work.
The new features are powered across the board by Google’s Gemini models. At their core are several tools intended to simplify how interactive map experiences are created and embedded into apps, websites and AI products. These include Builder agent, Maps Styling agent, the Code Assist Toolkit, Grounding Lite and Contextual View. Each sits at a different point of the development workflow, but all aim to reduce the time, effort and specialist knowledge usually required to work with geospatial data.
Builder Agent Brings Prototyping Down To A Prompt
Builder agent is presented as the centrepiece of the update. It is a geospatial AI agent that turns natural language instructions into functioning prototypes. For example, a user can type “create a Street View tour of a city”, “create a map visualising real-time weather in my region” or “show pet-friendly hotels in the city”, then let the agent build an interactive map with the relevant data and code.
The system works by combining Gemini with Google Maps Platform APIs for Maps, Routes, Places and Environment. It produces a ready-to-test prototype along with the full source code and a written solution guide. Users can then export the code, drop in their own API keys, test it, and refine it further in Firebase Studio or their preferred development tools.
Google is positioning Builder agent as a way to collapse weeks of early scoping into just a few minutes. It is designed to remove the need for specialist geospatial experience, thereby potentially helping product managers, designers, researchers or smaller technical teams to move quickly from idea to working demo. Google says this reduces the learning curve, supports faster experimentation and increases confidence when deciding whether to invest development time.
A New Approach To Map Styling For Brands
The second major tool announced by Google is Maps Styling agent. This tool allows users to prompt the AI to create custom map styles that match a brand’s visual identity or highlight specific features such as landmarks, roads, lakes or points of interest.
For example, instead of editing style configurations manually, a user can ask the agent to apply a particular theme, colour palette or emphasis. This means a retailer could request a branded map that highlights store locations and access routes. A tourism app could ask for a theme that emphasises heritage sites and walking trails. A transport provider could request a clean map focused on stations, lines and interchanges.
These styles can be generated in Google AI Studio and used across mobile or web applications, giving designers more control without requiring in-depth map-styling knowledge.
Grounding Lite Connects AI Assistants To Maps Data
Google is also preparing to launch Grounding Lite, a feature that lets developers link their own AI models to up-to-date information from Google Maps using the Model Context Protocol, known as MCP.
This allows an AI assistant to answer practical location-based questions such as “How far is the nearest supermarket?”, “What would my commute look like from here?” or “Where are the closest rooftop cafés?” using live map data rather than static or outdated datasets.
Google points to use cases such as real estate apps that can instantly surface commute times and nearby amenities, or travel apps that can offer personalised recommendations based on local geography. Grounding Lite is designed as a more accessible and cost-effective version of the existing Gemini grounding tools for developers who want accuracy without having to fully adopt Gemini themselves.
Contextual View Adds Interactive Maps Inside AI Responses
Another feature launching globally is Contextual View, a low-code component from the new Google Maps AI Kit. It lets developers embed interactive map elements directly into AI-generated answers.
This means that, if an AI assistant is asked for things to do in a city, it can now respond with a written list alongside a 3D visual display of each area. If a user asks about hiking routes, the assistant can show a map that highlights the trails, terrain changes and surrounding points of interest.
The aim is to give AI products a much richer, more visual response layer, using familiar Google Maps interfaces rather than custom-built ones.
Code Assist Toolkit Brings Maps Knowledge Into Developer Tools
Google has also released a Code Assist Toolkit that connects AI coding assistants to the latest Google Maps documentation using an MCP server. This means a developer can ask, inside their coding environment, how to use a particular Maps API feature or which method is required for a specific task. The AI then responds using verified documentation instead of outdated or generic information found elsewhere.
The toolkit also links into Google’s command line interface for Gemini, allowing developers to pull Maps examples, patterns and instructions directly into their workflow. Google says this reduces debugging time and encourages consistent, accurate use of Maps APIs.
Businesses And Users
For businesses, the upgrades are likely to reduce development overheads and shorten experimentation cycles. For example, a property company could use Builder agent to create a neighbourhood exploration tool that combines Street View tours, local schools and air quality layers before refining it into a full feature. Also, a retail brand could produce custom-styled maps for store finders across its digital properties without extensive engineering support.
Smaller companies may also find the barrier to entry reduced. For example, teams without specialist mapping knowledge can still prototype experiences, explore new concepts and present map-based ideas to stakeholders. Also, agencies and consultancies may be able to validate client concepts far more quickly, with clearer early examples.
Gradually Introduced
For everyday users, it seems these changes are likely to appear gradually. Google has already enabled hands-free Gemini interactions within Maps in some regions, along with additional features such as incident alerts and speed limit information. As Grounding Lite and Contextual View are adopted, users may start seeing more AI-driven maps embedded inside customer service chats, booking tools, property apps, travel guides or workplace dashboards.
For Google, the update could be said to strengthen its position as the default mapping layer for both traditional applications and AI-integrated products. As AI assistants become more important in everyday digital experiences, Google is making sure Maps is the dataset these assistants rely on. This may deepen Google’s relationship with advertisers too, since visual mapping layers open up new possibilities for location-based content, commercial listings and branded experiences.
Competitors will, no doubt, feel the pressure from this latest announcement. For example, companies such as Mapbox and HERE have already started offering AI-supported design tools, but Google’s combination of vast location data, Gemini integration and low-code components gives it a strong advantage at a time when many businesses are shifting their digital experiences into conversational interfaces.
Challenges And Concerns
As with all AI updates these days it seems, there are still a few issues to address. For example, reliability remains one of the biggest challenges for AI-generated maps and code. Even when grounded in live data, an AI model can misunderstand a prompt, misapply a parameter or generate code that seems correct but fails in practice. Teams will still need rigorous testing processes despite the convenience these tools offer.
Privacy concerns are likely to grow as well. Location-based AI responses depend on user queries and contextual details passing through AI systems, raising questions about how this information is stored, processed and combined with other datasets. The use of MCP servers that directly connect AI tools to Google’s documentation and data may also attract regulatory scrutiny in regions where competition and data protection rules are strict.
Another concern is lock-in. For example, as businesses become more reliant on Gemini-specific agents and Maps APIs, moving to a different provider could become more difficult. Google presents the new features as standardised and flexible, but the most powerful capabilities still sit inside its own platform and models.
Cost transparency will also matter to smaller developers. For example, prototypes may be quick to build, but AI grounding, Maps embedding and visual responses still consume API usage, which can add up quickly if not managed carefully.
For now, Google’s update highlights a move towards AI-assisted geospatial design, where much of the early thinking, coding and styling work can be carried out automatically, and where map-based answers become a standard part of AI conversations across industries.
What Does This Mean For Your Business?
These developments point to a future where mapping tools become more accessible, quicker to deploy and far more tightly integrated with AI products across every sector. The move from manual design work to prompt driven prototyping is already changing how teams test concepts and bring new ideas to the table. For UK businesses in particular, this could mean shorter development cycles, faster decision making and the ability to explore new digital services without relying on large specialist teams. It also lowers the barrier for smaller firms that want to experiment with location based features but have never had the resources to invest heavily in geospatial expertise.
There is a broader industry impact too. For example, Google is strengthening its position at the centre of the mapping ecosystem, which places pressure on competitors and gives advertisers new formats to work with. This creates opportunities for brands that want richer, more contextual digital experiences, but it will also invite questions about how much influence Google holds over the frameworks that developers rely on. Regulators, privacy groups and consumer advocates are likely to monitor how these tools handle data, how AI responses are grounded and how transparent the wider system becomes.
The real test now lies in how reliably these tools perform at scale and how responsibly they are used. AI driven code, even when backed by accurate map data, still needs careful oversight. Businesses will want confidence that outputs are dependable, users will want clarity about how their information is being processed and competitors will look closely at whether the integration of AI and Maps creates any unfair advantages. Google’s latest update opens new possibilities for innovation across apps, services and AI assistants, yet the next phase will depend on trust, governance and the ability of organisations to put these tools to work in a secure and transparent way.
Apple Partner Programme Cuts App Store Fees For Mini App Makers
Apple has introduced a new Mini Apps Partner Programme that halves App Store commissions for qualifying mini apps to 15 percent, marking a significant shift in how Apple wants developers to build and monetise app-within-app experiences.
A New Approach To App Store Revenue
Apple has announced that developers who host mini apps inside a larger iOS or iPadOS app can now qualify for a reduced 15 percent commission on digital purchases made within those mini apps. Mini apps are small, self-contained experiences built using web technologies such as HTML5 and JavaScript. They run inside a host app rather than being downloaded as separate apps from the App Store.
Apple has supported this format since 2017 under its App Review Guideline 4.7, which covers mini apps, mini games, streaming games, chatbots, plug-ins and emulators. This new programme is the first time Apple has offered a financial incentive that directly targets this part of its ecosystem.
The standard App Store commission can be as high as 30 percent for many in-app purchases, so the new 15 percent rate represents a meaningful reduction for developers who operate or contribute to mini app ecosystems. Apple says the aim is to help developers “grow their business” while ensuring that mini apps continue to meet App Store safety, age-rating and payments standards.
What Mini Apps Are And Why They Matter
Mini apps allow users to access small, task-based experiences without installing a separate full app. For example, a user might open a messaging app and launch a mini game, shopping experience, restaurant booking or banking tool directly within it.
The concept has existed for years in China, where WeChat mini programmes have become a major part of digital life. They let users book taxis, play games, access government services or shop online, all from within a single app. Tencent’s ecosystem has grown to such an extent that analysts estimate it has well over a billion active users.
Similar ideas have now appeared in other services. For example, LINE (a Japanese messaging and social app), Alipay (a major Chinese digital payments platform), Telegram (the global messaging app) and Discord (a communication platform popular with gaming and online communities) all offer mini app-style features. ChatGPT also appears to be moving into the space by allowing users to open external services from inside its chatbot, including travel, retail, music and design tools. This trend creates a growing shift in how people discover and interact with digital services.
Apple’s move, therefore, could be seen as a clear signal that Apple intends to support similar patterns on iOS, rather than allowing super apps, AI platforms or rival ecosystems to define this behaviour without Apple’s involvement.
Why Now?
The timing appears tied to several overlapping pressures. For example, regulatory scrutiny has intensified in the United States, Europe and the United Kingdom over Apple’s control of in-app payments and its App Store commissions. Authorities have questioned whether Apple’s rules limit competition. Apple has already faced investigations around super apps, with US regulators arguing that Apple’s policies restricted the growth of app-within-app ecosystems.
There is also a commercial backdrop. Reports have suggested that Apple and Tencent previously agreed a 15 percent commission for purchases made through WeChat’s mini apps. Given the scale of WeChat’s reach, even a small slice of that activity could be extremely valuable to Apple over time.
Apple is also most likely responding to the rise of AI platforms that attempt to reduce reliance on traditional apps. Some developers have speculated that if users spend more time in AI chatbots and transact through them, the App Store’s central role could weaken. Mini apps give Apple a way to reassert influence over this changing landscape.
How The Programme Works
To join the Mini Apps Partner Programme, a developer must operate a host app available on the App Store for iOS or iPadOS. The host app must comply with all Apple Developer Programme conditions and App Review Guideline 4.7, including the requirement to provide a detailed manifest listing every mini app and its metadata.
Participating apps must support specific Apple technologies. These include the Advanced Commerce API, which manages in-app purchase flows for mini apps, and the Declared Age Range API, which helps developers present age-appropriate content. Apple says this provides a safer and more consistent experience for customers.
Developers must also use Apple’s in-app purchase system. Purchases inside mini apps can include consumables, non-consumables, auto-renewing subscriptions and non-renewing subscriptions. If these purchases meet Apple’s criteria and are handled through the Advanced Commerce API, the 15 percent commission applies.
This means that while developers receive a lower fee, they must integrate more of Apple’s commerce and safety tools to qualify. Apple has made clear that the reduced rate is conditional on this deeper technical alignment.
What It Means For Developers
For developers who run mini app ecosystems, such as messaging platforms, digital wallets or gaming communities, the financial impact is pretty straightforward, i.e., a lower fee means more revenue stays within the ecosystem. A host app might use that extra revenue to invest in more mini apps, or to share income with third-party creators more generously.
The programme may also make it more attractive for smaller studios or service providers to build mini apps instead of full native apps. A mini app can be faster to develop and easier to distribute because users do not need to search for or install anything. Host apps with large user bases could become important distribution channels for businesses of all sizes.
At the same time, some developers have voiced concerns about the additional work required to meet Apple’s technical requirements. The Advanced Commerce API and the detailed manifest process introduce extra steps that may be burdensome for small teams.
Competitors And The Wider App Market
Apple’s move directly intersects with the strategies of companies that operate super app-like platforms. WeChat’s mini app ecosystem is the clearest example, but others are emerging. Google already supports Android instant apps, and messaging services worldwide are experimenting with their own in-app experience formats.
The rise of mini apps could gradually change consumer behaviour. For example, if users spend more time inside host apps that contain multiple mini apps, they may download fewer standalone apps from the App Store. This could be an opportunity and a risk for Apple, since it could reduce direct App Store engagement while opening new revenue paths within host environments.
AI platforms also play a role here. For example, mini apps inside AI tools introduce a new layer of app discovery and interaction. Apple’s decision to strengthen the economic and technical framework for mini apps may help keep developers focused on the App Store ecosystem rather than diverting too much attention to alternative platforms.
What Users And Businesses Will Notice
For everyday users, the change will mostly be felt inside the apps they already use. Mini apps can launch quickly, offer simple interfaces and provide focused features without requiring a full install.
For businesses, this change could widen opportunities to appear inside high-traffic apps without committing to a full native app build. For example, retailers, travel companies, financial services, entertainment platforms and many other sectors could use mini apps to reach customers more efficiently. The standardised payment and refund process through Apple may also reassure customers, particularly those making purchases in unfamiliar mini apps.
Challenges And Criticisms
However, some early reactions from developers suggest that Apple’s programme may reinforce, rather than relax, Apple’s control. For example, developers must adopt Apple’s payment tools and age-rating systems to qualify for the lower commission, which critics argue keeps Apple firmly in charge of the revenue chain.
There are also ongoing concerns about App Store competition. Although the fee is lower, developers still cannot use their own payment rails. Privacy groups have questioned whether Apple’s age-rating system will satisfy regulators who are proposing stricter verification measures.
Discoverability remains another challenge for mini apps in general. For example, in large host apps, mini apps risk becoming difficult to find unless the platform provides strong search tools or clear navigation. Apple’s metadata requirements aim to improve transparency and quality, but they also increase the workload for developers who manage large mini app catalogues.
Investors Positive
That said, investors appear to view the programme positively, describing it as a strategic move that supports revenue growth while strengthening Apple’s position as app behaviour evolves. The response from developers and users over the coming months will reveal whether the balance of incentives works in practice.
What This Means For Your Business?
Apple’s decision could reshape how developers think about distributing lightweight digital experiences, since the economic incentive is far stronger than anything Apple has offered around mini apps before. The requirement to adopt Apple’s own commerce and safety tools keeps the company firmly at the centre of the transaction chain, yet the reduced commission makes the trade-off more appealing than previous arrangements. This balance will matter as host apps weigh up whether the increased technical work is justified by the additional revenue and the chance to attract more third-party creators.
The wider market impact could be significant because mini app ecosystems have already changed digital behaviour in other regions. Apple’s move suggests that similar patterns may now emerge more visibly on iOS. If host apps that already attract large audiences begin to expand their mini app offerings, a growing share of daily digital activity could take place inside these environments instead of through standalone native apps. This may alter how services are discovered, how often users browse the App Store and how developers plan their product strategies.
UK businesses may find that mini apps offer an important new route to reach customers who prefer quicker, simpler interactions. A retailer, service provider or travel firm could appear inside a widely used host app rather than relying entirely on its own native app to attract attention. This could mean lower development costs, a broader reach and could give businesses access to an environment where purchases, refunds and subscriptions are handled through familiar Apple systems that many customers already trust.
Others will also be watching the adoption rate closely. For example, regulators may take interest in how Apple links the lower commission to its own technologies, especially in markets where competition and platform control are under scrutiny. Developers will want to see whether Apple’s technical requirements remain manageable as the number of mini apps grows. Host platforms will need to balance the commercial opportunity with the operational responsibility of policing large catalogues of third-party content.
It is likely that the coming months will show whether developers embrace the model at scale or continue to rely on native apps and alternative platforms. Apple has put forward a clearer financial incentive at a moment when the structure of the app ecosystem is evolving, and both the market response and the regulatory environment will shape what happens next.
Company Check : Rightmove Shares Slide Over Major AI Investment
Rightmove’s share price has fallen by more than a quarter after the UK’s biggest property portal told investors it will slow near-term profit growth in order to fund a major new programme of artificial intelligence investment.
Rightmove Shares Plunge
Rightmove used its early November trading update to outline a significant shift in strategy, announcing plans to spend around £60 million over the next three years on AI-driven upgrades to its platform, tools and internal systems. The company said this investment is central to how it intends to run the business, improve its product suite and position itself for higher long-term growth.
Consequently, the company now expects underlying operating profit to rise by only 3 to 5 per cent in 2026, compared with about 9 per cent growth this year. Revenue growth for 2026 is still forecast at between 8 and 10 per cent. Investors reacted sharply to the reduction in profit expectations, pushing the share price down by as much as 28 per cent during early trading. The stock recovered some ground later in the day, although it still closed more than 12 per cent lower and hit a new 52-week low.
Chief executive Johan Svanstrom said the company was “already working on a wide range of exciting AI-enabled innovations” and that AI is “becoming absolutely central” to how Rightmove operates. He said the investment would “create an even stronger platform and higher-growth business over time”, with the company targeting more than 10 per cent annual revenue growth by 2030.
Why Is Rightmove Investing So Heavily In AI?
Rightmove has framed this new strategy as a deliberate move into an investment phase that runs from 2026 to 2028. The company says the spending will cover three areas, which are:
Consumer-facing improvements, including conversational search tools that allow users to describe what they want in natural language, more personalised recommendations and a virtual mortgage assistant that can guide users through affordability and product options.
Major upgrades to Rightmove’s internal systems, where AI is expected to automate workflows, speed up data processing and improve customer service. This includes what the firm described as re-platforming significant parts of its back-end infrastructure.
Research and development focused on new products and revenue lines. Rightmove has already identified more than two dozen AI projects it wants to develop during this investment cycle.
The company said these tools are designed to help both consumers and agents by improving search accuracy, increasing the speed of listing updates and delivering more actionable insights from the portal’s large data sets.
Why The Market Reacted So Strongly
The sharp fall in Rightmove’s share price reflects a combination of surprise and wider market anxiety. For example, investors have traditionally viewed Rightmove as a highly predictable, low-risk business with very high margins, low capital requirements and steady subscription income from estate agents and developers. Therefore, a sudden fall in forecast profit growth, even if temporary, looks like a significant departure from expectations.
The sharp market reaction reflects a mix of scepticism and uncertainty. For example, analysts have noted that although investing for future growth is usually welcomed, the size and timing of the planned AI spend left investors questioning whether the short-term hit to profit was justified. Commentators have also highlighted that while AI could help Rightmove make better use of its data and improve efficiency, there were concerns the company might be committing substantial funds to technology projects without clear evidence of how quickly they would deliver returns.
Analysts at UBS described the move as a “strategic pivot” that leaves the market with unanswered questions about the timing and return on investment. Others, including RBC (Royal Bank of Canada’s investment banking and research division) and Peel Hunt (a UK-based investment bank and equity research firm), have taken a more positive view, suggesting the sell-off may be overdone and arguing that the investment could help Rightmove maintain its lead in an increasingly competitive market.
The update also seems to have come at a difficult moment for global technology stocks more broadly. For example, fears of an overheating AI sector have triggered sell-offs across US and European markets over the past week, and investors appear cautious about companies committing large sums to AI projects with uncertain payoff periods. Rightmove’s update therefore landed in a market already highly sensitive to any sign of increased spending on AI.
Implications For Agents And The Property Industry
Rightmove’s paying customers are estate agents, lettings agents, developers and other professionals who rely on listings and data tools to win clients and run their businesses. For them, the impact of the new strategy depends on whether the AI improvements genuinely make the platform more effective.
Rightmove has already launched products such as Optimiser Edge, which uses data to help agents target new instructions and improve marketing. Strong take-up has encouraged Rightmove to double down on data-led tools. If the new AI tools deliver as promised, agents could access richer insights into pricing, demand, buyer behaviour and lead quality. That could help them work more efficiently and justify the portal’s subscription fees, which have been a long-running flashpoint in the industry.
Some agents may welcome these updates, whereas others are likely to be concerned that rising investment costs could lead to further fee increases. This tension has been reflected in a new legal claim against Rightmove that accuses the company of unfair pricing. Although the claim is separate from the AI announcement, the perception of rising costs will remain a key issue for stakeholders.
Developers, landlords and corporate property owners may benefit from more accurate pricing tools, better audience targeting and stronger data on local market dynamics. If Rightmove uses its data to launch new B2B products, this could strengthen its position further across the property ecosystem.
Homebuyers, Renters And Landlords
For individual users, the difference will mostly be seen in the platform experience. For example, conversational search could make it easier to find suitable homes without navigating multiple filters. AI-driven recommendations may also surface properties more relevant to specific needs or preferences, while improved data analysis could give buyers and renters clearer insights into pricing trends, local demand and affordability.
These tools could, therefore, save time for renters, first-time buyers and families trying to navigate an often opaque and fast-moving market. More accurate recommendations could also mean fewer wasted viewings.
However, these improvements come with quite a few questions. For example, more personalisation means more data use, and users will want to know how their information is used, stored and analysed. There is also a broader debate over whether automated valuation tools could introduce bias or distort local pricing. As AI becomes more visible in property technology, these issues will attract increased attention.
The UK Property Market
Rightmove dominates online property search in the UK by a wide margin. This means that any shift in its operating model actually has potential implications for the wider housing market. Better search tools could, in theory, improve matching between buyers and sellers, shortening transaction times and reducing friction.
More accurate pricing tools may help reduce the difference between asking and achieved prices, particularly in slower markets. Improved analytics could help developers understand demand patterns and assist landlords in managing rental portfolios.
However, that said, a more advanced platform could also strengthen Rightmove’s position. For example, competitors such as Zoopla and OnTheMarket already invest heavily in technology, and they may now face pressure to respond with their own accelerated AI programmes. If Rightmove’s AI tools become significantly more advanced than its rivals’, agents may feel increasingly locked in. This raises questions about competition, pricing power and how much choice agents and landlords can realistically exercise.
The Portal Landscape
It’s worth noting here that Rightmove’s rivals have been developing their own AI tools for some time now, particularly in automated valuation, user search and agent dashboards. The size of Rightmove’s latest programme may lead competitors to increase their own investment or reposition themselves more aggressively on pricing.
For estate agents, the announcement could signal a future in which portals compete less on the volume of listings and more on the intelligence and value added by their technology. The next few years are likely to be defined by how well AI helps agents attract vendors, manage leads and handle day-to-day operations.
For investors, the key question here may be whether this investment pays off in the second half of the decade. Rightmove believes its operating profit will begin to rebound after 2028 and that higher growth rates will follow. The stock market will, no doubt, be watching pretty closely to see whether those expectations translate into real performance, stronger user engagement and a clear competitive edge.
What Does This Mean For Your Business?
The immediate challenge for Rightmove is proving that these investments will deliver practical improvements rather than simply increasing costs. Investors will want to see evidence that AI tools can streamline operations, deepen user engagement and support new revenue lines without undermining the predictability that has defined the business to date. Estate agents will also be watching closely because AI driven workflows and data products will only be seen as valuable if they genuinely help them win instructions, price properties more accurately and run leaner operations. For those users, the question is less about the scale of the investment and more about whether it translates into tools that make day to day work faster and more effective.
Homebuyers, renters and landlords face a different set of considerations. For example, if conversational search and personalised recommendations improve accuracy and reduce wasted time, the platform could feel more intuitive and more useful during a move. Concerns around data use and algorithmic fairness will still need addressing as AI driven products expand, but the potential for clearer pricing insight and better matching remains significant. The wider property market will also feel the effects because more precise analytics and smarter discovery tools could influence how quickly homes sell, how properties are valued and how landlords plan their portfolios.
The broader implications for UK businesses are based around adoption, competition and capability. Many firms are assessing how quickly they should modernise their own digital systems, and Rightmove’s shift illustrates how even established, highly profitable businesses are accelerating their AI strategies. It provides a useful signal to UK decision makers that AI investment is increasingly being treated as a long term infrastructure requirement rather than an optional upgrade. For companies that supply or rely on property insights, there may be new opportunities to integrate richer data streams into planning, risk assessment and market forecasting.
Competitors will now need to decide whether to match this level of investment or differentiate more clearly on price and service. The risk for the wider portal landscape is growing concentration if Rightmove’s AI programme strengthens its lead, although the response from rivals is likely to shape how the market evolves. If they produce credible alternatives with strong AI features of their own, agents and landlords may benefit from greater choice and lower pressure on fees.
For regulators and policymakers, the developments highlight a sector where data, pricing power and platform dominance intersect. The balance between innovation and competition will be important because the benefits of AI will only be felt widely if the market remains open, transparent and fair for users. The next few years will reveal whether this investment cycle creates a more efficient and more dynamic property ecosystem or whether it intensifies existing concerns about market concentration and rising costs.
Security Stop-Press: Cyber Insurance Payouts Triple
Association of British Insurers (ABI) figures show that cyber insurance payouts in the UK have tripled, reaching £197 million in 2024 as businesses face increasingly costly cyber-attacks, particularly from ransomware and malware.
The number of cyber insurance policies has also risen, with 17 per cent more businesses taking out coverage in 2024. However, experts warn that not all claims are guaranteed to be paid. Insurers are tightening requirements, and failure to meet security standards or maintain effective recovery plans may limit payouts.
Businesses must ensure they implement robust cybersecurity measures, including secure backups and effective recovery plans. Cybersecurity should be a core business priority, with regular risk assessments and a proactive security culture to mitigate risks and safeguard against costly attacks.
Sustainability-In-Tech : Want A Data Centre In Your Shed ?
An Essex couple have become the first in the UK to heat their home using a mini data centre in their garden shed, in a trial designed to cut energy bills and support low income households through the transition to net zero.
Pilot Scheme
Terrence and Lesley Bridges live in a modest two bedroom bungalow near Braintree in Essex. Their home is owned by Eastlight Community Homes, a social housing provider, and they are part of a pilot run jointly by UK Power Networks and Thermify through an innovation project called SHIELD. The couple were selected for the pilot because they rely heavily on their heating, especially as Lesley lives with spinal stenosis and is in significant pain when temperatures drop.
Thermify HeatHub – Huge Savings
Since the installation of the Thermify HeatHub, their monthly energy costs have fallen from around £375 to between £40 and £60. Terrence said: “It truly is brilliant. I’m over the moon that we got picked to trial this out. You can’t fault the heating system, it is a 100 per cent improvement on what we had before.” Lesley added: “You don’t need to go to a sauna after coming here.” Their experience is one of the first real world demonstrations of a heating concept that blends clean energy, digital infrastructure and social support.
Who Is Thermify?
The heating unit in the Bridges’ shed is called a HeatHub. It is developed by the British company Thermify, which offers cloud computing services to businesses. Instead of housing its servers in a single large data centre, Thermify installs small clusters in people’s homes, where the heat generated by data processing is captured and used as low cost domestic heating.
The wider programme is actually part of SHIELD, which stands for Smart Heat and Intelligent Energy in Low income Districts. SHIELD is run by UK Power Networks through the Strategic Innovation Fund. Its aim is to help people who would normally be excluded from the shift to low carbon technologies because of high upfront costs. The project brings together Thermify, Eastlight Community Homes, community energy groups and technical partners to develop what they describe as a Social ESCo model. Under this model, equipment such as solar panels, batteries and HeatHubs is funded upfront by an energy services company and repaid over time through the value created by the technologies.
How The Data Centre In Their Shed Works
Inside the HeatHub are around 500 Raspberry Pi Compute Modules, all submerged in a special oil. As these computers run cloud tasks for Thermify’s business clients, the electricity they use becomes heat, which raises the temperature of the surrounding oil. That heat is then transferred into a heat store and the home’s central heating and hot water systems.
The principle is pretty simple. For example, computers turn electricity into information but all the electricity eventually becomes heat. Traditional data centres spend significant amounts of extra electricity on cooling systems that remove the heat and release it into the air. Thermify’s approach uses that unavoidable heat twice by turning it into a resource for the household.
A dedicated network line is installed so the unit can send and receive data without affecting the resident’s broadband. From the resident’s point of view, it behaves much like a boiler, controlled through familiar heating settings. The Bridges’ shed also contains a solar inverter and a battery, meaning their HeatHub is part of a small integrated energy system that stores and manages electricity through the day.
Why It Cut Their Bills
In the Bridges’ case, the combination of the HeatHub, solar panels and battery storage has transformed their energy use. Thermify pays for the electricity needed to run the computing tasks because this is part of its service to business clients. The heat produced from this process is supplied to the home at a low or no cost because the energy is already being paid for. SHIELD tenants who receive HeatHubs also pay a small standing charge for heat, although UK Power Networks expects this to be significantly lower and more predictable than the cost of gas for many low income families.
Thermify points to independent modelling that suggests this kind of distributed computing could reduce carbon emissions from data centre operations by about 75 per cent on average. SHIELD’s own modelling suggests combining HeatHubs with solar and batteries could reduce household energy costs by 20 to 40 per cent and cut heating related emissions by more than 90 per cent.
Data, Energy And Tech Companies
The concept has clear implications for cloud and data centre operators. For example, data centres already account for roughly 2.5 per cent of the UK’s electricity consumption and the sector’s demand is forecast to grow rapidly in the next five years. As more companies expand into artificial intelligence and digital services, pressure is rising to reduce the environmental impact and find practical uses for the heat that data centres produce.
Distributed systems like Thermify’s also offer an alternative to building ever larger centralised facilities. Although HeatHubs cannot handle the heavy workloads required for advanced artificial intelligence, they can run many common tasks such as analytics, apps or batch processing. If rolled out at scale, the model could create a network of tens of thousands of small data nodes that serve business customers while heating homes. SHIELD itself has a long term ambition to deploy up to 100,000 such systems a year by 2030.
The approach may also interest energy companies and grid operators. For example, embedded assets such as HeatHubs can help manage peaks and troughs in local demand and provide flexibility services to the grid. SHIELD is exploring how these devices might be combined with peer to peer energy trading and other smart local energy systems.
Sustainability Advantages
There’s clearly an environmental case for improving overall energy efficiency and reducing reliance on fossil fuels. With up to 30 per cent of a data centre’s electricity used solely for cooling, capturing that heat and using it to warm homes can replace the need for gas and reduces the total energy wasted.
There are also potential social benefits to consider here. For example, many low income households cannot afford the upfront investment needed for heat pumps or solar installations. SHIELD’s Social ESCo model aims to solve this by funding the equipment and repaying costs through the value generated by the assets. Early stages of the project show strong interest among tenants who are worried about energy bills but keen to adopt cleaner solutions.
Not A Totally New Idea
It should be noted here that the idea of using data centre heat in buildings is not new. For example, in Devon, a startup called Deep Green operates a washing machine sized digital boiler at a local swimming pool. The servers inside the unit warm the mineral oil surrounding them and the captured heat is used to heat the pool. Reports indicate that the installation has reduced the pool’s gas use by more than half and cut emissions by dozens of tonnes of CO₂ each year. A recent investment from Octopus Energy aims to expand similar units to more than one hundred pools across the UK.
Also, another British company, Heata, attaches small servers to domestic hot water tanks. Homeowners earn a payment for hosting cloud workloads and the heat from the servers warms their water. In mainland Europe, district heating networks in cities such as Odense, Paris and Stockholm already capture heat from large data centres to supply nearby homes and offices.
Key Challenges And Criticisms
Although the Bridges’ results are positive, there are ongoing questions about reliability and long term performance. For example, HeatHubs depend on a steady demand for cloud computing. If business workloads fall or move to other locations there could be uncertainty about how much heat is produced and how backup systems would operate. Trials like SHIELD allow operators to test these scenarios before any wider rollout.
There are also some practical issues to consider. HeatHubs need secure network connections, scheduled maintenance and clear communication so residents understand how the system works. Social landlords also have to consider noise, space and safety. Early feedback from SHIELD has highlighted the importance of strong support and simple user experience.
There is also a broader debate about whether heat reuse can keep pace with the rapid growth in data centre energy demand. Artificial intelligence training and inference use far more electricity than the kind of workloads Thermify deploys. Even with heat capture, growing numbers of data centres will place pressure on local electricity networks. Policymakers and regulators are increasingly encouraging heat reuse but stress that it must be combined with wider grid planning and efficiency measures.
For now, however, the Bridges’ warm bungalow in Essex has become a test case for how computing and heating might come together, offering an early glimpse of a model that could reshape how data centres are built and how homes are heated in the years ahead.
What Does This Mean For Your Organisation?
The trial highlights how digital infrastructure and domestic energy systems can support each other, which is why it is gaining interest across the UK. Data centres are expanding rapidly as businesses adopt artificial intelligence and cloud services, yet their rising electricity use and waste heat are becoming harder to manage. A system that captures this heat and delivers it as affordable, low carbon warmth offers clear benefits for households and creates a more efficient model for the tech companies that rely on constant processing power.
There are important implications for UK businesses here. For example, a distributed network of small data hubs could give companies access to computing capacity with a lower environmental impact, supporting sustainability commitments while easing pressure on the wider grid. Energy providers and local authorities may also see value in systems that help stabilise local demand and offer predictable heating costs for low income residents.
The Social ESCo model is another key part of the story, as it removes the upfront cost barrier that prevents many households from adopting low carbon technologies. If the model proves reliable at scale, it could influence how social housing providers, councils and developers approach retrofit programmes and new energy installations.
Heat reuse is likely to become more common as the UK works towards decarbonising heat. Projects like SHIELD show how data processing, renewable generation and home heating can be combined in a practical way, although long term questions remain around reliability, workload availability and system management. Even so, the Bridges’ experience demonstrates how an integrated approach can reduce bills, cut emissions and provide a template that could be adapted for both homes and businesses in the years ahead.