I asked 5 data leaders about how they use AI to automate – and end integration nightmares


light Background

nadla via iStock / Getty Images Plus

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • Your ability to exploit data relies on strong underlying processes.
  • AI can be your best friend when it comes to data integration.
  • Focus on consistency, orchestration, capabilities, and culture.

As many as 63% of business leaders describe their organizations as data-driven. However, only one in two executives is confident about its ability to deliver timely business insights.

If your business is going to make the most of its information, it’s going to need a technique to make its data available and accessible. So, step forward emerging technology, which increasing evidence suggests can be the key to unlocking value from information.

Also: 5 ways to use AI when your budget is tight

Whether it’s integrating platforms, merging companies, or working across geographies, professionals must manage a complex range of sources. Here’s how experts say that AI and automation can help.

1. Drive internal consistency

Joel Hron, CTO at global content and technology specialist Thomson Reuters (TR), said his organization uses AI to overcome data and system integration challenges in software engineering.

“We’ve found great benefit across various modernization and migration activities,” he said. “We heavily use AI tools to help ensure compliance with accessibility standards and things like that.”

That pioneering work continues at pace. Hron said TR’s corporate development teams are currently creating an internal AI system for due diligence to drive more consistency in deal evaluation, risk assessment, and potential risk mitigation.

“It’s really a super powerful idea,” he said. “They’ve been building that for the last month or two, coupling it quite nicely with a legal operations product that we sell in the market called HighQ.”

Also: 90% of AI projects fail – here are 3 ways to ensure yours doesn’t

Hron said TR is an acquisitive company that spends time integrating systems. While the benefits of these AI-enabled developments for his company are clear, could the tool above one day be used by external clients? Maybe, he said.

“If we can create something really useful for us, why not bring it to market? But at this point, our focus is about how we make this tech useful for all of our M&A activity and help drive not just speed and efficiency, but consistency in the deals that we do.”

2. Orchestrate your insights

Miko Chen, lead data engineer at Create Music Group, uses data and AI to improve her company’s operational processes, and she advises other professionals to explore leading-edge tools.

The Los Angeles-based music technology specialist uses AI and orchestration capabilities in Astronomer’s Airflow service Astro to manage over 600 data pipelines.

Create has used Astro to integrate its BigQuery and Google Cloud Storage technologies, and APIs from Spotify, YouTube, Apple Music, and Amazon Music, into a layer that manages data pipelines for operational activities, such as analytics and financial forecasting for labels and artists.

Also: 5 ways you can stop testing AI and start scaling it responsibly in 2026

“We want to provide better data to help our clients make decisions, instead of randomly thinking about what they should do,” she said.

“For example, if they want to host a concert, they can use our insights to consider which city they should pick versus which city they can select next time. So, with our data, our artists and our clients can make this proactive decision.”

Create is an acquisitive business, and Chen said her team also uses Astro to consolidate data.

“With Astro, we can easily move data around different places, such as from one organization to another, or from one country to another,” she said.

3. Explore current capabilities

Huy Dao, director of data and machine learning platform at Booking.com, said it’s important for professionals to understand the technological capabilities in their existing data stacks.

Booking was already a Snowflake customer when Dao joined the firm in August 2023. While this platform was proving its worth, he knew the technology offered additional capabilities, particularly for creating AI-enabled services.

“We now use it for way more than the warehousing component,” said Dao, who explained to ZDNET his team’s direction of travel during the past two and a half years.

“All our sensitive data access goes through Snowflake, and we are also using the latest available capabilities, both Cortex AI and Cortex Analyst. We’re exploring the Snowflake Semantic View, and we are also interested in the Horizon Catalog, which can interconnect with other data catalogs.”

Also: How to build better AI agents for your business – without creating trust issues

Rather than just providing a consolidated source of information, Dao said Snowflake provides an AI-enabled solution that helps professionals solve intractable business challenges. In short, business users now have the power to enact change, thanks to AI.

“The platform reduces the barrier to entry. So, instead of having only 200 users who can access and use our data, we can have 2,000 users because Snowflake makes it easy,” he said.

“With some of the AI capabilities, you don’t even have to write SQL to query the data. Things like that make it easy for people who traditionally did not have data skills.”

4. Focus on marginal gains

Richard Corbridge, CIO at property specialist Segro, told ZDNET that AI and automation can play a crucial role in helping his company bring disparate data assets together. He gave ZDNET the example of cross-European sustainability data.

“We need to monitor, by law, our carbon footprint and sustainability stats. In Poland, they send the meter readings as PDFs. In Germany, they come as digital, automatic readings. In the UK, they might come as photos of utility meters,” he said.

“We’ve got to work out how we take all those disparate ways of reporting on energy usage, put them into one place, and turn it into a Segro report on carbon footprint and energy use.”

Also: Worried AI agents will replace you? 5 ways you can turn anxiety into action at work

Previously, that laborious task was completed by a human specialist across Excel spreadsheets. Now, with the help of AI and process automation, Corbridge’s team is using AI to free up human resources and create business benefits.

“We’re building AI capability to go out, bring the data back, put it into the database, find out when it’s not right, and point out illogical elements, such as where the meter reading is the same as last month, so there must be something wrong,” he said.

“It’s cool stuff in a small, impactful area. It’s fascinating to see a bit of a geek on this one. The results have been super exciting.”

5. Reduce the management load

Ankur Anand, CIO at global technology and talent solutions provider Nash Squared, said the biggest impact AI has on data management is mapping and normalization.

“AI reduces your integration effort by almost 30% to 40% and gives you far more accurate results compared with the traditional approach of using Excel,” he said.

Using BlueGecko, an AI-enabled data management platform from technology specialist Nextgenlytics, Anand’s team has automated time-consuming data-mapping processes, particularly during post-M&A activities.

Also: 5 security tactics your business can’t get wrong in the age of AI – and why they’re critical

He explained to ZDNET how the system produces accurate results at key stages, such as during ETL (Extract, Transform, Load) processes.

“Blue Gecko understands the data, maps the data, explains how the two systems are talking to each other, and the values within those systems, and through that approach, the technology helps to accelerate your work around ETL development,” he said.

Anand’s advice for other professionals looking to integrate data and systems is to focus on culture.

“Think about people who have been using other tools,” he said. “What are the change management processes that you can bring in? Success is not just about deployment; adoption is also important.”





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link