France bets €500 million that quantum computing is the tech race Europe can finally win



Europe has spent a decade watching American and Chinese companies capture every major technology wave, cloud, mobile, social, AI. Quantum computing may be the exception. A cluster of French startups, backed by €500 million in government funding and underpinned by some of the world’s strongest physics research, is positioning France as a serious contender in a race where legacy advantages count for remarkably little.

At the centre of the French effort is Alice & Bob, a Paris-based startup whose “cat qubit” technology, named after Schrödinger’s thought experiment, takes a fundamentally different approach to the field’s central problem: errors. Quantum computers manipulate individual particles whose states are so fragile that any interaction with the outside world destroys the computation. Most approaches compensate with massive redundancy, using thousands of physical qubits to produce a single reliable “logical” qubit. Alice & Bob’s cat qubits are designed to autonomously correct certain errors at the hardware level, potentially reducing the number of physical qubits needed by orders of magnitude.

It’s not about being faster,” says co-founder and CEO Théau Peronnin, who founded the company in 2020 with Raphaël Lescanne. “It’s about being so dramatically faster that you change what is feasible.” The company raised €100 million in a Series B round in January 2025, led by Future French Champions, AXA Venture Partners, and Bpifrance, bringing total funding to €130 million. It is now investing $50 million in a new laboratory north of Paris, with a clean room for in-house chip fabrication and a test facility for progressively larger machines.

Five companies, five qubit architectures

Alice & Bob is not operating in isolation. France’s PROQCIMA programme, a government initiative aiming to deliver a fault-tolerant quantum computer demonstrator with 128 logical qubits by 2030 and a 2,048-logical-qubit commercial system by 2035, has selected five companies for its €500 million first phase: Alice & Bob (cat qubits), Pasqal (neutral atoms), Quandela (photonics), Quobly (silicon spin), and C12 Quantum Electronics (carbon nanotubes). The programme is structured as a competition: after four years, the three most promising approaches advance; after eight, only two remain.

The diversity is the strategy. Rather than betting on a single qubit architecture, as the US has effectively done with superconducting circuits, France is funding parallel approaches, each with distinct advantages. Pasqal, which is planning a public listing at a reported $2 billion valuation, already has neutral-atom quantum computers deployed in high-performance computing installations across Europe. Quobly reached a milestone in December 2025 when its isotopically enriched silicon wafers entered STMicroelectronics’ 300mm production line in Crolles, the first integration into a high-volume commercial semiconductor fab. Quandela has partnered with OVHcloud to make its processors available via sovereign cloud infrastructure by mid-2026.

Olivier Ezratty, an academic whose 1,500-page compendium “Understanding Quantum Technologies” has become a standard reference, notes that the French companies share a common advantage: lower machine and energy costs compared to their American competitors. In a field where cryogenic cooling and error correction drive enormous power consumption, that advantage may prove more consequential than raw qubit counts.

The competitive landscape

France is not the only European country with quantum ambitions. Finland’s IQM announced in February that it would become the first publicly listed European quantum company through a $1.8 billion SPAC merger, with a primary listing on the NYSE and a possible dual listing in Helsinki. IQM has raised over $600 million in total and already deploys superconducting quantum computers. The UK has Oxford Quantum Circuits and Riverlane, the latter focused on quantum operating systems.

The American incumbents remain formidable. Google, which acquired cat-qubit-adjacent startup Atlantic Quantum in October 2025, IBM, and a constellation of well-funded competitors have deeper pockets and larger engineering teams. But Peronnin argues the playing field is more level than it appears. “At the end of the day, it’s a maths challenge,” he says. “There is no unfair advantage from legacy technology like classical computing, so there is no reason to be shy.

The physics talent pipeline supports his confidence. France has produced three Nobel Prize-winning physicists in recent years, Serge Haroche (2012) for quantum optics, Alain Aspect (2022) for quantum entanglement experiments, and Albert Fert (2007) for spintronics, all from institutions like École Polytechnique and École Normale Supérieure that feed directly into the country’s quantum ecosystem. Both Alice & Bob’s founders are products of that pipeline.

The gap between promise and product

Peronnin is candid about where the technology stands. “At the moment, the machine we have is no more powerful than your telephone,” he says. “We’re on the flat part of the exponential curve.” The quantum computers that Alice & Bob and its French peers have placed in companies like Air Liquide are not yet delivering on the technology’s transformative promise. Their purpose is to train a community of specialists who will be ready when the hardware catches up.

The applications, when they arrive, are potentially enormous. Peronnin describes drug development, currently dominated by trial and error, as a field where quantum simulation of molecular interactions could transform what is feasible. Materials science, cryptography, financial modelling, and logistics optimisation are all candidates for quantum disruption.

The race will be “winner-takes-all,” Peronnin predicts, comparing it to IBM’s dominance in classical computing. That framing may be optimistic — quantum computing could fragment along application-specific lines rather than consolidating around a single platform. But it captures the strategic stakes for Europe: after decades of building world-class research and watching the commercial value migrate to Silicon Valley, quantum represents a technology where the science and the business might, for once, stay in the same place.

We have what it takes to win it,” Peronnin says. “It’s about believing in ourselves.” Coming from the CEO of a company whose quantum chip is currently less powerful than a smartphone, that is either delusion or the kind of conviction that turns exponential curves into market share. The €500 million in government funding suggests France, at least, is betting on the latter.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link