The following is a guest editorial courtesy of Aditya Singh, Head of Product and Strategy at online broker INFINOX.
Having read the FCA’s recent research note on Quantum Computing Application in Financial Services, my overarching takeaway was the that this is a timely and useful map of where quantum computing could intersect the industry and that for INFINOX, as a broker operating at the cutting edge of trading platforms, algorithmic infrastructure, and copy-trading, is watching these developments closely. But while the report makes some important points, there are areas where we think the risks and implications run deeper.
The FCA is right to identify optimisation, machine learning, and stochastic modelling as early candidates for quantum application. Yet in practice, this is likely to mean that quantum capacity, like cloud storage and AI compute. will be concentrated in the hands of a few very large providers. For brokers like INFINOX, which already deliver algorithmic trading through MT4/MT5, copy trading via IX Social, and analytics across IX ONE, the question is not theoretical: if we plug a quantum-enhanced module into our algorithmic trading ecosystem, we are instantly dependent on that third-party quantum cloud. That creates concentration risk and potential systemic fragility – a parallel to the way financial services now depend on a handful of hyperscale cloud providers.
The FCA could have pushed harder on this point, because dependency on a small group of quantum vendors is not a niche concern, it’s a structural vulnerability.
The probabilistic nature of quantum outputs is also more problematic than the paper suggests. Trading signals, portfolio weights, or risk models generated by a quantum subroutine are not deterministic. For a platform like IX Social, where strategies are copied in real-time by thousands of clients, probabilistic variance makes explainability (already a challenge with AI) even harder. Under the FCA’s Consumer Duty, telling a retail client that a trade was selected because the quantum model ‘probably’ thought it was optimal is insufficient. Regulators will need a new standard of explainability that accounts for probabilistic reasoning, and firms like ours will need to prove it at the product level.
Early adoption is another fairness issue. If quantum confers a real edge in optimisation or latency-sensitive strategies, large players with the capital to integrate it may entrench their advantage. For brokers competing to give clients a level playing field, that raises uncomfortable questions about access and market integrity that deserve more attention than the report gave them.
Finally, the FCA nods to data migration complexity, but for us that risk is immediate: migrating live trading data, partner commission structures in our IB portals, or risk models in IX ONE into a quantum environment introduces novel attack vectors and operational risk. It is not just a back-office concern – it is a live front-end issue for clients and partners.
On international standards, coordination is essential, but the bigger challenge is pace. Hardware, cryptography, and algorithm development are racing ahead; regulatory frameworks, if fragmented, will create arbitrage opportunities and inconsistent resilience. Achieving alignment will be difficult, but it is the only way to prevent global fragmentation.
In short, the FCA’s paper is a strong opening salvo. But the next phase must get more specific: what does probabilistic explainability mean under Consumer Duty, how do we guard against concentration in the hands of a few quantum vendors, how do we manage the fairness gap between early and late adopters, and how do we migrate live financial data without introducing new systemic risk? Those are the questions brokers like INFINOX need answered if we are to integrate quantum responsibly into our products.