Workflow
Distributed Ledger Repo (DLR) solution
icon
Search documents
Tokenization Moving from Hype to Reality Across Financial Services, Broadridge Report Reveals
Prnewswireยท 2025-10-27 20:15
Core Insights - The adoption of tokenized assets is accelerating in the financial services industry, with custodians leading the way, as 63% currently offer tokenized assets and 30% plan to do so within two years [1][2][3] - The Broadridge Tokenization Survey indicates that tokenization is set to reshape capital markets, enhance efficiency, and democratize access for investors [2] - Early adopters of tokenization report significant benefits, while non-adopters perceive fewer advantages, highlighting a widening gap between these groups [6] Custodians and Asset Managers - Custodians are at the forefront of tokenization, with 91% reporting improvements in efficiency, security, and innovation from offering tokenized assets [3] - Asset managers are beginning to accelerate their adoption, with only 15% currently offering tokenized products but 41% planning to launch them soon [3][4] - Wealth managers are more cautious, with only 10% currently offering tokenized assets and 33% planning to adopt them in the next two years [4] Challenges and Barriers - Regulatory uncertainty is cited as the biggest challenge for tokenization adoption, affecting 73% of institutions surveyed [5] - Other barriers include security concerns, infrastructure gaps, and a lack of common standards, which impact the adoption plans of asset and wealth managers [5] Broadridge's Position - Broadridge is emerging as a leader in supporting tokenized trading, with its Distributed Ledger Repo (DLR) solution processing an average of $339 billion in daily trade volumes [7] - The company is committed to facilitating the trading of digital assets across its technology platforms [7] Future of Tokenization - Scaling tokenization offerings will require common standards, regulatory clarity, and robust technology partnerships [8] - A cultural shift is necessary for institutions to prioritize tokenization as a core strategy rather than a side project [8]