Decentralized AI Compute
Search documents
Tianrong Internet Products and Services Inc. (OTC: TIPS) Announces Front-End Marketplace Launch for DEPINfer – Decentralized GPU Compute Platform Powered by $DEPIN Token on Solana
Prism Media Wire· 2026-03-25 13:00
Tianrong Internet Products and Services Inc. (OTC: TIPS) Announces Front-End Marketplace Launch for DEPINfer – Decentralized GPU Compute Platform Powered by $DEPIN Token on Solana Decentralized AI compute marketplace enables seamless GPU sharing, instant access, and secure transactions powered by $DEPIN on SolanaMOUNTAINHOME, Pa., March 25, 2026 – PRISM MediaWire (Press Release Service – Press Release Distribution) – Tianrong Internet Products and Services Inc. (OTC: TIPS) (“TIPS” or the “Company”), a Penn ...
Node AI ($GPU) Launches Phase 01 of GPU Aggregator with AWS, Azure, Vast AI & More — Alongside GPU DAO & Staking 2.0
GlobeNewswire News Room· 2025-06-04 19:30
Core Insights - Node AI has launched Phase 01 of its GPU Aggregator, a one-click deployment solution that integrates GPUs from over 50 global providers, including AWS, Azure, Vast AI, GCP, and RunPod [1][4][8] - This launch aims to democratize access to high-performance compute, positioning Node AI as a key player in the decentralized AI infrastructure space [3][4] GPU Aggregator Overview - The GPU Aggregator serves as a unified compute marketplace, providing a single interface for users to access global compute resources [4] - It enables real-time selection of the best pricing and performance for AI workloads, making deployment more efficient and cost-effective [8] Decentralized GPU Renting & Lending - Node AI connects GPU owners with AI developers, facilitating both model training and live inference [5] - Users can lend idle GPU power to earn $GPU tokens and rent compute on-demand through smart contracts [8] Tokenomics & Revenue Model - The total supply of $GPU tokens is capped at 100 million, with approximately 96 million currently in circulation [6][14] - The revenue model is based on real ETH fees from compute usage, which are distributed to stakers, ensuring sustainability and fair participation [10][14] Infrastructure and Performance - Node AI's compute backbone is designed for high performance, allowing instant deployment of AI endpoints [12] - The platform includes enterprise-grade cooling and power infrastructure, as well as redundant systems to ensure uptime for AI model deployment [14] Future Developments - Upcoming features include deeper routing intelligence for the GPU Aggregator, dApp integrations for AI projects, and a benchmarking suite for hardware performance transparency [14]