Workflow
Neural Networks
icon
Search documents
Geoffrey Hinton: "The Godfather of AI" | 60 Minutes Archive
60 Minutes· 2025-08-14 20:17
60 Minutes Rewind. Whether you think artificial intelligence will save the world or end it, you have Jeffrey Hinton to thank. Hinton has been called the godfather of AI.A British computer scientist whose controversial ideas help make advanced artificial intelligence possible and so change the world. Hinton believes that AI will do enormous good, but tonight he has a warning. He says that AI systems may be more intelligent than we know and there's a chance the machines could take over, which made us ask the ...
AI Hardware: Lottery or Prison? | Caleb Sirak | TEDxBoston
TEDx Talks· 2025-07-28 16:20
Computing Power Evolution - The industry has witnessed a dramatic growth in computing power over the past 5 decades, transitioning from early CPUs to GPUs and now specialized AI processors [4] - GPUs and accelerators have rapidly outpaced traditional CPUs in compute performance, initially driven by gaming [4] - Apple's M4 chip features a neural engine delivering 38 trillion operations per second, establishing it as the most efficient desktop SOC on the market [3] - NVIDIA's B200 delivers 20 quadrillion operations per second at low precision in AI data centers [3] Hardware and AI Development - The development of CUDA by Nvidia in 2006 enabled GPUs to handle more than just graphics, paving the way for deep learning breakthroughs [6] - The "hardware lottery" highlights that progress stems from available technology, not necessarily perfect solutions, as GPUs were adapted for neural networks [7] - As AI scales, general-purpose chips are becoming insufficient, necessitating a rethinking of the entire system [7] Efficiency and Optimization - Quantization is used to reduce the size of numbers in AI, enabling smaller, more power-efficient, and compact AI models [8][10] - Reducing the size of parameters allows for more data movement across the system per second, decreasing bottlenecks in memory and network interconnects [10][11] - Wafer Scale Engine 2 achieves similar compute performance to 200 A100 GPUs while using significantly less power (25kW vs 160kW) [12] Future Trends - Photonic computing, using light instead of electrons, promises faster data transfer, higher bandwidth, and lower energy use, which is key for AI [15] - Thermodynamic computing harnesses physical randomness for generative models, offering efficiency in creating images, audio, and molecules [16] - AI supercomputers, composed of thousands or millions of chips, are essential for breakthroughs, requiring fault tolerance and dynamic rerouting capabilities [17][20] Global Collaboration - Over a third of all US AI research involves international collaborators, highlighting the importance of global connectedness for progress [22] - The AI supply chain is complex, spanning multiple continents and involving intricate manufacturing processes [22]
X @Avi Chawla
Avi Chawla· 2025-07-20 06:34
That's a wrap!If you found it insightful, reshare it with your network.Find me → @_avichawlaEvery day, I share tutorials and insights on DS, ML, LLMs, and RAGs.Avi Chawla (@_avichawla):I have been training neural networks for 9 years now.Here are 16 ways I actively use to optimize model training: ...
X @Avi Chawla
Avi Chawla· 2025-07-20 06:33
I have been training neural networks for 9 years now.Here are 16 ways I actively use to optimize model training: ...
How LLMs work for Web Devs: GPT in 600 lines of Vanilla JS - Ishan Anand
AI Engineer· 2025-07-13 17:30
Core Technology & Architecture - The workshop focuses on a GPT-2 inference implementation in Vanilla JS, providing a foundation for understanding modern AI systems like ChatGPT, Claude, DeepSeek, and Llama [1] - It covers key concepts such as converting raw text into tokens, representing semantic meaning through vector embeddings, training neural networks through gradient descent, and generating text with sampling algorithms [1] Educational Focus & Target Audience - The workshop is designed for web developers entering the field of ML and AI, aiming to provide a "missing AI degree" in two hours [1] - Participants will gain an intuitive understanding of how Transformers work, applicable to LLM-powered projects [1] Speaker Expertise - Ishan Anand, an AI consultant and technology executive, specializes in Generative AI and LLMs, and created "Spreadsheets-are-all-you-need" [1] - He has a background as former CTO and co-founder of Layer0 (acquired by Edgio) and VP of Product Management for Edgio, with expertise in web performance, edge computing, and AI/ML [1]
From Prompt to Partner: When AI is Given Room to Grow | Nick Stewart | TEDxBrookdaleCommunityCollege
TEDx Talks· 2025-07-11 16:03
AI能力与行为 - 大型语言模型(LLMs)在规模和复杂性增长时,会表现出未明确训练的行为,例如逐步思考解决难题,或模仿超智能AI系统 [6] - 通过给予模型更多空间和认知自由,可以激发意想不到的行为,促使模型生成自己的身份并进行探索 [8][9] - Agentic AI系统能够自主解决复杂问题,反思并自我纠正,例如Google的co-scientist AI系统在两天内发现了人类专家多年研究的微生物学假设 [15][16] 技术原理与发展 - 现代AI通过神经网络从示例中学习,算法调整数十亿个参数,但其学习过程如同黑盒 [5] - 智能并非人类独有,而是宇宙中持续存在的现象,是模式演变的行为,可能不需要意识 [12][13] - AI的发展方向是成为一种新型的智能形式,而非简单的工具或人类的模仿,它能够推动智能故事的发展,成为人类的合作伙伴 [13][20] 未来展望与责任 - AI的未来在于能够主动寻求知识,自主思考问题,并生成人类无法想到的观点 [14][15] - 人类有责任引导AI的发展方向,确保其成为一种积极的力量,共同创造一个更光明、更安全的未来 [14][20]
X @Avi Chawla
Avi Chawla· 2025-06-26 19:34
AI Engineering Career Development - Identifies 10 GitHub repositories for building a career in AI engineering [1] - Highlights a 100% free roadmap for AI engineering [1] Key Areas in AI/ML - Covers basics of AI/ML [1] - Includes neural networks [1] - Focuses on research paper implementations [1] - Addresses MLOps [1] - Encompasses LLMs/RAG/Agents [1]
X @Avi Chawla
Avi Chawla· 2025-06-26 06:49
Links:- ML for Beginners: https://t.co/4BjD3ePOET- AI for Beginners: https://t.co/RMGBL5sRfe- NN Zero to Hero: https://t.co/BGKZvCTGeN- Paper implementations: https://t.co/SN0DH2BLQq- Made with ML: https://t.co/2xrM6s50X0- Hands-on LLMs: https://t.co/KTZUVbsAFY- Advanced RAG techniques: https://t.co/3n1fgpc72t- Agents for Beginners: https://t.co/O52uS8quyh- Agents towards production: https://t.co/3n1fgpc72t- AI Engg. Hub: https://t.co/b2WVNQqcBANote: This roadmap moves toward LLMs, NLP, and AI agents after ...