Workflow
Hardware Lottery
icon
Search documents
Large Scale AI on Apple Silicon (as mentioned by @AndrejKarpathy ) โ€” Alex Cheema, EXO Labs
AI Engineerยท 2025-06-20 22:52
Scientific Rigor and Progress - Scientific progress is not always linear, and inertia within the scientific community can hinder the adoption of new ideas, even with sound methodology [8][18] - Questioning assumptions is crucial for scientific advancement, as illustrated by historical examples in physics and rat experiments [9][16] - Oversimplification of science is a common pitfall, and publishing both successful and unsuccessful results is important for transparency and learning [18][35] AI Development and Hardware - The "hardware lottery" suggests that the best research ideas in AI don't always win due to various factors, including hardware limitations and existing paradigms [22] - Large Language Models (LLMs) can create inertia by reinforcing existing practices, such as the dominance of Python, making it harder for new programming languages to gain adoption [23][24] - GPUs addressed the von Neumann bottleneck of CPUs by changing the ratio of bytes loaded to floating-point operations executed, enabling significant performance improvements in AI [21] Exo's Solution and Research - Exo is developing an orchestration layer for AI that runs on different hardware targets, aiming to provide a reliable system for managing distributed devices in ad hoc mesh networks [25] - Exo models everything as a causally consistent set of events, creating a causal graph to reason about the system and ensure data consistency across distributed systems [26][27] - Exo's technology enables the efficient utilization of diverse hardware, such as combining Nvidia Spark (high compute) and Studio (high memory bandwidth) for LLM generation [28][29] - Exo is researching new optimizers that are more efficient per flop than Adam but require more memory, leveraging the memory-to-flops ratio of Apple silicon [31][32][33]