40 likes | 134 Views
Dive into the next phase of AI evolution with insights from Christopher Olston of Google Research on how Big Data is transforming artificial intelligence. Explore the key elements of successful Big-AI projects, from algorithms to scalable data management, and learn the essential optimization techniques to keep your project thriving. Discover the importance of maintaining and evolving your codebase while staying informed about cutting-edge algorithms. Embrace higher-level programming abstractions for enhanced efficiency and seamless scalability in your AI endeavors.
E N D
We can be at the center of AI 2.0 Christopher Olston Google Research
AI is getting its groove back • ... largely thanks to Big Data • e.g. Watson,Siri, Google Translate • Building Big-AI systems is easy, thanks to scalable data management building blocks • BigTable, Map-Reduce, Pregel, … • Life is good
NOT REALLY … • Life of a Big-AI project: • Commit to an algorithm • Bust it up into map functions, co-processors, ... • Optimize the crap out of it: • Caching, batching • Indexing, clever encoding • “Stupid map-reduce tricks” • Never ever disband the project (who else could understand the debris field that is your code?) • To keep entertained while you maintain your ossified code: • read papers about new algorithms and muse “it would be cool if we could try that”
We Need Higher-Level Programming Abstractions • But unlike SQL etc.: • Power: Turing complete • Syntax: Math should look like math • Control: Physical transparency • Declarative programs that “just work” on small data (for experimentation, debugging) • Target scalable platforms (e.g. map-reduce), and choose optimizations to apply, via operational-style annotations