InfoQ Homepage Deep Learning Content on InfoQ
-
Denys Linkov on Micro Metrics for LLM System Evaluation
Live from the QCon San Francisco Conference, we are talking with Denys Linkov, Head of Machine Learning at Voiceflow. Linkov shares insights on using micro metrics to refine large language models (LLMs), highlighting the importance of granular evaluation, continuous iteration, and rigorous prompt engineering to create reliable and user-focused AI systems.
-
Generally AI - Season 2 - Episode 6: the Godfathers of Programming and AI
Hosts discuss the Godfather of AI, Geoffrey Hinton, who developed pivotal algorithms like backpropagation, contributed to neural visualization with t-SNE, and inspired a resurgence in neural networks with AlexNet's success. They turn to John von Neumann, whose impact spanned mathematics, the Manhattan Project, and game theory, but most importantly: the von Neumann computer hardware architecture.
-
Namee Oberst on Small Language Models and How They are Enabling AI-Powered PCs
In this podcast, Namee Oberst, co-founder of AI Bloks, the company behind AI framework LLMWare, discusses the recent trend in Generative AI and Language Model technologies, the Small Language Models (SLMs) and how these smaller models are empowering the edge computing on devices and enabling AI-powered PC's.
-
Generally AI - Season 2 - Episode 3: Surviving the AI Winter
Roland Meertens and Anthony Alford discuss the historical cycles of AI "summers" and "winters": periods of optimism and decline in AI research. The conversation follows the story of neural networks, to the resurgence of AI with backpropagation and deep learning in the 2010s. They also explore the potential for a future "AI Winter", as technological advances face both hype and skepticism.