InfoQ Homepage Large language models Content on InfoQ
-
Namee Oberst on Small Language Models and How They are Enabling AI-Powered PCs
In this podcast, Namee Oberst, co-founder of AI Bloks, the company behind AI framework LLMWare, discusses the recent trend in Generative AI and Language Model technologies, the Small Language Models (SLMs) and how these smaller models are empowering the edge computing on devices and enabling AI-powered PC's.
-
Generally AI - Season 2 - Episode 1: Generative AI and Creativity
Hosts Roland Meertens and Anthony Alford discuss how AI is being used to make creativity more accessible. While some Generative AI content lacks variety and artistic depth, there is potential for AI to assist human creators rather than replace them. They also explore the challenge of evaluating generative AI models.
-
AI, ML, and Data Engineering InfoQ Trends Report 2024
One of the regular features of InfoQ are the trends reports, which each focus on a different aspect of software development. These reports provide the InfoQ audience with a high-level overview of the topics to pay attention to this year. In this episode, members of the InfoQ editorial staff and friends of InfoQ are discussing the current trends in the domain of AI, ML and Data Engineering.
-
Meryem Arik on LLM Deployment, State-of-the-Art RAG Apps, and Inference Architecture Stack
In this podcast, Meryem Arik, co-founder/CEO at TitanML, discusses the innovations in Generative AI and Large Language Model (LLM) technologies including current state of large language models, LLM Deployment, state-of-the-art Retrieval Augmented Generation (RAG) apps, and inference architecture stack for LLM applications.
-
Making Code Explain Itself – Observability Through AI
In this podcast Shane Hastie, Lead Editor for Culture & Methods spoke to Dr. Elizabeth Lawler, the founder and CEO of AppMap, about observability in the age of AI, creativity in programming and problems developers face on a day-to-day basis.