Generative AI 101
Welcome to Generative AI 101, your go-to podcast for learning the basics of generative artificial intelligence in easy-to-understand, bite-sized episodes. Join host Emily Laird, AI Integration Technologist and AI lecturer, to explore key concepts, applications, and ethical considerations, making AI accessible for everyone.
Episodes

Wednesday Jul 17, 2024
What is Anthropic's Claude?
Wednesday Jul 17, 2024
Wednesday Jul 17, 2024
In this episode of Generative AI 101, we explore the world of Anthropic's Claude AI—a chatbot born from the minds of ex-OpenAI siblings and backed by tech giants like Google and Amazon. Picture Claude as the cool, thoughtful cousin of ChatGPT, capable of handling 200,000 tokens at once. With the latest iteration, Claude 3.5 Sonnet, this AI sets new benchmarks in graduate-level reasoning, broad knowledge, and coding proficiency. Whether you need an advanced model for automation or a speed demon for instant translation, Claude is redefining what smart, safe AI can do.
Connect with Emily Laird on LinkedIn

Tuesday Jul 16, 2024
What is Google's Gemini?
Tuesday Jul 16, 2024
Tuesday Jul 16, 2024
In this episode of Generative AI 101, we explore the slick, high-octane world of Google's Gemini. Think of it as the James Bond of AI—sharp, sophisticated, and always ahead of the curve. We’ll dish out the inside scoop on Gemini’s cutting-edge tech, from its inception to its rise as a superstar in the AI universe. Get ready to explore its suave capabilities, from powering chatty virtual assistants to mastering the nuances of human language and crunching mountains of data like it’s nothing. Plus, we’ll sprinkle in some juicy tidbits about Gemini’s meteoric rise, its flair for languages, and the latest bells and whistles.Connect with Emily Laird on LinkedIn

Monday Jul 15, 2024
What is ChatGPT?
Monday Jul 15, 2024
Monday Jul 15, 2024
In this episode of Generative AI 101, we explore the modern marvel that is ChatGPT. Discover what "GPT" stands for and how this "Generative Pre-trained Transformer" operates, processing text like a high-powered engine. Learn what ChatGPT can do, from generating human-like responses to engaging in multi-language conversations and analyzing vast text data. Tune in to explore the incredible capabilities and the global impact of ChatGPT.Connect with Emily Laird on LinkedIn

Wednesday Jul 10, 2024
How LLMs Make Coherent Text
Wednesday Jul 10, 2024
Wednesday Jul 10, 2024
In this episode of Generative AI 101, go on an insider’s tour of a large language model (LLM). Discover how each component, from the transformer architecture and positional encoding to the multi-head attention layers and feed-forward neural networks, contributes to creating intelligent, coherent text. We’ll explore tokenization and resource management techniques like mixed-precision training and model parallelism. Join us for a fascinating look at the complex, finely-tuned process that powers modern AI, turning raw text into human-like responses.
Connect with Emily Laird on LinkedIn

Tuesday Jul 09, 2024
Training Large Language Models (LLMs
Tuesday Jul 09, 2024
Tuesday Jul 09, 2024
In this episode of Generative AI 101, we explore the intricate process of training Large Language Models (LLMs). Imagine training a brilliant student with the entire internet as their textbook—books, academic papers, Wikipedia, social media posts, and code repositories. We’ll cover the stages of data collection, cleaning, and tokenization. Learn how transformers, with their self-attention mechanisms, help these models understand and generate coherent text. Discover the training process using powerful GPUs or TPUs and techniques like distributed and mixed precision training. We'll also address the challenges, including the need for computational resources and ensuring data diversity. Finally, understand how fine-tuning these models for specific tasks makes them even more capable.
Connect with Emily Laird on LinkedIn

Monday Jul 08, 2024
The Evolution of Large Language Models (LLMs)
Monday Jul 08, 2024
Monday Jul 08, 2024
In this episode of Generative AI 101, we trace the evolution of Large Language Models (LLMs) from their early, simplistic beginnings to the sophisticated powerhouses they are today. Starting with basic models that struggled with coherence, we'll see how the introduction of transformers in 2017 revolutionized the field. Discover how models like GPT-2 and GPT-3 brought human-like text generation to new heights, and learn about the advancements in GPT-4, which offers even greater accuracy and versatility. Join us to understand the incredible journey of LLMs, from data training to fine-tuning, and how they've transformed our digital interactions.Connect with Emily Laird on LinkedIn

Wednesday Jul 03, 2024
What is a Large Language Model (LLM)?
Wednesday Jul 03, 2024
Wednesday Jul 03, 2024
In this episode of Generative AI 101, we explore Large Language Models (LLMs) and their significance. Imagine chatting with an AI that feels almost human—you're likely interacting with an LLM. These models, trained on massive datasets, understand and generate text with impressive accuracy. With billions of parameters, they handle a wide range of tasks from chatbots and virtual assistants to sentiment analysis and document summarization.Connect with Emily Laird on LinkedIn

Tuesday Jul 02, 2024
Natural Language Processing Techniques & Concepts
Tuesday Jul 02, 2024
Tuesday Jul 02, 2024
In this episode of Generative AI 101, we explore the core techniques and methods in Natural Language Processing (NLP). Starting with rule-based approaches that rely on handcrafted rules, we move to statistical models that learn patterns from vast amounts of data. We'll explain n-gram models and their limitations before diving into the revolution brought by machine learning, where algorithms like Support Vector Machines (SVMs) and decision trees learn from annotated datasets. Finally, we arrive at deep learning and neural networks, particularly Transformers, which enable advanced models like BERT and GPT-3 to understand context and generate human-like text.Connect with Emily Laird on LinkedIn

Monday Jul 01, 2024
Natural Language Processing (NLP) Concepts
Monday Jul 01, 2024
Monday Jul 01, 2024
In this episode of Generative AI 101, we break down the fundamental concepts of Natural Language Processing (NLP). Imagine trying to read a book that's one long, unbroken string of text—impossible, right? That’s where tokenization comes in, breaking text into manageable chunks. We’ll also cover stemming and lemmatization, techniques for reducing words to their root forms, and explain the importance of stop words—the linguistic background noise. Finally, we’ll explore Named Entity Recognition (NER), which identifies key names and places in text. These basics form the foundation of NLP, making our interactions with technology smoother and more intuitive.
Connect with Emily Laird on LinkedIn

Friday Jun 28, 2024
The History of Natural Language Processing (NLP)
Friday Jun 28, 2024
Friday Jun 28, 2024
In this episode of Generative AI 101, we journey through the captivating history of Natural Language Processing (NLP), from Alan Turing's pioneering question "Can machines think?" to the game-changing advancements of modern AI. Discover how NLP evolved from early rule-based systems and statistical methods to the revolutionary introduction of machine learning, deep learning, and OpenAI's GPT-3. Tune in to understand how these milestones have transformed machines' ability to understand and generate human language, making our tech experiences smoother and more intuitive.
Connect with Emily Laird on LinkedIn

Lecturer + Speaker
Transform your business with Emily Laird's captivating presentation on Generative AI. An AI expert and dynamic speaker, Emily breaks down complex concepts with ease and entertainment. Perfect for businesses and organizations eager to discover AI's potential.