Generative AI 101
Welcome to Generative AI 101, your go-to podcast for learning the basics of generative artificial intelligence in easy-to-understand, bite-sized episodes. Join host Emily Laird, AI Integration Technologist and AI lecturer, to explore key concepts, applications, and ethical considerations, making AI accessible for everyone.
Episodes

Wednesday Jul 10, 2024
How LLMs Make Coherent Text
Wednesday Jul 10, 2024
Wednesday Jul 10, 2024
In this episode of Generative AI 101, go on an insider’s tour of a large language model (LLM). Discover how each component, from the transformer architecture and positional encoding to the multi-head attention layers and feed-forward neural networks, contributes to creating intelligent, coherent text. We’ll explore tokenization and resource management techniques like mixed-precision training and model parallelism. Join us for a fascinating look at the complex, finely-tuned process that powers modern AI, turning raw text into human-like responses.
Connect with Emily Laird on LinkedIn

Tuesday Jul 09, 2024
Training Large Language Models (LLMs
Tuesday Jul 09, 2024
Tuesday Jul 09, 2024
In this episode of Generative AI 101, we explore the intricate process of training Large Language Models (LLMs). Imagine training a brilliant student with the entire internet as their textbook—books, academic papers, Wikipedia, social media posts, and code repositories. We’ll cover the stages of data collection, cleaning, and tokenization. Learn how transformers, with their self-attention mechanisms, help these models understand and generate coherent text. Discover the training process using powerful GPUs or TPUs and techniques like distributed and mixed precision training. We'll also address the challenges, including the need for computational resources and ensuring data diversity. Finally, understand how fine-tuning these models for specific tasks makes them even more capable.
Connect with Emily Laird on LinkedIn

Monday Jul 08, 2024
The Evolution of Large Language Models (LLMs)
Monday Jul 08, 2024
Monday Jul 08, 2024
In this episode of Generative AI 101, we trace the evolution of Large Language Models (LLMs) from their early, simplistic beginnings to the sophisticated powerhouses they are today. Starting with basic models that struggled with coherence, we'll see how the introduction of transformers in 2017 revolutionized the field. Discover how models like GPT-2 and GPT-3 brought human-like text generation to new heights, and learn about the advancements in GPT-4, which offers even greater accuracy and versatility. Join us to understand the incredible journey of LLMs, from data training to fine-tuning, and how they've transformed our digital interactions.Connect with Emily Laird on LinkedIn

Wednesday Jul 03, 2024
What is a Large Language Model (LLM)?
Wednesday Jul 03, 2024
Wednesday Jul 03, 2024
In this episode of Generative AI 101, we explore Large Language Models (LLMs) and their significance. Imagine chatting with an AI that feels almost human—you're likely interacting with an LLM. These models, trained on massive datasets, understand and generate text with impressive accuracy. With billions of parameters, they handle a wide range of tasks from chatbots and virtual assistants to sentiment analysis and document summarization.Connect with Emily Laird on LinkedIn

Tuesday Jul 02, 2024
Natural Language Processing Techniques & Concepts
Tuesday Jul 02, 2024
Tuesday Jul 02, 2024
In this episode of Generative AI 101, we explore the core techniques and methods in Natural Language Processing (NLP). Starting with rule-based approaches that rely on handcrafted rules, we move to statistical models that learn patterns from vast amounts of data. We'll explain n-gram models and their limitations before diving into the revolution brought by machine learning, where algorithms like Support Vector Machines (SVMs) and decision trees learn from annotated datasets. Finally, we arrive at deep learning and neural networks, particularly Transformers, which enable advanced models like BERT and GPT-3 to understand context and generate human-like text.Connect with Emily Laird on LinkedIn

Monday Jul 01, 2024
Natural Language Processing (NLP) Concepts
Monday Jul 01, 2024
Monday Jul 01, 2024
In this episode of Generative AI 101, we break down the fundamental concepts of Natural Language Processing (NLP). Imagine trying to read a book that's one long, unbroken string of text—impossible, right? That’s where tokenization comes in, breaking text into manageable chunks. We’ll also cover stemming and lemmatization, techniques for reducing words to their root forms, and explain the importance of stop words—the linguistic background noise. Finally, we’ll explore Named Entity Recognition (NER), which identifies key names and places in text. These basics form the foundation of NLP, making our interactions with technology smoother and more intuitive.
Connect with Emily Laird on LinkedIn

Friday Jun 28, 2024
The History of Natural Language Processing (NLP)
Friday Jun 28, 2024
Friday Jun 28, 2024
In this episode of Generative AI 101, we journey through the captivating history of Natural Language Processing (NLP), from Alan Turing's pioneering question "Can machines think?" to the game-changing advancements of modern AI. Discover how NLP evolved from early rule-based systems and statistical methods to the revolutionary introduction of machine learning, deep learning, and OpenAI's GPT-3. Tune in to understand how these milestones have transformed machines' ability to understand and generate human language, making our tech experiences smoother and more intuitive.
Connect with Emily Laird on LinkedIn

Thursday Jun 27, 2024
What is Natural Language Processing (NLP)?
Thursday Jun 27, 2024
Thursday Jun 27, 2024
Let's explore Natural Language Processing (NLP). Picture this: you’re chatting with your phone, asking it to find the nearest pizza joint, and it not only understands you but also provides a list of places with mouth-watering photos. That’s NLP in action. We'll explain how NLP allows machines to interpret and respond to human language naturally, like teaching a robot to be a linguist. Discover its key applications, from virtual assistants and machine translation to sentiment analysis and healthcare. Tune in to learn why NLP is the magic making our interactions with technology smoother and more intuitive.
Connect with Emily Laird on LinkedIn

Wednesday Jun 26, 2024
Transformers Mini Series: How do Transformers Process Text?
Wednesday Jun 26, 2024
Wednesday Jun 26, 2024
In this episode of Generative AI 101, we explore how Transformers break down text into tokens. Imagine turning a big, colorful pile of Lego blocks into individual pieces to build something cool—this is what tokenization does for AI models. Emily explains tokens, and how they work, and shows you why they’re the magic behind GenAI’s impressive outputs. Learn how Transformers assign numerical values to tokens and process them in parallel, allowing them to understand context, detect patterns, and generate coherent text. Tune in to discover why tokenization is important for tasks like language translation and text summarization.
Connect with Emily Laird on LinkedIn

Tuesday Jun 25, 2024
Transformers Mini Series: How do Transformers work?
Tuesday Jun 25, 2024
Tuesday Jun 25, 2024
In part two of our Transformer mini-series, we peel back the layers to uncover the mechanics that make Transformers the rock stars of the AI world. Think of this episode as your backstage pass to understanding how these models operate. We’ll break down the self-attention mechanism, comparing it to having superhuman hearing at a party, and explore the power of multi-head attention, likened to having multiple sets of ears tuned to different conversations. We also delve into the rigorous training process of Transformers, from the use of GPUs and TPUs to optimization strategies. Connect with Emily Laird on LinkedIn

Lecturer + Speaker
Transform your business with Emily Laird's captivating presentation on Generative AI. An AI expert and dynamic speaker, Emily breaks down complex concepts with ease and entertainment. Perfect for businesses and organizations eager to discover AI's potential.







