Session Proposals

Abhigyan Shrivastava

Building Context-Aware RAG Models in Python

In this talk, we’ll walk through how to build context-aware Retrieval-Augmented Generation (RAG) systems in Python. The idea is to combine the power of large language models with our own custom data to create powerful semantic data search technology. Whether we’re creating AI assistants, document Q systems or chatbots, drawing from my experience building several RAG models at Adobe, in this session will walk us through the fundamentals and advanced implementation of search-augmented LLMs using Python.

The talk will begin with a quick overview of RAG systems and why they matter, and then I’ll dive into how to set up semantic search with vector databases and connect that with LLMs for contextual responses.

1. Introduction to RAG: what it is and why it's useful

2. How to convert documents into vector embeddings for search

3. Using FAISS or Chroma to store and query vectors in Python

4. Hooking up LLMs to retrieve context-aware answers

5. Common challenges (like hallucinations) and how to address them

Olena Kutsenko

Bringing stories to life with AI, data streaming and generative agents

Storytelling has always been a way to connect and imagine new worlds. Now, with Generative Agents - AI-powered characters that can think, act, and adapt - we can take storytelling to a whole new level. But what if these agents could change and grow in real time, driven by live data streams?

Inspired by the Standford's paper "Generative Agents: Interactive Simulacra of Human Behavior", this session explores how to build dynamic, AI-driven worlds using Apache Kafka, Apache Flink, and Apache Iceberg. We'll use a Large Language Model to power for conversation and agent decision-making, integrate Retrieval-Augmented Generation (RAG) for memory storage and retrieval, and use Python to tie it all together. Along the way, we’ll examine different approaches for data processing, storage, and analysis.

By the end, you’ll see how data streaming and AI can work together to create lively, evolving virtual communities. Whether you’re into gaming, simulations, research or just exploring what’s possible, this session will give you ideas for building something amazing.

Olena Kutsenko

Mastering real-time anomaly detection with open source tools

Detecting problems as they happen is essential in today’s fast-moving world. This talk shows how to build a simple, powerful system for real-time anomaly detection with Python. We’ll use Apache Kafka for streaming data, Apache Flink for processing it, and AI to find unusual patterns. Whether it’s spotting fraud, monitoring systems, or tracking IoT devices, this solution is flexible and reliable.

First, we’ll explain how Kafka helps collect and manage fast-moving data. Then, we’ll show how Flink processes this data in real time to detect events as they happen. We’ll also explore how to add AI to the pipeline, using pre-trained models to find anomalies with high accuracy. Finally, we’ll look at how Apache Iceberg can store past data for analysis and model improvements. Combining real-time detection with historical data makes the system smarter and more effective over time.

This talk includes clear examples and practical steps to help you build your own pipeline. It’s perfect for anyone who wants to learn how to use open-source tools to spot problems in real-time data streams.

Olena Kutsenko

Building agent systems with streaming data

AI-powered agent systems are becoming essential for automation, personalization, and real-time decision-making. But how do we ensure that these agents can process information continuously, maintain context, and provide intelligent responses at scale?

This talk explores how Apache Kafka and Apache Flink can be used to build dynamic real-time agent systems using Python. We'll start with the basics of agent-based systems - how they work, how they communicate, and how they retrieve and generate relevant knowledge using Retrieval-Augmented Generation. Then, we'll look into real-time streaming architectures, showing how Kafka handles message passing between agents and Flink processes events to track context and enable intelligent responses.

By the end of this session, you'll have a clear roadmap for designing AI-driven agent systems that are context-aware, efficient and work with a continuous stream of data.

Whether you're working on chatbots, monitoring systems, or intelligent automation, this talk will provide practical insights into bridging streaming data with generative AI to power the next generation of autonomous agents. Perfect for beginners and experts alike, this session offers valuable insights for all skill levels.

In order to be able to create or vote for proposals, you need to be logged in. you can log in and register here