A private AI assistant on Telegram that runs on your own machine gives you the convenience of Telegram with the privacy of local AI processing. This live workshop shows you how to build one using OpenClaw and Docker Model Runner — working by the end of the 4-hour session.
By Packt Publishing · Refunds up to 10 days before
Telegram's clean bot API, large user base, and developer-friendly approach make it ideal for a private AI assistant. Combined with OpenClaw and Docker Model Runner for local processing, you get the best of both worlds — Telegram's interface with complete AI privacy.
OpenClaw is the open-source personal AI assistant that went viral in early 2026 with 200K+ GitHub stars. It runs on your own devices and connects to WhatsApp, Telegram, Slack and more. No subscription. No data leaving your machine.
Docker Model Runner is Docker's native feature for running large language models locally on your machine. It gives you an OpenAI-compatible API that OpenClaw uses as its AI brain — complete data privacy, no cloud costs.
OpenClaw gives you the assistant interface and messaging integrations. Docker Model Runner gives you the AI brain running privately on your machine. Together they create a production grade private AI assistant you fully own.
Setting this up from scattered documentation takes days of debugging. This live workshop gives you a complete guided build in 4 hours with a live instructor answering your questions. Packt has delivered 108 workshops worldwide.
Six modules. From local AI setup to a private AI assistant running in your Telegram.
Understand the Gateway, channels, and skills architecture. Set up and configure OpenClaw locally from scratch.
Run and manage local LLMs using Docker Model Runner. Pull models, configure memory, and understand the OpenAI-compatible API.
Configure DM pairing, allowlists, sandbox mode, and proper access controls for your local AI deployment.
Deploy your AI assistant to real messaging platforms without sending data to any third party cloud service.
Design an extensible assistant architecture. Add skills, configure personality, and set up proactive automation.
Deploy your OpenClaw and Docker setup to a VPS for always-on availability running 24 hours a day.
A private AI assistant on Telegram — processing locally, deployed, and responding.
A fully functional local AI assistant running on your machine
Docker Model Runner configured with your chosen LLM model
OpenClaw connected to WhatsApp or Telegram
Security and privacy configuration you can trust
A reusable architecture for future AI assistant projects
Certificate of completion from Packt Publishing
Rami Krispin builds private AI assistants on Telegram in production environments.
Rami is a Senior Manager of Data Science and Engineering, Docker Captain, and LinkedIn Learning Instructor with deep expertise in building and deploying production AI systems. He guides you step by step from a blank terminal to a fully deployed private AI assistant — answering your questions live throughout the 4-hour session.
Developers who want a genuinely private AI assistant accessible through Telegram.
Everything you need to know about building a private AI assistant on Telegram.
Privacy comes from where the AI processing happens. Most Telegram AI bots send your messages to cloud AI APIs for processing. Your private AI assistant built in this workshop processes messages entirely through Docker Model Runner running locally on your own machine — your conversation content never reaches any external AI server.
Yes. Because your Telegram bot is accessible through Telegram itself, you can interact with your private AI assistant from any device that has Telegram — phone, tablet, desktop. The AI processing always happens on your local machine or VPS, regardless of which Telegram client you use to send the message.
Yes, for personal use. Your laptop running Docker Model Runner and OpenClaw is sufficient. Your private Telegram AI assistant works whenever your laptop is on. The workshop also covers VPS deployment for always-on availability if you need your assistant to respond when your laptop is off.
Your private Telegram AI assistant can handle any conversational task that the underlying open weight model supports — answering questions, helping with writing, summarising content, brainstorming, coding assistance, and general conversation. The instructor covers how to optimise your model choice for your specific use case.
Privacy configuration includes: OpenClaw allowlists to restrict which Telegram users can interact with your bot, ensuring Docker Model Runner makes no external AI API calls (verifiable with network monitoring), and configuring OpenClaw to not log conversation content beyond what is needed for context. The instructor covers all privacy configuration during module three.
Yes. OpenClaw's skills system lets you add new capabilities to your private Telegram AI assistant written in Python. Skills can integrate with external APIs (for real-time data), automate tasks, set reminders, or add any custom functionality you need. The instructor covers the skills architecture during module five of the workshop.
4 hours. Live instructor. Private AI assistant on Telegram by the end. Seats are limited.
Register Now →Sunday April 26 · 9am to 1pm EDT · Online · Packt Publishing